Search (7 results, page 1 of 1)

  • × theme_ss:"Elektronisches Publizieren"
  • × theme_ss:"Informetrie"
  • × type_ss:"a"
  1. Costas, R.; Perianes-Rodríguez, A.; Ruiz-Castillo, J.: On the quest for currencies of science : field "exchange rates" for citations and Mendeley readership (2017) 0.09
    0.08645598 = product of:
      0.17291196 = sum of:
        0.13832192 = weight(_text_:fields in 4051) [ClassicSimilarity], result of:
          0.13832192 = score(doc=4051,freq=8.0), product of:
            0.31604284 = queryWeight, product of:
              4.951651 = idf(docFreq=849, maxDocs=44218)
              0.06382575 = queryNorm
            0.43766826 = fieldWeight in 4051, product of:
              2.828427 = tf(freq=8.0), with freq of:
                8.0 = termFreq=8.0
              4.951651 = idf(docFreq=849, maxDocs=44218)
              0.03125 = fieldNorm(doc=4051)
        0.034590032 = weight(_text_:22 in 4051) [ClassicSimilarity], result of:
          0.034590032 = score(doc=4051,freq=2.0), product of:
            0.2235069 = queryWeight, product of:
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.06382575 = queryNorm
            0.15476047 = fieldWeight in 4051, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.03125 = fieldNorm(doc=4051)
      0.5 = coord(2/4)
    
    Abstract
    Purpose The introduction of "altmetrics" as new tools to analyze scientific impact within the reward system of science has challenged the hegemony of citations as the predominant source for measuring scientific impact. Mendeley readership has been identified as one of the most important altmetric sources, with several features that are similar to citations. The purpose of this paper is to perform an in-depth analysis of the differences and similarities between the distributions of Mendeley readership and citations across fields. Design/methodology/approach The authors analyze two issues by using in each case a common analytical framework for both metrics: the shape of the distributions of readership and citations, and the field normalization problem generated by differences in citation and readership practices across fields. In the first issue the authors use the characteristic scores and scales method, and in the second the measurement framework introduced in Crespo et al. (2013). Findings There are three main results. First, the citations and Mendeley readership distributions exhibit a strikingly similar degree of skewness in all fields. Second, the results on "exchange rates (ERs)" for Mendeley readership empirically supports the possibility of comparing readership counts across fields, as well as the field normalization of readership distributions using ERs as normalization factors. Third, field normalization using field mean readerships as normalization factors leads to comparably good results. Originality/value These findings open up challenging new questions, particularly regarding the possibility of obtaining conflicting results from field normalized citation and Mendeley readership indicators; this suggests the need for better determining the role of the two metrics in capturing scientific recognition.
    Date
    20. 1.2015 18:30:22
  2. Frandsen, T.F.: ¬The integration of open access journals in the scholarly communication system : three science fields (2009) 0.05
    0.05187072 = product of:
      0.20748287 = sum of:
        0.20748287 = weight(_text_:fields in 4210) [ClassicSimilarity], result of:
          0.20748287 = score(doc=4210,freq=8.0), product of:
            0.31604284 = queryWeight, product of:
              4.951651 = idf(docFreq=849, maxDocs=44218)
              0.06382575 = queryNorm
            0.65650237 = fieldWeight in 4210, product of:
              2.828427 = tf(freq=8.0), with freq of:
                8.0 = termFreq=8.0
              4.951651 = idf(docFreq=849, maxDocs=44218)
              0.046875 = fieldNorm(doc=4210)
      0.25 = coord(1/4)
    
    Abstract
    The greatest number of open access journals (OAJs) is found in the sciences and their influence is growing. However, there are only a few studies on the acceptance and thereby integration of these OAJs in the scholarly communication system. Even fewer studies provide insight into the differences across disciplines. This study is an analysis of the citing behaviour in journals within three science fields: biology, mathematics, and pharmacy and pharmacology. It is a statistical analysis of OAJs as well as non-OAJs including both the citing and cited side of the journal to journal citations. The multivariate linear regression reveals many similarities in citing behaviour across fields and media. But it also points to great differences in the integration of OAJs. The integration of OAJs in the scholarly communication system varies considerably across fields. The implications for bibliometric research are discussed.
  3. Zahedi, Z.; Costas, R.; Wouters, P.: Mendeley readership as a filtering tool to identify highly cited publications (2017) 0.03
    0.030565115 = product of:
      0.12226046 = sum of:
        0.12226046 = weight(_text_:fields in 3837) [ClassicSimilarity], result of:
          0.12226046 = score(doc=3837,freq=4.0), product of:
            0.31604284 = queryWeight, product of:
              4.951651 = idf(docFreq=849, maxDocs=44218)
              0.06382575 = queryNorm
            0.38684773 = fieldWeight in 3837, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              4.951651 = idf(docFreq=849, maxDocs=44218)
              0.0390625 = fieldNorm(doc=3837)
      0.25 = coord(1/4)
    
    Abstract
    This study presents a large-scale analysis of the distribution and presence of Mendeley readership scores over time and across disciplines. We study whether Mendeley readership scores (RS) can identify highly cited publications more effectively than journal citation scores (JCS). Web of Science (WoS) publications with digital object identifiers (DOIs) published during the period 2004-2013 and across five major scientific fields were analyzed. The main result of this study shows that RS are more effective (in terms of precision/recall values) than JCS to identify highly cited publications across all fields of science and publication years. The findings also show that 86.5% of all the publications are covered by Mendeley and have at least one reader. Also, the share of publications with Mendeley RS is increasing from 84% in 2004 to 89% in 2009, and decreasing from 88% in 2010 to 82% in 2013. However, it is noted that publications from 2010 onwards exhibit on average a higher density of readership versus citation scores. This indicates that compared to citation scores, RS are more prevalent for recent publications and hence they could work as an early indicator of research impact. These findings highlight the potential and value of Mendeley as a tool for scientometric purposes and particularly as a relevant tool to identify highly cited publications.
  4. Kousha, K.; Thelwall, M.; Abdoli, M.: Goodreads reviews to assess the wider impacts of books (2017) 0.02
    0.0216128 = product of:
      0.0864512 = sum of:
        0.0864512 = weight(_text_:fields in 3768) [ClassicSimilarity], result of:
          0.0864512 = score(doc=3768,freq=2.0), product of:
            0.31604284 = queryWeight, product of:
              4.951651 = idf(docFreq=849, maxDocs=44218)
              0.06382575 = queryNorm
            0.27354267 = fieldWeight in 3768, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              4.951651 = idf(docFreq=849, maxDocs=44218)
              0.0390625 = fieldNorm(doc=3768)
      0.25 = coord(1/4)
    
    Abstract
    Although peer-review and citation counts are commonly used to help assess the scholarly impact of published research, informal reader feedback might also be exploited to help assess the wider impacts of books, such as their educational or cultural value. The social website Goodreads seems to be a reasonable source for this purpose because it includes a large number of book reviews and ratings by many users inside and outside of academia. To check this, Goodreads book metrics were compared with different book-based impact indicators for 15,928 academic books across broad fields. Goodreads engagements were numerous enough in the arts (85% of books had at least one), humanities (80%), and social sciences (67%) for use as a source of impact evidence. Low and moderate correlations between Goodreads book metrics and scholarly or non-scholarly indicators suggest that reader feedback in Goodreads reflects the many purposes of books rather than a single type of impact. Although Goodreads book metrics can be manipulated, they could be used guardedly by academics, authors, and publishers in evaluations.
  5. Walters, W.H.; Linvill, A.C.: Bibliographic index coverage of open-access journals in six subject areas (2011) 0.01
    0.010809385 = product of:
      0.04323754 = sum of:
        0.04323754 = weight(_text_:22 in 4635) [ClassicSimilarity], result of:
          0.04323754 = score(doc=4635,freq=2.0), product of:
            0.2235069 = queryWeight, product of:
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.06382575 = queryNorm
            0.19345059 = fieldWeight in 4635, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.0390625 = fieldNorm(doc=4635)
      0.25 = coord(1/4)
    
    Abstract
    We investigate the extent to which open-access (OA) journals and articles in biology, computer science, economics, history, medicine, and psychology are indexed in each of 11 bibliographic databases. We also look for variations in index coverage by journal subject, journal size, publisher type, publisher size, date of first OA issue, region of publication, language of publication, publication fee, and citation impact factor. Two databases, Biological Abstracts and PubMed, provide very good coverage of the OA journal literature, indexing 60 to 63% of all OA articles in their disciplines. Five databases provide moderately good coverage (22-41%), and four provide relatively poor coverage (0-12%). OA articles in biology journals, English-only journals, high-impact journals, and journals that charge publication fees of $1,000 or more are especially likely to be indexed. Conversely, articles from OA publishers in Africa, Asia, or Central/South America are especially unlikely to be indexed. Four of the 11 databases index commercially published articles at a substantially higher rate than articles published by universities, scholarly societies, nonprofit publishers, or governments. Finally, three databases-EBSCO Academic Search Complete, ProQuest Research Library, and Wilson OmniFile-provide less comprehensive coverage of OA articles than of articles in comparable subscription journals.
  6. Moed, H.F.; Halevi, G.: On full text download and citation distributions in scientific-scholarly journals (2016) 0.01
    0.010809385 = product of:
      0.04323754 = sum of:
        0.04323754 = weight(_text_:22 in 2646) [ClassicSimilarity], result of:
          0.04323754 = score(doc=2646,freq=2.0), product of:
            0.2235069 = queryWeight, product of:
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.06382575 = queryNorm
            0.19345059 = fieldWeight in 2646, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.0390625 = fieldNorm(doc=2646)
      0.25 = coord(1/4)
    
    Date
    22. 1.2016 14:11:17
  7. Ortega, J.L.: ¬The presence of academic journals on Twitter and its relationship with dissemination (tweets) and research impact (citations) (2017) 0.01
    0.010809385 = product of:
      0.04323754 = sum of:
        0.04323754 = weight(_text_:22 in 4410) [ClassicSimilarity], result of:
          0.04323754 = score(doc=4410,freq=2.0), product of:
            0.2235069 = queryWeight, product of:
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.06382575 = queryNorm
            0.19345059 = fieldWeight in 4410, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.0390625 = fieldNorm(doc=4410)
      0.25 = coord(1/4)
    
    Date
    20. 1.2015 18:30:22