Search (2 results, page 1 of 1)

  • × theme_ss:"Informetrie"
  • × theme_ss:"Informationsmittel"
  1. Teplitskiy, M.; Lu, G.; Duede, E.: Amplifying the impact of open access : Wikipedia and the diffusion of science (2017) 0.02
    0.024234561 = product of:
      0.048469122 = sum of:
        0.048469122 = product of:
          0.096938245 = sum of:
            0.096938245 = weight(_text_:policy in 3782) [ClassicSimilarity], result of:
              0.096938245 = score(doc=3782,freq=2.0), product of:
                0.2727254 = queryWeight, product of:
                  5.361833 = idf(docFreq=563, maxDocs=44218)
                  0.05086421 = queryNorm
                0.35544267 = fieldWeight in 3782, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  5.361833 = idf(docFreq=563, maxDocs=44218)
                  0.046875 = fieldNorm(doc=3782)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    With the rise of Wikipedia as a first-stop source for scientific information, it is important to understand whether Wikipedia draws upon the research that scientists value most. Here we identify the 250 most heavily used journals in each of 26 research fields (4,721 journals, 19.4M articles) indexed by the Scopus database, and test whether topic, academic status, and accessibility make articles from these journals more or less likely to be referenced on Wikipedia. We find that a journal's academic status (impact factor) and accessibility (open access policy) both strongly increase the probability of it being referenced on Wikipedia. Controlling for field and impact factor, the odds that an open access journal is referenced on the English Wikipedia are 47% higher compared to paywall journals. These findings provide evidence is that a major consequence of open access policies is to significantly amplify the diffusion of science, through an intermediary like Wikipedia, to a broad audience.
  2. Meho, L.I.; Rogers, Y.: Citation counting, citation ranking, and h-index of human-computer interaction researchers : a comparison of Scopus and Web of Science (2008) 0.01
    0.008614248 = product of:
      0.017228495 = sum of:
        0.017228495 = product of:
          0.03445699 = sum of:
            0.03445699 = weight(_text_:22 in 2352) [ClassicSimilarity], result of:
              0.03445699 = score(doc=2352,freq=2.0), product of:
                0.1781178 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.05086421 = queryNorm
                0.19345059 = fieldWeight in 2352, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=2352)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    This study examines the differences between Scopus and Web of Science in the citation counting, citation ranking, and h-index of 22 top human-computer interaction (HCI) researchers from EQUATOR - a large British Interdisciplinary Research Collaboration project. Results indicate that Scopus provides significantly more coverage of HCI literature than Web of Science, primarily due to coverage of relevant ACM and IEEE peer-reviewed conference proceedings. No significant differences exist between the two databases if citations in journals only are compared. Although broader coverage of the literature does not significantly alter the relative citation ranking of individual researchers, Scopus helps distinguish between the researchers in a more nuanced fashion than Web of Science in both citation counting and h-index. Scopus also generates significantly different maps of citation networks of individual scholars than those generated by Web of Science. The study also presents a comparison of h-index scores based on Google Scholar with those based on the union of Scopus and Web of Science. The study concludes that Scopus can be used as a sole data source for citation-based research and evaluation in HCI, especially when citations in conference proceedings are sought, and that researchers should manually calculate h scores instead of relying on system calculations.