Search (2 results, page 1 of 1)

  • × theme_ss:"Informationsmittel"
  • × theme_ss:"Informetrie"
  1. Meho, L.I.; Rogers, Y.: Citation counting, citation ranking, and h-index of human-computer interaction researchers : a comparison of Scopus and Web of Science (2008) 0.01
    0.008837775 = product of:
      0.01767555 = sum of:
        0.01767555 = product of:
          0.0353511 = sum of:
            0.0353511 = weight(_text_:22 in 2352) [ClassicSimilarity], result of:
              0.0353511 = score(doc=2352,freq=2.0), product of:
                0.1827397 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.052184064 = queryNorm
                0.19345059 = fieldWeight in 2352, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=2352)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    This study examines the differences between Scopus and Web of Science in the citation counting, citation ranking, and h-index of 22 top human-computer interaction (HCI) researchers from EQUATOR - a large British Interdisciplinary Research Collaboration project. Results indicate that Scopus provides significantly more coverage of HCI literature than Web of Science, primarily due to coverage of relevant ACM and IEEE peer-reviewed conference proceedings. No significant differences exist between the two databases if citations in journals only are compared. Although broader coverage of the literature does not significantly alter the relative citation ranking of individual researchers, Scopus helps distinguish between the researchers in a more nuanced fashion than Web of Science in both citation counting and h-index. Scopus also generates significantly different maps of citation networks of individual scholars than those generated by Web of Science. The study also presents a comparison of h-index scores based on Google Scholar with those based on the union of Scopus and Web of Science. The study concludes that Scopus can be used as a sole data source for citation-based research and evaluation in HCI, especially when citations in conference proceedings are sought, and that researchers should manually calculate h scores instead of relying on system calculations.
  2. Zhao, D.; Strotmann, A.: Intellectual structure of information science 2011-2020 : an author co-citation analysis (2022) 0.01
    0.0054452433 = product of:
      0.010890487 = sum of:
        0.010890487 = product of:
          0.021780973 = sum of:
            0.021780973 = weight(_text_:systems in 610) [ClassicSimilarity], result of:
              0.021780973 = score(doc=610,freq=2.0), product of:
                0.16037072 = queryWeight, product of:
                  3.0731742 = idf(docFreq=5561, maxDocs=44218)
                  0.052184064 = queryNorm
                0.1358164 = fieldWeight in 610, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.0731742 = idf(docFreq=5561, maxDocs=44218)
                  0.03125 = fieldNorm(doc=610)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    Purpose This study continues a long history of author co-citation analysis of the intellectual structure of information science into the time period of 2011-2020. It also examines changes in this structure from 2006-2010 through 2011-2015 to 2016-2020. Results will contribute to a better understanding of the information science research field. Design/methodology/approach The well-established procedures and techniques for author co-citation analysis were followed. Full records of research articles in core information science journals published during 2011-2020 were retrieved and downloaded from the Web of Science database. About 150 most highly cited authors in each of the two five-year time periods were selected from this dataset to represent this field, and their co-citation counts were calculated. Each co-citation matrix was input into SPSS for factor analysis, and results were visualized in Pajek. Factors were interpreted as specialties and labeled upon an examination of articles written by authors who load primarily on each factor. Findings The two-camp structure of information science continued to be present clearly. Bibliometric indicators for research evaluation dominated the Knowledge Domain Analysis camp during both fivr-year time periods, whereas interactive information retrieval (IR) dominated the IR camp during 2011-2015 but shared dominance with information behavior during 2016-2020. Bridging between the two camps became increasingly weaker and was only provided by the scholarly communication specialty during 2016-2020. The IR systems specialty drifted further away from the IR camp. The information behavior specialty experienced a deep slump during 2011-2020 in its evolution process. Altmetrics grew to dominate the Webometrics specialty and brought it to a sharp increase during 2016-2020. Originality/value Author co-citation analysis (ACA) is effective in revealing intellectual structures of research fields. Most related studies used term-based methods to identify individual research topics but did not examine the interrelationships between these topics or the overall structure of the field. The few studies that did discuss the overall structure paid little attention to the effect of changes to the source journals on the results. The present study does not have these problems and continues the long history of benchmark contributions to a better understanding of the information science field using ACA.