Search (6 results, page 1 of 1)

  • × type_ss:"m"
  • × theme_ss:"Informetrie"
  1. Diodato, V.: Dictionary of bibliometrics (1994) 0.02
    0.019541508 = product of:
      0.09770754 = sum of:
        0.09770754 = weight(_text_:22 in 5666) [ClassicSimilarity], result of:
          0.09770754 = score(doc=5666,freq=2.0), product of:
            0.18038483 = queryWeight, product of:
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.051511593 = queryNorm
            0.5416616 = fieldWeight in 5666, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.109375 = fieldNorm(doc=5666)
      0.2 = coord(1/5)
    
    Footnote
    Rez. in: Journal of library and information science 22(1996) no.2, S.116-117 (L.C. Smith)
  2. De Bellis, N.: Bibliometrics and citation analysis : from the Science citation index to cybermetrics (2008) 0.02
    0.015058323 = product of:
      0.07529161 = sum of:
        0.07529161 = weight(_text_:index in 3585) [ClassicSimilarity], result of:
          0.07529161 = score(doc=3585,freq=6.0), product of:
            0.2250935 = queryWeight, product of:
              4.369764 = idf(docFreq=1520, maxDocs=44218)
              0.051511593 = queryNorm
            0.3344904 = fieldWeight in 3585, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              4.369764 = idf(docFreq=1520, maxDocs=44218)
              0.03125 = fieldNorm(doc=3585)
      0.2 = coord(1/5)
    
    Content
    Inhalt: Biblio/sciento/infor-metrics : terminological issues and early historical developments -- The empirical foundations of bibliometrics : the Science citation index -- The philosophical foundations of bibliometrics : Bernal, Merton, Price, Garfield, and Small -- The mathematical foundations of bibliometrics -- Maps and paradigms : bibliographic citations at the service of the history and sociology of science -- Impact factor and the evaluation of scientists : bibliographic citations at the service of science policy and management -- On the shoulders of dwarfs : citation as rhetorical device and the criticisms to the normative model -- Measuring scientific communication in the twentieth century : from bibliometrics to cybermetrics.
    Object
    Science Citation Index
  3. Theories of informetrics and scholarly communication : a Festschrift in honor of Blaise Cronin (2016) 0.01
    0.012295068 = product of:
      0.06147534 = sum of:
        0.06147534 = weight(_text_:index in 3801) [ClassicSimilarity], result of:
          0.06147534 = score(doc=3801,freq=4.0), product of:
            0.2250935 = queryWeight, product of:
              4.369764 = idf(docFreq=1520, maxDocs=44218)
              0.051511593 = queryNorm
            0.27311024 = fieldWeight in 3801, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              4.369764 = idf(docFreq=1520, maxDocs=44218)
              0.03125 = fieldNorm(doc=3801)
      0.2 = coord(1/5)
    
    Content
    Frontmatter -- -- Foreword -- -- Prologue -- -- Contents -- -- Introduction -- -- Part I: Critical informetrics -- -- The Incessant Chattering of Texts -- -- Informetrics Needs a Foundation in the Theory of Science -- -- Part II: Citation theories -- -- Referencing as Cooperation or Competition -- -- Semiotics and Citations -- -- Data Citation as a Bibliometric Oxymoron -- -- Part III: Statistical theories -- -- TypeToken Theory and Bibliometrics -- -- From a Success Index to a Success Multiplier -- -- From Matthew to Hirsch: A Success-Breeds-Success Story -- -- Informations Magic Numbers: The Numerology of Information Science -- -- Part IV: Authorship theories -- -- Authors as Persons and Authors as Bundles of Words -- -- The Angle Sum Theory: Exploring the Literature on Acknowledgments in Scholarly Communication -- -- The Flesh of Science: Somatics and Semiotics -- -- Part V: Knowledge organization theories -- -- Informetric Analyses of Knowledge Organization Systems (KOSs) -- -- Information, Meaning, and Intellectual Organization in Networks of Inter-Human Communication -- -- Modeling the Structure and Dynamics of Science Using Books -- -- Part VI: Altmetric theories -- -- Webometrics and Altmetrics: Home Birth vs. Hospital Birth -- -- Scientific Revolution in Scientometrics: The Broadening of Impact from Citation to Societal -- -- Altmetrics as Traces of the Computerization of the Research Process -- -- Interpreting Altmetrics: Viewing Acts on Social Media through the Lens of Citation and Social Theories -- -- Biographical information for the editor and contributors -- -- Index
  4. Scholarly metrics under the microscope : from citation analysis to academic auditing (2015) 0.01
    0.011166576 = product of:
      0.05583288 = sum of:
        0.05583288 = weight(_text_:22 in 4654) [ClassicSimilarity], result of:
          0.05583288 = score(doc=4654,freq=2.0), product of:
            0.18038483 = queryWeight, product of:
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.051511593 = queryNorm
            0.30952093 = fieldWeight in 4654, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.0625 = fieldNorm(doc=4654)
      0.2 = coord(1/5)
    
    Date
    22. 1.2017 17:12:50
  5. Tüür-Fröhlich, T.: ¬The non-trivial effects of trivial errors in scientific communication and evaluation (2016) 0.01
    0.008693925 = product of:
      0.043469626 = sum of:
        0.043469626 = weight(_text_:index in 3137) [ClassicSimilarity], result of:
          0.043469626 = score(doc=3137,freq=2.0), product of:
            0.2250935 = queryWeight, product of:
              4.369764 = idf(docFreq=1520, maxDocs=44218)
              0.051511593 = queryNorm
            0.1931181 = fieldWeight in 3137, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              4.369764 = idf(docFreq=1520, maxDocs=44218)
              0.03125 = fieldNorm(doc=3137)
      0.2 = coord(1/5)
    
    Abstract
    "Thomson Reuters' citation indexes i.e. SCI, SSCI and AHCI are said to be "authoritative". Due to the huge influence of these databases on global academic evaluation of productivity and impact, Terje Tüür-Fröhlich decided to conduct case studies on the data quality of Social Sciences Citation Index (SSCI) records. Tüür-Fröhlich investigated articles from social science and law. The main findings: SSCI records contain tremendous amounts of "trivial errors", not only misspellings and typos as previously mentioned in bibliometrics and scientometrics literature. But Tüür-Fröhlich's research documented fatal errors which have not been mentioned in the scientometrics literature yet at all. Tüür-Fröhlich found more than 80 fatal mutations and mutilations of Pierre Bourdieu (e.g. "Atkinson" or "Pierre, B. and "Pierri, B."). SSCI even generated zombie references (phantom authors and works) by data fields' confusion - a deadly sin for a database producer - as fragments of Patent Laws were indexed as fictional author surnames/initials. Additionally, horrific OCR-errors (e.g. "nuxure" instead of "Nature" as journal title) were identified. Tüür-Fröhlich´s extensive quantitative case study of an article of the Harvard Law Review resulted in a devastating finding: only 1% of all correct references from the original article were indexed by SSCI without any mistake or error. Many scientific communication experts and database providers' believe that errors in databanks are of less importance: There are many errors, yes - but they would counterbalance each other, errors would not result in citation losses and would not bear any effect on retrieval and evaluation outcomes. Terje Tüür-Fröhlich claims the contrary: errors and inconsistencies are not evenly distributed but linked with languages biases and publication cultures."
  6. Gingras, Y.: Bibliometrics and research evaluation : uses and abuses (2016) 0.01
    0.008693925 = product of:
      0.043469626 = sum of:
        0.043469626 = weight(_text_:index in 3805) [ClassicSimilarity], result of:
          0.043469626 = score(doc=3805,freq=2.0), product of:
            0.2250935 = queryWeight, product of:
              4.369764 = idf(docFreq=1520, maxDocs=44218)
              0.051511593 = queryNorm
            0.1931181 = fieldWeight in 3805, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              4.369764 = idf(docFreq=1520, maxDocs=44218)
              0.03125 = fieldNorm(doc=3805)
      0.2 = coord(1/5)
    
    Abstract
    The research evaluation market is booming. "Ranking," "metrics," "h-index," and "impact factors" are reigning buzzwords. Government and research administrators want to evaluate everything -- teachers, professors, training programs, universities -- using quantitative indicators. Among the tools used to measure "research excellence," bibliometrics -- aggregate data on publications and citations -- has become dominant. Bibliometrics is hailed as an "objective" measure of research quality, a quantitative measure more useful than "subjective" and intuitive evaluation methods such as peer review that have been used since scientific papers were first published in the seventeenth century. In this book, Yves Gingras offers a spirited argument against an unquestioning reliance on bibliometrics as an indicator of research quality. Gingras shows that bibliometric rankings have no real scientific validity, rarely measuring what they pretend to. Although the study of publication and citation patterns, at the proper scales, can yield insights on the global dynamics of science over time, ill-defined quantitative indicators often generate perverse and unintended effects on the direction of research. Moreover, abuse of bibliometrics occurs when data is manipulated to boost rankings. Gingras looks at the politics of evaluation and argues that using numbers can be a way to control scientists and diminish their autonomy in the evaluation process. Proposing precise criteria for establishing the validity of indicators at a given scale of analysis, Gingras questions why universities are so eager to let invalid indicators influence their research strategy.