Search (8 results, page 1 of 1)

  • × type_ss:"el"
  • × theme_ss:"Informetrie"
  1. Van der Veer Martens, B.: Do citation systems represent theories of truth? (2001) 0.02
    0.019610625 = product of:
      0.03922125 = sum of:
        0.03922125 = product of:
          0.0784425 = sum of:
            0.0784425 = weight(_text_:22 in 3925) [ClassicSimilarity], result of:
              0.0784425 = score(doc=3925,freq=4.0), product of:
                0.14336278 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.04093939 = queryNorm
                0.54716086 = fieldWeight in 3925, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.078125 = fieldNorm(doc=3925)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Date
    22. 7.2006 15:22:28
  2. Braun, S.: Manifold: a custom analytics platform to visualize research impact (2015) 0.01
    0.0136500355 = product of:
      0.027300071 = sum of:
        0.027300071 = product of:
          0.054600142 = sum of:
            0.054600142 = weight(_text_:i in 2906) [ClassicSimilarity], result of:
              0.054600142 = score(doc=2906,freq=4.0), product of:
                0.15441231 = queryWeight, product of:
                  3.7717297 = idf(docFreq=2765, maxDocs=44218)
                  0.04093939 = queryNorm
                0.35359967 = fieldWeight in 2906, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  3.7717297 = idf(docFreq=2765, maxDocs=44218)
                  0.046875 = fieldNorm(doc=2906)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    The use of research impact metrics and analytics has become an integral component to many aspects of institutional assessment. Many platforms currently exist to provide such analytics, both proprietary and open source; however, the functionality of these systems may not always overlap to serve uniquely specific needs. In this paper, I describe a novel web-based platform, named Manifold, that I built to serve custom research impact assessment needs in the University of Minnesota Medical School. Built on a standard LAMP architecture, Manifold automatically pulls publication data for faculty from Scopus through APIs, calculates impact metrics through automated analytics, and dynamically generates report-like profiles that visualize those metrics. Work on this project has resulted in many lessons learned about challenges to sustainability and scalability in developing a system of such magnitude.
  3. Schreiber, M.: Restricting the h-index to a citation time window : a case study of a timed Hirsch index (2014) 0.01
    0.012869377 = product of:
      0.025738753 = sum of:
        0.025738753 = product of:
          0.051477507 = sum of:
            0.051477507 = weight(_text_:i in 1563) [ClassicSimilarity], result of:
              0.051477507 = score(doc=1563,freq=2.0), product of:
                0.15441231 = queryWeight, product of:
                  3.7717297 = idf(docFreq=2765, maxDocs=44218)
                  0.04093939 = queryNorm
                0.33337694 = fieldWeight in 1563, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.7717297 = idf(docFreq=2765, maxDocs=44218)
                  0.0625 = fieldNorm(doc=1563)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    The h-index has been shown to increase in many cases mostly because of citations to rather old publications. This inertia can be circumvented by restricting the evaluation to a citation time window. Here I report results of an empirical study analyzing the evolution of the thus defined timed h-index in dependence on the length of the citation time window.
  4. Schmitz, J.; Arning, U.; Peters, I.: handbuch.io : Handbuch CoScience / Messung von wissenschaftlichem Impact (2015) 0.01
    0.011260705 = product of:
      0.02252141 = sum of:
        0.02252141 = product of:
          0.04504282 = sum of:
            0.04504282 = weight(_text_:i in 2189) [ClassicSimilarity], result of:
              0.04504282 = score(doc=2189,freq=2.0), product of:
                0.15441231 = queryWeight, product of:
                  3.7717297 = idf(docFreq=2765, maxDocs=44218)
                  0.04093939 = queryNorm
                0.29170483 = fieldWeight in 2189, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.7717297 = idf(docFreq=2765, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=2189)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  5. Abdelkareem, M.A.A.: In terms of publication index, what indicator is the best for researchers indexing, Google Scholar, Scopus, Clarivate or others? (2018) 0.01
    0.011260705 = product of:
      0.02252141 = sum of:
        0.02252141 = product of:
          0.04504282 = sum of:
            0.04504282 = weight(_text_:i in 4548) [ClassicSimilarity], result of:
              0.04504282 = score(doc=4548,freq=2.0), product of:
                0.15441231 = queryWeight, product of:
                  3.7717297 = idf(docFreq=2765, maxDocs=44218)
                  0.04093939 = queryNorm
                0.29170483 = fieldWeight in 4548, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.7717297 = idf(docFreq=2765, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=4548)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    I believe that Google Scholar is the most popular academic indexing way for researchers and citations. However, some other indexing institutions may be more professional than Google Scholar but not as popular as Google Scholar. Other indexing websites like Scopus and Clarivate are providing more statistical figures for scholars, institutions or even journals. On account of publication citations, always Google Scholar shows higher citations for a paper than other indexing websites since Google Scholar consider most of the publication platforms so he can easily count the citations. While other databases just consider the citations come from those journals that are already indexed in their database
  6. Lamb, I.; Larson, C.: Shining a light on scientific data : building a data catalog to foster data sharing and reuse (2016) 0.01
    0.0096520325 = product of:
      0.019304065 = sum of:
        0.019304065 = product of:
          0.03860813 = sum of:
            0.03860813 = weight(_text_:i in 3195) [ClassicSimilarity], result of:
              0.03860813 = score(doc=3195,freq=2.0), product of:
                0.15441231 = queryWeight, product of:
                  3.7717297 = idf(docFreq=2765, maxDocs=44218)
                  0.04093939 = queryNorm
                0.25003272 = fieldWeight in 3195, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.7717297 = idf(docFreq=2765, maxDocs=44218)
                  0.046875 = fieldNorm(doc=3195)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  7. Metrics in research : for better or worse? (2016) 0.01
    0.009100024 = product of:
      0.018200047 = sum of:
        0.018200047 = product of:
          0.036400095 = sum of:
            0.036400095 = weight(_text_:i in 3312) [ClassicSimilarity], result of:
              0.036400095 = score(doc=3312,freq=4.0), product of:
                0.15441231 = queryWeight, product of:
                  3.7717297 = idf(docFreq=2765, maxDocs=44218)
                  0.04093939 = queryNorm
                0.2357331 = fieldWeight in 3312, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  3.7717297 = idf(docFreq=2765, maxDocs=44218)
                  0.03125 = fieldNorm(doc=3312)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Content
    Inhalt: Metrics in Research - For better or worse? / Jozica Dolenc, Philippe Hünenberger Oliver Renn - A brief visual history of research metrics / Oliver Renn, Jozica Dolenc, Joachim Schnabl - Bibliometry: The wizard of O's / Philippe Hünenberger - The grip of bibliometrics - A student perspective / Matthias Tinzl - Honesty and transparency to taxpayers is the long-term fundament for stable university funding / Wendelin J. Stark - Beyond metrics: Managing the performance of your work / Charlie Rapple - Scientific profiling instead of bibliometrics: Key performance indicators of the future / Rafael Ball - More knowledge, less numbers / Carl Philipp Rosenau - Do we really need BIBLIO-metrics to evaluate individual researchers? / Rüdiger Mutz - Using research metrics responsibly and effectively as a researcher / Peter I. Darroch, Lisa H. Colledge - Metrics in research: More (valuable) questions than answers / Urs Hugentobler - Publication of research results: Use and abuse / Wilfred F. van Gunsteren - Wanted: Transparent algorithms, interpretation skills, common sense / Eva E. Wille - Impact factors, the h-index, and citation hype - Metrics in research from the point of view of a journal editor / Renato Zenobi - Rashomon or metrics in a publisher's world / Gabriella Karger - The impact factor and I: A love-hate relationship / Jean-Christophe Leroux - Personal experiences bringing altmetrics to the academic market / Ben McLeish - Fatally attracted by numbers? / Oliver Renn - On computable numbers / Gerd Folkers, Laura Folkers - ScienceMatters - Single observation science publishing and linking observations to create an internet of science / Lawrence Rajendran.
  8. Scientometrics pioneer Eugene Garfield dies : Eugene Garfield, founder of the Institute for Scientific Information and The Scientist, has passed away at age 91 (2017) 0.01
    0.0056303525 = product of:
      0.011260705 = sum of:
        0.011260705 = product of:
          0.02252141 = sum of:
            0.02252141 = weight(_text_:i in 3460) [ClassicSimilarity], result of:
              0.02252141 = score(doc=3460,freq=2.0), product of:
                0.15441231 = queryWeight, product of:
                  3.7717297 = idf(docFreq=2765, maxDocs=44218)
                  0.04093939 = queryNorm
                0.14585242 = fieldWeight in 3460, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.7717297 = idf(docFreq=2765, maxDocs=44218)
                  0.02734375 = fieldNorm(doc=3460)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Content
    Vgl. auch Open Password, Nr.167 vom 01.03.2017 :"Eugene Garfield, Begründer und Pionier der Zitationsindexierung und der Ziationsanalyse, ohne den die Informationswissenschaft heute anders aussähe, ist im Alter von 91 Jahren gestorben. Er hinterlässt Frau, drei Söhne, eine Tochter, eine Stieftochter, zwei Enkeltöchter und zwei Großelternkinder. Garfield machte seinen ersten Abschluss als Bachelor in Chemie an der Columbia University in New York City im Jahre 1949. 1954 sattelte er einen Abschluss in Bibliothekswissenschaft drauf. 1961 sollte er im Fach strukturelle Linguistik promovieren. Als Chemie-Student war er nach eigenen Angaben weder besonders gut noch besonders glücklich. Sein "Erweckungserlebnis" hatte er auf einer Tagung der American Chemical Society, als er entdeckte, dass sich mit der Suche nach Literatur womöglich ein Lebensunterhalt bestreiten lasse. "So I went to the Chairman of the meeting and said: "How do you get a job in this racket?" Ab 1955 war Garfield zunächst als Berater für pharmazeutische Unternehmen tätig. Dort spezialisierte er sich auf Fachinformationen, indem er Inhalte relevanter Fachzeitschriften erarbeitete. 1955 schlug er in "Science" seine bahnbrechende Idee vor, Zitationen wissenschaftlicher Veröffentlichungen systematisch zu erfassen und Zusammenhänge zwischen Zitaten deutlich zu machen. 1960 gründete Garfield das Institute für Scientific Informationen, dessen CEO er bis 1992 blieb. 1964 brachte er den Scientific Information Index heraus. Weitere Maßgrößen wie der Social Science Index (ab 1973), der Arts and Humanities Citation Index (ab 1978) und der Journal Citation Index folgten. Diese Verzeichnisse wurden in dem "Web of Science" zusammengefasst und als Datenbank elektronisch zugänglich gemacht. Damit wurde es den Forschern ermöglich, die für sie relevante Literatur "at their fingertips" zu finden und sich in ihr zurechtzufinden. Darüber hinaus wurde es mit Hilfe der Rankings von Garfields Messgrößen möglich, die relative wissenschaftliche Bedeutung wissenschaftlicher Beiträge, Autoren, wissenschaftlicher Einrichtungen, Regionen und Länder zu messen.