Search (9 results, page 1 of 1)

  • × author_ss:"Wouters, P."
  • × theme_ss:"Informetrie"
  1. Costas, R.; Zahedi, Z.; Wouters, P.: ¬The thematic orientation of publications mentioned on social media : large-scale disciplinary comparison of social media metrics with citations (2015) 0.02
    0.01938208 = product of:
      0.03876416 = sum of:
        0.03876416 = sum of:
          0.0075639198 = weight(_text_:a in 2598) [ClassicSimilarity], result of:
            0.0075639198 = score(doc=2598,freq=10.0), product of:
              0.053105544 = queryWeight, product of:
                1.153047 = idf(docFreq=37942, maxDocs=44218)
                0.046056706 = queryNorm
              0.14243183 = fieldWeight in 2598, product of:
                3.1622777 = tf(freq=10.0), with freq of:
                  10.0 = termFreq=10.0
                1.153047 = idf(docFreq=37942, maxDocs=44218)
                0.0390625 = fieldNorm(doc=2598)
          0.03120024 = weight(_text_:22 in 2598) [ClassicSimilarity], result of:
            0.03120024 = score(doc=2598,freq=2.0), product of:
              0.16128273 = queryWeight, product of:
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.046056706 = queryNorm
              0.19345059 = fieldWeight in 2598, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.0390625 = fieldNorm(doc=2598)
      0.5 = coord(1/2)
    
    Abstract
    Purpose - The purpose of this paper is to analyze the disciplinary orientation of scientific publications that were mentioned on different social media platforms, focussing on their differences and similarities with citation counts. Design/methodology/approach - Social media metrics and readership counts, associated with 500,216 publications and their citation data from the Web of Science database, were collected from Altmetric.com and Mendeley. Results are presented through descriptive statistical analyses together with science maps generated with VOSviewer. Findings - The results confirm Mendeley as the most prevalent social media source with similar characteristics to citations in their distribution across fields and their density in average values per publication. The humanities, natural sciences, and engineering disciplines have a much lower presence of social media metrics. Twitter has a stronger focus on general medicine and social sciences. Other sources (blog, Facebook, Google+, and news media mentions) are more prominent in regards to multidisciplinary journals. Originality/value - This paper reinforces the relevance of Mendeley as a social media source for analytical purposes from a disciplinary perspective, being particularly relevant for the social sciences (together with Twitter). Key implications for the use of social media metrics on the evaluation of research performance (e.g. the concentration of some social media metrics, such as blogs, news items, etc., around multidisciplinary journals) are identified.
    Date
    20. 1.2015 18:30:22
    Type
    a
  2. Zahedi, Z.; Costas, R.; Wouters, P.: Mendeley readership as a filtering tool to identify highly cited publications (2017) 0.00
    0.0020714647 = product of:
      0.0041429293 = sum of:
        0.0041429293 = product of:
          0.008285859 = sum of:
            0.008285859 = weight(_text_:a in 3837) [ClassicSimilarity], result of:
              0.008285859 = score(doc=3837,freq=12.0), product of:
                0.053105544 = queryWeight, product of:
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.046056706 = queryNorm
                0.15602624 = fieldWeight in 3837, product of:
                  3.4641016 = tf(freq=12.0), with freq of:
                    12.0 = termFreq=12.0
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=3837)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    This study presents a large-scale analysis of the distribution and presence of Mendeley readership scores over time and across disciplines. We study whether Mendeley readership scores (RS) can identify highly cited publications more effectively than journal citation scores (JCS). Web of Science (WoS) publications with digital object identifiers (DOIs) published during the period 2004-2013 and across five major scientific fields were analyzed. The main result of this study shows that RS are more effective (in terms of precision/recall values) than JCS to identify highly cited publications across all fields of science and publication years. The findings also show that 86.5% of all the publications are covered by Mendeley and have at least one reader. Also, the share of publications with Mendeley RS is increasing from 84% in 2004 to 89% in 2009, and decreasing from 88% in 2010 to 82% in 2013. However, it is noted that publications from 2010 onwards exhibit on average a higher density of readership versus citation scores. This indicates that compared to citation scores, RS are more prevalent for recent publications and hence they could work as an early indicator of research impact. These findings highlight the potential and value of Mendeley as a tool for scientometric purposes and particularly as a relevant tool to identify highly cited publications.
    Type
    a
  3. Hicks, D.; Wouters, P.; Waltman, L.; Rijcke, S. de; Rafols, I.: ¬The Leiden Manifesto for research metrics : 10 principles to guide research evaluation (2015) 0.00
    0.0020506454 = product of:
      0.004101291 = sum of:
        0.004101291 = product of:
          0.008202582 = sum of:
            0.008202582 = weight(_text_:a in 1994) [ClassicSimilarity], result of:
              0.008202582 = score(doc=1994,freq=6.0), product of:
                0.053105544 = queryWeight, product of:
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.046056706 = queryNorm
                0.1544581 = fieldWeight in 1994, product of:
                  2.4494898 = tf(freq=6.0), with freq of:
                    6.0 = termFreq=6.0
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=1994)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    Research evaluation has become routine and often relies on metrics. But it is increasingly driven by data and not by expert judgement. As a result, the procedures that were designed to increase the quality of research are now threatening to damage the scientific system. To support researchers and managers, five experts led by Diana Hicks, professor in the School of Public Policy at Georgia Institute of Technology, and Paul Wouters, director of CWTS at Leiden University, have proposed ten principles for the measurement of research performance: the Leiden Manifesto for Research Metrics published as a comment in Nature.
    Type
    a
  4. Wouters, P.: ¬The signs of science (1998) 0.00
    0.001913537 = product of:
      0.003827074 = sum of:
        0.003827074 = product of:
          0.007654148 = sum of:
            0.007654148 = weight(_text_:a in 1023) [ClassicSimilarity], result of:
              0.007654148 = score(doc=1023,freq=4.0), product of:
                0.053105544 = queryWeight, product of:
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.046056706 = queryNorm
                0.14413087 = fieldWeight in 1023, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.0625 = fieldNorm(doc=1023)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    Since the 'Science Citation Index' emerged within the system of scientific communication in 1964, an intense controversy about its character has been raging: in what sense can citation analysis be trusted? This debate can be characterized as the confrontation of different perspectives on science. Discusses the citation representation of science: the way the citation creates a new reality of as well as in the world of science; the main features of this reality; and some implications for science and science policy
    Type
    a
  5. Waltman, L.; Calero-Medina, C.; Kosten, J.; Noyons, E.C.M.; Tijssen, R.J.W.; Eck, N.J. van; Leeuwen, T.N. van; Raan, A.F.J. van; Visser, M.S.; Wouters, P.: ¬The Leiden ranking 2011/2012 : data collection, indicators, and interpretation (2012) 0.00
    0.0018909799 = product of:
      0.0037819599 = sum of:
        0.0037819599 = product of:
          0.0075639198 = sum of:
            0.0075639198 = weight(_text_:a in 514) [ClassicSimilarity], result of:
              0.0075639198 = score(doc=514,freq=10.0), product of:
                0.053105544 = queryWeight, product of:
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.046056706 = queryNorm
                0.14243183 = fieldWeight in 514, product of:
                  3.1622777 = tf(freq=10.0), with freq of:
                    10.0 = termFreq=10.0
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=514)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    The Leiden Ranking 2011/2012 is a ranking of universities based on bibliometric indicators of publication output, citation impact, and scientific collaboration. The ranking includes 500 major universities from 41 different countries. This paper provides an extensive discussion of the Leiden Ranking 2011/2012. The ranking is compared with other global university rankings, in particular the Academic Ranking of World Universities (commonly known as the Shanghai Ranking) and the Times Higher Education World University Rankings. The comparison focuses on the methodological choices underlying the different rankings. Also, a detailed description is offered of the data collection methodology of the Leiden Ranking 2011/2012 and of the indicators used in the ranking. Various innovations in the Leiden Ranking 2011/2012 are presented. These innovations include (1) an indicator based on counting a university's highly cited publications, (2) indicators based on fractional rather than full counting of collaborative publications, (3) the possibility of excluding non-English language publications, and (4) the use of stability intervals. Finally, some comments are made on the interpretation of the ranking and a number of limitations of the ranking are pointed out.
    Type
    a
  6. Costas, R.; Zahedi, Z.; Wouters, P.: Do "altmetrics" correlate with citations? : extensive comparison of altmetric indicators with citations from a multidisciplinary perspective (2015) 0.00
    0.0018909799 = product of:
      0.0037819599 = sum of:
        0.0037819599 = product of:
          0.0075639198 = sum of:
            0.0075639198 = weight(_text_:a in 2214) [ClassicSimilarity], result of:
              0.0075639198 = score(doc=2214,freq=10.0), product of:
                0.053105544 = queryWeight, product of:
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.046056706 = queryNorm
                0.14243183 = fieldWeight in 2214, product of:
                  3.1622777 = tf(freq=10.0), with freq of:
                    10.0 = termFreq=10.0
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=2214)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    An extensive analysis of the presence of different altmetric indicators provided by Altmetric.com across scientific fields is presented, particularly focusing on their relationship with citations. Our results confirm that the presence and density of social media altmetric counts are still very low and not very frequent among scientific publications, with 15%-24% of the publications presenting some altmetric activity and concentrated on the most recent publications, although their presence is increasing over time. Publications from the social sciences, humanities, and the medical and life sciences show the highest presence of altmetrics, indicating their potential value and interest for these fields. The analysis of the relationships between altmetrics and citations confirms previous claims of positive correlations but is relatively weak, thus supporting the idea that altmetrics do not reflect the same kind of impact as citations. Also, altmetric counts do not always present a better filtering of highly-cited publications than journal citation scores. Altmetric scores (particularly mentions in blogs) are able to identify highly-cited publications with higher levels of precision than journal citation scores (JCS), but they have a lower level of recall. The value of altmetrics as a complementary tool of citation analysis is highlighted, although more research is suggested to disentangle the potential meaning and value of altmetric indicators for research evaluation.
    Type
    a
  7. Franssen, T.; Wouters, P.: Science and its significant other : representing the humanities in bibliometric scholarship (2019) 0.00
    0.0016913437 = product of:
      0.0033826875 = sum of:
        0.0033826875 = product of:
          0.006765375 = sum of:
            0.006765375 = weight(_text_:a in 5409) [ClassicSimilarity], result of:
              0.006765375 = score(doc=5409,freq=8.0), product of:
                0.053105544 = queryWeight, product of:
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.046056706 = queryNorm
                0.12739488 = fieldWeight in 5409, product of:
                  2.828427 = tf(freq=8.0), with freq of:
                    8.0 = termFreq=8.0
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=5409)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    The cognitive and social structures, and publication practices, of the humanities have been studied bibliometrically for the past 50 years. This article explores the conceptual frameworks, methods, and data sources used in bibliometrics to study the nature of the humanities, and its differences and similarities in comparison with other scientific domains. We give a historical overview of bibliometric scholarship between 1965 and 2018 that studies the humanities empirically and distinguishes between two periods in which the configuration of the bibliometric system differs remarkably. The first period, 1965 to the 1980s, is characterized by bibliometric methods embedded in a sociological theoretical framework, the development and use of the Price Index, and small samples of journal publications from which references are used as data sources. The second period, the 1980s to the present day, is characterized by a new intellectual hinterland-that of science policy and research evaluation-in which bibliometric methods become embedded. Here metadata of publications becomes the primary data source with which publication profiles of humanistic scholarly communities are analyzed. We unpack the differences between these two periods and critically discuss the analytical avenues that different approaches offer.
    Type
    a
  8. Wouters, P.; Vries, R. de: Formally citing the Web (2004) 0.00
    0.001353075 = product of:
      0.00270615 = sum of:
        0.00270615 = product of:
          0.0054123 = sum of:
            0.0054123 = weight(_text_:a in 3093) [ClassicSimilarity], result of:
              0.0054123 = score(doc=3093,freq=8.0), product of:
                0.053105544 = queryWeight, product of:
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.046056706 = queryNorm
                0.10191591 = fieldWeight in 3093, product of:
                  2.828427 = tf(freq=8.0), with freq of:
                    8.0 = termFreq=8.0
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.03125 = fieldNorm(doc=3093)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    How do authors refer to Web-based information sources in their formal scientific publications? It is not yet weIl known how scientists and scholars actually include new types of information sources, available through the new media, in their published work. This article reports an a comparative study of the lists of references in 38 scientific journals in five different scientific and social scientific fields. The fields are sociology, library and information science, biochemistry and biotechnology, neuroscience, and the mathematics of computing. As is weIl known, references, citations, and hyperlinks play different roles in academic publishing and communication. Our study focuses an hyperlinks as attributes of references in formal scholarly publications. The study developed and applied a method to analyze the differential roles of publishing media in the analysis of scientific and scholarly literature references. The present secondary databases that include reference and citation data (the Web of Science) cannot be used for this type of research. By the automated processing and analysis of the full text of scientific and scholarly articles, we were able to extract the references and hyperlinks contained in these references in relation to other features of the scientific and scholarly literature. Our findings show that hyperlinking references are indeed, as expected, abundantly present in the formal literature. They also tend to cite more recent literature than the average reference. The large majority of the references are to Web instances of traditional scientific journals. Other types of Web-based information sources are less weIl represented in the lists of references, except in the case of pure e-journals. We conclude that this can be explained by taking the role of the publisher into account. Indeed, it seems that the shift from print-based to electronic publishing has created new roles for the publisher. By shaping the way scientific references are hyperlinking to other information sources, the publisher may have a large impact an the availability of scientific and scholarly information.
    Type
    a
  9. Fang, Z.; Costas, R.; Tian, W.; Wang, X.; Wouters, P.: How is science clicked on Twitter? : click metrics for Bitly short links to scientific publications (2021) 0.00
    0.0011959607 = product of:
      0.0023919214 = sum of:
        0.0023919214 = product of:
          0.0047838427 = sum of:
            0.0047838427 = weight(_text_:a in 265) [ClassicSimilarity], result of:
              0.0047838427 = score(doc=265,freq=4.0), product of:
                0.053105544 = queryWeight, product of:
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.046056706 = queryNorm
                0.090081796 = fieldWeight in 265, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=265)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    To provide some context for the potential engagement behavior of Twitter users around science, this article investigates how Bitly short links to scientific publications embedded in scholarly Twitter mentions are clicked on Twitter. Based on the click metrics of over 1.1 million Bitly short links referring to Web of Science (WoS) publications, our results show that around 49.5% of them were not clicked by Twitter users. For those Bitly short links with clicks from Twitter, the majority of their Twitter clicks accumulated within a short period of time after they were first tweeted. Bitly short links to the publications in the field of Social Sciences and Humanities tend to attract more clicks from Twitter over other subject fields. This article also assesses the extent to which Twitter clicks are correlated with some other impact indicators. Twitter clicks are weakly correlated with scholarly impact indicators (WoS citations and Mendeley readers), but moderately correlated to other Twitter engagement indicators (total retweets and total likes). In light of these results, we highlight the importance of paying more attention to the click metrics of URLs in scholarly Twitter mentions, to improve our understanding about the more effective dissemination and reception of science information on Twitter.
    Type
    a