Search (10 results, page 1 of 1)

  • × author_ss:"Wouters, P."
  1. Frandsen, T.F.; Wouters, P.: Turning working papers into journal articles : an exercise in microbibliometrics (2009) 0.03
    0.034861263 = product of:
      0.052291892 = sum of:
        0.032007344 = weight(_text_:on in 2757) [ClassicSimilarity], result of:
          0.032007344 = score(doc=2757,freq=8.0), product of:
            0.109763056 = queryWeight, product of:
              2.199415 = idf(docFreq=13325, maxDocs=44218)
              0.04990557 = queryNorm
            0.29160398 = fieldWeight in 2757, product of:
              2.828427 = tf(freq=8.0), with freq of:
                8.0 = termFreq=8.0
              2.199415 = idf(docFreq=13325, maxDocs=44218)
              0.046875 = fieldNorm(doc=2757)
        0.020284547 = product of:
          0.040569093 = sum of:
            0.040569093 = weight(_text_:22 in 2757) [ClassicSimilarity], result of:
              0.040569093 = score(doc=2757,freq=2.0), product of:
                0.1747608 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.04990557 = queryNorm
                0.23214069 = fieldWeight in 2757, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.046875 = fieldNorm(doc=2757)
          0.5 = coord(1/2)
      0.6666667 = coord(2/3)
    
    Abstract
    This article focuses on the process of scientific and scholarly communication. Data on open access publications on the Internet not only provides a supplement to the traditional citation indexes but also enables analysis of the microprocesses and daily practices that constitute scientific communication. This article focuses on a stage in the life cycle of scientific and scholarly information that precedes the publication of formal research articles in the scientific and scholarly literature. Binomial logistic regression models are used to analyse precise mechanisms at work in the transformation of a working paper (WP) into a journal article (JA) in the field of economics. The study unveils a fine-grained process of adapting WPs to their new context as JAs by deleting and adding literature references, which perhaps can be best captured by the term sculpting.
    Date
    22. 3.2009 18:59:25
  2. Costas, R.; Zahedi, Z.; Wouters, P.: ¬The thematic orientation of publications mentioned on social media : large-scale disciplinary comparison of social media metrics with citations (2015) 0.03
    0.031149916 = product of:
      0.046724875 = sum of:
        0.029821085 = weight(_text_:on in 2598) [ClassicSimilarity], result of:
          0.029821085 = score(doc=2598,freq=10.0), product of:
            0.109763056 = queryWeight, product of:
              2.199415 = idf(docFreq=13325, maxDocs=44218)
              0.04990557 = queryNorm
            0.271686 = fieldWeight in 2598, product of:
              3.1622777 = tf(freq=10.0), with freq of:
                10.0 = termFreq=10.0
              2.199415 = idf(docFreq=13325, maxDocs=44218)
              0.0390625 = fieldNorm(doc=2598)
        0.01690379 = product of:
          0.03380758 = sum of:
            0.03380758 = weight(_text_:22 in 2598) [ClassicSimilarity], result of:
              0.03380758 = score(doc=2598,freq=2.0), product of:
                0.1747608 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.04990557 = queryNorm
                0.19345059 = fieldWeight in 2598, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=2598)
          0.5 = coord(1/2)
      0.6666667 = coord(2/3)
    
    Abstract
    Purpose - The purpose of this paper is to analyze the disciplinary orientation of scientific publications that were mentioned on different social media platforms, focussing on their differences and similarities with citation counts. Design/methodology/approach - Social media metrics and readership counts, associated with 500,216 publications and their citation data from the Web of Science database, were collected from Altmetric.com and Mendeley. Results are presented through descriptive statistical analyses together with science maps generated with VOSviewer. Findings - The results confirm Mendeley as the most prevalent social media source with similar characteristics to citations in their distribution across fields and their density in average values per publication. The humanities, natural sciences, and engineering disciplines have a much lower presence of social media metrics. Twitter has a stronger focus on general medicine and social sciences. Other sources (blog, Facebook, Google+, and news media mentions) are more prominent in regards to multidisciplinary journals. Originality/value - This paper reinforces the relevance of Mendeley as a social media source for analytical purposes from a disciplinary perspective, being particularly relevant for the social sciences (together with Twitter). Key implications for the use of social media metrics on the evaluation of research performance (e.g. the concentration of some social media metrics, such as blogs, news items, etc., around multidisciplinary journals) are identified.
    Date
    20. 1.2015 18:30:22
  3. Waltman, L.; Calero-Medina, C.; Kosten, J.; Noyons, E.C.M.; Tijssen, R.J.W.; Eck, N.J. van; Leeuwen, T.N. van; Raan, A.F.J. van; Visser, M.S.; Wouters, P.: ¬The Leiden ranking 2011/2012 : data collection, indicators, and interpretation (2012) 0.01
    0.009940362 = product of:
      0.029821085 = sum of:
        0.029821085 = weight(_text_:on in 514) [ClassicSimilarity], result of:
          0.029821085 = score(doc=514,freq=10.0), product of:
            0.109763056 = queryWeight, product of:
              2.199415 = idf(docFreq=13325, maxDocs=44218)
              0.04990557 = queryNorm
            0.271686 = fieldWeight in 514, product of:
              3.1622777 = tf(freq=10.0), with freq of:
                10.0 = termFreq=10.0
              2.199415 = idf(docFreq=13325, maxDocs=44218)
              0.0390625 = fieldNorm(doc=514)
      0.33333334 = coord(1/3)
    
    Abstract
    The Leiden Ranking 2011/2012 is a ranking of universities based on bibliometric indicators of publication output, citation impact, and scientific collaboration. The ranking includes 500 major universities from 41 different countries. This paper provides an extensive discussion of the Leiden Ranking 2011/2012. The ranking is compared with other global university rankings, in particular the Academic Ranking of World Universities (commonly known as the Shanghai Ranking) and the Times Higher Education World University Rankings. The comparison focuses on the methodological choices underlying the different rankings. Also, a detailed description is offered of the data collection methodology of the Leiden Ranking 2011/2012 and of the indicators used in the ranking. Various innovations in the Leiden Ranking 2011/2012 are presented. These innovations include (1) an indicator based on counting a university's highly cited publications, (2) indicators based on fractional rather than full counting of collaborative publications, (3) the possibility of excluding non-English language publications, and (4) the use of stability intervals. Finally, some comments are made on the interpretation of the ranking and a number of limitations of the ranking are pointed out.
  4. Fang, Z.; Costas, R.; Tian, W.; Wang, X.; Wouters, P.: How is science clicked on Twitter? : click metrics for Bitly short links to scientific publications (2021) 0.01
    0.008890929 = product of:
      0.026672786 = sum of:
        0.026672786 = weight(_text_:on in 265) [ClassicSimilarity], result of:
          0.026672786 = score(doc=265,freq=8.0), product of:
            0.109763056 = queryWeight, product of:
              2.199415 = idf(docFreq=13325, maxDocs=44218)
              0.04990557 = queryNorm
            0.24300331 = fieldWeight in 265, product of:
              2.828427 = tf(freq=8.0), with freq of:
                8.0 = termFreq=8.0
              2.199415 = idf(docFreq=13325, maxDocs=44218)
              0.0390625 = fieldNorm(doc=265)
      0.33333334 = coord(1/3)
    
    Abstract
    To provide some context for the potential engagement behavior of Twitter users around science, this article investigates how Bitly short links to scientific publications embedded in scholarly Twitter mentions are clicked on Twitter. Based on the click metrics of over 1.1 million Bitly short links referring to Web of Science (WoS) publications, our results show that around 49.5% of them were not clicked by Twitter users. For those Bitly short links with clicks from Twitter, the majority of their Twitter clicks accumulated within a short period of time after they were first tweeted. Bitly short links to the publications in the field of Social Sciences and Humanities tend to attract more clicks from Twitter over other subject fields. This article also assesses the extent to which Twitter clicks are correlated with some other impact indicators. Twitter clicks are weakly correlated with scholarly impact indicators (WoS citations and Mendeley readers), but moderately correlated to other Twitter engagement indicators (total retweets and total likes). In light of these results, we highlight the importance of paying more attention to the click metrics of URLs in scholarly Twitter mentions, to improve our understanding about the more effective dissemination and reception of science information on Twitter.
  5. Wouters, P.: ¬The signs of science (1998) 0.01
    0.007112743 = product of:
      0.021338228 = sum of:
        0.021338228 = weight(_text_:on in 1023) [ClassicSimilarity], result of:
          0.021338228 = score(doc=1023,freq=2.0), product of:
            0.109763056 = queryWeight, product of:
              2.199415 = idf(docFreq=13325, maxDocs=44218)
              0.04990557 = queryNorm
            0.19440265 = fieldWeight in 1023, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              2.199415 = idf(docFreq=13325, maxDocs=44218)
              0.0625 = fieldNorm(doc=1023)
      0.33333334 = coord(1/3)
    
    Abstract
    Since the 'Science Citation Index' emerged within the system of scientific communication in 1964, an intense controversy about its character has been raging: in what sense can citation analysis be trusted? This debate can be characterized as the confrontation of different perspectives on science. Discusses the citation representation of science: the way the citation creates a new reality of as well as in the world of science; the main features of this reality; and some implications for science and science policy
  6. Costas, R.; Zahedi, Z.; Wouters, P.: Do "altmetrics" correlate with citations? : extensive comparison of altmetric indicators with citations from a multidisciplinary perspective (2015) 0.01
    0.0062868367 = product of:
      0.01886051 = sum of:
        0.01886051 = weight(_text_:on in 2214) [ClassicSimilarity], result of:
          0.01886051 = score(doc=2214,freq=4.0), product of:
            0.109763056 = queryWeight, product of:
              2.199415 = idf(docFreq=13325, maxDocs=44218)
              0.04990557 = queryNorm
            0.1718293 = fieldWeight in 2214, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              2.199415 = idf(docFreq=13325, maxDocs=44218)
              0.0390625 = fieldNorm(doc=2214)
      0.33333334 = coord(1/3)
    
    Abstract
    An extensive analysis of the presence of different altmetric indicators provided by Altmetric.com across scientific fields is presented, particularly focusing on their relationship with citations. Our results confirm that the presence and density of social media altmetric counts are still very low and not very frequent among scientific publications, with 15%-24% of the publications presenting some altmetric activity and concentrated on the most recent publications, although their presence is increasing over time. Publications from the social sciences, humanities, and the medical and life sciences show the highest presence of altmetrics, indicating their potential value and interest for these fields. The analysis of the relationships between altmetrics and citations confirms previous claims of positive correlations but is relatively weak, thus supporting the idea that altmetrics do not reflect the same kind of impact as citations. Also, altmetric counts do not always present a better filtering of highly-cited publications than journal citation scores. Altmetric scores (particularly mentions in blogs) are able to identify highly-cited publications with higher levels of precision than journal citation scores (JCS), but they have a lower level of recall. The value of altmetrics as a complementary tool of citation analysis is highlighted, although more research is suggested to disentangle the potential meaning and value of altmetric indicators for research evaluation.
  7. Thelwall, M.; Wouters, P.; Fry, J.: Information-centered research for large-scale analyses of new information sources (2008) 0.01
    0.00622365 = product of:
      0.01867095 = sum of:
        0.01867095 = weight(_text_:on in 1969) [ClassicSimilarity], result of:
          0.01867095 = score(doc=1969,freq=2.0), product of:
            0.109763056 = queryWeight, product of:
              2.199415 = idf(docFreq=13325, maxDocs=44218)
              0.04990557 = queryNorm
            0.17010231 = fieldWeight in 1969, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              2.199415 = idf(docFreq=13325, maxDocs=44218)
              0.0546875 = fieldNorm(doc=1969)
      0.33333334 = coord(1/3)
    
    Abstract
    New mass publishing genres, such as blogs and personal home pages provide a rich source of social data that is yet to be fully exploited by the social sciences and humanities. Information-centered research (ICR) not only provides a genuinely new and useful information science research model for this type of data, but can also contribute to the emerging e-research infrastructure. Nevertheless, ICR should not be conducted on a purely abstract level, but should relate to potentially relevant problems.
  8. Hicks, D.; Wouters, P.; Waltman, L.; Rijcke, S. de; Rafols, I.: ¬The Leiden Manifesto for research metrics : 10 principles to guide research evaluation (2015) 0.01
    0.00622365 = product of:
      0.01867095 = sum of:
        0.01867095 = weight(_text_:on in 1994) [ClassicSimilarity], result of:
          0.01867095 = score(doc=1994,freq=2.0), product of:
            0.109763056 = queryWeight, product of:
              2.199415 = idf(docFreq=13325, maxDocs=44218)
              0.04990557 = queryNorm
            0.17010231 = fieldWeight in 1994, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              2.199415 = idf(docFreq=13325, maxDocs=44218)
              0.0546875 = fieldNorm(doc=1994)
      0.33333334 = coord(1/3)
    
    Abstract
    Research evaluation has become routine and often relies on metrics. But it is increasingly driven by data and not by expert judgement. As a result, the procedures that were designed to increase the quality of research are now threatening to damage the scientific system. To support researchers and managers, five experts led by Diana Hicks, professor in the School of Public Policy at Georgia Institute of Technology, and Paul Wouters, director of CWTS at Leiden University, have proposed ten principles for the measurement of research performance: the Leiden Manifesto for Research Metrics published as a comment in Nature.
  9. Talja, S.; Vakkari, P.; Fry, J.; Wouters, P.: Impact of research cultures on the use of digital library resources (2007) 0.00
    0.0044454644 = product of:
      0.013336393 = sum of:
        0.013336393 = weight(_text_:on in 590) [ClassicSimilarity], result of:
          0.013336393 = score(doc=590,freq=2.0), product of:
            0.109763056 = queryWeight, product of:
              2.199415 = idf(docFreq=13325, maxDocs=44218)
              0.04990557 = queryNorm
            0.121501654 = fieldWeight in 590, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              2.199415 = idf(docFreq=13325, maxDocs=44218)
              0.0390625 = fieldNorm(doc=590)
      0.33333334 = coord(1/3)
    
  10. Zahedi, Z.; Costas, R.; Wouters, P.: Mendeley readership as a filtering tool to identify highly cited publications (2017) 0.00
    0.0044454644 = product of:
      0.013336393 = sum of:
        0.013336393 = weight(_text_:on in 3837) [ClassicSimilarity], result of:
          0.013336393 = score(doc=3837,freq=2.0), product of:
            0.109763056 = queryWeight, product of:
              2.199415 = idf(docFreq=13325, maxDocs=44218)
              0.04990557 = queryNorm
            0.121501654 = fieldWeight in 3837, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              2.199415 = idf(docFreq=13325, maxDocs=44218)
              0.0390625 = fieldNorm(doc=3837)
      0.33333334 = coord(1/3)
    
    Abstract
    This study presents a large-scale analysis of the distribution and presence of Mendeley readership scores over time and across disciplines. We study whether Mendeley readership scores (RS) can identify highly cited publications more effectively than journal citation scores (JCS). Web of Science (WoS) publications with digital object identifiers (DOIs) published during the period 2004-2013 and across five major scientific fields were analyzed. The main result of this study shows that RS are more effective (in terms of precision/recall values) than JCS to identify highly cited publications across all fields of science and publication years. The findings also show that 86.5% of all the publications are covered by Mendeley and have at least one reader. Also, the share of publications with Mendeley RS is increasing from 84% in 2004 to 89% in 2009, and decreasing from 88% in 2010 to 82% in 2013. However, it is noted that publications from 2010 onwards exhibit on average a higher density of readership versus citation scores. This indicates that compared to citation scores, RS are more prevalent for recent publications and hence they could work as an early indicator of research impact. These findings highlight the potential and value of Mendeley as a tool for scientometric purposes and particularly as a relevant tool to identify highly cited publications.