Search (10 results, page 1 of 1)

  • × author_ss:"Larivière, V."
  • × year_i:[2010 TO 2020}
  1. Haustein, S.; Sugimoto, C.; Larivière, V.: Social media in scholarly communication : Guest editorial (2015) 0.07
    0.06505008 = sum of:
      0.004754305 = product of:
        0.02852583 = sum of:
          0.02852583 = weight(_text_:authors in 3809) [ClassicSimilarity], result of:
            0.02852583 = score(doc=3809,freq=2.0), product of:
              0.1887818 = queryWeight, product of:
                4.558814 = idf(docFreq=1258, maxDocs=44218)
                0.041410286 = queryNorm
              0.15110476 = fieldWeight in 3809, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                4.558814 = idf(docFreq=1258, maxDocs=44218)
                0.0234375 = fieldNorm(doc=3809)
        0.16666667 = coord(1/6)
      0.06029577 = sum of:
        0.043464206 = weight(_text_:p in 3809) [ClassicSimilarity], result of:
          0.043464206 = score(doc=3809,freq=12.0), product of:
            0.14889121 = queryWeight, product of:
              3.5955126 = idf(docFreq=3298, maxDocs=44218)
              0.041410286 = queryNorm
            0.29191923 = fieldWeight in 3809, product of:
              3.4641016 = tf(freq=12.0), with freq of:
                12.0 = termFreq=12.0
              3.5955126 = idf(docFreq=3298, maxDocs=44218)
              0.0234375 = fieldNorm(doc=3809)
        0.016831566 = weight(_text_:22 in 3809) [ClassicSimilarity], result of:
          0.016831566 = score(doc=3809,freq=2.0), product of:
            0.14501177 = queryWeight, product of:
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.041410286 = queryNorm
            0.116070345 = fieldWeight in 3809, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.0234375 = fieldNorm(doc=3809)
    
    Abstract
    This year marks 350 years since the inaugural publications of both the Journal des Sçavans and the Philosophical Transactions, first published in 1665 and considered the birth of the peer-reviewed journal article. This form of scholarly communication has not only remained the dominant model for disseminating new knowledge (particularly for science and medicine), but has also increased substantially in volume. Derek de Solla Price - the "father of scientometrics" (Merton and Garfield, 1986, p. vii) - was the first to document the exponential increase in scientific journals and showed that "scientists have always felt themselves to be awash in a sea of the scientific literature" (Price, 1963, p. 15), as, for example, expressed at the 1948 Royal Society's Scientific Information Conference: Not for the first time in history, but more acutely than ever before, there was a fear that scientists would be overwhelmed, that they would be no longer able to control the vast amounts of potentially relevant material that were pouring forth from the world's presses, that science itself was under threat (Bawden and Robinson, 2008, p. 183).
    One of the solutions to help scientists filter the most relevant publications and, thus, to stay current on developments in their fields during the transition from "little science" to "big science", was the introduction of citation indexing as a Wellsian "World Brain" (Garfield, 1964) of scientific information: It is too much to expect a research worker to spend an inordinate amount of time searching for the bibliographic descendants of antecedent papers. It would not be excessive to demand that the thorough scholar check all papers that have cited or criticized such papers, if they could be located quickly. The citation index makes this check practicable (Garfield, 1955, p. 108). In retrospective, citation indexing can be perceived as a pre-social web version of crowdsourcing, as it is based on the concept that the community of citing authors outperforms indexers in highlighting cognitive links between papers, particularly on the level of specific ideas and concepts (Garfield, 1983). Over the last 50 years, citation analysis and more generally, bibliometric methods, have developed from information retrieval tools to research evaluation metrics, where they are presumed to make scientific funding more efficient and effective (Moed, 2006). However, the dominance of bibliometric indicators in research evaluation has also led to significant goal displacement (Merton, 1957) and the oversimplification of notions of "research productivity" and "scientific quality", creating adverse effects such as salami publishing, honorary authorships, citation cartels, and misuse of indicators (Binswanger, 2015; Cronin and Sugimoto, 2014; Frey and Osterloh, 2006; Haustein and Larivière, 2015; Weingart, 2005).
    Furthermore, the rise of the web, and subsequently, the social web, has challenged the quasi-monopolistic status of the journal as the main form of scholarly communication and citation indices as the primary assessment mechanisms. Scientific communication is becoming more open, transparent, and diverse: publications are increasingly open access; manuscripts, presentations, code, and data are shared online; research ideas and results are discussed and criticized openly on blogs; and new peer review experiments, with open post publication assessment by anonymous or non-anonymous referees, are underway. The diversification of scholarly production and assessment, paired with the increasing speed of the communication process, leads to an increased information overload (Bawden and Robinson, 2008), demanding new filters. The concept of altmetrics, short for alternative (to citation) metrics, was created out of an attempt to provide a filter (Priem et al., 2010) and to steer against the oversimplification of the measurement of scientific success solely on the basis of number of journal articles published and citations received, by considering a wider range of research outputs and metrics (Piwowar, 2013). Although the term altmetrics was introduced in a tweet in 2010 (Priem, 2010), the idea of capturing traces - "polymorphous mentioning" (Cronin et al., 1998, p. 1320) - of scholars and their documents on the web to measure "impact" of science in a broader manner than citations was introduced years before, largely in the context of webometrics (Almind and Ingwersen, 1997; Thelwall et al., 2005):
    There will soon be a critical mass of web-based digital objects and usage statistics on which to model scholars' communication behaviors - publishing, posting, blogging, scanning, reading, downloading, glossing, linking, citing, recommending, acknowledging - and with which to track their scholarly influence and impact, broadly conceived and broadly felt (Cronin, 2005, p. 196). A decade after Cronin's prediction and five years after the coining of altmetrics, the time seems ripe to reflect upon the role of social media in scholarly communication. This Special Issue does so by providing an overview of current research on the indicators and metrics grouped under the umbrella term of altmetrics, on their relationships with traditional indicators of scientific activity, and on the uses that are made of the various social media platforms - on which these indicators are based - by scientists of various disciplines.
    Date
    20. 1.2015 18:30:22
  2. Mongeon, P.; Larivière, V.: Costly collaborations : the impact of scientific fraud on co-authors' careers (2016) 0.04
    0.037198834 = sum of:
      0.02241201 = product of:
        0.13447206 = sum of:
          0.13447206 = weight(_text_:authors in 2769) [ClassicSimilarity], result of:
            0.13447206 = score(doc=2769,freq=16.0), product of:
              0.1887818 = queryWeight, product of:
                4.558814 = idf(docFreq=1258, maxDocs=44218)
                0.041410286 = queryNorm
              0.7123147 = fieldWeight in 2769, product of:
                4.0 = tf(freq=16.0), with freq of:
                  16.0 = termFreq=16.0
                4.558814 = idf(docFreq=1258, maxDocs=44218)
                0.0390625 = fieldNorm(doc=2769)
        0.16666667 = coord(1/6)
      0.014786825 = product of:
        0.02957365 = sum of:
          0.02957365 = weight(_text_:p in 2769) [ClassicSimilarity], result of:
            0.02957365 = score(doc=2769,freq=2.0), product of:
              0.14889121 = queryWeight, product of:
                3.5955126 = idf(docFreq=3298, maxDocs=44218)
                0.041410286 = queryNorm
              0.19862589 = fieldWeight in 2769, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.5955126 = idf(docFreq=3298, maxDocs=44218)
                0.0390625 = fieldNorm(doc=2769)
        0.5 = coord(1/2)
    
    Abstract
    Over the past few years, several major scientific fraud cases have shocked the scientific community. The number of retractions each year has also increased tremendously, especially in the biomedical field, and scientific misconduct accounts for more than half of those retractions. It is assumed that co-authors of retracted papers are affected by their colleagues' misconduct, and the aim of this study is to provide empirical evidence of the effect of retractions in biomedical research on co-authors' research careers. Using data from the Web of Science, we measured the productivity, impact, and collaboration of 1,123 co-authors of 293 retracted articles for a period of 5 years before and after the retraction. We found clear evidence that collaborators do suffer consequences of their colleagues' misconduct and that a retraction for fraud has higher consequences than a retraction for error. Our results also suggest that the extent of these consequences is closely linked with the ranking of co-authors on the retracted paper, being felt most strongly by first authors, followed by the last authors, with the impact is less important for middle authors.
  3. Larivière, V.; Sugimoto, C.R.; Bergeron, P.: In their own image? : a comparison of doctoral students' and faculty members' referencing behavior (2013) 0.01
    0.008872095 = product of:
      0.01774419 = sum of:
        0.01774419 = product of:
          0.03548838 = sum of:
            0.03548838 = weight(_text_:p in 751) [ClassicSimilarity], result of:
              0.03548838 = score(doc=751,freq=2.0), product of:
                0.14889121 = queryWeight, product of:
                  3.5955126 = idf(docFreq=3298, maxDocs=44218)
                  0.041410286 = queryNorm
                0.23835106 = fieldWeight in 751, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5955126 = idf(docFreq=3298, maxDocs=44218)
                  0.046875 = fieldNorm(doc=751)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  4. Atanassova, I.; Bertin, M.; Larivière, V.: On the composition of scientific abstracts (2016) 0.01
    0.0079238415 = product of:
      0.015847683 = sum of:
        0.015847683 = product of:
          0.0950861 = sum of:
            0.0950861 = weight(_text_:authors in 3028) [ClassicSimilarity], result of:
              0.0950861 = score(doc=3028,freq=8.0), product of:
                0.1887818 = queryWeight, product of:
                  4.558814 = idf(docFreq=1258, maxDocs=44218)
                  0.041410286 = queryNorm
                0.50368255 = fieldWeight in 3028, product of:
                  2.828427 = tf(freq=8.0), with freq of:
                    8.0 = termFreq=8.0
                  4.558814 = idf(docFreq=1258, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=3028)
          0.16666667 = coord(1/6)
      0.5 = coord(1/2)
    
    Abstract
    Purpose - Scientific abstracts reproduce only part of the information and the complexity of argumentation in a scientific article. The purpose of this paper provides a first analysis of the similarity between the text of scientific abstracts and the body of articles, using sentences as the basic textual unit. It contributes to the understanding of the structure of abstracts. Design/methodology/approach - Using sentence-based similarity metrics, the authors quantify the phenomenon of text re-use in abstracts and examine the positions of the sentences that are similar to sentences in abstracts in the introduction, methods, results and discussion structure, using a corpus of over 85,000 research articles published in the seven Public Library of Science journals. Findings - The authors provide evidence that 84 percent of abstract have at least one sentence in common with the body of the paper. Studying the distributions of sentences in the body of the articles that are re-used in abstracts, the authors show that there exists a strong relation between the rhetorical structure of articles and the zones that authors re-use when writing abstracts, with sentences mainly coming from the beginning of the introduction and the end of the conclusion. Originality/value - Scientific abstracts contain what is considered by the author(s) as information that best describe documents' content. This is a first study that examines the relation between the contents of abstracts and the rhetorical structure of scientific articles. The work might provide new insight for improving automatic abstracting tools as well as information retrieval approaches, in which text organization and structure are important features.
  5. Vincent-Lamarre, P.; Boivin, J.; Gargouri, Y.; Larivière, V.; Harnad, S.: Estimating open access mandate effectiveness : the MELIBEA score (2016) 0.01
    0.0073934123 = product of:
      0.014786825 = sum of:
        0.014786825 = product of:
          0.02957365 = sum of:
            0.02957365 = weight(_text_:p in 3162) [ClassicSimilarity], result of:
              0.02957365 = score(doc=3162,freq=2.0), product of:
                0.14889121 = queryWeight, product of:
                  3.5955126 = idf(docFreq=3298, maxDocs=44218)
                  0.041410286 = queryNorm
                0.19862589 = fieldWeight in 3162, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5955126 = idf(docFreq=3298, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=3162)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  6. Larivière, V.; Sugimoto, C.R.; Cronin, B.: ¬A bibliometric chronicling of library and information science's first hundred years (2012) 0.01
    0.0068622488 = product of:
      0.0137244975 = sum of:
        0.0137244975 = product of:
          0.08234698 = sum of:
            0.08234698 = weight(_text_:authors in 244) [ClassicSimilarity], result of:
              0.08234698 = score(doc=244,freq=6.0), product of:
                0.1887818 = queryWeight, product of:
                  4.558814 = idf(docFreq=1258, maxDocs=44218)
                  0.041410286 = queryNorm
                0.43620193 = fieldWeight in 244, product of:
                  2.4494898 = tf(freq=6.0), with freq of:
                    6.0 = termFreq=6.0
                  4.558814 = idf(docFreq=1258, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=244)
          0.16666667 = coord(1/6)
      0.5 = coord(1/2)
    
    Abstract
    This paper presents a condensed history of Library and Information Science (LIS) over the course of more than a century using a variety of bibliometric measures. It examines in detail the variable rate of knowledge production in the field, shifts in subject coverage, the dominance of particular publication genres at different times, prevailing modes of production, interactions with other disciplines, and, more generally, observes how the field has evolved. It shows that, despite a striking growth in the number of journals, papers, and contributing authors, a decrease was observed in the field's market-share of all social science and humanities research. Collaborative authorship is now the norm, a pattern seen across the social sciences. The idea of boundary crossing was also examined: in 2010, nearly 60% of authors who published in LIS also published in another discipline. This high degree of permeability in LIS was also demonstrated through reference and citation practices: LIS scholars now cite and receive citations from other fields more than from LIS itself. Two major structural shifts are revealed in the data: in 1960, LIS changed from a professional field focused on librarianship to an academic field focused on information and use; and in 1990, LIS began to receive a growing number of citations from outside the field, notably from Computer Science and Management, and saw a dramatic increase in the number of authors contributing to the literature of the field.
  7. Larivière, V.; Gingras, Y.; Sugimoto, C.R.; Tsou, A.: Team size matters : collaboration and scientific impact since 1900 (2015) 0.01
    0.006723603 = product of:
      0.013447206 = sum of:
        0.013447206 = product of:
          0.08068323 = sum of:
            0.08068323 = weight(_text_:authors in 2035) [ClassicSimilarity], result of:
              0.08068323 = score(doc=2035,freq=4.0), product of:
                0.1887818 = queryWeight, product of:
                  4.558814 = idf(docFreq=1258, maxDocs=44218)
                  0.041410286 = queryNorm
                0.42738882 = fieldWeight in 2035, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  4.558814 = idf(docFreq=1258, maxDocs=44218)
                  0.046875 = fieldNorm(doc=2035)
          0.16666667 = coord(1/6)
      0.5 = coord(1/2)
    
    Abstract
    This article provides the first historical analysis of the relationship between collaboration and scientific impact using three indicators of collaboration (number of authors, number of addresses, and number of countries) derived from articles published between 1900 and 2011. The results demonstrate that an increase in the number of authors leads to an increase in impact, from the beginning of the last century onward, and that this is not due simply to self-citations. A similar trend is also observed for the number of addresses and number of countries represented in the byline of an article. However, the constant inflation of collaboration since 1900 has resulted in diminishing citation returns: Larger and more diverse (in terms of institutional and country affiliation) teams are necessary to realize higher impact. The article concludes with a discussion of the potential causes of the impact gain in citations of collaborative papers.
  8. Shu, F.; Julien, C.-A.; Larivière, V.: Does the Web of Science accurately represent chinese scientific performance? (2019) 0.01
    0.006723603 = product of:
      0.013447206 = sum of:
        0.013447206 = product of:
          0.08068323 = sum of:
            0.08068323 = weight(_text_:authors in 5388) [ClassicSimilarity], result of:
              0.08068323 = score(doc=5388,freq=4.0), product of:
                0.1887818 = queryWeight, product of:
                  4.558814 = idf(docFreq=1258, maxDocs=44218)
                  0.041410286 = queryNorm
                0.42738882 = fieldWeight in 5388, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  4.558814 = idf(docFreq=1258, maxDocs=44218)
                  0.046875 = fieldNorm(doc=5388)
          0.16666667 = coord(1/6)
      0.5 = coord(1/2)
    
    Abstract
    With the significant development of China's economy and scientific activity, its scientific publication activity is experiencing a period of rapid growth. However, measuring China's research output remains a challenge because Chinese scholars may publish their research in either international or national journals, yet no bibliometric database covers both the Chinese and English scientific literature. The purpose of this study is to compare Web of Science (WoS) with a Chinese bibliometric database in terms of authors and their performance, demonstrate the extent of the overlap between the two groups of Chinese most productive authors in both international and Chinese bibliometric databases, and determine how different disciplines may affect this overlap. The results of this study indicate that Chinese bibliometric databases, or a combination of WoS and Chinese bibliometric databases, should be used to evaluate Chinese research performance except in the few disciplines in which Chinese research performance could be assessed using WoS only.
  9. Kirchik, O.; Gingras, Y.; Larivière, V.: Changes in publication languages and citation practices and their effect on the scientific impact of Russian science (1993-2010) (2012) 0.00
    0.0039619207 = product of:
      0.0079238415 = sum of:
        0.0079238415 = product of:
          0.04754305 = sum of:
            0.04754305 = weight(_text_:authors in 284) [ClassicSimilarity], result of:
              0.04754305 = score(doc=284,freq=2.0), product of:
                0.1887818 = queryWeight, product of:
                  4.558814 = idf(docFreq=1258, maxDocs=44218)
                  0.041410286 = queryNorm
                0.25184128 = fieldWeight in 284, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  4.558814 = idf(docFreq=1258, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=284)
          0.16666667 = coord(1/6)
      0.5 = coord(1/2)
    
    Abstract
    This article analyzes the effects of publication language on the international scientific visibility of Russia using the Web of Science (WoS). Like other developing and transition countries, it is subject to a growing pressure to "internationalize" its scientific activities, which primarily means a shift to English as a language of scientific communication. But to what extent does the transition to English improve the impact of research? The case of Russia is of interest in this respect as the existence of many combinations of national journals and languages of publications (namely, Russian and English, including translated journals) provide a kind of natural I experiment to test the effects of language and publisher's country on the international visibility of research through citations as well as on the referencing practices of authors. Our analysis points to the conclusion that the production of original English-language papers in foreign journals is a more efficient strategy of internationalization than the mere translation of domestic journals. If the objective of a country is to maximize the international visibility of its scientific work, then the efforts should go into the promotion of publication in reputed English-language journals to profit from the added effect provided by the Matthew effect of these venues.
  10. Mohammadi, E.; Thelwall, M.; Haustein, S.; Larivière, V.: Who reads research articles? : an altmetrics analysis of Mendeley user categories (2015) 0.00
    0.0039619207 = product of:
      0.0079238415 = sum of:
        0.0079238415 = product of:
          0.04754305 = sum of:
            0.04754305 = weight(_text_:authors in 2162) [ClassicSimilarity], result of:
              0.04754305 = score(doc=2162,freq=2.0), product of:
                0.1887818 = queryWeight, product of:
                  4.558814 = idf(docFreq=1258, maxDocs=44218)
                  0.041410286 = queryNorm
                0.25184128 = fieldWeight in 2162, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  4.558814 = idf(docFreq=1258, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=2162)
          0.16666667 = coord(1/6)
      0.5 = coord(1/2)
    
    Abstract
    Little detailed information is known about who reads research articles and the contexts in which research articles are read. Using data about people who register in Mendeley as readers of articles, this article explores different types of users of Clinical Medicine, Engineering and Technology, Social Science, Physics, and Chemistry articles inside and outside academia. The majority of readers for all disciplines were PhD students, postgraduates, and postdocs but other types of academics were also represented. In addition, many Clinical Medicine articles were read by medical professionals. The highest correlations between citations and Mendeley readership counts were found for types of users who often authored academic articles, except for associate professors in some sub-disciplines. This suggests that Mendeley readership can reflect usage similar to traditional citation impact if the data are restricted to readers who are also authors without the delay of impact measured by citation counts. At the same time, Mendeley statistics can also reveal the hidden impact of some research articles, such as educational value for nonauthor users inside academia or the impact of research articles on practice for readers outside academia.