Search (8 results, page 1 of 1)

  • × author_ss:"Gingras, Y."
  • × language_ss:"e"
  • × year_i:[2010 TO 2020}
  1. Kirchik, O.; Gingras, Y.; Larivière, V.: Changes in publication languages and citation practices and their effect on the scientific impact of Russian science (1993-2010) (2012) 0.00
    0.0039210445 = product of:
      0.015684178 = sum of:
        0.015684178 = product of:
          0.031368356 = sum of:
            0.031368356 = weight(_text_:science in 284) [ClassicSimilarity], result of:
              0.031368356 = score(doc=284,freq=6.0), product of:
                0.124457374 = queryWeight, product of:
                  2.6341193 = idf(docFreq=8627, maxDocs=44218)
                  0.047248192 = queryNorm
                0.25204095 = fieldWeight in 284, product of:
                  2.4494898 = tf(freq=6.0), with freq of:
                    6.0 = termFreq=6.0
                  2.6341193 = idf(docFreq=8627, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=284)
          0.5 = coord(1/2)
      0.25 = coord(1/4)
    
    Abstract
    This article analyzes the effects of publication language on the international scientific visibility of Russia using the Web of Science (WoS). Like other developing and transition countries, it is subject to a growing pressure to "internationalize" its scientific activities, which primarily means a shift to English as a language of scientific communication. But to what extent does the transition to English improve the impact of research? The case of Russia is of interest in this respect as the existence of many combinations of national journals and languages of publications (namely, Russian and English, including translated journals) provide a kind of natural I experiment to test the effects of language and publisher's country on the international visibility of research through citations as well as on the referencing practices of authors. Our analysis points to the conclusion that the production of original English-language papers in foreign journals is a more efficient strategy of internationalization than the mere translation of domestic journals. If the objective of a country is to maximize the international visibility of its scientific work, then the efforts should go into the promotion of publication in reputed English-language journals to profit from the added effect provided by the Matthew effect of these venues.
    Source
    Journal of the American Society for Information Science and Technology. 63(2012) no.7, S.1411-1419
  2. Larivière, V.; Gingras, Y.: ¬The impact factor's Matthew Effect : a natural experiment in bibliometrics (2010) 0.00
    0.0038418232 = product of:
      0.015367293 = sum of:
        0.015367293 = product of:
          0.030734586 = sum of:
            0.030734586 = weight(_text_:science in 3338) [ClassicSimilarity], result of:
              0.030734586 = score(doc=3338,freq=4.0), product of:
                0.124457374 = queryWeight, product of:
                  2.6341193 = idf(docFreq=8627, maxDocs=44218)
                  0.047248192 = queryNorm
                0.24694869 = fieldWeight in 3338, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  2.6341193 = idf(docFreq=8627, maxDocs=44218)
                  0.046875 = fieldNorm(doc=3338)
          0.5 = coord(1/2)
      0.25 = coord(1/4)
    
    Abstract
    Since the publication of Robert K. Merton's theory of cumulative advantage in science (Matthew Effect), several empirical studies have tried to measure its presence at the level of papers, individual researchers, institutions, or countries. However, these studies seldom control for the intrinsic quality of papers or of researchers - better (however defined) papers or researchers could receive higher citation rates because they are indeed of better quality. Using an original method for controlling the intrinsic value of papers - identical duplicate papers published in different journals with different impact factors - this paper shows that the journal in which papers are published have a strong influence on their citation rates, as duplicate papers published in high-impact journals obtain, on average, twice as many citations as their identical counterparts published in journals with lower impact factors. The intrinsic value of a paper is thus not the only reason a given paper gets cited or not, there is a specific Matthew Effect attached to journals and this gives to papers published there an added value over and above their intrinsic quality.
    Source
    Journal of the American Society for Information Science and Technology. 61(2010) no.2, S.424-427
  3. Larivière, V.; Lozano, G.A.; Gingras, Y.: Are elite journals declining? (2014) 0.00
    0.0032015191 = product of:
      0.012806077 = sum of:
        0.012806077 = product of:
          0.025612153 = sum of:
            0.025612153 = weight(_text_:science in 1228) [ClassicSimilarity], result of:
              0.025612153 = score(doc=1228,freq=4.0), product of:
                0.124457374 = queryWeight, product of:
                  2.6341193 = idf(docFreq=8627, maxDocs=44218)
                  0.047248192 = queryNorm
                0.20579056 = fieldWeight in 1228, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  2.6341193 = idf(docFreq=8627, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=1228)
          0.5 = coord(1/2)
      0.25 = coord(1/4)
    
    Abstract
    Previous research indicates that during the past 20 years, the highest-quality work has been published in an increasingly diverse and larger group of journals. In this article, we examine whether this diversification has also affected the handful of elite journals that are traditionally considered to be the best. We examine citation patterns during the past 40 years of seven long-standing traditionally elite journals and six journals that have been increasing in importance during the past 20 years. To be among the top 5% or 1% cited papers, papers now need about twice as many citations as they did 40 years ago. Since the late 1980s and early 1990s, elite journals have been publishing a decreasing proportion of these top-cited papers. This also applies to the two journals that are typically considered as the top venues and often used as bibliometric indicators of "excellence": Science and Nature. On the other hand, several new and established journals are publishing an increasing proportion of the most-cited papers. These changes bring new challenges and opportunities for all parties. Journals can enact policies to increase or maintain their relative position in the journal hierarchy. Researchers now have the option to publish in more diverse venues knowing that their work can still reach the same audiences. Finally, evaluators and administrators need to know that although there will always be a certain prestige associated with publishing in "elite" journals, journal hierarchies are in constant flux.
    Source
    Journal of the Association for Information Science and Technology. 65(2014) no.4, S.649-655
  4. Larivière, V.; Gingras, Y.; Sugimoto, C.R.; Tsou, A.: Team size matters : collaboration and scientific impact since 1900 (2015) 0.00
    0.002716579 = product of:
      0.010866316 = sum of:
        0.010866316 = product of:
          0.021732632 = sum of:
            0.021732632 = weight(_text_:science in 2035) [ClassicSimilarity], result of:
              0.021732632 = score(doc=2035,freq=2.0), product of:
                0.124457374 = queryWeight, product of:
                  2.6341193 = idf(docFreq=8627, maxDocs=44218)
                  0.047248192 = queryNorm
                0.17461908 = fieldWeight in 2035, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  2.6341193 = idf(docFreq=8627, maxDocs=44218)
                  0.046875 = fieldNorm(doc=2035)
          0.5 = coord(1/2)
      0.25 = coord(1/4)
    
    Source
    Journal of the Association for Information Science and Technology. 66(2015) no.7, S.1323-1332
  5. Bertin, M.; Atanassova, I.; Gingras, Y.; Larivière, V.: ¬The invariant distribution of references in scientific articles (2016) 0.00
    0.002716579 = product of:
      0.010866316 = sum of:
        0.010866316 = product of:
          0.021732632 = sum of:
            0.021732632 = weight(_text_:science in 2497) [ClassicSimilarity], result of:
              0.021732632 = score(doc=2497,freq=2.0), product of:
                0.124457374 = queryWeight, product of:
                  2.6341193 = idf(docFreq=8627, maxDocs=44218)
                  0.047248192 = queryNorm
                0.17461908 = fieldWeight in 2497, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  2.6341193 = idf(docFreq=8627, maxDocs=44218)
                  0.046875 = fieldNorm(doc=2497)
          0.5 = coord(1/2)
      0.25 = coord(1/4)
    
    Source
    Journal of the Association for Information Science and Technology. 67(2016) no.1, S.164-177
  6. Gingras, Y.: Bibliometrics and research evaluation : uses and abuses (2016) 0.00
    0.0025612153 = product of:
      0.010244861 = sum of:
        0.010244861 = product of:
          0.020489722 = sum of:
            0.020489722 = weight(_text_:science in 3805) [ClassicSimilarity], result of:
              0.020489722 = score(doc=3805,freq=4.0), product of:
                0.124457374 = queryWeight, product of:
                  2.6341193 = idf(docFreq=8627, maxDocs=44218)
                  0.047248192 = queryNorm
                0.16463245 = fieldWeight in 3805, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  2.6341193 = idf(docFreq=8627, maxDocs=44218)
                  0.03125 = fieldNorm(doc=3805)
          0.5 = coord(1/2)
      0.25 = coord(1/4)
    
    Abstract
    The research evaluation market is booming. "Ranking," "metrics," "h-index," and "impact factors" are reigning buzzwords. Government and research administrators want to evaluate everything -- teachers, professors, training programs, universities -- using quantitative indicators. Among the tools used to measure "research excellence," bibliometrics -- aggregate data on publications and citations -- has become dominant. Bibliometrics is hailed as an "objective" measure of research quality, a quantitative measure more useful than "subjective" and intuitive evaluation methods such as peer review that have been used since scientific papers were first published in the seventeenth century. In this book, Yves Gingras offers a spirited argument against an unquestioning reliance on bibliometrics as an indicator of research quality. Gingras shows that bibliometric rankings have no real scientific validity, rarely measuring what they pretend to. Although the study of publication and citation patterns, at the proper scales, can yield insights on the global dynamics of science over time, ill-defined quantitative indicators often generate perverse and unintended effects on the direction of research. Moreover, abuse of bibliometrics occurs when data is manipulated to boost rankings. Gingras looks at the politics of evaluation and argues that using numbers can be a way to control scientists and diminish their autonomy in the evaluation process. Proposing precise criteria for establishing the validity of indicators at a given scale of analysis, Gingras questions why universities are so eager to let invalid indicators influence their research strategy.
    Series
    History and foundations of information science
  7. Larivière, V.; Gingras, Y.: On the prevalence and scientific impact of duplicate publications in different scientific fields (1980-2007) (2010) 0.00
    0.0022638158 = product of:
      0.009055263 = sum of:
        0.009055263 = product of:
          0.018110527 = sum of:
            0.018110527 = weight(_text_:science in 3622) [ClassicSimilarity], result of:
              0.018110527 = score(doc=3622,freq=2.0), product of:
                0.124457374 = queryWeight, product of:
                  2.6341193 = idf(docFreq=8627, maxDocs=44218)
                  0.047248192 = queryNorm
                0.1455159 = fieldWeight in 3622, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  2.6341193 = idf(docFreq=8627, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=3622)
          0.5 = coord(1/2)
      0.25 = coord(1/4)
    
    Abstract
    Purpose - The issue of duplicate publications has received a lot of attention in the medical literature, but much less in the information science community. This paper aims to analyze the prevalence and scientific impact of duplicate publications across all fields of research between 1980 and 2007. Design/methodology/approach - The approach is a bibliometric analysis of duplicate papers based on their metadata. Duplicate papers are defined as papers published in two different journals having: the exact same title; the same first author; and the same number of cited references. Findings - In all fields combined, the prevalence of duplicates is one out of 2,000 papers, but is higher in the natural and medical sciences than in the social sciences and humanities. A very high proportion (>85 percent) of these papers are published the same year or one year apart, which suggest that most duplicate papers were submitted simultaneously. Furthermore, duplicate papers are generally published in journals with impact factors below the average of their field and obtain lower citations. Originality/value - The paper provides clear evidence that the prevalence of duplicate papers is low and, more importantly, that the scientific impact of such papers is below average.
  8. Lozano, G.A.; Larivière, V.; Gingras, Y.: ¬The weakening relationship between the impact factor and papers' citations in the digital age (2012) 0.00
    0.0022638158 = product of:
      0.009055263 = sum of:
        0.009055263 = product of:
          0.018110527 = sum of:
            0.018110527 = weight(_text_:science in 486) [ClassicSimilarity], result of:
              0.018110527 = score(doc=486,freq=2.0), product of:
                0.124457374 = queryWeight, product of:
                  2.6341193 = idf(docFreq=8627, maxDocs=44218)
                  0.047248192 = queryNorm
                0.1455159 = fieldWeight in 486, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  2.6341193 = idf(docFreq=8627, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=486)
          0.5 = coord(1/2)
      0.25 = coord(1/4)
    
    Source
    Journal of the American Society for Information Science and Technology. 63(2012) no.11, S.2140-2145