Search (13 results, page 1 of 1)

  • × author_ss:"Larivière, V."
  1. Larivière, V.; Gingras, Y.; Archambault, E.: ¬The decline in the concentration of citations, 1900-2007 (2009) 0.05
    0.048338976 = product of:
      0.09667795 = sum of:
        0.09667795 = sum of:
          0.03678342 = weight(_text_:web in 2763) [ClassicSimilarity], result of:
            0.03678342 = score(doc=2763,freq=2.0), product of:
              0.17002425 = queryWeight, product of:
                3.2635105 = idf(docFreq=4597, maxDocs=44218)
                0.052098576 = queryNorm
              0.21634221 = fieldWeight in 2763, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.2635105 = idf(docFreq=4597, maxDocs=44218)
                0.046875 = fieldNorm(doc=2763)
          0.059894532 = weight(_text_:22 in 2763) [ClassicSimilarity], result of:
            0.059894532 = score(doc=2763,freq=4.0), product of:
              0.18244034 = queryWeight, product of:
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.052098576 = queryNorm
              0.32829654 = fieldWeight in 2763, product of:
                2.0 = tf(freq=4.0), with freq of:
                  4.0 = termFreq=4.0
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.046875 = fieldNorm(doc=2763)
      0.5 = coord(1/2)
    
    Abstract
    This article challenges recent research (Evans, 2008) reporting that the concentration of cited scientific literature increases with the online availability of articles and journals. Using Thomson Reuters' Web of Science, the present article analyses changes in the concentration of citations received (2- and 5-year citation windows) by papers published between 1900 and 2005. Three measures of concentration are used: the percentage of papers that received at least one citation (cited papers); the percentage of papers needed to account for 20%, 50%, and 80% of the citations; and the Herfindahl-Hirschman index (HHI). These measures are used for four broad disciplines: natural sciences and engineering, medical fields, social sciences, and the humanities. All these measures converge and show that, contrary to what was reported by Evans, the dispersion of citations is actually increasing.
    Date
    22. 3.2009 19:22:35
  2. Haustein, S.; Sugimoto, C.; Larivière, V.: Social media in scholarly communication : Guest editorial (2015) 0.03
    0.031150516 = product of:
      0.062301032 = sum of:
        0.062301032 = sum of:
          0.04112512 = weight(_text_:web in 3809) [ClassicSimilarity], result of:
            0.04112512 = score(doc=3809,freq=10.0), product of:
              0.17002425 = queryWeight, product of:
                3.2635105 = idf(docFreq=4597, maxDocs=44218)
                0.052098576 = queryNorm
              0.24187797 = fieldWeight in 3809, product of:
                3.1622777 = tf(freq=10.0), with freq of:
                  10.0 = termFreq=10.0
                3.2635105 = idf(docFreq=4597, maxDocs=44218)
                0.0234375 = fieldNorm(doc=3809)
          0.021175914 = weight(_text_:22 in 3809) [ClassicSimilarity], result of:
            0.021175914 = score(doc=3809,freq=2.0), product of:
              0.18244034 = queryWeight, product of:
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.052098576 = queryNorm
              0.116070345 = fieldWeight in 3809, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.0234375 = fieldNorm(doc=3809)
      0.5 = coord(1/2)
    
    Abstract
    One of the solutions to help scientists filter the most relevant publications and, thus, to stay current on developments in their fields during the transition from "little science" to "big science", was the introduction of citation indexing as a Wellsian "World Brain" (Garfield, 1964) of scientific information: It is too much to expect a research worker to spend an inordinate amount of time searching for the bibliographic descendants of antecedent papers. It would not be excessive to demand that the thorough scholar check all papers that have cited or criticized such papers, if they could be located quickly. The citation index makes this check practicable (Garfield, 1955, p. 108). In retrospective, citation indexing can be perceived as a pre-social web version of crowdsourcing, as it is based on the concept that the community of citing authors outperforms indexers in highlighting cognitive links between papers, particularly on the level of specific ideas and concepts (Garfield, 1983). Over the last 50 years, citation analysis and more generally, bibliometric methods, have developed from information retrieval tools to research evaluation metrics, where they are presumed to make scientific funding more efficient and effective (Moed, 2006). However, the dominance of bibliometric indicators in research evaluation has also led to significant goal displacement (Merton, 1957) and the oversimplification of notions of "research productivity" and "scientific quality", creating adverse effects such as salami publishing, honorary authorships, citation cartels, and misuse of indicators (Binswanger, 2015; Cronin and Sugimoto, 2014; Frey and Osterloh, 2006; Haustein and Larivière, 2015; Weingart, 2005).
    Furthermore, the rise of the web, and subsequently, the social web, has challenged the quasi-monopolistic status of the journal as the main form of scholarly communication and citation indices as the primary assessment mechanisms. Scientific communication is becoming more open, transparent, and diverse: publications are increasingly open access; manuscripts, presentations, code, and data are shared online; research ideas and results are discussed and criticized openly on blogs; and new peer review experiments, with open post publication assessment by anonymous or non-anonymous referees, are underway. The diversification of scholarly production and assessment, paired with the increasing speed of the communication process, leads to an increased information overload (Bawden and Robinson, 2008), demanding new filters. The concept of altmetrics, short for alternative (to citation) metrics, was created out of an attempt to provide a filter (Priem et al., 2010) and to steer against the oversimplification of the measurement of scientific success solely on the basis of number of journal articles published and citations received, by considering a wider range of research outputs and metrics (Piwowar, 2013). Although the term altmetrics was introduced in a tweet in 2010 (Priem, 2010), the idea of capturing traces - "polymorphous mentioning" (Cronin et al., 1998, p. 1320) - of scholars and their documents on the web to measure "impact" of science in a broader manner than citations was introduced years before, largely in the context of webometrics (Almind and Ingwersen, 1997; Thelwall et al., 2005):
    There will soon be a critical mass of web-based digital objects and usage statistics on which to model scholars' communication behaviors - publishing, posting, blogging, scanning, reading, downloading, glossing, linking, citing, recommending, acknowledging - and with which to track their scholarly influence and impact, broadly conceived and broadly felt (Cronin, 2005, p. 196). A decade after Cronin's prediction and five years after the coining of altmetrics, the time seems ripe to reflect upon the role of social media in scholarly communication. This Special Issue does so by providing an overview of current research on the indicators and metrics grouped under the umbrella term of altmetrics, on their relationships with traditional indicators of scientific activity, and on the uses that are made of the various social media platforms - on which these indicators are based - by scientists of various disciplines.
    Date
    20. 1.2015 18:30:22
  3. Shu, F.; Julien, C.-A.; Larivière, V.: Does the Web of Science accurately represent chinese scientific performance? (2019) 0.02
    0.015927691 = product of:
      0.031855382 = sum of:
        0.031855382 = product of:
          0.063710764 = sum of:
            0.063710764 = weight(_text_:web in 5388) [ClassicSimilarity], result of:
              0.063710764 = score(doc=5388,freq=6.0), product of:
                0.17002425 = queryWeight, product of:
                  3.2635105 = idf(docFreq=4597, maxDocs=44218)
                  0.052098576 = queryNorm
                0.37471575 = fieldWeight in 5388, product of:
                  2.4494898 = tf(freq=6.0), with freq of:
                    6.0 = termFreq=6.0
                  3.2635105 = idf(docFreq=4597, maxDocs=44218)
                  0.046875 = fieldNorm(doc=5388)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    With the significant development of China's economy and scientific activity, its scientific publication activity is experiencing a period of rapid growth. However, measuring China's research output remains a challenge because Chinese scholars may publish their research in either international or national journals, yet no bibliometric database covers both the Chinese and English scientific literature. The purpose of this study is to compare Web of Science (WoS) with a Chinese bibliometric database in terms of authors and their performance, demonstrate the extent of the overlap between the two groups of Chinese most productive authors in both international and Chinese bibliometric databases, and determine how different disciplines may affect this overlap. The results of this study indicate that Chinese bibliometric databases, or a combination of WoS and Chinese bibliometric databases, should be used to evaluate Chinese research performance except in the few disciplines in which Chinese research performance could be assessed using WoS only.
    Object
    Web of Science
  4. Larivière, V.; Macaluso, B.: Improving the coverage of social science and humanities researchers' output : the case of the Érudit journal platform (2011) 0.02
    0.015326426 = product of:
      0.030652853 = sum of:
        0.030652853 = product of:
          0.061305705 = sum of:
            0.061305705 = weight(_text_:web in 4943) [ClassicSimilarity], result of:
              0.061305705 = score(doc=4943,freq=8.0), product of:
                0.17002425 = queryWeight, product of:
                  3.2635105 = idf(docFreq=4597, maxDocs=44218)
                  0.052098576 = queryNorm
                0.36057037 = fieldWeight in 4943, product of:
                  2.828427 = tf(freq=8.0), with freq of:
                    8.0 = termFreq=8.0
                  3.2635105 = idf(docFreq=4597, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=4943)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    In non-English-speaking countries the measurement of research output in the social sciences and humanities (SSH) using standard bibliographic databases suffers from a major drawback: the underrepresentation of articles published in local, non-English, journals. Using papers indexed (1) in a local database of periodicals (Érudit) and (2) in the Web of Science, assigned to the population of university professors in the province of Québec, this paper quantifies, for individual researchers and departments, the importance of papers published in local journals. It also analyzes differences across disciplines and between French-speaking and English-speaking universities. The results show that, while the addition of papers published in local journals to bibliometric measures has little effect when all disciplines are considered and for anglophone universities, it increases the output of researchers from francophone universities in the social sciences and humanities by almost a third. It also shows that there is very little relation, at the level of individual researchers or departments, between the output indexed in the Web of Science and the output retrieved from the Érudit database; a clear demonstration that the Web of Science cannot be used as a proxy for the "overall" production of SSH researchers in Québec. The paper concludes with a discussion on these disciplinary and language differences, as well as on their implications for rankings of universities.
    Object
    Web of Science
  5. Archambault, E.; Campbell, D; Gingras, Y.; Larivière, V.: Comparing bibliometric statistics obtained from the Web of Science and Scopus (2009) 0.01
    0.013273074 = product of:
      0.026546149 = sum of:
        0.026546149 = product of:
          0.053092297 = sum of:
            0.053092297 = weight(_text_:web in 2933) [ClassicSimilarity], result of:
              0.053092297 = score(doc=2933,freq=6.0), product of:
                0.17002425 = queryWeight, product of:
                  3.2635105 = idf(docFreq=4597, maxDocs=44218)
                  0.052098576 = queryNorm
                0.3122631 = fieldWeight in 2933, product of:
                  2.4494898 = tf(freq=6.0), with freq of:
                    6.0 = termFreq=6.0
                  3.2635105 = idf(docFreq=4597, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=2933)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    For more than 40 years, the Institute for Scientific Information (ISI, now part of Thomson Reuters) produced the only available bibliographic databases from which bibliometricians could compile large-scale bibliometric indicators. ISI's citation indexes, now regrouped under the Web of Science (WoS), were the major sources of bibliometric data until 2004, when Scopus was launched by the publisher Reed Elsevier. For those who perform bibliometric analyses and comparisons of countries or institutions, the existence of these two major databases raises the important question of the comparability and stability of statistics obtained from different data sources. This paper uses macrolevel bibliometric indicators to compare results obtained from the WoS and Scopus. It shows that the correlations between the measures obtained with both databases for the number of papers and the number of citations received by countries, as well as for their ranks, are extremely high. There is also a very high correlation when countries' papers are broken down by field. The paper thus provides evidence that indicators of scientific production and citations at the country level are stable and largely independent of the database.
    Object
    Web of Science
  6. Hu, B.; Dong, X.; Zhang, C.; Bowman, T.D.; Ding, Y.; Milojevic, S.; Ni, C.; Yan, E.; Larivière, V.: ¬A lead-lag analysis of the topic evolution patterns for preprints and publications (2015) 0.01
    0.009195855 = product of:
      0.01839171 = sum of:
        0.01839171 = product of:
          0.03678342 = sum of:
            0.03678342 = weight(_text_:web in 2337) [ClassicSimilarity], result of:
              0.03678342 = score(doc=2337,freq=2.0), product of:
                0.17002425 = queryWeight, product of:
                  3.2635105 = idf(docFreq=4597, maxDocs=44218)
                  0.052098576 = queryNorm
                0.21634221 = fieldWeight in 2337, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.2635105 = idf(docFreq=4597, maxDocs=44218)
                  0.046875 = fieldNorm(doc=2337)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    This study applied LDA (latent Dirichlet allocation) and regression analysis to conduct a lead-lag analysis to identify different topic evolution patterns between preprints and papers from arXiv and the Web of Science (WoS) in astrophysics over the last 20 years (1992-2011). Fifty topics in arXiv and WoS were generated using an LDA algorithm and then regression models were used to explain 4 types of topic growth patterns. Based on the slopes of the fitted equation curves, the paper redefines the topic trends and popularity. Results show that arXiv and WoS share similar topics in a given domain, but differ in evolution trends. Topics in WoS lose their popularity much earlier and their durations of popularity are shorter than those in arXiv. This work demonstrates that open access preprints have stronger growth tendency as compared to traditional printed publications.
  7. Larivière, V.; Gingras, Y.: On the relationship between interdisciplinarity and scientific impact (2009) 0.01
    0.007663213 = product of:
      0.015326426 = sum of:
        0.015326426 = product of:
          0.030652853 = sum of:
            0.030652853 = weight(_text_:web in 3316) [ClassicSimilarity], result of:
              0.030652853 = score(doc=3316,freq=2.0), product of:
                0.17002425 = queryWeight, product of:
                  3.2635105 = idf(docFreq=4597, maxDocs=44218)
                  0.052098576 = queryNorm
                0.18028519 = fieldWeight in 3316, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.2635105 = idf(docFreq=4597, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=3316)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    This article analyzes the effect of interdisciplinarity on the scientific impact of individual articles. Using all the articles published in Web of Science in 2000, we define the degree of interdisciplinarity of a given article as the percentage of its cited references made to journals of other disciplines. We show that although for all disciplines combined there is no clear correlation between the level of interdisciplinarity of articles and their citation rates, there are nonetheless some disciplines in which a higher level of interdisciplinarity is related to a higher citation rates. For other disciplines, citations decline as interdisciplinarity grows. One characteristic is visible in all disciplines: Highly disciplinary and highly interdisciplinary articles have a low scientific impact. This suggests that there might be an optimum of interdisciplinarity beyond which the research is too dispersed to find its niche and under which it is too mainstream to have high impact. Finally, the relationship between interdisciplinarity and scientific impact is highly determined by the citation characteristics of the disciplines involved: Articles citing citation-intensive disciplines are more likely to be cited by those disciplines and, hence, obtain higher citation scores than would articles citing non-citation-intensive disciplines.
  8. Kirchik, O.; Gingras, Y.; Larivière, V.: Changes in publication languages and citation practices and their effect on the scientific impact of Russian science (1993-2010) (2012) 0.01
    0.007663213 = product of:
      0.015326426 = sum of:
        0.015326426 = product of:
          0.030652853 = sum of:
            0.030652853 = weight(_text_:web in 284) [ClassicSimilarity], result of:
              0.030652853 = score(doc=284,freq=2.0), product of:
                0.17002425 = queryWeight, product of:
                  3.2635105 = idf(docFreq=4597, maxDocs=44218)
                  0.052098576 = queryNorm
                0.18028519 = fieldWeight in 284, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.2635105 = idf(docFreq=4597, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=284)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    This article analyzes the effects of publication language on the international scientific visibility of Russia using the Web of Science (WoS). Like other developing and transition countries, it is subject to a growing pressure to "internationalize" its scientific activities, which primarily means a shift to English as a language of scientific communication. But to what extent does the transition to English improve the impact of research? The case of Russia is of interest in this respect as the existence of many combinations of national journals and languages of publications (namely, Russian and English, including translated journals) provide a kind of natural I experiment to test the effects of language and publisher's country on the international visibility of research through citations as well as on the referencing practices of authors. Our analysis points to the conclusion that the production of original English-language papers in foreign journals is a more efficient strategy of internationalization than the mere translation of domestic journals. If the objective of a country is to maximize the international visibility of its scientific work, then the efforts should go into the promotion of publication in reputed English-language journals to profit from the added effect provided by the Matthew effect of these venues.
  9. Haustein, S.; Peters, I.; Sugimoto, C.R.; Thelwall, M.; Larivière, V.: Tweeting biomedicine : an analysis of tweets and citations in the biomedical literature (2014) 0.01
    0.007663213 = product of:
      0.015326426 = sum of:
        0.015326426 = product of:
          0.030652853 = sum of:
            0.030652853 = weight(_text_:web in 1229) [ClassicSimilarity], result of:
              0.030652853 = score(doc=1229,freq=2.0), product of:
                0.17002425 = queryWeight, product of:
                  3.2635105 = idf(docFreq=4597, maxDocs=44218)
                  0.052098576 = queryNorm
                0.18028519 = fieldWeight in 1229, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.2635105 = idf(docFreq=4597, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=1229)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    Data collected by social media platforms have been introduced as new sources for indicators to help measure the impact of scholarly research in ways that are complementary to traditional citation analysis. Data generated from social media activities can be used to reflect broad types of impact. This article aims to provide systematic evidence about how often Twitter is used to disseminate information about journal articles in the biomedical sciences. The analysis is based on 1.4 million documents covered by both PubMed and Web of Science and published between 2010 and 2012. The number of tweets containing links to these documents was analyzed and compared to citations to evaluate the degree to which certain journals, disciplines, and specialties were represented on Twitter and how far tweets correlate with citation impact. With less than 10% of PubMed articles mentioned on Twitter, its uptake is low in general but differs between journals and specialties. Correlations between tweets and citations are low, implying that impact metrics based on tweets are different from those based on citations. A framework using the coverage of articles and the correlation between Twitter mentions and citations is proposed to facilitate the evaluation of novel social-media-based metrics.
  10. Larivière, V.; Sugimoto, C.R.; Macaluso, B.; Milojevi´c, S.; Cronin, B.; Thelwall, M.: arXiv E-prints and the journal of record : an analysis of roles and relationships (2014) 0.01
    0.007663213 = product of:
      0.015326426 = sum of:
        0.015326426 = product of:
          0.030652853 = sum of:
            0.030652853 = weight(_text_:web in 1285) [ClassicSimilarity], result of:
              0.030652853 = score(doc=1285,freq=2.0), product of:
                0.17002425 = queryWeight, product of:
                  3.2635105 = idf(docFreq=4597, maxDocs=44218)
                  0.052098576 = queryNorm
                0.18028519 = fieldWeight in 1285, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.2635105 = idf(docFreq=4597, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=1285)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    Since its creation in 1991, arXiv has become central to the diffusion of research in a number of fields. Combining data from the entirety of arXiv and the Web of Science (WoS), this article investigates (a) the proportion of papers across all disciplines that are on arXiv and the proportion of arXiv papers that are in the WoS, (b) the elapsed time between arXiv submission and journal publication, and (c) the aging characteristics and scientific impact of arXiv e-prints and their published version. It shows that the proportion of WoS papers found on arXiv varies across the specialties of physics and mathematics, and that only a few specialties make extensive use of the repository. Elapsed time between arXiv submission and journal publication has shortened but remains longer in mathematics than in physics. In physics, mathematics, as well as in astronomy and astrophysics, arXiv versions are cited more promptly and decay faster than WoS papers. The arXiv versions of papers-both published and unpublished-have lower citation rates than published papers, although there is almost no difference in the impact of the arXiv versions of published and unpublished papers.
  11. Mongeon, P.; Larivière, V.: Costly collaborations : the impact of scientific fraud on co-authors' careers (2016) 0.01
    0.007663213 = product of:
      0.015326426 = sum of:
        0.015326426 = product of:
          0.030652853 = sum of:
            0.030652853 = weight(_text_:web in 2769) [ClassicSimilarity], result of:
              0.030652853 = score(doc=2769,freq=2.0), product of:
                0.17002425 = queryWeight, product of:
                  3.2635105 = idf(docFreq=4597, maxDocs=44218)
                  0.052098576 = queryNorm
                0.18028519 = fieldWeight in 2769, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.2635105 = idf(docFreq=4597, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=2769)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    Over the past few years, several major scientific fraud cases have shocked the scientific community. The number of retractions each year has also increased tremendously, especially in the biomedical field, and scientific misconduct accounts for more than half of those retractions. It is assumed that co-authors of retracted papers are affected by their colleagues' misconduct, and the aim of this study is to provide empirical evidence of the effect of retractions in biomedical research on co-authors' research careers. Using data from the Web of Science, we measured the productivity, impact, and collaboration of 1,123 co-authors of 293 retracted articles for a period of 5 years before and after the retraction. We found clear evidence that collaborators do suffer consequences of their colleagues' misconduct and that a retraction for fraud has higher consequences than a retraction for error. Our results also suggest that the extent of these consequences is closely linked with the ranking of co-authors on the retracted paper, being felt most strongly by first authors, followed by the last authors, with the impact is less important for middle authors.
  12. Vincent-Lamarre, P.; Boivin, J.; Gargouri, Y.; Larivière, V.; Harnad, S.: Estimating open access mandate effectiveness : the MELIBEA score (2016) 0.01
    0.007663213 = product of:
      0.015326426 = sum of:
        0.015326426 = product of:
          0.030652853 = sum of:
            0.030652853 = weight(_text_:web in 3162) [ClassicSimilarity], result of:
              0.030652853 = score(doc=3162,freq=2.0), product of:
                0.17002425 = queryWeight, product of:
                  3.2635105 = idf(docFreq=4597, maxDocs=44218)
                  0.052098576 = queryNorm
                0.18028519 = fieldWeight in 3162, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.2635105 = idf(docFreq=4597, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=3162)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    MELIBEA is a directory of institutional open-access policies for research output that uses a composite formula with eight weighted conditions to estimate the "strength" of open access (OA) mandates (registered in ROARMAP). We analyzed total Web of Science-(WoS)-indexed publication output in years 2011-2013 for 67 institutions in which OA was mandated to estimate the mandates' effectiveness: How well did the MELIBEA score and its individual conditions predict what percentage of the WoS-indexed articles is actually deposited in each institution's OA repository, and when? We found a small but significant positive correlation (0.18) between the MELIBEA "strength" score and deposit percentage. For three of the eight MELIBEA conditions (deposit timing, internal use, and opt-outs), one value of each was strongly associated with deposit percentage or latency ([a] immediate deposit required; [b] deposit required for performance evaluation; [c] unconditional opt-out allowed for the OA requirement but no opt-out for deposit requirement). When we updated the initial values and weights of the MELIBEA formula to reflect the empirical association we had found, the score's predictive power for mandate effectiveness doubled (0.36). There are not yet enough OA mandates to test further mandate conditions that might contribute to mandate effectiveness, but the present findings already suggest that it would be productive for existing and future mandates to adopt the three identified conditions so as to maximize their effectiveness, and thereby the growth of OA.
  13. Chen, L.; Ding, J.; Larivière, V.: Measuring the citation context of national self-references : how a web journal club is used (2022) 0.01
    0.007663213 = product of:
      0.015326426 = sum of:
        0.015326426 = product of:
          0.030652853 = sum of:
            0.030652853 = weight(_text_:web in 545) [ClassicSimilarity], result of:
              0.030652853 = score(doc=545,freq=2.0), product of:
                0.17002425 = queryWeight, product of:
                  3.2635105 = idf(docFreq=4597, maxDocs=44218)
                  0.052098576 = queryNorm
                0.18028519 = fieldWeight in 545, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.2635105 = idf(docFreq=4597, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=545)
          0.5 = coord(1/2)
      0.5 = coord(1/2)