Diese Datenbank enthält über 40.000 Dokumente zu Themen aus den Bereichen Formalerschließung – Inhaltserschließung – Information Retrieval.
© 2015 W. Gödert, TH Köln, Institut für Informationswissenschaft / Powered by litecat, BIS Oldenburg (Stand: 04. Juni 2021)
1Bertin, M. ; Atanassova, I. ; Gingras, Y. ; Larivière, V.: ¬The invariant distribution of references in scientific articles.
In: Journal of the Association for Information Science and Technology. 67(2016) no.1, S.164-177.
Abstract: The organization of scientific papers typically follows a standardized pattern, the well-known IMRaD structure (introduction, methods, results, and discussion). Using the full text of 45,000 papers published in the PLoS series of journals as a case study, this paper investigates, from the viewpoint of bibliometrics, how references are distributed along the structure of scientific papers as well as the age of these cited references. Once the sections of articles are realigned to follow the IMRaD sequence, the position of cited references along the text of articles is invariant across all PLoS journals, with the introduction and discussion accounting for most of the references. It also provides evidence that the age of cited references varies by section, with older references being found in the methods and more recent references in the discussion. These results provide insight into the different roles citations have in the scholarly communication process.
Inhalt: Vgl.: http://onlinelibrary.wiley.com/doi/10.1002/asi.23367/abstract.
2Gingras, Y.: Bibliometrics and research evaluation : uses and abuses.
Cambridge, MA : MIT Press, 2016. xii, 119 S.
(History and foundations of information science)
Abstract: The research evaluation market is booming. "Ranking," "metrics," "h-index," and "impact factors" are reigning buzzwords. Government and research administrators want to evaluate everything -- teachers, professors, training programs, universities -- using quantitative indicators. Among the tools used to measure "research excellence," bibliometrics -- aggregate data on publications and citations -- has become dominant. Bibliometrics is hailed as an "objective" measure of research quality, a quantitative measure more useful than "subjective" and intuitive evaluation methods such as peer review that have been used since scientific papers were first published in the seventeenth century. In this book, Yves Gingras offers a spirited argument against an unquestioning reliance on bibliometrics as an indicator of research quality. Gingras shows that bibliometric rankings have no real scientific validity, rarely measuring what they pretend to. Although the study of publication and citation patterns, at the proper scales, can yield insights on the global dynamics of science over time, ill-defined quantitative indicators often generate perverse and unintended effects on the direction of research. Moreover, abuse of bibliometrics occurs when data is manipulated to boost rankings. Gingras looks at the politics of evaluation and argues that using numbers can be a way to control scientists and diminish their autonomy in the evaluation process. Proposing precise criteria for establishing the validity of indicators at a given scale of analysis, Gingras questions why universities are so eager to let invalid indicators influence their research strategy.
Inhalt: The origins of bibliometrics -- What bibliometrics teach us about the dynamics of scienceThe proliferation of research evaluation -- The evaluation of research evaluation -- Conclusion: the universities' new clothes?
Anmerkung: Rez. in: JASIST 68(2017) no.9, S.2290-2292 (Judit Bar-Ilan). Originaltitel: Dérives de l'évaluation de la recherche.
LCSH: Bibliometrics ; Research / Evaluation ; Education, Higher / Research / Evaluation ; Universities and colleges / Research / Evaluation
RSWK: Bibliometrie / Missbrauch / Forschung / Erfolgskontrolle
BK: 02.13 (Wissenschaftspraxis) Subject | Subject | ; 81.80 (Hochschulen / Fachhochschulen) ; 83.31 (Wirtschaftswachstum)
DDC: 020.727 / dc23
RVK: AK 28100 ; AN 96300 ; AN 96800 ; QB 100
3Larivière, V. ; Gingras, Y. ; Sugimoto, C.R. ; Tsou, A.: Team size matters : collaboration and scientific impact since 1900.
In: Journal of the Association for Information Science and Technology. 66(2015) no.7, S.1323-1332.
Abstract: This article provides the first historical analysis of the relationship between collaboration and scientific impact using three indicators of collaboration (number of authors, number of addresses, and number of countries) derived from articles published between 1900 and 2011. The results demonstrate that an increase in the number of authors leads to an increase in impact, from the beginning of the last century onward, and that this is not due simply to self-citations. A similar trend is also observed for the number of addresses and number of countries represented in the byline of an article. However, the constant inflation of collaboration since 1900 has resulted in diminishing citation returns: Larger and more diverse (in terms of institutional and country affiliation) teams are necessary to realize higher impact. The article concludes with a discussion of the potential causes of the impact gain in citations of collaborative papers.
Inhalt: Vgl.: http://onlinelibrary.wiley.com/doi/10.1002/asi.23266/abstract.
4Larivière, V. ; Lozano, G.A. ; Gingras, Y.: Are elite journals declining?.
In: Journal of the Association for Information Science and Technology. 65(2014) no.4, S.649-655.
Abstract: Previous research indicates that during the past 20 years, the highest-quality work has been published in an increasingly diverse and larger group of journals. In this article, we examine whether this diversification has also affected the handful of elite journals that are traditionally considered to be the best. We examine citation patterns during the past 40 years of seven long-standing traditionally elite journals and six journals that have been increasing in importance during the past 20 years. To be among the top 5% or 1% cited papers, papers now need about twice as many citations as they did 40 years ago. Since the late 1980s and early 1990s, elite journals have been publishing a decreasing proportion of these top-cited papers. This also applies to the two journals that are typically considered as the top venues and often used as bibliometric indicators of "excellence": Science and Nature. On the other hand, several new and established journals are publishing an increasing proportion of the most-cited papers. These changes bring new challenges and opportunities for all parties. Journals can enact policies to increase or maintain their relative position in the journal hierarchy. Researchers now have the option to publish in more diverse venues knowing that their work can still reach the same audiences. Finally, evaluators and administrators need to know that although there will always be a certain prestige associated with publishing in "elite" journals, journal hierarchies are in constant flux.
5Kirchik, O. ; Gingras, Y. ; Larivière, V.: Changes in publication languages and citation practices and their effect on the scientific impact of Russian science (1993-2010).
In: Journal of the American Society for Information Science and Technology. 63(2012) no.7, S.1411-1419.
Abstract: This article analyzes the effects of publication language on the international scientific visibility of Russia using the Web of Science (WoS). Like other developing and transition countries, it is subject to a growing pressure to "internationalize" its scientific activities, which primarily means a shift to English as a language of scientific communication. But to what extent does the transition to English improve the impact of research? The case of Russia is of interest in this respect as the existence of many combinations of national journals and languages of publications (namely, Russian and English, including translated journals) provide a kind of natural I experiment to test the effects of language and publisher's country on the international visibility of research through citations as well as on the referencing practices of authors. Our analysis points to the conclusion that the production of original English-language papers in foreign journals is a more efficient strategy of internationalization than the mere translation of domestic journals. If the objective of a country is to maximize the international visibility of its scientific work, then the efforts should go into the promotion of publication in reputed English-language journals to profit from the added effect provided by the Matthew effect of these venues.
6Lozano, G.A. ; Larivière, V. ; Gingras, Y.: ¬The weakening relationship between the impact factor and papers' citations in the digital age.
In: Journal of the American Society for Information Science and Technology. 63(2012) no.11, S.2140-2145.
Abstract: Historically, papers have been physically bound to the journal in which they were published; but in the digital age papers are available individually, no longer tied to their respective journals. Hence, papers now can be read and cited based on their own merits, independently of the journal's physical availability, reputation, or impact factor (IF). We compare the strength of the relationship between journals' IFs and the actual citations received by their respective papers from 1902 to 2009. Throughout most of the 20th century, papers' citation rates were increasingly linked to their respective journals' IFs. However, since 1990, the advent of the digital age, the relation between IFs and paper citations has been weakening. This began first in physics, a field that was quick to make the transition into the electronic domain. Furthermore, since 1990 the overall proportion of highly cited papers coming from highly cited journals has been decreasing and, of these highly cited papers, the proportion not coming from highly cited journals has been increasing. Should this pattern continue, it might bring an end to the use of the IF as a way to evaluate the quality of journals, papers, and researchers.
Themenfeld: Informetrie ; Elektronisches Publizieren
7Larivière, V. ; Gingras, Y.: ¬The impact factor's Matthew Effect : a natural experiment in bibliometrics.
In: Journal of the American Society for Information Science and Technology. 61(2010) no.2, S.424-427.
Abstract: Since the publication of Robert K. Merton's theory of cumulative advantage in science (Matthew Effect), several empirical studies have tried to measure its presence at the level of papers, individual researchers, institutions, or countries. However, these studies seldom control for the intrinsic quality of papers or of researchers - better (however defined) papers or researchers could receive higher citation rates because they are indeed of better quality. Using an original method for controlling the intrinsic value of papers - identical duplicate papers published in different journals with different impact factors - this paper shows that the journal in which papers are published have a strong influence on their citation rates, as duplicate papers published in high-impact journals obtain, on average, twice as many citations as their identical counterparts published in journals with lower impact factors. The intrinsic value of a paper is thus not the only reason a given paper gets cited or not, there is a specific Matthew Effect attached to journals and this gives to papers published there an added value over and above their intrinsic quality.
8Larivière, V. ; Gingras, Y.: On the prevalence and scientific impact of duplicate publications in different scientific fields (1980-2007).
In: Journal of documentation. 66(2010) no.2, S.179-190.
Abstract: Purpose - The issue of duplicate publications has received a lot of attention in the medical literature, but much less in the information science community. This paper aims to analyze the prevalence and scientific impact of duplicate publications across all fields of research between 1980 and 2007. Design/methodology/approach - The approach is a bibliometric analysis of duplicate papers based on their metadata. Duplicate papers are defined as papers published in two different journals having: the exact same title; the same first author; and the same number of cited references. Findings - In all fields combined, the prevalence of duplicates is one out of 2,000 papers, but is higher in the natural and medical sciences than in the social sciences and humanities. A very high proportion (>85 percent) of these papers are published the same year or one year apart, which suggest that most duplicate papers were submitted simultaneously. Furthermore, duplicate papers are generally published in journals with impact factors below the average of their field and obtain lower citations. Originality/value - The paper provides clear evidence that the prevalence of duplicate papers is low and, more importantly, that the scientific impact of such papers is below average.
9Wallace, M.L. ; Gingras, Y. ; Duhon, R.: ¬A new approach for detecting scientific specialties from raw cocitation networks.
In: Journal of the American Society for Information Science and Technology. 60(2009) no.2, S.240-246.
Abstract: We use a technique recently developed by V. Blondel, J.-L. Guillaume, R. Lambiotte, and E. Lefebvre (2008) to detect scientific specialties from author cocitation networks. This algorithm has distinct advantages over most previous methods used to obtain cocitation clusters since it avoids the use of similarity measures, relies entirely on the topology of the weighted network, and can be applied to relatively large networks. Most importantly, it requires no subjective interpretation of the cocitation data or of the communities found. Using two examples, we show that the resulting specialties are the smallest coherent groups of researchers (within a hierarchy of cluster sizes) and can thus be identified unambiguously. Furthermore, we confirm that these communities are indeed representative of what we know about the structure of a given scientific discipline and that as specialties, they can be accurately characterized by a few keywords (from the publication titles). We argue that this robust and efficient algorithm is particularly well-suited to cocitation networks and that the results generated can be of great use to researchers studying various facets of the structure and evolution of science.
10Larivière, V. ; Gingras, Y. ; Archambault, E.: ¬The decline in the concentration of citations, 1900-2007.
In: Journal of the American Society for Information Science and Technology. 60(2009) no.4, S.858-862.
Abstract: This article challenges recent research (Evans, 2008) reporting that the concentration of cited scientific literature increases with the online availability of articles and journals. Using Thomson Reuters' Web of Science, the present article analyses changes in the concentration of citations received (2- and 5-year citation windows) by papers published between 1900 and 2005. Three measures of concentration are used: the percentage of papers that received at least one citation (cited papers); the percentage of papers needed to account for 20%, 50%, and 80% of the citations; and the Herfindahl-Hirschman index (HHI). These measures are used for four broad disciplines: natural sciences and engineering, medical fields, social sciences, and the humanities. All these measures converge and show that, contrary to what was reported by Evans, the dispersion of citations is actually increasing.
Themenfeld: Informetrie ; Citation indexing
11Archambault, E. ; Campbell, D ; Gingras, Y. ; Larivière, V.: Comparing bibliometric statistics obtained from the Web of Science and Scopus.
In: Journal of the American Society for Information Science and Technology. 60(2009) no.7, S.1320-1326.
Abstract: For more than 40 years, the Institute for Scientific Information (ISI, now part of Thomson Reuters) produced the only available bibliographic databases from which bibliometricians could compile large-scale bibliometric indicators. ISI's citation indexes, now regrouped under the Web of Science (WoS), were the major sources of bibliometric data until 2004, when Scopus was launched by the publisher Reed Elsevier. For those who perform bibliometric analyses and comparisons of countries or institutions, the existence of these two major databases raises the important question of the comparability and stability of statistics obtained from different data sources. This paper uses macrolevel bibliometric indicators to compare results obtained from the WoS and Scopus. It shows that the correlations between the measures obtained with both databases for the number of papers and the number of citations received by countries, as well as for their ranks, are extremely high. There is also a very high correlation when countries' papers are broken down by field. The paper thus provides evidence that indicators of scientific production and citations at the country level are stable and largely independent of the database.
Objekt: Web of Science ; Scopus
12Larivière, V. ; Gingras, Y.: On the relationship between interdisciplinarity and scientific impact.
In: Journal of the American Society for Information Science and Technology. 61(2010) no.1, S.126-131.
Abstract: This article analyzes the effect of interdisciplinarity on the scientific impact of individual articles. Using all the articles published in Web of Science in 2000, we define the degree of interdisciplinarity of a given article as the percentage of its cited references made to journals of other disciplines. We show that although for all disciplines combined there is no clear correlation between the level of interdisciplinarity of articles and their citation rates, there are nonetheless some disciplines in which a higher level of interdisciplinarity is related to a higher citation rates. For other disciplines, citations decline as interdisciplinarity grows. One characteristic is visible in all disciplines: Highly disciplinary and highly interdisciplinary articles have a low scientific impact. This suggests that there might be an optimum of interdisciplinarity beyond which the research is too dispersed to find its niche and under which it is too mainstream to have high impact. Finally, the relationship between interdisciplinarity and scientific impact is highly determined by the citation characteristics of the disciplines involved: Articles citing citation-intensive disciplines are more likely to be cited by those disciplines and, hence, obtain higher citation scores than would articles citing non-citation-intensive disciplines.
13Larivière, V. ; Archambault, E. ; Gingras, Y.: Long-term variations in the aging of scientific literature : from exponential growth to steady-state science (1900-2004).
In: Journal of the American Society for Information Science and Technology. 59(2008) no.2, S.288-296.
Abstract: Despite a very large number of studies on the aging and obsolescence of scientific literature, no study has yet measured, over a very long time period, the changes in the rates at which scientific literature becomes obsolete. This article studies the evolution of the aging phenomenon and, in particular, how the age of cited literature has changed over more than 100 years of scientific activity. It shows that the average and median ages of cited literature have undergone several changes over the period. Specifically, both World War I and World War II had the effect of significantly increasing the age of the cited literature. The major finding of this article is that contrary to a widely held belief, the age of cited material has risen continuously since the mid-1960s. In other words, during that period, researchers were relying on an increasingly old body of literature. Our data suggest that this phenomenon is a direct response to the steady-state dynamics of modern science that followed its exponential growth; however, we also have observed that online preprint archives such as arXiv have had the opposite effect in some subfields.
14Larivière, V. ; Archambault, V. ; Gingras, Y. ; Vignola-Gagné, E.: ¬The place of serials in referencing practices : comparing natural sciences and engineering with social sciences and humanities.
In: Journal of the American Society for Information Science and Technology. 57(2006) no.8, S.997-1004.
Abstract: Journal articles constitute the core documents for the diffusion of knowledge in the natural sciences. It has been argued that the same is not true for the social sciences and humanities where knowledge is more often disseminated in monographs that are not indexed in the journal-based databases used for bibliometric analysis. Previous studies have made only partial assessments of the role played by both serials and other types of literature. The importance of journal literature in the various scientific fields has therefore not been systematically characterized. The authors address this issue by providing a systematic measurement of the role played by journal literature in the building of knowledge in both the natural sciences and engineering and the social sciences and humanities. Using citation data from the CD-ROM versions of the Science Citation Index (SCI), Social Science Citation Index (SSCI), and Arts and Humanities Citation Index (AHCI) databases from 1981 to 2000 (Thomson ISI, Philadelphia, PA), the authors quantify the share of citations to both serials and other types of literature. Variations in time and between fields are also analyzed. The results show that journal literature is increasingly important in the natural and social sciences, but that its role in the humanities is stagnant and has even tended to diminish slightly in the 1990s. Journal literature accounts for less than 50% of the citations in several disciplines of the social sciences and humanities; hence, special care should be used when using bibliometric indicators that rely only on journal literature.
Wissenschaftsfach: Naturwissenschaften ; Sozialwissenschaften
Objekt: Social Sciences Citation Index ; Science Citation Index ; Arts and Humanities Citation Index