Search (13 results, page 1 of 1)

  • × author_ss:"Larivière, V."
  1. Larivière, V.; Gingras, Y.; Archambault, E.: ¬The decline in the concentration of citations, 1900-2007 (2009) 0.04
    0.037729513 = product of:
      0.075459026 = sum of:
        0.04883048 = weight(_text_:social in 2763) [ClassicSimilarity], result of:
          0.04883048 = score(doc=2763,freq=2.0), product of:
            0.1847249 = queryWeight, product of:
              3.9875789 = idf(docFreq=2228, maxDocs=44218)
              0.046325076 = queryNorm
            0.26434162 = fieldWeight in 2763, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.9875789 = idf(docFreq=2228, maxDocs=44218)
              0.046875 = fieldNorm(doc=2763)
        0.026628546 = product of:
          0.053257093 = sum of:
            0.053257093 = weight(_text_:22 in 2763) [ClassicSimilarity], result of:
              0.053257093 = score(doc=2763,freq=4.0), product of:
                0.16222252 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.046325076 = queryNorm
                0.32829654 = fieldWeight in 2763, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.046875 = fieldNorm(doc=2763)
          0.5 = coord(1/2)
      0.5 = coord(2/4)
    
    Abstract
    This article challenges recent research (Evans, 2008) reporting that the concentration of cited scientific literature increases with the online availability of articles and journals. Using Thomson Reuters' Web of Science, the present article analyses changes in the concentration of citations received (2- and 5-year citation windows) by papers published between 1900 and 2005. Three measures of concentration are used: the percentage of papers that received at least one citation (cited papers); the percentage of papers needed to account for 20%, 50%, and 80% of the citations; and the Herfindahl-Hirschman index (HHI). These measures are used for four broad disciplines: natural sciences and engineering, medical fields, social sciences, and the humanities. All these measures converge and show that, contrary to what was reported by Evans, the dispersion of citations is actually increasing.
    Date
    22. 3.2009 19:22:35
  2. Haustein, S.; Sugimoto, C.; Larivière, V.: Social media in scholarly communication : Guest editorial (2015) 0.03
    0.034609746 = product of:
      0.06921949 = sum of:
        0.059804883 = weight(_text_:social in 3809) [ClassicSimilarity], result of:
          0.059804883 = score(doc=3809,freq=12.0), product of:
            0.1847249 = queryWeight, product of:
              3.9875789 = idf(docFreq=2228, maxDocs=44218)
              0.046325076 = queryNorm
            0.32375106 = fieldWeight in 3809, product of:
              3.4641016 = tf(freq=12.0), with freq of:
                12.0 = termFreq=12.0
              3.9875789 = idf(docFreq=2228, maxDocs=44218)
              0.0234375 = fieldNorm(doc=3809)
        0.009414612 = product of:
          0.018829225 = sum of:
            0.018829225 = weight(_text_:22 in 3809) [ClassicSimilarity], result of:
              0.018829225 = score(doc=3809,freq=2.0), product of:
                0.16222252 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.046325076 = queryNorm
                0.116070345 = fieldWeight in 3809, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0234375 = fieldNorm(doc=3809)
          0.5 = coord(1/2)
      0.5 = coord(2/4)
    
    Abstract
    One of the solutions to help scientists filter the most relevant publications and, thus, to stay current on developments in their fields during the transition from "little science" to "big science", was the introduction of citation indexing as a Wellsian "World Brain" (Garfield, 1964) of scientific information: It is too much to expect a research worker to spend an inordinate amount of time searching for the bibliographic descendants of antecedent papers. It would not be excessive to demand that the thorough scholar check all papers that have cited or criticized such papers, if they could be located quickly. The citation index makes this check practicable (Garfield, 1955, p. 108). In retrospective, citation indexing can be perceived as a pre-social web version of crowdsourcing, as it is based on the concept that the community of citing authors outperforms indexers in highlighting cognitive links between papers, particularly on the level of specific ideas and concepts (Garfield, 1983). Over the last 50 years, citation analysis and more generally, bibliometric methods, have developed from information retrieval tools to research evaluation metrics, where they are presumed to make scientific funding more efficient and effective (Moed, 2006). However, the dominance of bibliometric indicators in research evaluation has also led to significant goal displacement (Merton, 1957) and the oversimplification of notions of "research productivity" and "scientific quality", creating adverse effects such as salami publishing, honorary authorships, citation cartels, and misuse of indicators (Binswanger, 2015; Cronin and Sugimoto, 2014; Frey and Osterloh, 2006; Haustein and Larivière, 2015; Weingart, 2005).
    Furthermore, the rise of the web, and subsequently, the social web, has challenged the quasi-monopolistic status of the journal as the main form of scholarly communication and citation indices as the primary assessment mechanisms. Scientific communication is becoming more open, transparent, and diverse: publications are increasingly open access; manuscripts, presentations, code, and data are shared online; research ideas and results are discussed and criticized openly on blogs; and new peer review experiments, with open post publication assessment by anonymous or non-anonymous referees, are underway. The diversification of scholarly production and assessment, paired with the increasing speed of the communication process, leads to an increased information overload (Bawden and Robinson, 2008), demanding new filters. The concept of altmetrics, short for alternative (to citation) metrics, was created out of an attempt to provide a filter (Priem et al., 2010) and to steer against the oversimplification of the measurement of scientific success solely on the basis of number of journal articles published and citations received, by considering a wider range of research outputs and metrics (Piwowar, 2013). Although the term altmetrics was introduced in a tweet in 2010 (Priem, 2010), the idea of capturing traces - "polymorphous mentioning" (Cronin et al., 1998, p. 1320) - of scholars and their documents on the web to measure "impact" of science in a broader manner than citations was introduced years before, largely in the context of webometrics (Almind and Ingwersen, 1997; Thelwall et al., 2005):
    There will soon be a critical mass of web-based digital objects and usage statistics on which to model scholars' communication behaviors - publishing, posting, blogging, scanning, reading, downloading, glossing, linking, citing, recommending, acknowledging - and with which to track their scholarly influence and impact, broadly conceived and broadly felt (Cronin, 2005, p. 196). A decade after Cronin's prediction and five years after the coining of altmetrics, the time seems ripe to reflect upon the role of social media in scholarly communication. This Special Issue does so by providing an overview of current research on the indicators and metrics grouped under the umbrella term of altmetrics, on their relationships with traditional indicators of scientific activity, and on the uses that are made of the various social media platforms - on which these indicators are based - by scientists of various disciplines.
    Date
    20. 1.2015 18:30:22
    Footnote
    Teil eines Special Issue: Social Media Metrics in Scholarly Communication: exploring tweets, blogs, likes and other altmetrics. Der Beitrag ist frei verfügbar.
  3. Sugimoto, C.R.; Work, S.; Larivière, V.; Haustein, S.: Scholarly use of social media and altmetrics : A review of the literature (2017) 0.03
    0.029902441 = product of:
      0.119609766 = sum of:
        0.119609766 = weight(_text_:social in 3781) [ClassicSimilarity], result of:
          0.119609766 = score(doc=3781,freq=12.0), product of:
            0.1847249 = queryWeight, product of:
              3.9875789 = idf(docFreq=2228, maxDocs=44218)
              0.046325076 = queryNorm
            0.6475021 = fieldWeight in 3781, product of:
              3.4641016 = tf(freq=12.0), with freq of:
                12.0 = termFreq=12.0
              3.9875789 = idf(docFreq=2228, maxDocs=44218)
              0.046875 = fieldNorm(doc=3781)
      0.25 = coord(1/4)
    
    Abstract
    Social media has become integrated into the fabric of the scholarly communication system in fundamental ways, principally through scholarly use of social media platforms and the promotion of new indicators on the basis of interactions with these platforms. Research and scholarship in this area has accelerated since the coining and subsequent advocacy for altmetrics-that is, research indicators based on social media activity. This review provides an extensive account of the state-of-the art in both scholarly use of social media and altmetrics. The review consists of 2 main parts: the first examines the use of social media in academia, reviewing the various functions these platforms have in the scholarly communication process and the factors that affect this use. The second part reviews empirical studies of altmetrics, discussing the various interpretations of altmetrics, data collection and methodological limitations, and differences according to platform. The review ends with a critical discussion of the implications of this transformation in the scholarly communication system.
  4. Larivière, V.; Archambault, V.; Gingras, Y.; Vignola-Gagné, E.: ¬The place of serials in referencing practices : comparing natural sciences and engineering with social sciences and humanities (2006) 0.03
    0.026915273 = product of:
      0.10766109 = sum of:
        0.10766109 = weight(_text_:social in 5107) [ClassicSimilarity], result of:
          0.10766109 = score(doc=5107,freq=14.0), product of:
            0.1847249 = queryWeight, product of:
              3.9875789 = idf(docFreq=2228, maxDocs=44218)
              0.046325076 = queryNorm
            0.5828185 = fieldWeight in 5107, product of:
              3.7416575 = tf(freq=14.0), with freq of:
                14.0 = termFreq=14.0
              3.9875789 = idf(docFreq=2228, maxDocs=44218)
              0.0390625 = fieldNorm(doc=5107)
      0.25 = coord(1/4)
    
    Abstract
    Journal articles constitute the core documents for the diffusion of knowledge in the natural sciences. It has been argued that the same is not true for the social sciences and humanities where knowledge is more often disseminated in monographs that are not indexed in the journal-based databases used for bibliometric analysis. Previous studies have made only partial assessments of the role played by both serials and other types of literature. The importance of journal literature in the various scientific fields has therefore not been systematically characterized. The authors address this issue by providing a systematic measurement of the role played by journal literature in the building of knowledge in both the natural sciences and engineering and the social sciences and humanities. Using citation data from the CD-ROM versions of the Science Citation Index (SCI), Social Science Citation Index (SSCI), and Arts and Humanities Citation Index (AHCI) databases from 1981 to 2000 (Thomson ISI, Philadelphia, PA), the authors quantify the share of citations to both serials and other types of literature. Variations in time and between fields are also analyzed. The results show that journal literature is increasingly important in the natural and social sciences, but that its role in the humanities is stagnant and has even tended to diminish slightly in the 1990s. Journal literature accounts for less than 50% of the citations in several disciplines of the social sciences and humanities; hence, special care should be used when using bibliometric indicators that rely only on journal literature.
    Object
    Social Sciences Citation Index
  5. Larivière, V.; Macaluso, B.: Improving the coverage of social science and humanities researchers' output : the case of the Érudit journal platform (2011) 0.02
    0.017620182 = product of:
      0.07048073 = sum of:
        0.07048073 = weight(_text_:social in 4943) [ClassicSimilarity], result of:
          0.07048073 = score(doc=4943,freq=6.0), product of:
            0.1847249 = queryWeight, product of:
              3.9875789 = idf(docFreq=2228, maxDocs=44218)
              0.046325076 = queryNorm
            0.3815443 = fieldWeight in 4943, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              3.9875789 = idf(docFreq=2228, maxDocs=44218)
              0.0390625 = fieldNorm(doc=4943)
      0.25 = coord(1/4)
    
    Abstract
    In non-English-speaking countries the measurement of research output in the social sciences and humanities (SSH) using standard bibliographic databases suffers from a major drawback: the underrepresentation of articles published in local, non-English, journals. Using papers indexed (1) in a local database of periodicals (Érudit) and (2) in the Web of Science, assigned to the population of university professors in the province of Québec, this paper quantifies, for individual researchers and departments, the importance of papers published in local journals. It also analyzes differences across disciplines and between French-speaking and English-speaking universities. The results show that, while the addition of papers published in local journals to bibliometric measures has little effect when all disciplines are considered and for anglophone universities, it increases the output of researchers from francophone universities in the social sciences and humanities by almost a third. It also shows that there is very little relation, at the level of individual researchers or departments, between the output indexed in the Web of Science and the output retrieved from the Érudit database; a clear demonstration that the Web of Science cannot be used as a proxy for the "overall" production of SSH researchers in Québec. The paper concludes with a discussion on these disciplinary and language differences, as well as on their implications for rankings of universities.
  6. Haustein, S.; Peters, I.; Sugimoto, C.R.; Thelwall, M.; Larivière, V.: Tweeting biomedicine : an analysis of tweets and citations in the biomedical literature (2014) 0.02
    0.017620182 = product of:
      0.07048073 = sum of:
        0.07048073 = weight(_text_:social in 1229) [ClassicSimilarity], result of:
          0.07048073 = score(doc=1229,freq=6.0), product of:
            0.1847249 = queryWeight, product of:
              3.9875789 = idf(docFreq=2228, maxDocs=44218)
              0.046325076 = queryNorm
            0.3815443 = fieldWeight in 1229, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              3.9875789 = idf(docFreq=2228, maxDocs=44218)
              0.0390625 = fieldNorm(doc=1229)
      0.25 = coord(1/4)
    
    Abstract
    Data collected by social media platforms have been introduced as new sources for indicators to help measure the impact of scholarly research in ways that are complementary to traditional citation analysis. Data generated from social media activities can be used to reflect broad types of impact. This article aims to provide systematic evidence about how often Twitter is used to disseminate information about journal articles in the biomedical sciences. The analysis is based on 1.4 million documents covered by both PubMed and Web of Science and published between 2010 and 2012. The number of tweets containing links to these documents was analyzed and compared to citations to evaluate the degree to which certain journals, disciplines, and specialties were represented on Twitter and how far tweets correlate with citation impact. With less than 10% of PubMed articles mentioned on Twitter, its uptake is low in general but differs between journals and specialties. Correlations between tweets and citations are low, implying that impact metrics based on tweets are different from those based on citations. A framework using the coverage of articles and the correlation between Twitter mentions and citations is proposed to facilitate the evaluation of novel social-media-based metrics.
  7. Haustein, S.; Bowman, T.D.; Holmberg, K.; Tsou, A.; Sugimoto, C.R.; Larivière, V.: Tweets as impact indicators : Examining the implications of automated "bot" accounts on Twitter (2016) 0.02
    0.017264182 = product of:
      0.06905673 = sum of:
        0.06905673 = weight(_text_:social in 2502) [ClassicSimilarity], result of:
          0.06905673 = score(doc=2502,freq=4.0), product of:
            0.1847249 = queryWeight, product of:
              3.9875789 = idf(docFreq=2228, maxDocs=44218)
              0.046325076 = queryNorm
            0.3738355 = fieldWeight in 2502, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              3.9875789 = idf(docFreq=2228, maxDocs=44218)
              0.046875 = fieldNorm(doc=2502)
      0.25 = coord(1/4)
    
    Abstract
    This brief communication presents preliminary findings on automated Twitter accounts distributing links to scientific articles deposited on the preprint repository arXiv. It discusses the implication of the presence of such bots from the perspective of social media metrics (altmetrics), where mentions of scholarly documents on Twitter have been suggested as a means of measuring impact that is both broader and timelier than citations. Our results show that automated Twitter accounts create a considerable amount of tweets to scientific articles and that they behave differently than common social bots, which has critical implications for the use of raw tweet counts in research evaluation and assessment. We discuss some definitions of Twitter cyborgs and bots in scholarly communication and propose distinguishing between different levels of engagement-that is, differentiating between tweeting only bibliographic information to discussing or commenting on the content of a scientific work.
  8. Larivière, V.; Sugimoto, C.R.; Cronin, B.: ¬A bibliometric chronicling of library and information science's first hundred years (2012) 0.01
    0.014386819 = product of:
      0.057547275 = sum of:
        0.057547275 = weight(_text_:social in 244) [ClassicSimilarity], result of:
          0.057547275 = score(doc=244,freq=4.0), product of:
            0.1847249 = queryWeight, product of:
              3.9875789 = idf(docFreq=2228, maxDocs=44218)
              0.046325076 = queryNorm
            0.3115296 = fieldWeight in 244, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              3.9875789 = idf(docFreq=2228, maxDocs=44218)
              0.0390625 = fieldNorm(doc=244)
      0.25 = coord(1/4)
    
    Abstract
    This paper presents a condensed history of Library and Information Science (LIS) over the course of more than a century using a variety of bibliometric measures. It examines in detail the variable rate of knowledge production in the field, shifts in subject coverage, the dominance of particular publication genres at different times, prevailing modes of production, interactions with other disciplines, and, more generally, observes how the field has evolved. It shows that, despite a striking growth in the number of journals, papers, and contributing authors, a decrease was observed in the field's market-share of all social science and humanities research. Collaborative authorship is now the norm, a pattern seen across the social sciences. The idea of boundary crossing was also examined: in 2010, nearly 60% of authors who published in LIS also published in another discipline. This high degree of permeability in LIS was also demonstrated through reference and citation practices: LIS scholars now cite and receive citations from other fields more than from LIS itself. Two major structural shifts are revealed in the data: in 1960, LIS changed from a professional field focused on librarianship to an academic field focused on information and use; and in 1990, LIS began to receive a growing number of citations from outside the field, notably from Computer Science and Management, and saw a dramatic increase in the number of authors contributing to the literature of the field.
  9. Larivière, V.; Sugimoto, C.R.; Bergeron, P.: In their own image? : a comparison of doctoral students' and faculty members' referencing behavior (2013) 0.01
    0.01220762 = product of:
      0.04883048 = sum of:
        0.04883048 = weight(_text_:social in 751) [ClassicSimilarity], result of:
          0.04883048 = score(doc=751,freq=2.0), product of:
            0.1847249 = queryWeight, product of:
              3.9875789 = idf(docFreq=2228, maxDocs=44218)
              0.046325076 = queryNorm
            0.26434162 = fieldWeight in 751, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.9875789 = idf(docFreq=2228, maxDocs=44218)
              0.046875 = fieldNorm(doc=751)
      0.25 = coord(1/4)
    
    Abstract
    This article compares doctoral students' and faculty members' referencing behavior through the analysis of a large corpus of scientific articles. It shows that doctoral students tend to cite more documents per article than faculty members, and that the literature they cite is, on average, more recent. It also demonstrates that doctoral students cite a larger proportion of conference proceedings and journal articles than faculty members and faculty members are more likely to self-cite and cite theses than doctoral students. Analysis of the impact of cited journals indicates that in health research, faculty members tend to cite journals with slightly lower impact factors whereas in social sciences and humanities, faculty members cite journals with higher impact factors. Finally, it provides evidence that, in every discipline, faculty members tend to cite a higher proportion of clinical/applied research journals than doctoral students. This study contributes to the understanding of referencing patterns and age stratification in academia. Implications for understanding the information-seeking behavior of academics are discussed.
  10. Lisée, C.; Larivière, V.; Archambault, E.: Conference proceedings as a source of scientific information : a bibliometric analysis (2008) 0.01
    0.010173016 = product of:
      0.040692065 = sum of:
        0.040692065 = weight(_text_:social in 2356) [ClassicSimilarity], result of:
          0.040692065 = score(doc=2356,freq=2.0), product of:
            0.1847249 = queryWeight, product of:
              3.9875789 = idf(docFreq=2228, maxDocs=44218)
              0.046325076 = queryNorm
            0.22028469 = fieldWeight in 2356, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.9875789 = idf(docFreq=2228, maxDocs=44218)
              0.0390625 = fieldNorm(doc=2356)
      0.25 = coord(1/4)
    
    Abstract
    While several authors have argued that conference proceedings are an important source of scientific knowledge, the extent of their importance has not been measured in a systematic manner. This article examines the scientific impact and aging of conference proceedings compared to those of scientific literature in general. It shows that the relative importance of proceedings is diminishing over time and currently represents only 1.7% of references made in the natural sciences and engineering, and 2.5% in the social sciences and humanities. Although the scientific impact of proceedings is losing ground to other types of scientific literature in nearly all fields, it has grown from 8% of the references in engineering papers in the early 1980s to its current 10%. Proceedings play a particularly important role in computer sciences, where they account for close to 20% of the references. This article also shows that not unexpectedly, proceedings age faster than cited scientific literature in general. The evidence thus shows that proceedings have a relatively limited scientific impact, on average representing only about 2% of total citations, that their relative importance is shrinking, and that they become obsolete faster than the scientific literature in general.
  11. Larivière, V.; Gingras, Y.: On the prevalence and scientific impact of duplicate publications in different scientific fields (1980-2007) (2010) 0.01
    0.010173016 = product of:
      0.040692065 = sum of:
        0.040692065 = weight(_text_:social in 3622) [ClassicSimilarity], result of:
          0.040692065 = score(doc=3622,freq=2.0), product of:
            0.1847249 = queryWeight, product of:
              3.9875789 = idf(docFreq=2228, maxDocs=44218)
              0.046325076 = queryNorm
            0.22028469 = fieldWeight in 3622, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.9875789 = idf(docFreq=2228, maxDocs=44218)
              0.0390625 = fieldNorm(doc=3622)
      0.25 = coord(1/4)
    
    Abstract
    Purpose - The issue of duplicate publications has received a lot of attention in the medical literature, but much less in the information science community. This paper aims to analyze the prevalence and scientific impact of duplicate publications across all fields of research between 1980 and 2007. Design/methodology/approach - The approach is a bibliometric analysis of duplicate papers based on their metadata. Duplicate papers are defined as papers published in two different journals having: the exact same title; the same first author; and the same number of cited references. Findings - In all fields combined, the prevalence of duplicates is one out of 2,000 papers, but is higher in the natural and medical sciences than in the social sciences and humanities. A very high proportion (>85 percent) of these papers are published the same year or one year apart, which suggest that most duplicate papers were submitted simultaneously. Furthermore, duplicate papers are generally published in journals with impact factors below the average of their field and obtain lower citations. Originality/value - The paper provides clear evidence that the prevalence of duplicate papers is low and, more importantly, that the scientific impact of such papers is below average.
  12. Mohammadi, E.; Thelwall, M.; Haustein, S.; Larivière, V.: Who reads research articles? : an altmetrics analysis of Mendeley user categories (2015) 0.01
    0.010173016 = product of:
      0.040692065 = sum of:
        0.040692065 = weight(_text_:social in 2162) [ClassicSimilarity], result of:
          0.040692065 = score(doc=2162,freq=2.0), product of:
            0.1847249 = queryWeight, product of:
              3.9875789 = idf(docFreq=2228, maxDocs=44218)
              0.046325076 = queryNorm
            0.22028469 = fieldWeight in 2162, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.9875789 = idf(docFreq=2228, maxDocs=44218)
              0.0390625 = fieldNorm(doc=2162)
      0.25 = coord(1/4)
    
    Abstract
    Little detailed information is known about who reads research articles and the contexts in which research articles are read. Using data about people who register in Mendeley as readers of articles, this article explores different types of users of Clinical Medicine, Engineering and Technology, Social Science, Physics, and Chemistry articles inside and outside academia. The majority of readers for all disciplines were PhD students, postgraduates, and postdocs but other types of academics were also represented. In addition, many Clinical Medicine articles were read by medical professionals. The highest correlations between citations and Mendeley readership counts were found for types of users who often authored academic articles, except for associate professors in some sub-disciplines. This suggests that Mendeley readership can reflect usage similar to traditional citation impact if the data are restricted to readers who are also authors without the delay of impact measured by citation counts. At the same time, Mendeley statistics can also reveal the hidden impact of some research articles, such as educational value for nonauthor users inside academia or the impact of research articles on practice for readers outside academia.
  13. Siler, K.; Larivière, V.: Varieties of diffusion in academic publishing : how status and legitimacy influence growth trajectories of new innovations (2024) 0.01
    0.010173016 = product of:
      0.040692065 = sum of:
        0.040692065 = weight(_text_:social in 1206) [ClassicSimilarity], result of:
          0.040692065 = score(doc=1206,freq=2.0), product of:
            0.1847249 = queryWeight, product of:
              3.9875789 = idf(docFreq=2228, maxDocs=44218)
              0.046325076 = queryNorm
            0.22028469 = fieldWeight in 1206, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.9875789 = idf(docFreq=2228, maxDocs=44218)
              0.0390625 = fieldNorm(doc=1206)
      0.25 = coord(1/4)
    
    Abstract
    Open Access (OA) publishing has progressed from an initial fringe idea to a still-growing, major component of modern academic communication. The proliferation of OA publishing presents a context to examine how new innovations and institutions develop. Based on analyses of 1,296,304 articles published in 83 OA journals, we analyze changes in the institutional status, gender, age, citedness, and geographical locations of authors over time. Generally, OA journals tended towards core-to-periphery diffusion patterns. Specifically, journal authors tended to decrease in high-status institutional affiliations, male and highly cited authors over time. Despite these general tendencies, there was substantial variation in the diffusion patterns of OA journals. Some journals exhibited no significant demographic changes, and a few exhibited periphery-to-core diffusion patterns. We find that although both highly and less-legitimate journals generally exhibit core-to-periphery diffusion patterns, there are still demographic differences between such journals. Institutional and cultural legitimacy-or lack thereof-affects the social and intellectual diffusion of new OA journals.