Search (27 results, page 1 of 2)

  • × author_ss:"Kousha, K."
  1. Kousha, K.; Thelwall, M.: ¬An automatic method for assessing the teaching impact of books from online academic syllabi (2016) 0.01
    0.0149040185 = product of:
      0.044712055 = sum of:
        0.009977593 = weight(_text_:in in 3226) [ClassicSimilarity], result of:
          0.009977593 = score(doc=3226,freq=10.0), product of:
            0.059380736 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.043654136 = queryNorm
            0.16802745 = fieldWeight in 3226, product of:
              3.1622777 = tf(freq=10.0), with freq of:
                10.0 = termFreq=10.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0390625 = fieldNorm(doc=3226)
        0.03473446 = product of:
          0.06946892 = sum of:
            0.06946892 = weight(_text_:ausbildung in 3226) [ClassicSimilarity], result of:
              0.06946892 = score(doc=3226,freq=2.0), product of:
                0.23429902 = queryWeight, product of:
                  5.3671665 = idf(docFreq=560, maxDocs=44218)
                  0.043654136 = queryNorm
                0.29649687 = fieldWeight in 3226, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  5.3671665 = idf(docFreq=560, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=3226)
          0.5 = coord(1/2)
      0.33333334 = coord(2/6)
    
    Abstract
    Scholars writing books that are widely used to support teaching in higher education may be undervalued because of a lack of evidence of teaching value. Although sales data may give credible evidence for textbooks, these data may poorly reflect educational uses of other types of books. As an alternative, this article proposes a method to search automatically for mentions of books in online academic course syllabi based on Bing searches for syllabi mentioning a given book, filtering out false matches through an extensive set of rules. The method had an accuracy of over 90% based on manual checks of a sample of 2,600 results from the initial Bing searches. Over one third of about 14,000 monographs checked had one or more academic syllabus mention, with more in the arts and humanities (56%) and social sciences (52%). Low but significant correlations between syllabus mentions and citations across most fields, except the social sciences, suggest that books tend to have different levels of impact for teaching and research. In conclusion, the automatic syllabus search method gives a new way to estimate the educational utility of books in a way that sales data and citation counts cannot.
    Theme
    Ausbildung
  2. Li, X.; Thelwall, M.; Kousha, K.: ¬The role of arXiv, RePEc, SSRN and PMC in formal scholarly communication (2015) 0.01
    0.0091357 = product of:
      0.027407099 = sum of:
        0.012620768 = weight(_text_:in in 2593) [ClassicSimilarity], result of:
          0.012620768 = score(doc=2593,freq=16.0), product of:
            0.059380736 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.043654136 = queryNorm
            0.21253976 = fieldWeight in 2593, product of:
              4.0 = tf(freq=16.0), with freq of:
                16.0 = termFreq=16.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0390625 = fieldNorm(doc=2593)
        0.014786332 = product of:
          0.029572664 = sum of:
            0.029572664 = weight(_text_:22 in 2593) [ClassicSimilarity], result of:
              0.029572664 = score(doc=2593,freq=2.0), product of:
                0.15286934 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.043654136 = queryNorm
                0.19345059 = fieldWeight in 2593, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=2593)
          0.5 = coord(1/2)
      0.33333334 = coord(2/6)
    
    Abstract
    Purpose The four major Subject Repositories (SRs), arXiv, Research Papers in Economics (RePEc), Social Science Research Network (SSRN) and PubMed Central (PMC), are all important within their disciplines but no previous study has systematically compared how often they are cited in academic publications. In response, the purpose of this paper is to report an analysis of citations to SRs from Scopus publications, 2000-2013. Design/methodology/approach Scopus searches were used to count the number of documents citing the four SRs in each year. A random sample of 384 documents citing the four SRs was then visited to investigate the nature of the citations. Findings Each SR was most cited within its own subject area but attracted substantial citations from other subject areas, suggesting that they are open to interdisciplinary uses. The proportion of documents citing each SR is continuing to increase rapidly, and the SRs all seem to attract substantial numbers of citations from more than one discipline. Research limitations/implications Scopus does not cover all publications, and most citations to documents found in the four SRs presumably cite the published version, when one exists, rather than the repository version. Practical implications SRs are continuing to grow and do not seem to be threatened by institutional repositories and so research managers should encourage their continued use within their core disciplines, including for research that aims at an audience in other disciplines. Originality/value This is the first simultaneous analysis of Scopus citations to the four most popular SRs.
    Date
    20. 1.2015 18:30:22
    Object
    Research Papers in Economics
  3. Thelwall, M.; Kousha, K.; Abdoli, M.; Stuart, E.; Makita, M.; Wilson, P.; Levitt, J.: Why are coauthored academic articles more cited : higher quality or larger audience? (2023) 0.01
    0.008863994 = product of:
      0.02659198 = sum of:
        0.011805649 = weight(_text_:in in 995) [ClassicSimilarity], result of:
          0.011805649 = score(doc=995,freq=14.0), product of:
            0.059380736 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.043654136 = queryNorm
            0.19881277 = fieldWeight in 995, product of:
              3.7416575 = tf(freq=14.0), with freq of:
                14.0 = termFreq=14.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0390625 = fieldNorm(doc=995)
        0.014786332 = product of:
          0.029572664 = sum of:
            0.029572664 = weight(_text_:22 in 995) [ClassicSimilarity], result of:
              0.029572664 = score(doc=995,freq=2.0), product of:
                0.15286934 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.043654136 = queryNorm
                0.19345059 = fieldWeight in 995, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=995)
          0.5 = coord(1/2)
      0.33333334 = coord(2/6)
    
    Abstract
    Collaboration is encouraged because it is believed to improve academic research, supported by indirect evidence in the form of more coauthored articles being more cited. Nevertheless, this might not reflect quality but increased self-citations or the "audience effect": citations from increased awareness through multiple author networks. We address this with the first science wide investigation into whether author numbers associate with journal article quality, using expert peer quality judgments for 122,331 articles from the 2014-20 UK national assessment. Spearman correlations between author numbers and quality scores show moderately strong positive associations (0.2-0.4) in the health, life, and physical sciences, but weak or no positive associations in engineering and social sciences, with weak negative/positive or no associations in various arts and humanities, and a possible negative association for decision sciences. This gives the first systematic evidence that greater numbers of authors associates with higher quality journal articles in the majority of academia outside the arts and humanities, at least for the UK. Positive associations between team size and citation counts in areas with little association between team size and quality also show that audience effects or other nonquality factors account for the higher citation rates of coauthored articles in some fields.
    Date
    22. 6.2023 18:11:50
  4. Kousha, K.; Thelwall, M.: How is science cited on the Web? : a classification of google unique Web citations (2007) 0.01
    0.008254642 = product of:
      0.024763925 = sum of:
        0.009977593 = weight(_text_:in in 586) [ClassicSimilarity], result of:
          0.009977593 = score(doc=586,freq=10.0), product of:
            0.059380736 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.043654136 = queryNorm
            0.16802745 = fieldWeight in 586, product of:
              3.1622777 = tf(freq=10.0), with freq of:
                10.0 = termFreq=10.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0390625 = fieldNorm(doc=586)
        0.014786332 = product of:
          0.029572664 = sum of:
            0.029572664 = weight(_text_:22 in 586) [ClassicSimilarity], result of:
              0.029572664 = score(doc=586,freq=2.0), product of:
                0.15286934 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.043654136 = queryNorm
                0.19345059 = fieldWeight in 586, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=586)
          0.5 = coord(1/2)
      0.33333334 = coord(2/6)
    
    Abstract
    Although the analysis of citations in the scholarly literature is now an established and relatively well understood part of information science, not enough is known about citations that can be found on the Web. In particular, are there new Web types, and if so, are these trivial or potentially useful for studying or evaluating research communication? We sought evidence based upon a sample of 1,577 Web citations of the URLs or titles of research articles in 64 open-access journals from biology, physics, chemistry, and computing. Only 25% represented intellectual impact, from references of Web documents (23%) and other informal scholarly sources (2%). Many of the Web/URL citations were created for general or subject-specific navigation (45%) or for self-publicity (22%). Additional analyses revealed significant disciplinary differences in the types of Google unique Web/URL citations as well as some characteristics of scientific open-access publishing on the Web. We conclude that the Web provides access to a new and different type of citation information, one that may therefore enable us to measure different aspects of research, and the research process in particular; but to obtain good information, the different types should be separated.
  5. Kousha, K.; Thelwall, M.: Assessing the impact of disciplinary research on teaching : an automatic analysis of online syllabuses (2008) 0.00
    0.0023517415 = product of:
      0.014110449 = sum of:
        0.014110449 = weight(_text_:in in 2383) [ClassicSimilarity], result of:
          0.014110449 = score(doc=2383,freq=20.0), product of:
            0.059380736 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.043654136 = queryNorm
            0.2376267 = fieldWeight in 2383, product of:
              4.472136 = tf(freq=20.0), with freq of:
                20.0 = termFreq=20.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0390625 = fieldNorm(doc=2383)
      0.16666667 = coord(1/6)
    
    Abstract
    The impact of published academic research in the sciences and social sciences, when measured, is commonly estimated by counting citations from journal articles. The Web has now introduced new potential sources of quantitative data online that could be used to measure aspects of research impact. In this article we assess the extent to which citations from online syllabuses could be a valuable source of evidence about the educational utility of research. An analysis of online syllabus citations to 70,700 articles published in 2003 in the journals of 12 subjects indicates that online syllabus citations were sufficiently numerous to be a useful impact indictor in some social sciences, including political science and information and library science, but not in others, nor in any sciences. This result was consistent with current social science research having, in general, more educational value than current science research. Moreover, articles frequently cited in online syllabuses were not necessarily highly cited by other articles. Hence it seems that online syllabus citations provide a valuable additional source of evidence about the impact of journals, scholars, and research articles in some social sciences.
  6. Kousha, K.; Thelwall, M.; Abdoli, M.: ¬The role of online videos in research communication : a content analysis of YouTube videos cited in academic publications (2012) 0.00
    0.0023517415 = product of:
      0.014110449 = sum of:
        0.014110449 = weight(_text_:in in 382) [ClassicSimilarity], result of:
          0.014110449 = score(doc=382,freq=20.0), product of:
            0.059380736 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.043654136 = queryNorm
            0.2376267 = fieldWeight in 382, product of:
              4.472136 = tf(freq=20.0), with freq of:
                20.0 = termFreq=20.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0390625 = fieldNorm(doc=382)
      0.16666667 = coord(1/6)
    
    Abstract
    Although there is some evidence that online videos are increasingly used by academics for informal scholarly communication and teaching, the extent to which they are used in published academic research is unknown. This article explores the extent to which YouTube videos are cited in academic publications and whether there are significant broad disciplinary differences in this practice. To investigate, we extracted the URL citations to YouTube videos from academic publications indexed by Scopus. A total of 1,808 Scopus publications cited at least one YouTube video, and there was a steady upward growth in citing online videos within scholarly publications from 2006 to 2011, with YouTube citations being most common within arts and humanities (0.3%) and the social sciences (0.2%). A content analysis of 551 YouTube videos cited by research articles indicated that in science (78%) and in medicine and health sciences (77%), over three fourths of the cited videos had either direct scientific (e.g., laboratory experiments) or scientific-related contents (e.g., academic lectures or education) whereas in the arts and humanities, about 80% of the YouTube videos had art, culture, or history themes, and in the social sciences, about 63% of the videos were related to news, politics, advertisements, and documentaries. This shows both the disciplinary differences and the wide variety of innovative research communication uses found for videos within the different subject areas.
  7. Kousha, K.; Thelwall, M.: Google book search : citation analysis for social science and the humanities (2009) 0.00
    0.0022310577 = product of:
      0.0133863455 = sum of:
        0.0133863455 = weight(_text_:in in 2946) [ClassicSimilarity], result of:
          0.0133863455 = score(doc=2946,freq=18.0), product of:
            0.059380736 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.043654136 = queryNorm
            0.22543246 = fieldWeight in 2946, product of:
              4.2426405 = tf(freq=18.0), with freq of:
                18.0 = termFreq=18.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0390625 = fieldNorm(doc=2946)
      0.16666667 = coord(1/6)
    
    Abstract
    In both the social sciences and the humanities, books and monographs play significant roles in research communication. The absence of citations from most books and monographs from the Thomson Reuters/Institute for Scientific Information databases (ISI) has been criticized, but attempts to include citations from or to books in the research evaluation of the social sciences and humanities have not led to widespread adoption. This article assesses whether Google Book Search (GBS) can partially fill this gap by comparing citations from books with citations from journal articles to journal articles in 10 science, social science, and humanities disciplines. Book citations were 31% to 212% of ISI citations and, hence, numerous enough to supplement ISI citations in the social sciences and humanities covered, but not in the sciences (3%-5%), except for computing (46%), due to numerous published conference proceedings. A case study was also made of all 1,923 articles in the 51 information science and library science ISI-indexed journals published in 2003. Within this set, highly book-cited articles tended to receive many ISI citations, indicating a significant relationship between the two types of citation data, but with important exceptions that point to the additional information provided by book citations. In summary, GBS is clearly a valuable new source of citation data for the social sciences and humanities. One practical implication is that book-oriented scholars should consult it for additional citations to their work when applying for promotion and tenure.
  8. Thelwall, M.; Kousha, K.: Online presentations as a source of scientific impact? : an analysis of PowerPoint files citing academic journals (2008) 0.00
    0.0021034614 = product of:
      0.012620768 = sum of:
        0.012620768 = weight(_text_:in in 1614) [ClassicSimilarity], result of:
          0.012620768 = score(doc=1614,freq=16.0), product of:
            0.059380736 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.043654136 = queryNorm
            0.21253976 = fieldWeight in 1614, product of:
              4.0 = tf(freq=16.0), with freq of:
                16.0 = termFreq=16.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0390625 = fieldNorm(doc=1614)
      0.16666667 = coord(1/6)
    
    Abstract
    Open-access online publication has made available an increasingly wide range of document types for scientometric analysis. In this article, we focus on citations in online presentations, seeking evidence of their value as nontraditional indicators of research impact. For this purpose, we searched for online PowerPoint files mentioning any one of 1,807 ISI-indexed journals in ten science and ten social science disciplines. We also manually classified 1,378 online PowerPoint citations to journals in eight additional science and social science disciplines. The results showed that very few journals were cited frequently enough in online PowerPoint files to make impact assessment worthwhile, with the main exceptions being popular magazines like Scientific American and Harvard Business Review. Surprisingly, however, there was little difference overall in the number of PowerPoint citations to science and to the social sciences, and also in the proportion representing traditional impact (about 60%) and wider impact (about 15%). It seems that the main scientometric value for online presentations may be in tracking the popularization of research, or for comparing the impact of whole journals rather than individual articles.
  9. Kousha, K.; Thelwall, M.: ¬An automatic method for extracting citations from Google Books (2015) 0.00
    0.0021034614 = product of:
      0.012620768 = sum of:
        0.012620768 = weight(_text_:in in 1658) [ClassicSimilarity], result of:
          0.012620768 = score(doc=1658,freq=16.0), product of:
            0.059380736 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.043654136 = queryNorm
            0.21253976 = fieldWeight in 1658, product of:
              4.0 = tf(freq=16.0), with freq of:
                16.0 = termFreq=16.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0390625 = fieldNorm(doc=1658)
      0.16666667 = coord(1/6)
    
    Abstract
    Recent studies have shown that counting citations from books can help scholarly impact assessment and that Google Books (GB) is a useful source of such citation counts, despite its lack of a public citation index. Searching GB for citations produces approximate matches, however, and so its raw results need time-consuming human filtering. In response, this article introduces a method to automatically remove false and irrelevant matches from GB citation searches in addition to introducing refinements to a previous GB manual citation extraction method. The method was evaluated by manual checking of sampled GB results and comparing citations to about 14,500 monographs in the Thomson Reuters Book Citation Index (BKCI) against automatically extracted citations from GB across 24 subject areas. GB citations were 103% to 137% as numerous as BKCI citations in the humanities, except for tourism (72%) and linguistics (91%), 46% to 85% in social sciences, but only 8% to 53% in the sciences. In all cases, however, GB had substantially more citing books than did BKCI, with BKCI's results coming predominantly from journal articles. Moderate correlations between the GB and BKCI citation counts in social sciences and humanities, with most BKCI results coming from journal articles rather than books, suggests that they could measure the different aspects of impact, however.
  10. Mohammadi, E.; Thelwall, M.; Kousha, K.: Can Mendeley bookmarks reflect readership? : a survey of user motivations (2016) 0.00
    0.0021034614 = product of:
      0.012620768 = sum of:
        0.012620768 = weight(_text_:in in 2897) [ClassicSimilarity], result of:
          0.012620768 = score(doc=2897,freq=16.0), product of:
            0.059380736 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.043654136 = queryNorm
            0.21253976 = fieldWeight in 2897, product of:
              4.0 = tf(freq=16.0), with freq of:
                16.0 = termFreq=16.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0390625 = fieldNorm(doc=2897)
      0.16666667 = coord(1/6)
    
    Abstract
    Although Mendeley bookmarking counts appear to correlate moderately with conventional citation metrics, it is not known whether academic publications are bookmarked in Mendeley in order to be read or not. Without this information, it is not possible to give a confident interpretation of altmetrics derived from Mendeley. In response, a survey of 860 Mendeley users shows that it is reasonable to use Mendeley bookmarking counts as an indication of readership because most (55%) users with a Mendeley library had read or intended to read at least half of their bookmarked publications. This was true across all broad areas of scholarship except for the arts and humanities (42%). About 85% of the respondents also declared that they bookmarked articles in Mendeley to cite them in their publications, but some also bookmark articles for use in professional (50%), teaching (25%), and educational activities (13%). Of course, it is likely that most readers do not record articles in Mendeley and so these data do not represent all readers. In conclusion, Mendeley bookmark counts seem to be indicators of readership leading to a combination of scholarly impact and wider professional impact.
  11. Thelwall, M.; Kousha, K.; Abdoli, M.; Stuart, E.; Makita, M.; Wilson, P.; Levitt, J.: Do altmetric scores reflect article quality? : evidence from the UK Research Excellence Framework 2021 (2023) 0.00
    0.0019676082 = product of:
      0.011805649 = sum of:
        0.011805649 = weight(_text_:in in 947) [ClassicSimilarity], result of:
          0.011805649 = score(doc=947,freq=14.0), product of:
            0.059380736 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.043654136 = queryNorm
            0.19881277 = fieldWeight in 947, product of:
              3.7416575 = tf(freq=14.0), with freq of:
                14.0 = termFreq=14.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0390625 = fieldNorm(doc=947)
      0.16666667 = coord(1/6)
    
    Abstract
    Altmetrics are web-based quantitative impact or attention indicators for academic articles that have been proposed to supplement citation counts. This article reports the first assessment of the extent to which mature altmetrics from Altmetric.com and Mendeley associate with individual article quality scores. It exploits expert norm-referenced peer review scores from the UK Research Excellence Framework 2021 for 67,030+ journal articles in all fields 2014-2017/2018, split into 34 broadly field-based Units of Assessment (UoAs). Altmetrics correlated more strongly with research quality than previously found, although less strongly than raw and field normalized Scopus citation counts. Surprisingly, field normalizing citation counts can reduce their strength as a quality indicator for articles in a single field. For most UoAs, Mendeley reader counts are the best altmetric (e.g., three Spearman correlations with quality scores above 0.5), tweet counts are also a moderate strength indicator in eight UoAs (Spearman correlations with quality scores above 0.3), ahead of news (eight correlations above 0.3, but generally weaker), blogs (five correlations above 0.3), and Facebook (three correlations above 0.3) citations, at least in the United Kingdom. In general, altmetrics are the strongest indicators of research quality in the health and physical sciences and weakest in the arts and humanities.
  12. Thelwall, M.; Kousha, K.: Academia.edu : Social network or Academic Network? (2014) 0.00
    0.001821651 = product of:
      0.010929906 = sum of:
        0.010929906 = weight(_text_:in in 1234) [ClassicSimilarity], result of:
          0.010929906 = score(doc=1234,freq=12.0), product of:
            0.059380736 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.043654136 = queryNorm
            0.18406484 = fieldWeight in 1234, product of:
              3.4641016 = tf(freq=12.0), with freq of:
                12.0 = termFreq=12.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0390625 = fieldNorm(doc=1234)
      0.16666667 = coord(1/6)
    
    Abstract
    Academic social network sites Academia.edu and ResearchGate, and reference sharing sites Mendeley, Bibsonomy, Zotero, and CiteULike, give scholars the ability to publicize their research outputs and connect with each other. With millions of users, these are a significant addition to the scholarly communication and academic information-seeking eco-structure. There is thus a need to understand the role that they play and the changes, if any, that they can make to the dynamics of academic careers. This article investigates attributes of philosophy scholars on Academia.edu, introducing a median-based, time-normalizing method to adjust for time delays in joining the site. In comparison to students, faculty tend to attract more profile views but female philosophers did not attract more profile views than did males, suggesting that academic capital drives philosophy uses of the site more than does friendship and networking. Secondary analyses of law, history, and computer science confirmed the faculty advantage (in terms of higher profile views) except for females in law and females in computer science. There was also a female advantage for both faculty and students in law and computer science as well as for history students. Hence, Academia.edu overall seems to reflect a hybrid of scholarly norms (the faculty advantage) and a female advantage that is suggestive of general social networking norms. Finally, traditional bibliometric measures did not correlate with any Academia.edu metrics for philosophers, perhaps because more senior academics use the site less extensively or because of the range informal scholarly activities that cannot be measured by bibliometric methods.
  13. Kousha, K.; Thelwall, M.; Rezaie, S.: Assessing the citation impact of books : the role of Google Books, Google Scholar, and Scopus (2011) 0.00
    0.0016629322 = product of:
      0.009977593 = sum of:
        0.009977593 = weight(_text_:in in 4920) [ClassicSimilarity], result of:
          0.009977593 = score(doc=4920,freq=10.0), product of:
            0.059380736 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.043654136 = queryNorm
            0.16802745 = fieldWeight in 4920, product of:
              3.1622777 = tf(freq=10.0), with freq of:
                10.0 = termFreq=10.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0390625 = fieldNorm(doc=4920)
      0.16666667 = coord(1/6)
    
    Abstract
    Citation indictors are increasingly used in some subject areas to support peer review in the evaluation of researchers and departments. Nevertheless, traditional journal-based citation indexes may be inadequate for the citation impact assessment of book-based disciplines. This article examines whether online citations from Google Books and Google Scholar can provide alternative sources of citation evidence. To investigate this, we compared the citation counts to 1,000 books submitted to the 2008 U.K. Research Assessment Exercise (RAE) from Google Books and Google Scholar with Scopus citations across seven book-based disciplines (archaeology; law; politics and international studies; philosophy; sociology; history; and communication, cultural, and media studies). Google Books and Google Scholar citations to books were 1.4 and 3.2 times more common than were Scopus citations, and their medians were more than twice and three times as high as were Scopus median citations, respectively. This large number of citations is evidence that in book-oriented disciplines in the social sciences, arts, and humanities, online book citations may be sufficiently numerous to support peer review for research evaluation, at least in the United Kingdom.
  14. Kousha, K.; Thelwall, M.: Disseminating research with web CV hyperlinks (2014) 0.00
    0.0016629322 = product of:
      0.009977593 = sum of:
        0.009977593 = weight(_text_:in in 1331) [ClassicSimilarity], result of:
          0.009977593 = score(doc=1331,freq=10.0), product of:
            0.059380736 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.043654136 = queryNorm
            0.16802745 = fieldWeight in 1331, product of:
              3.1622777 = tf(freq=10.0), with freq of:
                10.0 = termFreq=10.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0390625 = fieldNorm(doc=1331)
      0.16666667 = coord(1/6)
    
    Abstract
    Some curricula vitae (web CVs) of academics on the web, including homepages and publication lists, link to open-access (OA) articles, resources, abstracts in publishers' websites, or academic discussions, helping to disseminate research. To assess how common such practices are and whether they vary by discipline, gender, and country, the authors conducted a large-scale e-mail survey of astronomy and astrophysics, public health, environmental engineering, and philosophy across 15 European countries and analyzed hyperlinks from web CVs of academics. About 60% of the 2,154 survey responses reported having a web CV or something similar, and there were differences between disciplines, genders, and countries. A follow-up outlink analysis of 2,700 web CVs found that a third had at least one outlink to an OA target, typically a public eprint archive or an individual self-archived file. This proportion was considerably higher in astronomy (48%) and philosophy (37%) than in environmental engineering (29%) and public health (21%). There were also differences in linking to publishers' websites, resources, and discussions. Perhaps most important, however, the amount of linking to OA publications seems to be much lower than allowed by publishers and journals, suggesting that many opportunities for disseminating full-text research online are being missed, especially in disciplines without established repositories. Moreover, few academics seem to be exploiting their CVs to link to discussions, resources, or article abstracts, which seems to be another missed opportunity for publicizing research.
  15. Kousha, K.; Thelwall, M.: Are wikipedia citations important evidence of the impact of scholarly articles and books? (2017) 0.00
    0.0016629322 = product of:
      0.009977593 = sum of:
        0.009977593 = weight(_text_:in in 3440) [ClassicSimilarity], result of:
          0.009977593 = score(doc=3440,freq=10.0), product of:
            0.059380736 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.043654136 = queryNorm
            0.16802745 = fieldWeight in 3440, product of:
              3.1622777 = tf(freq=10.0), with freq of:
                10.0 = termFreq=10.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0390625 = fieldNorm(doc=3440)
      0.16666667 = coord(1/6)
    
    Abstract
    Individual academics and research evaluators often need to assess the value of published research. Although citation counts are a recognized indicator of scholarly impact, alternative data is needed to provide evidence of other types of impact, including within education and wider society. Wikipedia is a logical choice for both of these because the role of a general encyclopaedia is to be an understandable repository of facts about a diverse array of topics and hence it may cite research to support its claims. To test whether Wikipedia could provide new evidence about the impact of scholarly research, this article counted citations to 302,328 articles and 18,735 monographs in English indexed by Scopus in the period 2005 to 2012. The results show that citations from Wikipedia to articles are too rare for most research evaluation purposes, with only 5% of articles being cited in all fields. In contrast, a third of monographs have at least one citation from Wikipedia, with the most in the arts and humanities. Hence, Wikipedia citations can provide extra impact evidence for academic monographs. Nevertheless, the results may be relatively easily manipulated and so Wikipedia is not recommended for evaluations affecting stakeholder interests.
  16. Thelwall, M.; Kousha, K.; Stuart, E.; Makita, M.; Abdoli, M.; Wilson, P.; Levitt, J.: In which fields are citations indicators of research quality? (2023) 0.00
    0.0016629322 = product of:
      0.009977593 = sum of:
        0.009977593 = weight(_text_:in in 1033) [ClassicSimilarity], result of:
          0.009977593 = score(doc=1033,freq=10.0), product of:
            0.059380736 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.043654136 = queryNorm
            0.16802745 = fieldWeight in 1033, product of:
              3.1622777 = tf(freq=10.0), with freq of:
                10.0 = termFreq=10.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0390625 = fieldNorm(doc=1033)
      0.16666667 = coord(1/6)
    
    Abstract
    Citation counts are widely used as indicators of research quality to support or replace human peer review and for lists of top cited papers, researchers, and institutions. Nevertheless, the relationship between citations and research quality is poorly evidenced. We report the first large-scale science-wide academic evaluation of the relationship between research quality and citations (field normalized citation counts), correlating them for 87,739 journal articles in 34 field-based UK Units of Assessment (UoA). The two correlate positively in all academic fields, from very weak (0.1) to strong (0.5), reflecting broadly linear relationships in all fields. We give the first evidence that the correlations are positive even across the arts and humanities. The patterns are similar for the field classification schemes of Scopus and Dimensions.ai, although varying for some individual subjects and therefore more uncertain for these. We also show for the first time that no field has a citation threshold beyond which all articles are excellent quality, so lists of top cited articles are not pure collections of excellence, and neither is any top citation percentile indicator. Thus, while appropriately field normalized citations associate positively with research quality in all fields, they never perfectly reflect it, even at high values.
  17. Kousha, K.; Thelwall, M.: Google Scholar citations and Google Web/URL citations : a multi-discipline exploratory analysis (2007) 0.00
    0.0012881019 = product of:
      0.007728611 = sum of:
        0.007728611 = weight(_text_:in in 337) [ClassicSimilarity], result of:
          0.007728611 = score(doc=337,freq=6.0), product of:
            0.059380736 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.043654136 = queryNorm
            0.1301535 = fieldWeight in 337, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0390625 = fieldNorm(doc=337)
      0.16666667 = coord(1/6)
    
    Abstract
    We use a new data gathering method, "Web/URL citation," Web/URL and Google Scholar to compare traditional and Web-based citation patterns across multiple disciplines (biology, chemistry, physics, computing, sociology, economics, psychology, and education) based upon a sample of 1,650 articles from 108 open access (OA) journals published in 2001. A Web/URL citation of an online journal article is a Web mention of its title, URL, or both. For each discipline, except psychology, we found significant correlations between Thomson Scientific (formerly Thomson ISI, here: ISI) citations and both Google Scholar and Google Web/URL citations. Google Scholar citations correlated more highly with ISI citations than did Google Web/URL citations, indicating that the Web/URL method measures a broader type of citation phenomenon. Google Scholar citations were more numerous than ISI citations in computer science and the four social science disciplines, suggesting that Google Scholar is more comprehensive for social sciences and perhaps also when conference articles are valued and published online. We also found large disciplinary differences in the percentage overlap between ISI and Google Scholar citation sources. Finally, although we found many significant trends, there were also numerous exceptions, suggesting that replacing traditional citation sources with the Web or Google Scholar for research impact calculations would be problematic.
  18. Thelwall, M.; Kousha, K.: ResearchGate articles : age, discipline, audience size, and impact (2017) 0.00
    0.0012881019 = product of:
      0.007728611 = sum of:
        0.007728611 = weight(_text_:in in 3349) [ClassicSimilarity], result of:
          0.007728611 = score(doc=3349,freq=6.0), product of:
            0.059380736 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.043654136 = queryNorm
            0.1301535 = fieldWeight in 3349, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0390625 = fieldNorm(doc=3349)
      0.16666667 = coord(1/6)
    
    Abstract
    The large multidisciplinary academic social website ResearchGate aims to help academics to connect with each other and to publicize their work. Despite its popularity, little is known about the age and discipline of the articles uploaded and viewed in the site and whether publication statistics from the site could be useful impact indicators. In response, this article assesses samples of ResearchGate articles uploaded at specific dates, comparing their views in the site to their Mendeley readers and Scopus-indexed citations. This analysis shows that ResearchGate is dominated by recent articles, which attract about three times as many views as older articles. ResearchGate has uneven coverage of scholarship, with the arts and humanities, health professions, and decision sciences poorly represented and some fields receiving twice as many views per article as others. View counts for uploaded articles have low to moderate positive correlations with both Scopus citations and Mendeley readers, which is consistent with them tending to reflect a wider audience than Scopus-publishing scholars. Hence, for articles uploaded to the site, view counts may give a genuinely new audience indicator.
  19. Kousha, K.; Thelwall, M.; Abdoli, M.: Goodreads reviews to assess the wider impacts of books (2017) 0.00
    0.0012881019 = product of:
      0.007728611 = sum of:
        0.007728611 = weight(_text_:in in 3768) [ClassicSimilarity], result of:
          0.007728611 = score(doc=3768,freq=6.0), product of:
            0.059380736 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.043654136 = queryNorm
            0.1301535 = fieldWeight in 3768, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0390625 = fieldNorm(doc=3768)
      0.16666667 = coord(1/6)
    
    Abstract
    Although peer-review and citation counts are commonly used to help assess the scholarly impact of published research, informal reader feedback might also be exploited to help assess the wider impacts of books, such as their educational or cultural value. The social website Goodreads seems to be a reasonable source for this purpose because it includes a large number of book reviews and ratings by many users inside and outside of academia. To check this, Goodreads book metrics were compared with different book-based impact indicators for 15,928 academic books across broad fields. Goodreads engagements were numerous enough in the arts (85% of books had at least one), humanities (80%), and social sciences (67%) for use as a source of impact evidence. Low and moderate correlations between Goodreads book metrics and scholarly or non-scholarly indicators suggest that reader feedback in Goodreads reflects the many purposes of books rather than a single type of impact. Although Goodreads book metrics can be manipulated, they could be used guardedly by academics, authors, and publishers in evaluations.
  20. Kousha, K.; Thelwall, M.: Can Amazon.com reviews help to assess the wider impacts of books? (2016) 0.00
    0.0012620769 = product of:
      0.0075724614 = sum of:
        0.0075724614 = weight(_text_:in in 2768) [ClassicSimilarity], result of:
          0.0075724614 = score(doc=2768,freq=4.0), product of:
            0.059380736 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.043654136 = queryNorm
            0.12752387 = fieldWeight in 2768, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.046875 = fieldNorm(doc=2768)
      0.16666667 = coord(1/6)
    
    Abstract
    Although citation counts are often used to evaluate the research impact of academic publications, they are problematic for books that aim for educational or cultural impact. To fill this gap, this article assesses whether a number of simple metrics derived from Amazon.com reviews of academic books could provide evidence of their impact. Based on a set of 2,739 academic monographs from 2008 and a set of 1,305 best-selling books in 15 Amazon.com academic subject categories, the existence of significant but low or moderate correlations between citations and numbers of reviews, combined with other evidence, suggests that online book reviews tend to reflect the wider popularity of a book rather than its academic impact, although there are substantial disciplinary differences. Metrics based on online reviews are therefore recommended for the evaluation of books that aim at a wide audience inside or outside academia when it is important to capture the broader impacts of educational or cultural activities and when they cannot be manipulated in advance of the evaluation.