Search (27 results, page 1 of 2)

  • × author_ss:"Thelwall, M."
  1. Didegah, F.; Thelwall, M.: Co-saved, co-tweeted, and co-cited networks (2018) 0.06
    0.06346937 = product of:
      0.12693875 = sum of:
        0.12693875 = sum of:
          0.08539981 = weight(_text_:journals in 4291) [ClassicSimilarity], result of:
            0.08539981 = score(doc=4291,freq=2.0), product of:
              0.25656942 = queryWeight, product of:
                5.021064 = idf(docFreq=792, maxDocs=44218)
                0.05109862 = queryNorm
              0.33285263 = fieldWeight in 4291, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                5.021064 = idf(docFreq=792, maxDocs=44218)
                0.046875 = fieldNorm(doc=4291)
          0.041538943 = weight(_text_:22 in 4291) [ClassicSimilarity], result of:
            0.041538943 = score(doc=4291,freq=2.0), product of:
              0.17893866 = queryWeight, product of:
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.05109862 = queryNorm
              0.23214069 = fieldWeight in 4291, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.046875 = fieldNorm(doc=4291)
      0.5 = coord(1/2)
    
    Abstract
    Counts of tweets and Mendeley user libraries have been proposed as altmetric alternatives to citation counts for the impact assessment of articles. Although both have been investigated to discover whether they correlate with article citations, it is not known whether users tend to tweet or save (in Mendeley) the same kinds of articles that they cite. In response, this article compares pairs of articles that are tweeted, saved to a Mendeley library, or cited by the same user, but possibly a different user for each source. The study analyzes 1,131,318 articles published in 2012, with minimum tweeted (10), saved to Mendeley (100), and cited (10) thresholds. The results show surprisingly minor overall overlaps between the three phenomena. The importance of journals for Twitter and the presence of many bots at different levels of activity suggest that this site has little value for impact altmetrics. The moderate differences between patterns of saving and citation suggest that Mendeley can be used for some types of impact assessments, but sensitivity is needed for underlying differences.
    Date
    28. 7.2018 10:00:22
  2. Kousha, K.; Thelwall, M.: How is science cited on the Web? : a classification of google unique Web citations (2007) 0.05
    0.052891143 = product of:
      0.105782285 = sum of:
        0.105782285 = sum of:
          0.0711665 = weight(_text_:journals in 586) [ClassicSimilarity], result of:
            0.0711665 = score(doc=586,freq=2.0), product of:
              0.25656942 = queryWeight, product of:
                5.021064 = idf(docFreq=792, maxDocs=44218)
                0.05109862 = queryNorm
              0.2773772 = fieldWeight in 586, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                5.021064 = idf(docFreq=792, maxDocs=44218)
                0.0390625 = fieldNorm(doc=586)
          0.03461579 = weight(_text_:22 in 586) [ClassicSimilarity], result of:
            0.03461579 = score(doc=586,freq=2.0), product of:
              0.17893866 = queryWeight, product of:
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.05109862 = queryNorm
              0.19345059 = fieldWeight in 586, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.0390625 = fieldNorm(doc=586)
      0.5 = coord(1/2)
    
    Abstract
    Although the analysis of citations in the scholarly literature is now an established and relatively well understood part of information science, not enough is known about citations that can be found on the Web. In particular, are there new Web types, and if so, are these trivial or potentially useful for studying or evaluating research communication? We sought evidence based upon a sample of 1,577 Web citations of the URLs or titles of research articles in 64 open-access journals from biology, physics, chemistry, and computing. Only 25% represented intellectual impact, from references of Web documents (23%) and other informal scholarly sources (2%). Many of the Web/URL citations were created for general or subject-specific navigation (45%) or for self-publicity (22%). Additional analyses revealed significant disciplinary differences in the types of Google unique Web/URL citations as well as some characteristics of scientific open-access publishing on the Web. We conclude that the Web provides access to a new and different type of citation information, one that may therefore enable us to measure different aspects of research, and the research process in particular; but to obtain good information, the different types should be separated.
  3. Thelwall, M.; Kousha, K.: Online presentations as a source of scientific impact? : an analysis of PowerPoint files citing academic journals (2008) 0.04
    0.039783288 = product of:
      0.079566576 = sum of:
        0.079566576 = product of:
          0.15913315 = sum of:
            0.15913315 = weight(_text_:journals in 1614) [ClassicSimilarity], result of:
              0.15913315 = score(doc=1614,freq=10.0), product of:
                0.25656942 = queryWeight, product of:
                  5.021064 = idf(docFreq=792, maxDocs=44218)
                  0.05109862 = queryNorm
                0.6202343 = fieldWeight in 1614, product of:
                  3.1622777 = tf(freq=10.0), with freq of:
                    10.0 = termFreq=10.0
                  5.021064 = idf(docFreq=792, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=1614)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    Open-access online publication has made available an increasingly wide range of document types for scientometric analysis. In this article, we focus on citations in online presentations, seeking evidence of their value as nontraditional indicators of research impact. For this purpose, we searched for online PowerPoint files mentioning any one of 1,807 ISI-indexed journals in ten science and ten social science disciplines. We also manually classified 1,378 online PowerPoint citations to journals in eight additional science and social science disciplines. The results showed that very few journals were cited frequently enough in online PowerPoint files to make impact assessment worthwhile, with the main exceptions being popular magazines like Scientific American and Harvard Business Review. Surprisingly, however, there was little difference overall in the number of PowerPoint citations to science and to the social sciences, and also in the proportion representing traditional impact (about 60%) and wider impact (about 15%). It seems that the main scientometric value for online presentations may be in tracking the popularization of research, or for comparing the impact of whole journals rather than individual articles.
  4. Levitt, J.M.; Thelwall, M.: Is multidisciplinary research more highly cited? : a macrolevel study (2008) 0.04
    0.039783288 = product of:
      0.079566576 = sum of:
        0.079566576 = product of:
          0.15913315 = sum of:
            0.15913315 = weight(_text_:journals in 2375) [ClassicSimilarity], result of:
              0.15913315 = score(doc=2375,freq=10.0), product of:
                0.25656942 = queryWeight, product of:
                  5.021064 = idf(docFreq=792, maxDocs=44218)
                  0.05109862 = queryNorm
                0.6202343 = fieldWeight in 2375, product of:
                  3.1622777 = tf(freq=10.0), with freq of:
                    10.0 = termFreq=10.0
                  5.021064 = idf(docFreq=792, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=2375)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    Interdisciplinary collaboration is a major goal in research policy. This study uses citation analysis to examine diverse subjects in the Web of Science and Scopus to ascertain whether, in general, research published in journals classified in more than one subject is more highly cited than research published in journals classified in a single subject. For each subject, the study divides the journals into two disjoint sets called Multi and Mono. Multi consists of all journals in the subject and at least one other subject whereas Mono consists of all journals in the subject and in no other subject. The main findings are: (a) For social science subject categories in both the Web of Science and Scopus, the average citation levels of articles in Mono and Multi are very similar; and (b) for Scopus subject categories within life sciences, health sciences, and physical sciences, the average citation level of Mono articles is roughly twice that of Multi articles. Hence, one cannot assume that in general, multidisciplinary research will be more highly cited, and the converse is probably true for many areas of science. A policy implication is that, at least in the sciences, multidisciplinary researchers should not be evaluated by citations on the same basis as monodisciplinary researchers.
  5. Didegah, F.; Thelwall, M.: Determinants of research citation impact in nanoscience and nanotechnology (2013) 0.04
    0.036979202 = product of:
      0.073958404 = sum of:
        0.073958404 = product of:
          0.14791681 = sum of:
            0.14791681 = weight(_text_:journals in 737) [ClassicSimilarity], result of:
              0.14791681 = score(doc=737,freq=6.0), product of:
                0.25656942 = queryWeight, product of:
                  5.021064 = idf(docFreq=792, maxDocs=44218)
                  0.05109862 = queryNorm
                0.5765177 = fieldWeight in 737, product of:
                  2.4494898 = tf(freq=6.0), with freq of:
                    6.0 = termFreq=6.0
                  5.021064 = idf(docFreq=792, maxDocs=44218)
                  0.046875 = fieldNorm(doc=737)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    This study investigates a range of metrics available when a nanoscience and nanotechnology article is published to see which metrics correlate more with the number of citations to the article. It also introduces the degree of internationality of journals and references as new metrics for this purpose. The journal impact factor; the impact of references; the internationality of authors, journals, and references; and the number of authors, institutions, and references were all calculated for papers published in nanoscience and nanotechnology journals in the Web of Science from 2007 to 2009. Using a zero-inflated negative binomial regression model on the data set, the impact factor of the publishing journal and the citation impact of the cited references were found to be the most effective determinants of citation counts in all four time periods. In the entire 2007 to 2009 period, apart from journal internationality and author numbers and internationality, all other predictor variables had significant effects on citation counts.
  6. Shema, H.; Bar-Ilan, J.; Thelwall, M.: Do blog citations correlate with a higher number of future citations? : Research blogs as a potential source for alternative metrics (2014) 0.03
    0.030193392 = product of:
      0.060386784 = sum of:
        0.060386784 = product of:
          0.12077357 = sum of:
            0.12077357 = weight(_text_:journals in 1258) [ClassicSimilarity], result of:
              0.12077357 = score(doc=1258,freq=4.0), product of:
                0.25656942 = queryWeight, product of:
                  5.021064 = idf(docFreq=792, maxDocs=44218)
                  0.05109862 = queryNorm
                0.47072473 = fieldWeight in 1258, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  5.021064 = idf(docFreq=792, maxDocs=44218)
                  0.046875 = fieldNorm(doc=1258)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    Journal-based citations are an important source of data for impact indices. However, the impact of journal articles extends beyond formal scholarly discourse. Measuring online scholarly impact calls for new indices, complementary to the older ones. This article examines a possible alternative metric source, blog posts aggregated at ResearchBlogging.org, which discuss peer-reviewed articles and provide full bibliographic references. Articles reviewed in these blogs therefore receive "blog citations." We hypothesized that articles receiving blog citations close to their publication time receive more journal citations later than the articles in the same journal published in the same year that did not receive such blog citations. Statistically significant evidence for articles published in 2009 and 2010 support this hypothesis for seven of 12 journals (58%) in 2009 and 13 of 19 journals (68%) in 2010. We suggest, based on these results, that blog citations can be used as an alternative metric source.
  7. Maflahi, N.; Thelwall, M.: When are readership counts as useful as citation counts? : Scopus versus Mendeley for LIS journals (2016) 0.03
    0.030193392 = product of:
      0.060386784 = sum of:
        0.060386784 = product of:
          0.12077357 = sum of:
            0.12077357 = weight(_text_:journals in 2495) [ClassicSimilarity], result of:
              0.12077357 = score(doc=2495,freq=4.0), product of:
                0.25656942 = queryWeight, product of:
                  5.021064 = idf(docFreq=792, maxDocs=44218)
                  0.05109862 = queryNorm
                0.47072473 = fieldWeight in 2495, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  5.021064 = idf(docFreq=792, maxDocs=44218)
                  0.046875 = fieldNorm(doc=2495)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    In theory, articles can attract readers on the social reference sharing site Mendeley before they can attract citations, so Mendeley altmetrics could provide early indications of article impact. This article investigates the influence of time on the number of Mendeley readers of an article through a theoretical discussion and an investigation into the relationship between counts of readers of, and citations to, 4 general library and information science (LIS) journals. For this discipline, it takes about 7 years for articles to attract as many Scopus citations as Mendeley readers, and after this the Spearman correlation between readers and citers is stable at about 0.6 for all years. This suggests that Mendeley readership counts may be useful impact indicators for both newer and older articles. The lack of dates for individual Mendeley article readers and an unknown bias toward more recent articles mean that readership data should be normalized individually by year, however, before making any comparisons between articles published in different years.
  8. Thelwall, M.: Results from a web impact factor crawler (2001) 0.03
    0.025161162 = product of:
      0.050322324 = sum of:
        0.050322324 = product of:
          0.10064465 = sum of:
            0.10064465 = weight(_text_:journals in 4490) [ClassicSimilarity], result of:
              0.10064465 = score(doc=4490,freq=4.0), product of:
                0.25656942 = queryWeight, product of:
                  5.021064 = idf(docFreq=792, maxDocs=44218)
                  0.05109862 = queryNorm
                0.39227062 = fieldWeight in 4490, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  5.021064 = idf(docFreq=792, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=4490)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    Web impact factors, the proposed web equivalent of impact factors for journals, can be calculated by using search engines. It has been found that the results are problematic because of the variable coverage of search engines as well as their ability to give significantly different results over short periods of time. The fundamental problem is that although some search engines provide a functionality that is capable of being used for impact calculations, this is not their primary task and therefore they do not give guarantees as to performance in this respect. In this paper, a bespoke web crawler designed specifically for the calculation of reliable WIFs is presented. This crawler was used to calculate WIFs for a number of UK universities, and the results of these calculations are discussed. The principal findings were that with certain restrictions, WIFs can be calculated reliably, but do not correlate with accepted research rankings owing to the variety of material hosted on university servers. Changes to the calculations to improve the fit of the results to research rankings are proposed, but there are still inherent problems undermining the reliability of the calculation. These problems still apply if the WIF scores are taken on their own as indicators of the general impact of any area of the Internet, but with care would not apply to online journals.
  9. Kousha, K.; Thelwall, M.: Assessing the impact of disciplinary research on teaching : an automatic analysis of online syllabuses (2008) 0.03
    0.025161162 = product of:
      0.050322324 = sum of:
        0.050322324 = product of:
          0.10064465 = sum of:
            0.10064465 = weight(_text_:journals in 2383) [ClassicSimilarity], result of:
              0.10064465 = score(doc=2383,freq=4.0), product of:
                0.25656942 = queryWeight, product of:
                  5.021064 = idf(docFreq=792, maxDocs=44218)
                  0.05109862 = queryNorm
                0.39227062 = fieldWeight in 2383, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  5.021064 = idf(docFreq=792, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=2383)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    The impact of published academic research in the sciences and social sciences, when measured, is commonly estimated by counting citations from journal articles. The Web has now introduced new potential sources of quantitative data online that could be used to measure aspects of research impact. In this article we assess the extent to which citations from online syllabuses could be a valuable source of evidence about the educational utility of research. An analysis of online syllabus citations to 70,700 articles published in 2003 in the journals of 12 subjects indicates that online syllabus citations were sufficiently numerous to be a useful impact indictor in some social sciences, including political science and information and library science, but not in others, nor in any sciences. This result was consistent with current social science research having, in general, more educational value than current science research. Moreover, articles frequently cited in online syllabuses were not necessarily highly cited by other articles. Hence it seems that online syllabus citations provide a valuable additional source of evidence about the impact of journals, scholars, and research articles in some social sciences.
  10. Haustein, S.; Peters, I.; Sugimoto, C.R.; Thelwall, M.; Larivière, V.: Tweeting biomedicine : an analysis of tweets and citations in the biomedical literature (2014) 0.03
    0.025161162 = product of:
      0.050322324 = sum of:
        0.050322324 = product of:
          0.10064465 = sum of:
            0.10064465 = weight(_text_:journals in 1229) [ClassicSimilarity], result of:
              0.10064465 = score(doc=1229,freq=4.0), product of:
                0.25656942 = queryWeight, product of:
                  5.021064 = idf(docFreq=792, maxDocs=44218)
                  0.05109862 = queryNorm
                0.39227062 = fieldWeight in 1229, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  5.021064 = idf(docFreq=792, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=1229)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    Data collected by social media platforms have been introduced as new sources for indicators to help measure the impact of scholarly research in ways that are complementary to traditional citation analysis. Data generated from social media activities can be used to reflect broad types of impact. This article aims to provide systematic evidence about how often Twitter is used to disseminate information about journal articles in the biomedical sciences. The analysis is based on 1.4 million documents covered by both PubMed and Web of Science and published between 2010 and 2012. The number of tweets containing links to these documents was analyzed and compared to citations to evaluate the degree to which certain journals, disciplines, and specialties were represented on Twitter and how far tweets correlate with citation impact. With less than 10% of PubMed articles mentioned on Twitter, its uptake is low in general but differs between journals and specialties. Correlations between tweets and citations are low, implying that impact metrics based on tweets are different from those based on citations. A framework using the coverage of articles and the correlation between Twitter mentions and citations is proposed to facilitate the evaluation of novel social-media-based metrics.
  11. Maflahi, N.; Thelwall, M.: How quickly do publications get read? : the evolution of mendeley reader counts for new articles (2018) 0.03
    0.025161162 = product of:
      0.050322324 = sum of:
        0.050322324 = product of:
          0.10064465 = sum of:
            0.10064465 = weight(_text_:journals in 4015) [ClassicSimilarity], result of:
              0.10064465 = score(doc=4015,freq=4.0), product of:
                0.25656942 = queryWeight, product of:
                  5.021064 = idf(docFreq=792, maxDocs=44218)
                  0.05109862 = queryNorm
                0.39227062 = fieldWeight in 4015, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  5.021064 = idf(docFreq=792, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=4015)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    Within science, citation counts are widely used to estimate research impact but publication delays mean that they are not useful for recent research. This gap can be filled by Mendeley reader counts, which are valuable early impact indicators for academic articles because they appear before citations and correlate strongly with them. Nevertheless, it is not known how Mendeley readership counts accumulate within the year of publication, and so it is unclear how soon they can be used. In response, this paper reports a longitudinal weekly study of the Mendeley readers of articles in 6 library and information science journals from 2016. The results suggest that Mendeley readers accrue from when articles are first available online and continue to steadily build. For journals with large publication delays, articles can already have substantial numbers of readers by their publication date. Thus, Mendeley reader counts may even be useful as early impact indicators for articles before they have been officially published in a journal issue. If field normalized indicators are needed, then these can be generated when journal issues are published using the online first date.
  12. Kousha, K.; Thelwall, M.: Google Scholar citations and Google Web/URL citations : a multi-discipline exploratory analysis (2007) 0.02
    0.017791625 = product of:
      0.03558325 = sum of:
        0.03558325 = product of:
          0.0711665 = sum of:
            0.0711665 = weight(_text_:journals in 337) [ClassicSimilarity], result of:
              0.0711665 = score(doc=337,freq=2.0), product of:
                0.25656942 = queryWeight, product of:
                  5.021064 = idf(docFreq=792, maxDocs=44218)
                  0.05109862 = queryNorm
                0.2773772 = fieldWeight in 337, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  5.021064 = idf(docFreq=792, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=337)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    We use a new data gathering method, "Web/URL citation," Web/URL and Google Scholar to compare traditional and Web-based citation patterns across multiple disciplines (biology, chemistry, physics, computing, sociology, economics, psychology, and education) based upon a sample of 1,650 articles from 108 open access (OA) journals published in 2001. A Web/URL citation of an online journal article is a Web mention of its title, URL, or both. For each discipline, except psychology, we found significant correlations between Thomson Scientific (formerly Thomson ISI, here: ISI) citations and both Google Scholar and Google Web/URL citations. Google Scholar citations correlated more highly with ISI citations than did Google Web/URL citations, indicating that the Web/URL method measures a broader type of citation phenomenon. Google Scholar citations were more numerous than ISI citations in computer science and the four social science disciplines, suggesting that Google Scholar is more comprehensive for social sciences and perhaps also when conference articles are valued and published online. We also found large disciplinary differences in the percentage overlap between ISI and Google Scholar citation sources. Finally, although we found many significant trends, there were also numerous exceptions, suggesting that replacing traditional citation sources with the Web or Google Scholar for research impact calculations would be problematic.
  13. Kousha, K.; Thelwall, M.: Google book search : citation analysis for social science and the humanities (2009) 0.02
    0.017791625 = product of:
      0.03558325 = sum of:
        0.03558325 = product of:
          0.0711665 = sum of:
            0.0711665 = weight(_text_:journals in 2946) [ClassicSimilarity], result of:
              0.0711665 = score(doc=2946,freq=2.0), product of:
                0.25656942 = queryWeight, product of:
                  5.021064 = idf(docFreq=792, maxDocs=44218)
                  0.05109862 = queryNorm
                0.2773772 = fieldWeight in 2946, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  5.021064 = idf(docFreq=792, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=2946)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    In both the social sciences and the humanities, books and monographs play significant roles in research communication. The absence of citations from most books and monographs from the Thomson Reuters/Institute for Scientific Information databases (ISI) has been criticized, but attempts to include citations from or to books in the research evaluation of the social sciences and humanities have not led to widespread adoption. This article assesses whether Google Book Search (GBS) can partially fill this gap by comparing citations from books with citations from journal articles to journal articles in 10 science, social science, and humanities disciplines. Book citations were 31% to 212% of ISI citations and, hence, numerous enough to supplement ISI citations in the social sciences and humanities covered, but not in the sciences (3%-5%), except for computing (46%), due to numerous published conference proceedings. A case study was also made of all 1,923 articles in the 51 information science and library science ISI-indexed journals published in 2003. Within this set, highly book-cited articles tended to receive many ISI citations, indicating a significant relationship between the two types of citation data, but with important exceptions that point to the additional information provided by book citations. In summary, GBS is clearly a valuable new source of citation data for the social sciences and humanities. One practical implication is that book-oriented scholars should consult it for additional citations to their work when applying for promotion and tenure.
  14. Kousha, K.; Thelwall, M.: Disseminating research with web CV hyperlinks (2014) 0.02
    0.017791625 = product of:
      0.03558325 = sum of:
        0.03558325 = product of:
          0.0711665 = sum of:
            0.0711665 = weight(_text_:journals in 1331) [ClassicSimilarity], result of:
              0.0711665 = score(doc=1331,freq=2.0), product of:
                0.25656942 = queryWeight, product of:
                  5.021064 = idf(docFreq=792, maxDocs=44218)
                  0.05109862 = queryNorm
                0.2773772 = fieldWeight in 1331, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  5.021064 = idf(docFreq=792, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=1331)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    Some curricula vitae (web CVs) of academics on the web, including homepages and publication lists, link to open-access (OA) articles, resources, abstracts in publishers' websites, or academic discussions, helping to disseminate research. To assess how common such practices are and whether they vary by discipline, gender, and country, the authors conducted a large-scale e-mail survey of astronomy and astrophysics, public health, environmental engineering, and philosophy across 15 European countries and analyzed hyperlinks from web CVs of academics. About 60% of the 2,154 survey responses reported having a web CV or something similar, and there were differences between disciplines, genders, and countries. A follow-up outlink analysis of 2,700 web CVs found that a third had at least one outlink to an OA target, typically a public eprint archive or an individual self-archived file. This proportion was considerably higher in astronomy (48%) and philosophy (37%) than in environmental engineering (29%) and public health (21%). There were also differences in linking to publishers' websites, resources, and discussions. Perhaps most important, however, the amount of linking to OA publications seems to be much lower than allowed by publishers and journals, suggesting that many opportunities for disseminating full-text research online are being missed, especially in disciplines without established repositories. Moreover, few academics seem to be exploiting their CVs to link to discussions, resources, or article abstracts, which seems to be another missed opportunity for publicizing research.
  15. Shema, H.; Bar-Ilan, J.; Thelwall, M.: How is research blogged? : A content analysis approach (2015) 0.02
    0.017791625 = product of:
      0.03558325 = sum of:
        0.03558325 = product of:
          0.0711665 = sum of:
            0.0711665 = weight(_text_:journals in 1863) [ClassicSimilarity], result of:
              0.0711665 = score(doc=1863,freq=2.0), product of:
                0.25656942 = queryWeight, product of:
                  5.021064 = idf(docFreq=792, maxDocs=44218)
                  0.05109862 = queryNorm
                0.2773772 = fieldWeight in 1863, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  5.021064 = idf(docFreq=792, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=1863)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    Blogs that cite academic articles have emerged as a potential source of alternative impact metrics for the visibility of the blogged articles. Nevertheless, to evaluate more fully the value of blog citations, it is necessary to investigate whether research blogs focus on particular types of articles or give new perspectives on scientific discourse. Therefore, we studied the characteristics of peer-reviewed references in blogs and the typical content of blog posts to gain insight into bloggers' motivations. The sample consisted of 391 blog posts from 2010 to 2012 in Researchblogging.org's health category. The bloggers mostly cited recent research articles or reviews from top multidisciplinary and general medical journals. Using content analysis methods, we created a general classification scheme for blog post content with 10 major topic categories, each with several subcategories. The results suggest that health research bloggers rarely self-cite and that the vast majority of their blog posts (90%) include a general discussion of the issue covered in the article, with more than one quarter providing health-related advice based on the article(s) covered. These factors suggest a genuine attempt to engage with a wider, nonacademic audience. Nevertheless, almost 30% of the posts included some criticism of the issues being discussed.
  16. Barjak, F.; Li, X.; Thelwall, M.: Which factors explain the Web impact of scientists' personal homepages? (2007) 0.01
    0.014233301 = product of:
      0.028466603 = sum of:
        0.028466603 = product of:
          0.056933206 = sum of:
            0.056933206 = weight(_text_:journals in 73) [ClassicSimilarity], result of:
              0.056933206 = score(doc=73,freq=2.0), product of:
                0.25656942 = queryWeight, product of:
                  5.021064 = idf(docFreq=792, maxDocs=44218)
                  0.05109862 = queryNorm
                0.22190176 = fieldWeight in 73, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  5.021064 = idf(docFreq=792, maxDocs=44218)
                  0.03125 = fieldNorm(doc=73)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    In recent years, a considerable body of Webometric research has used hyperlinks to generate indicators for the impact of Web documents and the organizations that created them. The relationship between this Web impact and other, offline impact indicators has been explored for entire universities, departments, countries, and scientific journals, but not yet for individual scientists-an important omission. The present research closes this gap by investigating factors that may influence the Web impact (i.e., inlink counts) of scientists' personal homepages. Data concerning 456 scientists from five scientific disciplines in six European countries were analyzed, showing that both homepage content and personal and institutional characteristics of the homepage owners had significant relationships with inlink counts. A multivariate statistical analysis confirmed that full-text articles are the most linked-to content in homepages. At the individual homepage level, hyperlinks are related to several offline characteristics. Notable differences regarding total inlinks to scientists' homepages exist between the scientific disciplines and the countries in the sample. There also are both gender and age effects: fewer external inlinks (i.e., links from other Web domains) to the homepages of female and of older scientists. There is only a weak relationship between a scientist's recognition and homepage inlinks and, surprisingly, no relationship between research productivity and inlink counts. Contrary to expectations, the size of collaboration networks is negatively related to hyperlink counts. Some of the relationships between hyperlinks to homepages and the properties of their owners can be explained by the content that the homepage owners put on their homepage and their level of Internet use; however, the findings about productivity and collaborations do not seem to have a simple, intuitive explanation. Overall, the results emphasize the complexity of the phenomenon of Web linking, when analyzed at the level of individual pages.
  17. Thelwall, M.; Ruschenburg, T.: Grundlagen und Forschungsfelder der Webometrie (2006) 0.01
    0.0138463145 = product of:
      0.027692629 = sum of:
        0.027692629 = product of:
          0.055385258 = sum of:
            0.055385258 = weight(_text_:22 in 77) [ClassicSimilarity], result of:
              0.055385258 = score(doc=77,freq=2.0), product of:
                0.17893866 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.05109862 = queryNorm
                0.30952093 = fieldWeight in 77, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0625 = fieldNorm(doc=77)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Date
    4.12.2006 12:12:22
  18. Levitt, J.M.; Thelwall, M.: Citation levels and collaboration within library and information science (2009) 0.01
    0.012238529 = product of:
      0.024477057 = sum of:
        0.024477057 = product of:
          0.048954114 = sum of:
            0.048954114 = weight(_text_:22 in 2734) [ClassicSimilarity], result of:
              0.048954114 = score(doc=2734,freq=4.0), product of:
                0.17893866 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.05109862 = queryNorm
                0.27358043 = fieldWeight in 2734, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=2734)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    Collaboration is a major research policy objective, but does it deliver higher quality research? This study uses citation analysis to examine the Web of Science (WoS) Information Science & Library Science subject category (IS&LS) to ascertain whether, in general, more highly cited articles are more highly collaborative than other articles. It consists of two investigations. The first investigation is a longitudinal comparison of the degree and proportion of collaboration in five strata of citation; it found that collaboration in the highest four citation strata (all in the most highly cited 22%) increased in unison over time, whereas collaboration in the lowest citation strata (un-cited articles) remained low and stable. Given that over 40% of the articles were un-cited, it seems important to take into account the differences found between un-cited articles and relatively highly cited articles when investigating collaboration in IS&LS. The second investigation compares collaboration for 35 influential information scientists; it found that their more highly cited articles on average were not more highly collaborative than their less highly cited articles. In summary, although collaborative research is conducive to high citation in general, collaboration has apparently not tended to be essential to the success of current and former elite information scientists.
    Date
    22. 3.2009 12:43:51
  19. Thelwall, M.; Buckley, K.; Paltoglou, G.: Sentiment in Twitter events (2011) 0.01
    0.010384736 = product of:
      0.020769471 = sum of:
        0.020769471 = product of:
          0.041538943 = sum of:
            0.041538943 = weight(_text_:22 in 4345) [ClassicSimilarity], result of:
              0.041538943 = score(doc=4345,freq=2.0), product of:
                0.17893866 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.05109862 = queryNorm
                0.23214069 = fieldWeight in 4345, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.046875 = fieldNorm(doc=4345)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Date
    22. 1.2011 14:27:06
  20. Thelwall, M.; Maflahi, N.: Guideline references and academic citations as evidence of the clinical value of health research (2016) 0.01
    0.010384736 = product of:
      0.020769471 = sum of:
        0.020769471 = product of:
          0.041538943 = sum of:
            0.041538943 = weight(_text_:22 in 2856) [ClassicSimilarity], result of:
              0.041538943 = score(doc=2856,freq=2.0), product of:
                0.17893866 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.05109862 = queryNorm
                0.23214069 = fieldWeight in 2856, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.046875 = fieldNorm(doc=2856)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Date
    19. 3.2016 12:22:00