Search (8 results, page 1 of 1)

  • × author_ss:"Haustein, S."
  • × year_i:[2010 TO 2020}
  1. Lewandowski, D.; Haustein, S.: What does the German-language information science community cite? (2015) 0.01
    0.0051520485 = product of:
      0.020608194 = sum of:
        0.020608194 = weight(_text_:information in 2987) [ClassicSimilarity], result of:
          0.020608194 = score(doc=2987,freq=6.0), product of:
            0.06134496 = queryWeight, product of:
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.034944877 = queryNorm
            0.3359395 = fieldWeight in 2987, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.078125 = fieldNorm(doc=2987)
      0.25 = coord(1/4)
    
    Source
    Re:inventing information science in the networked society: Proceedings of the 14th International Symposium on Information Science, Zadar/Croatia, 19th-21st May 2015. Eds.: F. Pehar, C. Schloegl u. C. Wolff
  2. Haustein, S.: Scientific interactions and research evaluation : from bibliometrics to Altmetrics (2015) 0.00
    0.0042066295 = product of:
      0.016826518 = sum of:
        0.016826518 = weight(_text_:information in 2981) [ClassicSimilarity], result of:
          0.016826518 = score(doc=2981,freq=4.0), product of:
            0.06134496 = queryWeight, product of:
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.034944877 = queryNorm
            0.27429342 = fieldWeight in 2981, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.078125 = fieldNorm(doc=2981)
      0.25 = coord(1/4)
    
    Source
    Re:inventing information science in the networked society: Proceedings of the 14th International Symposium on Information Science, Zadar/Croatia, 19th-21st May 2015. Eds.: F. Pehar, C. Schloegl u. C. Wolff
  3. Haustein, S.; Tunger, D.: Sziento- und bibliometrische Verfahren (2013) 0.00
    0.0029745363 = product of:
      0.011898145 = sum of:
        0.011898145 = weight(_text_:information in 730) [ClassicSimilarity], result of:
          0.011898145 = score(doc=730,freq=2.0), product of:
            0.06134496 = queryWeight, product of:
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.034944877 = queryNorm
            0.19395474 = fieldWeight in 730, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.078125 = fieldNorm(doc=730)
      0.25 = coord(1/4)
    
    Source
    Grundlagen der praktischen Information und Dokumentation. Handbuch zur Einführung in die Informationswissenschaft und -praxis. 6., völlig neu gefaßte Ausgabe. Hrsg. von R. Kuhlen, W. Semar u. D. Strauch. Begründet von Klaus Laisiepen, Ernst Lutterbeck, Karl-Heinrich Meyer-Uhlenried
  4. Haustein, S.; Bowman, T.D.; Holmberg, K.; Tsou, A.; Sugimoto, C.R.; Larivière, V.: Tweets as impact indicators : Examining the implications of automated "bot" accounts on Twitter (2016) 0.00
    0.0025239778 = product of:
      0.010095911 = sum of:
        0.010095911 = weight(_text_:information in 2502) [ClassicSimilarity], result of:
          0.010095911 = score(doc=2502,freq=4.0), product of:
            0.06134496 = queryWeight, product of:
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.034944877 = queryNorm
            0.16457605 = fieldWeight in 2502, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.046875 = fieldNorm(doc=2502)
      0.25 = coord(1/4)
    
    Abstract
    This brief communication presents preliminary findings on automated Twitter accounts distributing links to scientific articles deposited on the preprint repository arXiv. It discusses the implication of the presence of such bots from the perspective of social media metrics (altmetrics), where mentions of scholarly documents on Twitter have been suggested as a means of measuring impact that is both broader and timelier than citations. Our results show that automated Twitter accounts create a considerable amount of tweets to scientific articles and that they behave differently than common social bots, which has critical implications for the use of raw tweet counts in research evaluation and assessment. We discuss some definitions of Twitter cyborgs and bots in scholarly communication and propose distinguishing between different levels of engagement-that is, differentiating between tweeting only bibliographic information to discussing or commenting on the content of a scientific work.
    Source
    Journal of the Association for Information Science and Technology. 67(2016) no.1, S.232-238
  5. Haustein, S.; Peters, I.; Sugimoto, C.R.; Thelwall, M.; Larivière, V.: Tweeting biomedicine : an analysis of tweets and citations in the biomedical literature (2014) 0.00
    0.0021033147 = product of:
      0.008413259 = sum of:
        0.008413259 = weight(_text_:information in 1229) [ClassicSimilarity], result of:
          0.008413259 = score(doc=1229,freq=4.0), product of:
            0.06134496 = queryWeight, product of:
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.034944877 = queryNorm
            0.13714671 = fieldWeight in 1229, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.0390625 = fieldNorm(doc=1229)
      0.25 = coord(1/4)
    
    Abstract
    Data collected by social media platforms have been introduced as new sources for indicators to help measure the impact of scholarly research in ways that are complementary to traditional citation analysis. Data generated from social media activities can be used to reflect broad types of impact. This article aims to provide systematic evidence about how often Twitter is used to disseminate information about journal articles in the biomedical sciences. The analysis is based on 1.4 million documents covered by both PubMed and Web of Science and published between 2010 and 2012. The number of tweets containing links to these documents was analyzed and compared to citations to evaluate the degree to which certain journals, disciplines, and specialties were represented on Twitter and how far tweets correlate with citation impact. With less than 10% of PubMed articles mentioned on Twitter, its uptake is low in general but differs between journals and specialties. Correlations between tweets and citations are low, implying that impact metrics based on tweets are different from those based on citations. A framework using the coverage of articles and the correlation between Twitter mentions and citations is proposed to facilitate the evaluation of novel social-media-based metrics.
    Source
    Journal of the Association for Information Science and Technology. 65(2014) no.4, S.656-669
  6. Mohammadi, E.; Thelwall, M.; Haustein, S.; Larivière, V.: Who reads research articles? : an altmetrics analysis of Mendeley user categories (2015) 0.00
    0.0021033147 = product of:
      0.008413259 = sum of:
        0.008413259 = weight(_text_:information in 2162) [ClassicSimilarity], result of:
          0.008413259 = score(doc=2162,freq=4.0), product of:
            0.06134496 = queryWeight, product of:
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.034944877 = queryNorm
            0.13714671 = fieldWeight in 2162, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.0390625 = fieldNorm(doc=2162)
      0.25 = coord(1/4)
    
    Abstract
    Little detailed information is known about who reads research articles and the contexts in which research articles are read. Using data about people who register in Mendeley as readers of articles, this article explores different types of users of Clinical Medicine, Engineering and Technology, Social Science, Physics, and Chemistry articles inside and outside academia. The majority of readers for all disciplines were PhD students, postgraduates, and postdocs but other types of academics were also represented. In addition, many Clinical Medicine articles were read by medical professionals. The highest correlations between citations and Mendeley readership counts were found for types of users who often authored academic articles, except for associate professors in some sub-disciplines. This suggests that Mendeley readership can reflect usage similar to traditional citation impact if the data are restricted to readers who are also authors without the delay of impact measured by citation counts. At the same time, Mendeley statistics can also reveal the hidden impact of some research articles, such as educational value for nonauthor users inside academia or the impact of research articles on practice for readers outside academia.
    Source
    Journal of the Association for Information Science and Technology. 66(2015) no.9, S.1832-1846
  7. Haustein, S.; Sugimoto, C.; Larivière, V.: Social media in scholarly communication : Guest editorial (2015) 0.00
    0.0019953798 = product of:
      0.007981519 = sum of:
        0.007981519 = weight(_text_:information in 3809) [ClassicSimilarity], result of:
          0.007981519 = score(doc=3809,freq=10.0), product of:
            0.06134496 = queryWeight, product of:
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.034944877 = queryNorm
            0.1301088 = fieldWeight in 3809, product of:
              3.1622777 = tf(freq=10.0), with freq of:
                10.0 = termFreq=10.0
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.0234375 = fieldNorm(doc=3809)
      0.25 = coord(1/4)
    
    Abstract
    This year marks 350 years since the inaugural publications of both the Journal des Sçavans and the Philosophical Transactions, first published in 1665 and considered the birth of the peer-reviewed journal article. This form of scholarly communication has not only remained the dominant model for disseminating new knowledge (particularly for science and medicine), but has also increased substantially in volume. Derek de Solla Price - the "father of scientometrics" (Merton and Garfield, 1986, p. vii) - was the first to document the exponential increase in scientific journals and showed that "scientists have always felt themselves to be awash in a sea of the scientific literature" (Price, 1963, p. 15), as, for example, expressed at the 1948 Royal Society's Scientific Information Conference: Not for the first time in history, but more acutely than ever before, there was a fear that scientists would be overwhelmed, that they would be no longer able to control the vast amounts of potentially relevant material that were pouring forth from the world's presses, that science itself was under threat (Bawden and Robinson, 2008, p. 183).
    One of the solutions to help scientists filter the most relevant publications and, thus, to stay current on developments in their fields during the transition from "little science" to "big science", was the introduction of citation indexing as a Wellsian "World Brain" (Garfield, 1964) of scientific information: It is too much to expect a research worker to spend an inordinate amount of time searching for the bibliographic descendants of antecedent papers. It would not be excessive to demand that the thorough scholar check all papers that have cited or criticized such papers, if they could be located quickly. The citation index makes this check practicable (Garfield, 1955, p. 108). In retrospective, citation indexing can be perceived as a pre-social web version of crowdsourcing, as it is based on the concept that the community of citing authors outperforms indexers in highlighting cognitive links between papers, particularly on the level of specific ideas and concepts (Garfield, 1983). Over the last 50 years, citation analysis and more generally, bibliometric methods, have developed from information retrieval tools to research evaluation metrics, where they are presumed to make scientific funding more efficient and effective (Moed, 2006). However, the dominance of bibliometric indicators in research evaluation has also led to significant goal displacement (Merton, 1957) and the oversimplification of notions of "research productivity" and "scientific quality", creating adverse effects such as salami publishing, honorary authorships, citation cartels, and misuse of indicators (Binswanger, 2015; Cronin and Sugimoto, 2014; Frey and Osterloh, 2006; Haustein and Larivière, 2015; Weingart, 2005).
    Furthermore, the rise of the web, and subsequently, the social web, has challenged the quasi-monopolistic status of the journal as the main form of scholarly communication and citation indices as the primary assessment mechanisms. Scientific communication is becoming more open, transparent, and diverse: publications are increasingly open access; manuscripts, presentations, code, and data are shared online; research ideas and results are discussed and criticized openly on blogs; and new peer review experiments, with open post publication assessment by anonymous or non-anonymous referees, are underway. The diversification of scholarly production and assessment, paired with the increasing speed of the communication process, leads to an increased information overload (Bawden and Robinson, 2008), demanding new filters. The concept of altmetrics, short for alternative (to citation) metrics, was created out of an attempt to provide a filter (Priem et al., 2010) and to steer against the oversimplification of the measurement of scientific success solely on the basis of number of journal articles published and citations received, by considering a wider range of research outputs and metrics (Piwowar, 2013). Although the term altmetrics was introduced in a tweet in 2010 (Priem, 2010), the idea of capturing traces - "polymorphous mentioning" (Cronin et al., 1998, p. 1320) - of scholars and their documents on the web to measure "impact" of science in a broader manner than citations was introduced years before, largely in the context of webometrics (Almind and Ingwersen, 1997; Thelwall et al., 2005):
    Source
    Aslib journal of information management. 67(2015) no.3, S.260-288
  8. Sugimoto, C.R.; Work, S.; Larivière, V.; Haustein, S.: Scholarly use of social media and altmetrics : A review of the literature (2017) 0.00
    0.0017847219 = product of:
      0.0071388874 = sum of:
        0.0071388874 = weight(_text_:information in 3781) [ClassicSimilarity], result of:
          0.0071388874 = score(doc=3781,freq=2.0), product of:
            0.06134496 = queryWeight, product of:
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.034944877 = queryNorm
            0.116372846 = fieldWeight in 3781, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.046875 = fieldNorm(doc=3781)
      0.25 = coord(1/4)
    
    Source
    Journal of the Association for Information Science and Technology. 68(2017) no.9, S.2037-2062