Search (11 results, page 1 of 1)

  • × author_ss:"Sugimoto, C.R."
  1. Meho, L.I.; Sugimoto, C.R.: Assessing the scholarly impact of information studies : a tale of two citation databases - Scopus and Web of Science (2009) 0.02
    0.01676857 = product of:
      0.06707428 = sum of:
        0.06707428 = product of:
          0.13414855 = sum of:
            0.13414855 = weight(_text_:assessment in 3298) [ClassicSimilarity], result of:
              0.13414855 = score(doc=3298,freq=4.0), product of:
                0.25917634 = queryWeight, product of:
                  5.52102 = idf(docFreq=480, maxDocs=44218)
                  0.04694356 = queryNorm
                0.51759565 = fieldWeight in 3298, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  5.52102 = idf(docFreq=480, maxDocs=44218)
                  0.046875 = fieldNorm(doc=3298)
          0.5 = coord(1/2)
      0.25 = coord(1/4)
    
    Abstract
    This study uses citations, from 1996 to 2007, to the work of 80 randomly selected full-time, information studies (IS) faculty members from North America to examine differences between Scopus and Web of Science in assessing the scholarly impact of the field focusing on the most frequently citing journals, conference proceedings, research domains and institutions, as well as all citing countries. Results show that when assessment is limited to smaller citing entities (e.g., journals, conference proceedings, institutions), the two databases produce considerably different results, whereas when assessment is limited to larger citing entities (e.g., research domains, countries), the two databases produce very similar pictures of scholarly impact. In the former case, the use of Scopus (for journals and institutions) and both Scopus and Web of Science (for conference proceedings) is necessary to more accurately assess or visualize the scholarly impact of IS, whereas in the latter case, assessing or visualizing the scholarly impact of IS is independent of the database used.
  2. Sugimoto, C.R.; Weingart, S.: ¬The kaleidoscope of disciplinarity (2015) 0.01
    0.011857167 = product of:
      0.047428668 = sum of:
        0.047428668 = product of:
          0.094857335 = sum of:
            0.094857335 = weight(_text_:assessment in 2141) [ClassicSimilarity], result of:
              0.094857335 = score(doc=2141,freq=2.0), product of:
                0.25917634 = queryWeight, product of:
                  5.52102 = idf(docFreq=480, maxDocs=44218)
                  0.04694356 = queryNorm
                0.36599535 = fieldWeight in 2141, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  5.52102 = idf(docFreq=480, maxDocs=44218)
                  0.046875 = fieldNorm(doc=2141)
          0.5 = coord(1/2)
      0.25 = coord(1/4)
    
    Abstract
    Purpose The purpose of this paper is to identify criteria for and definitions of disciplinarity, and how they differ between different types of literature. Design/methodology/approach This synthesis is achieved through a purposive review of three types of literature: explicit conceptualizations of disciplinarity; narrative histories of disciplines; and operationalizations of disciplinarity. Findings Each angle of discussing disciplinarity presents distinct criteria. However, there are a few common axes upon which conceptualizations, disciplinary narratives, and measurements revolve: communication, social features, topical coherence, and institutions. Originality/value There is considerable ambiguity in the concept of a discipline. This is of particular concern in a heightened assessment culture, where decisions about funding and resource allocation are often discipline-dependent (or focussed exclusively on interdisciplinary endeavors). This work explores the varied nature of disciplinarity and, through synthesis of the literature, presents a framework of criteria that can be used to guide science policy makers, scientometricians, administrators, and others interested in defining, constructing, and evaluating disciplines.
  3. Haustein, S.; Bowman, T.D.; Holmberg, K.; Tsou, A.; Sugimoto, C.R.; Larivière, V.: Tweets as impact indicators : Examining the implications of automated "bot" accounts on Twitter (2016) 0.01
    0.011857167 = product of:
      0.047428668 = sum of:
        0.047428668 = product of:
          0.094857335 = sum of:
            0.094857335 = weight(_text_:assessment in 2502) [ClassicSimilarity], result of:
              0.094857335 = score(doc=2502,freq=2.0), product of:
                0.25917634 = queryWeight, product of:
                  5.52102 = idf(docFreq=480, maxDocs=44218)
                  0.04694356 = queryNorm
                0.36599535 = fieldWeight in 2502, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  5.52102 = idf(docFreq=480, maxDocs=44218)
                  0.046875 = fieldNorm(doc=2502)
          0.5 = coord(1/2)
      0.25 = coord(1/4)
    
    Abstract
    This brief communication presents preliminary findings on automated Twitter accounts distributing links to scientific articles deposited on the preprint repository arXiv. It discusses the implication of the presence of such bots from the perspective of social media metrics (altmetrics), where mentions of scholarly documents on Twitter have been suggested as a means of measuring impact that is both broader and timelier than citations. Our results show that automated Twitter accounts create a considerable amount of tweets to scientific articles and that they behave differently than common social bots, which has critical implications for the use of raw tweet counts in research evaluation and assessment. We discuss some definitions of Twitter cyborgs and bots in scholarly communication and propose distinguishing between different levels of engagement-that is, differentiating between tweeting only bibliographic information to discussing or commenting on the content of a scientific work.
  4. Haustein, S.; Peters, I.; Sugimoto, C.R.; Thelwall, M.; Larivière, V.: Tweeting biomedicine : an analysis of tweets and citations in the biomedical literature (2014) 0.00
    0.0029427784 = product of:
      0.011771114 = sum of:
        0.011771114 = product of:
          0.047084454 = sum of:
            0.047084454 = weight(_text_:based in 1229) [ClassicSimilarity], result of:
              0.047084454 = score(doc=1229,freq=8.0), product of:
                0.14144066 = queryWeight, product of:
                  3.0129938 = idf(docFreq=5906, maxDocs=44218)
                  0.04694356 = queryNorm
                0.33289194 = fieldWeight in 1229, product of:
                  2.828427 = tf(freq=8.0), with freq of:
                    8.0 = termFreq=8.0
                  3.0129938 = idf(docFreq=5906, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=1229)
          0.25 = coord(1/4)
      0.25 = coord(1/4)
    
    Abstract
    Data collected by social media platforms have been introduced as new sources for indicators to help measure the impact of scholarly research in ways that are complementary to traditional citation analysis. Data generated from social media activities can be used to reflect broad types of impact. This article aims to provide systematic evidence about how often Twitter is used to disseminate information about journal articles in the biomedical sciences. The analysis is based on 1.4 million documents covered by both PubMed and Web of Science and published between 2010 and 2012. The number of tweets containing links to these documents was analyzed and compared to citations to evaluate the degree to which certain journals, disciplines, and specialties were represented on Twitter and how far tweets correlate with citation impact. With less than 10% of PubMed articles mentioned on Twitter, its uptake is low in general but differs between journals and specialties. Correlations between tweets and citations are low, implying that impact metrics based on tweets are different from those based on citations. A framework using the coverage of articles and the correlation between Twitter mentions and citations is proposed to facilitate the evaluation of novel social-media-based metrics.
  5. Milojevic, S.; Sugimoto, C.R.; Yan, E.; Ding, Y.: ¬The cognitive structure of Library and Information Science : analysis of article title words (2011) 0.00
    0.0020808585 = product of:
      0.008323434 = sum of:
        0.008323434 = product of:
          0.033293735 = sum of:
            0.033293735 = weight(_text_:based in 4608) [ClassicSimilarity], result of:
              0.033293735 = score(doc=4608,freq=4.0), product of:
                0.14144066 = queryWeight, product of:
                  3.0129938 = idf(docFreq=5906, maxDocs=44218)
                  0.04694356 = queryNorm
                0.23539014 = fieldWeight in 4608, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  3.0129938 = idf(docFreq=5906, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=4608)
          0.25 = coord(1/4)
      0.25 = coord(1/4)
    
    Abstract
    This study comprises a suite of analyses of words in article titles in order to reveal the cognitive structure of Library and Information Science (LIS). The use of title words to elucidate the cognitive structure of LIS has been relatively neglected. The present study addresses this gap by performing (a) co-word analysis and hierarchical clustering, (b) multidimensional scaling, and (c) determination of trends in usage of terms. The study is based on 10,344 articles published between 1988 and 2007 in 16 LIS journals. Methodologically, novel aspects of this study are: (a) its large scale, (b) removal of non-specific title words based on the "word concentration" measure (c) identification of the most frequent terms that include both single words and phrases, and (d) presentation of the relative frequencies of terms using "heatmaps". Conceptually, our analysis reveals that LIS consists of three main branches: the traditionally recognized library-related and information-related branches, plus an equally distinct bibliometrics/scientometrics branch. The three branches focus on: libraries, information, and science, respectively. In addition, our study identifies substructures within each branch. We also tentatively identify "information seeking behavior" as a branch that is establishing itself separate from the three main branches. Furthermore, we find that cognitive concepts in LIS evolve continuously, with no stasis since 1992. The most rapid development occurred between 1998 and 2001, influenced by the increased focus on the Internet. The change in the cognitive landscape is found to be driven by the emergence of new information technologies, and the retirement of old ones.
  6. Demarest, B.; Sugimoto, C.R.: Argue, observe, assess : measuring disciplinary identities and differences through socio-epistemic discourse (2015) 0.00
    0.0020808585 = product of:
      0.008323434 = sum of:
        0.008323434 = product of:
          0.033293735 = sum of:
            0.033293735 = weight(_text_:based in 2039) [ClassicSimilarity], result of:
              0.033293735 = score(doc=2039,freq=4.0), product of:
                0.14144066 = queryWeight, product of:
                  3.0129938 = idf(docFreq=5906, maxDocs=44218)
                  0.04694356 = queryNorm
                0.23539014 = fieldWeight in 2039, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  3.0129938 = idf(docFreq=5906, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=2039)
          0.25 = coord(1/4)
      0.25 = coord(1/4)
    
    Abstract
    Calls for interdisciplinary collaboration have become increasingly common in the face of large-scale complex problems (including climate change, economic inequality, and education, among others); however, outcomes of such collaborations have been mixed, due, among other things, to the so-called "translation problem" in interdisciplinary research. This article presents a potential solution: an empirical approach to quantitatively measure both the degree and nature of differences among disciplinary tongues through the social and epistemic terms used (a research area we refer to as discourse epistemetrics), in a case study comparing dissertations in philosophy, psychology, and physics. Using a support-vector model of machine learning to classify disciplines based on relative frequencies of social and epistemic terms, we were able to markedly improve accuracy over a random selection baseline (distinguishing between disciplines with as high as 90% accuracy) as well as acquire sets of most indicative terms for each discipline by their relative presence or absence. These lists were then considered in light of findings of sociological and epistemological studies of disciplines and found to validate the approach's measure of social and epistemic disciplinary identities and contrasts. Based on the findings of our study, we conclude by considering the beneficiaries of research in this area, including bibliometricians, students, and science policy makers, among others, as well as laying out a research program that expands the number of disciplines, considers shifts in socio-epistemic identities over time and applies these methods to nonacademic epistemological communities (e.g., political groups).
  7. Yan, E.; Ding, Y.; Sugimoto, C.R.: P-Rank: an indicator measuring prestige in heterogeneous scholarly networks (2011) 0.00
    0.0017656671 = product of:
      0.0070626684 = sum of:
        0.0070626684 = product of:
          0.028250674 = sum of:
            0.028250674 = weight(_text_:based in 4349) [ClassicSimilarity], result of:
              0.028250674 = score(doc=4349,freq=2.0), product of:
                0.14144066 = queryWeight, product of:
                  3.0129938 = idf(docFreq=5906, maxDocs=44218)
                  0.04694356 = queryNorm
                0.19973516 = fieldWeight in 4349, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.0129938 = idf(docFreq=5906, maxDocs=44218)
                  0.046875 = fieldNorm(doc=4349)
          0.25 = coord(1/4)
      0.25 = coord(1/4)
    
    Abstract
    Ranking scientific productivity and prestige are often limited to homogeneous networks. These networks are unable to account for the multiple factors that constitute the scholarly communication and reward system. This study proposes a new informetric indicator, P-Rank, for measuring prestige in heterogeneous scholarly networks containing articles, authors, and journals. P-Rank differentiates the weight of each citation based on its citing papers, citing journals, and citing authors. Articles from 16 representative library and information science journals are selected as the dataset. Principle Component Analysis is conducted to examine the relationship between P-Rank and other bibliometric indicators. We also compare the correlation and rank variances between citation counts and P-Rank scores. This work provides a new approach to examining prestige in scholarly communication networks in a more comprehensive and nuanced way.
  8. Yan, E.; Sugimoto, C.R.: Institutional interactions : exploring social, cognitive, and geographic relationships between institutions as demonstrated through citation networks (2011) 0.00
    0.0017656671 = product of:
      0.0070626684 = sum of:
        0.0070626684 = product of:
          0.028250674 = sum of:
            0.028250674 = weight(_text_:based in 4627) [ClassicSimilarity], result of:
              0.028250674 = score(doc=4627,freq=2.0), product of:
                0.14144066 = queryWeight, product of:
                  3.0129938 = idf(docFreq=5906, maxDocs=44218)
                  0.04694356 = queryNorm
                0.19973516 = fieldWeight in 4627, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.0129938 = idf(docFreq=5906, maxDocs=44218)
                  0.046875 = fieldNorm(doc=4627)
          0.25 = coord(1/4)
      0.25 = coord(1/4)
    
    Abstract
    The objective of this research is to examine the interaction of institutions, based on their citation and collaboration networks. The domain of library and information science is examined, using data from 1965-2010. A linear model is formulated to explore the factors that are associated with institutional citation behaviors, using the number of citations as the dependent variable, and the number of collaborations, physical distance, and topical distance as independent variables. It is found that institutional citation behaviors are associated with social, topical, and geographical factors. Dynamically, the number of citations is becoming more associated with collaboration intensity and less dependent on the country boundary and/or physical distance. This research is informative for scientometricians and policy makers.
  9. Sugimoto, C.R.; Work, S.; Larivière, V.; Haustein, S.: Scholarly use of social media and altmetrics : A review of the literature (2017) 0.00
    0.0017656671 = product of:
      0.0070626684 = sum of:
        0.0070626684 = product of:
          0.028250674 = sum of:
            0.028250674 = weight(_text_:based in 3781) [ClassicSimilarity], result of:
              0.028250674 = score(doc=3781,freq=2.0), product of:
                0.14144066 = queryWeight, product of:
                  3.0129938 = idf(docFreq=5906, maxDocs=44218)
                  0.04694356 = queryNorm
                0.19973516 = fieldWeight in 3781, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.0129938 = idf(docFreq=5906, maxDocs=44218)
                  0.046875 = fieldNorm(doc=3781)
          0.25 = coord(1/4)
      0.25 = coord(1/4)
    
    Abstract
    Social media has become integrated into the fabric of the scholarly communication system in fundamental ways, principally through scholarly use of social media platforms and the promotion of new indicators on the basis of interactions with these platforms. Research and scholarship in this area has accelerated since the coining and subsequent advocacy for altmetrics-that is, research indicators based on social media activity. This review provides an extensive account of the state-of-the art in both scholarly use of social media and altmetrics. The review consists of 2 main parts: the first examines the use of social media in academia, reviewing the various functions these platforms have in the scholarly communication process and the factors that affect this use. The second part reviews empirical studies of altmetrics, discussing the various interpretations of altmetrics, data collection and methodological limitations, and differences according to platform. The review ends with a critical discussion of the implications of this transformation in the scholarly communication system.
  10. Ni, C.; Sugimoto, C.R.; Jiang, J.: Venue-author-coupling : a measure for identifying disciplines through author communities (2013) 0.00
    0.0014713892 = product of:
      0.005885557 = sum of:
        0.005885557 = product of:
          0.023542227 = sum of:
            0.023542227 = weight(_text_:based in 607) [ClassicSimilarity], result of:
              0.023542227 = score(doc=607,freq=2.0), product of:
                0.14144066 = queryWeight, product of:
                  3.0129938 = idf(docFreq=5906, maxDocs=44218)
                  0.04694356 = queryNorm
                0.16644597 = fieldWeight in 607, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.0129938 = idf(docFreq=5906, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=607)
          0.25 = coord(1/4)
      0.25 = coord(1/4)
    
    Abstract
    Conceptualizations of disciplinarity often focus on the social aspects of disciplines; that is, disciplines are defined by the set of individuals who participate in their activities and communications. However, operationalizations of disciplinarity often demarcate the boundaries of disciplines by standard classification schemes, which may be inflexible to changes in the participation profile of that discipline. To address this limitation, a metric called venue-author-coupling (VAC) is proposed and illustrated using journals from the Journal Citation Report's (JCR) library science and information science category. As JCRs are some of the most frequently used categories in bibliometric analyses, this allows for an examination of the extent to which the journals in JCR categories can be considered as proxies for disciplines. By extending the idea of bibliographic coupling, VAC identifies similarities among journals based on the similarities of their author profiles. The employment of this method using information science and library science journals provides evidence of four distinct subfields, that is, management information systems, specialized information and library science, library science-focused, and information science-focused research. The proposed VAC method provides a novel way to examine disciplinarity from the perspective of author communities.
  11. Kelly, D.; Sugimoto, C.R.: ¬A systematic review of interactive information retrieval evaluation studies, 1967-2006 (2013) 0.00
    0.0014713892 = product of:
      0.005885557 = sum of:
        0.005885557 = product of:
          0.023542227 = sum of:
            0.023542227 = weight(_text_:based in 684) [ClassicSimilarity], result of:
              0.023542227 = score(doc=684,freq=2.0), product of:
                0.14144066 = queryWeight, product of:
                  3.0129938 = idf(docFreq=5906, maxDocs=44218)
                  0.04694356 = queryNorm
                0.16644597 = fieldWeight in 684, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.0129938 = idf(docFreq=5906, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=684)
          0.25 = coord(1/4)
      0.25 = coord(1/4)
    
    Abstract
    With the increasing number and diversity of search tools available, interest in the evaluation of search systems, particularly from a user perspective, has grown among researchers. More researchers are designing and evaluating interactive information retrieval (IIR) systems and beginning to innovate in evaluation methods. Maturation of a research specialty relies on the ability to replicate research, provide standards for measurement and analysis, and understand past endeavors. This article presents a historical overview of 40 years of IIR evaluation studies using the method of systematic review. A total of 2,791 journal and conference units were manually examined and 127 articles were selected for analysis in this study, based on predefined inclusion and exclusion criteria. These articles were systematically coded using features such as author, publication date, sources and references, and properties of the research method used in the articles, such as number of subjects, tasks, corpora, and measures. Results include data describing the growth of IIR studies over time, the most frequently occurring and cited authors and sources, and the most common types of corpora and measures used. An additional product of this research is a bibliography of IIR evaluation research that can be used by students, teachers, and those new to the area. To the authors' knowledge, this is the first historical, systematic characterization of the IIR evaluation literature, including the documentation of methods and measures used by researchers in this specialty.