Search (3 results, page 1 of 1)

  • × author_ss:"Wang, Y."
  • × language_ss:"e"
  • × year_i:[2020 TO 2030}
  1. Xie, B.; He, D.; Mercer, T.; Wang, Y.; Wu, D.; Fleischmann, K.R.; Zhang, Y.; Yoder, L.H.; Stephens, K.K.; Mackert, M.; Lee, M.K.: Global health crises are also information crises : a call to action (2020) 0.00
    0.0023678814 = product of:
      0.0047357627 = sum of:
        0.0047357627 = product of:
          0.009471525 = sum of:
            0.009471525 = weight(_text_:a in 32) [ClassicSimilarity], result of:
              0.009471525 = score(doc=32,freq=8.0), product of:
                0.053105544 = queryWeight, product of:
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.046056706 = queryNorm
                0.17835285 = fieldWeight in 32, product of:
                  2.828427 = tf(freq=8.0), with freq of:
                    8.0 = termFreq=8.0
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=32)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    In this opinion paper, we argue that global health crises are also information crises. Using as an example the coronavirus disease 2019 (COVID-19) epidemic, we (a) examine challenges associated with what we term "global information crises"; (b) recommend changes needed for the field of information science to play a leading role in such crises; and (c) propose actionable items for short- and long-term research, education, and practice in information science.
    Type
    a
  2. Cui, Y.; Wang, Y.; Liu, X.; Wang, X.; Zhang, X.: Multidimensional scholarly citations : characterizing and understanding scholars' citation behaviors (2023) 0.00
    0.0018909799 = product of:
      0.0037819599 = sum of:
        0.0037819599 = product of:
          0.0075639198 = sum of:
            0.0075639198 = weight(_text_:a in 847) [ClassicSimilarity], result of:
              0.0075639198 = score(doc=847,freq=10.0), product of:
                0.053105544 = queryWeight, product of:
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.046056706 = queryNorm
                0.14243183 = fieldWeight in 847, product of:
                  3.1622777 = tf(freq=10.0), with freq of:
                    10.0 = termFreq=10.0
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=847)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    This study investigates scholars' citation behaviors from a fine-grained perspective. Specifically, each scholarly citation is considered multidimensional rather than logically unidimensional (i.e., present or absent). Thirty million articles from PubMed were accessed for use in empirical research, in which a total of 15 interpretable features of scholarly citations were constructed and grouped into three main categories. Each category corresponds to one aspect of the reasons and motivations behind scholars' citation decision-making during academic writing. Using about 500,000 pairs of actual and randomly generated scholarly citations, a series of Random Forest-based classification experiments were conducted to quantitatively evaluate the correlation between each constructed citation feature and citation decisions made by scholars. Our experimental results indicate that citation proximity is the category most relevant to scholars' citation decision-making, followed by citation authority and citation inertia. However, big-name scholars whose h-indexes rank among the top 1% exhibit a unique pattern of citation behaviors-their citation decision-making correlates most closely with citation inertia, with the correlation nearly three times as strong as that of their ordinary counterparts. Hopefully, the empirical findings presented in this paper can bring us closer to characterizing and understanding the complex process of generating scholarly citations in academia.
    Type
    a
  3. Wang, Y.; Shah, C.: Authentic versus synthetic : an investigation of the influences of study settings and task configurations on search behaviors (2022) 0.00
    0.0011959607 = product of:
      0.0023919214 = sum of:
        0.0023919214 = product of:
          0.0047838427 = sum of:
            0.0047838427 = weight(_text_:a in 495) [ClassicSimilarity], result of:
              0.0047838427 = score(doc=495,freq=4.0), product of:
                0.053105544 = queryWeight, product of:
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.046056706 = queryNorm
                0.090081796 = fieldWeight in 495, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=495)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    In information seeking and retrieval research, researchers often collect data about users' behaviors to predict task characteristics and personalize information for users. The reliability of user behavior may be directly influenced by data collection methods. This article reports on a mixed-methods study examining the impact of study setting (laboratory setting vs. remote setting) and task authenticity (authentic task vs. simulated task) on users' online browsing and searching behaviors. Thirty-six undergraduate participants finished one lab session and one remote session in which they completed one authentic and one simulated task. Using log data collected from 144 task sessions, this study demonstrates that the synthetic lab study setting and simulated tasks had significant influences mostly on behaviors related to content pages (e.g., page dwell time, number of pages visited per task). Meanwhile, first-query behaviors were less affected by study settings or task authenticity than whole-session behaviors, indicating the reliability of using first-query behaviors in task prediction. Qualitative interviews reveal why users were influenced. This study addresses methodological limitations in existing research and provides new insights and implications for researchers who collect online user search behavioral data.
    Type
    a