Search (4 results, page 1 of 1)

  • × author_ss:"Liu, X."
  • × year_i:[2020 TO 2030}
  1. Liu, X.; Bu, Y.; Li, M.; Li, J.: Monodisciplinary collaboration disrupts science more than multidisciplinary collaboration (2024) 0.00
    0.0020296127 = product of:
      0.0040592253 = sum of:
        0.0040592253 = product of:
          0.008118451 = sum of:
            0.008118451 = weight(_text_:a in 1202) [ClassicSimilarity], result of:
              0.008118451 = score(doc=1202,freq=8.0), product of:
                0.053105544 = queryWeight, product of:
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.046056706 = queryNorm
                0.15287387 = fieldWeight in 1202, product of:
                  2.828427 = tf(freq=8.0), with freq of:
                    8.0 = termFreq=8.0
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.046875 = fieldNorm(doc=1202)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    Collaboration across disciplines is a critical form of scientific collaboration to solve complex problems and make innovative contributions. This study focuses on the association between multidisciplinary collaboration measured by coauthorship in publications and the disruption of publications measured by the Disruption (D) index. We used authors' affiliations as a proxy of the disciplines to which they belong and categorized an article into multidisciplinary collaboration or monodisciplinary collaboration. The D index quantifies the extent to which a study disrupts its predecessors. We selected 13 journals that publish articles in six disciplines from the Microsoft Academic Graph (MAG) database and then constructed regression models with fixed effects and estimated the relationship between the variables. The findings show that articles with monodisciplinary collaboration are more disruptive than those with multidisciplinary collaboration. Furthermore, we uncovered the mechanism of how monodisciplinary collaboration disrupts science more than multidisciplinary collaboration by exploring the references of the sampled publications.
    Type
    a
  2. Liu, X.; Chen, X.: Authors' noninstitutional emails and their correlation with retraction (2021) 0.00
    0.001913537 = product of:
      0.003827074 = sum of:
        0.003827074 = product of:
          0.007654148 = sum of:
            0.007654148 = weight(_text_:a in 152) [ClassicSimilarity], result of:
              0.007654148 = score(doc=152,freq=4.0), product of:
                0.053105544 = queryWeight, product of:
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.046056706 = queryNorm
                0.14413087 = fieldWeight in 152, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.0625 = fieldNorm(doc=152)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    We collected research articles from Retraction Watch database, Scopus, and a major retraction announcement by Springer, to identify emails used by authors. Authors' emails can be institutional emails and noninstitutional emails. Data suggest that retracted articles are more likely to use noninstitutional emails, but it is difficult to generalize. The study put some focus on authors from China.
    Type
    a
  3. Liu, X.; Hu, M.; Xiao, B.S.; Shao, J.: Is my doctor around me? : Investigating the impact of doctors' presence on patients' review behaviors on an online health platform (2022) 0.00
    0.0018909799 = product of:
      0.0037819599 = sum of:
        0.0037819599 = product of:
          0.0075639198 = sum of:
            0.0075639198 = weight(_text_:a in 650) [ClassicSimilarity], result of:
              0.0075639198 = score(doc=650,freq=10.0), product of:
                0.053105544 = queryWeight, product of:
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.046056706 = queryNorm
                0.14243183 = fieldWeight in 650, product of:
                  3.1622777 = tf(freq=10.0), with freq of:
                    10.0 = termFreq=10.0
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=650)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    Patient-generated online reviews are well-established as an important source of information for people to evaluate doctors' quality and improve health outcomes. However, how such reviews are generated in the first place is not well examined. This study examines a hitherto unexplored social driver of online review generation-doctors' presence on online health platforms, which results in the reviewers (i.e., patients) and the reviewees (i.e., doctors) coexisting in the same medium. Drawing on the Stimulus-Organism-Response theory as an overarching framework, we advance hypotheses about the impact of doctors' presence on their patients' review behaviors, including review volume, review effort, and emotional expression. To achieve causal identification, we conduct a quasi-experiment on a large online health platform and employ propensity score matching and difference-in-difference estimation. Our findings show that doctors' presence increases their patients' review volume. Furthermore, doctors' presence motivates their patients to exert greater effort and express more positive emotions in the review text. The results also show that the presence of doctors with higher professional titles has a stronger effect on review volume than the presence of doctors with lower professional titles. Our findings offer important implications both for research and practice.
    Type
    a
  4. Cui, Y.; Wang, Y.; Liu, X.; Wang, X.; Zhang, X.: Multidimensional scholarly citations : characterizing and understanding scholars' citation behaviors (2023) 0.00
    0.0018909799 = product of:
      0.0037819599 = sum of:
        0.0037819599 = product of:
          0.0075639198 = sum of:
            0.0075639198 = weight(_text_:a in 847) [ClassicSimilarity], result of:
              0.0075639198 = score(doc=847,freq=10.0), product of:
                0.053105544 = queryWeight, product of:
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.046056706 = queryNorm
                0.14243183 = fieldWeight in 847, product of:
                  3.1622777 = tf(freq=10.0), with freq of:
                    10.0 = termFreq=10.0
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=847)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    This study investigates scholars' citation behaviors from a fine-grained perspective. Specifically, each scholarly citation is considered multidimensional rather than logically unidimensional (i.e., present or absent). Thirty million articles from PubMed were accessed for use in empirical research, in which a total of 15 interpretable features of scholarly citations were constructed and grouped into three main categories. Each category corresponds to one aspect of the reasons and motivations behind scholars' citation decision-making during academic writing. Using about 500,000 pairs of actual and randomly generated scholarly citations, a series of Random Forest-based classification experiments were conducted to quantitatively evaluate the correlation between each constructed citation feature and citation decisions made by scholars. Our experimental results indicate that citation proximity is the category most relevant to scholars' citation decision-making, followed by citation authority and citation inertia. However, big-name scholars whose h-indexes rank among the top 1% exhibit a unique pattern of citation behaviors-their citation decision-making correlates most closely with citation inertia, with the correlation nearly three times as strong as that of their ordinary counterparts. Hopefully, the empirical findings presented in this paper can bring us closer to characterizing and understanding the complex process of generating scholarly citations in academia.
    Type
    a