Search (2 results, page 1 of 1)

  • × author_ss:"Rubin, V.L."
  • × year_i:[2010 TO 2020}
  1. Rubin, V.L.: Disinformation and misinformation triangle (2019) 0.04
    0.043228656 = product of:
      0.08645731 = sum of:
        0.08645731 = product of:
          0.17291462 = sum of:
            0.17291462 = weight(_text_:news in 5462) [ClassicSimilarity], result of:
              0.17291462 = score(doc=5462,freq=10.0), product of:
                0.26705483 = queryWeight, product of:
                  5.2416887 = idf(docFreq=635, maxDocs=44218)
                  0.05094824 = queryNorm
                0.64748734 = fieldWeight in 5462, product of:
                  3.1622777 = tf(freq=10.0), with freq of:
                    10.0 = termFreq=10.0
                  5.2416887 = idf(docFreq=635, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=5462)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    Purpose The purpose of this paper is to treat disinformation and misinformation (intentionally deceptive and unintentionally inaccurate misleading information, respectively) as a socio-cultural technology-enabled epidemic in digital news, propagated via social media. Design/methodology/approach The proposed disinformation and misinformation triangle is a conceptual model that identifies the three minimal causal factors occurring simultaneously to facilitate the spread of the epidemic at the societal level. Findings Following the epidemiological disease triangle model, the three interacting causal factors are translated into the digital news context: the virulent pathogens are falsifications, clickbait, satirical "fakes" and other deceptive or misleading news content; the susceptible hosts are information-overloaded, time-pressed news readers lacking media literacy skills; and the conducive environments are polluted poorly regulated social media platforms that propagate and encourage the spread of various "fakes." Originality/value The three types of interventions - automation, education and regulation - are proposed as a set of holistic measures to reveal, and potentially control, predict and prevent further proliferation of the epidemic. Partial automated solutions with natural language processing, machine learning and various automated detection techniques are currently available, as exemplified here briefly. Automated solutions assist (but not replace) human judgments about whether news is truthful and credible. Information literacy efforts require further in-depth understanding of the phenomenon and interdisciplinary collaboration outside of the traditional library and information science, incorporating media studies, journalism, interpersonal psychology and communication perspectives.
  2. Rubin, V.L.: Epistemic modality : from uncertainty to certainty in the context of information seeking as interactions with texts (2010) 0.03
    0.0273402 = product of:
      0.0546804 = sum of:
        0.0546804 = product of:
          0.1093608 = sum of:
            0.1093608 = weight(_text_:news in 4241) [ClassicSimilarity], result of:
              0.1093608 = score(doc=4241,freq=4.0), product of:
                0.26705483 = queryWeight, product of:
                  5.2416887 = idf(docFreq=635, maxDocs=44218)
                  0.05094824 = queryNorm
                0.40950692 = fieldWeight in 4241, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  5.2416887 = idf(docFreq=635, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=4241)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    This article introduces a type of uncertainty that resides in textual information and requires epistemic interpretation on the information seeker's part. Epistemic modality, as defined in linguistics and natural language processing, is a writer's estimation of the validity of propositional content in texts. It is an evaluation of chances that a certain hypothetical state of affairs is true, e.g., definitely true or possibly true. This research shifts attention from the uncertainty-certainty dichotomy to a gradient epistemic continuum of absolute, high, moderate, low certainty, and uncertainty. An analysis of a New York Times dataset showed that epistemically modalized statements are pervasive in news discourse and they occur at a significantly higher rate in editorials than in news reports. Four independent annotators were able to recognize a gradation on the continuum but individual perceptions of the boundaries between levels were highly subjective. Stricter annotation instructions and longer coder training improved intercoder agreement results. This paper offers an interdisciplinary bridge between research in linguistics, natural language processing, and information seeking with potential benefits to design and implementation of information systems for situations where large amounts of textual information are screened manually on a regular basis, for instance, by professional intelligence or business analysts.