Search (2 results, page 1 of 1)

  • × classification_ss:"020"
  • × year_i:[2010 TO 2020}
  1. Borgman, C.L.: Big data, little data, no data : scholarship in the networked world (2015) 0.01
    0.011292135 = product of:
      0.04516854 = sum of:
        0.04516854 = weight(_text_:term in 2785) [ClassicSimilarity], result of:
          0.04516854 = score(doc=2785,freq=2.0), product of:
            0.21904005 = queryWeight, product of:
              4.66603 = idf(docFreq=1130, maxDocs=44218)
              0.04694356 = queryNorm
            0.20621133 = fieldWeight in 2785, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              4.66603 = idf(docFreq=1130, maxDocs=44218)
              0.03125 = fieldNorm(doc=2785)
      0.25 = coord(1/4)
    
    Abstract
    "Big Data" is on the covers of Science, Nature, the Economist, and Wired magazines, on the front pages of the Wall Street Journal and the New York Times. But despite the media hyperbole, as Christine Borgman points out in this examination of data and scholarly research, having the right data is usually better than having more data; little data can be just as valuable as big data. In many cases, there are no data -- because relevant data don't exist, cannot be found, or are not available. Moreover, data sharing is difficult, incentives to do so are minimal, and data practices vary widely across disciplines. Borgman, an often-cited authority on scholarly communication, argues that data have no value or meaning in isolation; they exist within a knowledge infrastructure -- an ecology of people, practices, technologies, institutions, material objects, and relationships. After laying out the premises of her investigation -- six "provocations" meant to inspire discussion about the uses of data in scholarship -- Borgman offers case studies of data practices in the sciences, the social sciences, and the humanities, and then considers the implications of her findings for scholarly practice and research policy. To manage and exploit data over the long term, Borgman argues, requires massive investment in knowledge infrastructures; at stake is the future of scholarship.
  2. Badia, A.: ¬The information manifold : why computers cannot solve algorithmic bias and fake news (2019) 0.00
    0.0016646868 = product of:
      0.0066587473 = sum of:
        0.0066587473 = product of:
          0.02663499 = sum of:
            0.02663499 = weight(_text_:based in 160) [ClassicSimilarity], result of:
              0.02663499 = score(doc=160,freq=4.0), product of:
                0.14144066 = queryWeight, product of:
                  3.0129938 = idf(docFreq=5906, maxDocs=44218)
                  0.04694356 = queryNorm
                0.18831211 = fieldWeight in 160, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  3.0129938 = idf(docFreq=5906, maxDocs=44218)
                  0.03125 = fieldNorm(doc=160)
          0.25 = coord(1/4)
      0.25 = coord(1/4)
    
    Abstract
    An argument that information exists at different levels of analysis-syntactic, semantic, and pragmatic-and an exploration of the implications. Although this is the Information Age, there is no universal agreement about what information really is. Different disciplines view information differently; engineers, computer scientists, economists, linguists, and philosophers all take varying and apparently disconnected approaches. In this book, Antonio Badia distinguishes four levels of analysis brought to bear on information: syntactic, semantic, pragmatic, and network-based. Badia explains each of these theoretical approaches in turn, discussing, among other topics, theories of Claude Shannon and Andrey Kolomogorov, Fred Dretske's description of information flow, and ideas on receiver impact and informational interactions. Badia argues that all these theories describe the same phenomena from different perspectives, each one narrower than the previous one. The syntactic approach is the more general one, but it fails to specify when information is meaningful to an agent, which is the focus of the semantic and pragmatic approaches. The network-based approach, meanwhile, provides a framework to understand information use among agents. Badia then explores the consequences of understanding information as existing at several levels. Humans live at the semantic and pragmatic level (and at the network level as a society), computers at the syntactic level. This sheds light on some recent issues, including "fake news" (computers cannot tell whether a statement is true or not, because truth is a semantic notion) and "algorithmic bias" (a pragmatic, not syntactic concern). Humans, not computers, the book argues, have the ability to solve these issues.