Search (4 results, page 1 of 1)

  • × author_ss:"Bowker, G.C."
  1. Bowker, G.C.; Star, S.L.: Sorting things out : classification and its consequences (1999) 0.12
    0.12450441 = product of:
      0.26457188 = sum of:
        0.062118907 = weight(_text_:allgemeines in 733) [ClassicSimilarity], result of:
          0.062118907 = score(doc=733,freq=4.0), product of:
            0.17420313 = queryWeight, product of:
              5.705423 = idf(docFreq=399, maxDocs=44218)
              0.030532904 = queryNorm
            0.35658893 = fieldWeight in 733, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              5.705423 = idf(docFreq=399, maxDocs=44218)
              0.03125 = fieldNorm(doc=733)
        0.04125118 = weight(_text_:buch in 733) [ClassicSimilarity], result of:
          0.04125118 = score(doc=733,freq=4.0), product of:
            0.14195877 = queryWeight, product of:
              4.64937 = idf(docFreq=1149, maxDocs=44218)
              0.030532904 = queryNorm
            0.29058564 = fieldWeight in 733, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              4.64937 = idf(docFreq=1149, maxDocs=44218)
              0.03125 = fieldNorm(doc=733)
        0.013257037 = weight(_text_:und in 733) [ClassicSimilarity], result of:
          0.013257037 = score(doc=733,freq=8.0), product of:
            0.06767212 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.030532904 = queryNorm
            0.19590102 = fieldWeight in 733, product of:
              2.828427 = tf(freq=8.0), with freq of:
                8.0 = termFreq=8.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.03125 = fieldNorm(doc=733)
        0.018959826 = product of:
          0.03791965 = sum of:
            0.03791965 = weight(_text_:bibliothekswesen in 733) [ClassicSimilarity], result of:
              0.03791965 = score(doc=733,freq=4.0), product of:
                0.13610567 = queryWeight, product of:
                  4.457672 = idf(docFreq=1392, maxDocs=44218)
                  0.030532904 = queryNorm
                0.2786045 = fieldWeight in 733, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  4.457672 = idf(docFreq=1392, maxDocs=44218)
                  0.03125 = fieldNorm(doc=733)
          0.5 = coord(1/2)
        0.05476408 = weight(_text_:informationswissenschaft in 733) [ClassicSimilarity], result of:
          0.05476408 = score(doc=733,freq=8.0), product of:
            0.13754173 = queryWeight, product of:
              4.504705 = idf(docFreq=1328, maxDocs=44218)
              0.030532904 = queryNorm
            0.3981634 = fieldWeight in 733, product of:
              2.828427 = tf(freq=8.0), with freq of:
                8.0 = termFreq=8.0
              4.504705 = idf(docFreq=1328, maxDocs=44218)
              0.03125 = fieldNorm(doc=733)
        0.03791965 = weight(_text_:bibliothekswesen in 733) [ClassicSimilarity], result of:
          0.03791965 = score(doc=733,freq=4.0), product of:
            0.13610567 = queryWeight, product of:
              4.457672 = idf(docFreq=1392, maxDocs=44218)
              0.030532904 = queryNorm
            0.2786045 = fieldWeight in 733, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              4.457672 = idf(docFreq=1392, maxDocs=44218)
              0.03125 = fieldNorm(doc=733)
        0.0049934816 = weight(_text_:in in 733) [ClassicSimilarity], result of:
          0.0049934816 = score(doc=733,freq=8.0), product of:
            0.04153252 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.030532904 = queryNorm
            0.120230645 = fieldWeight in 733, product of:
              2.828427 = tf(freq=8.0), with freq of:
                8.0 = termFreq=8.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.03125 = fieldNorm(doc=733)
        0.031307716 = product of:
          0.06261543 = sum of:
            0.06261543 = weight(_text_:katalogisierung in 733) [ClassicSimilarity], result of:
              0.06261543 = score(doc=733,freq=4.0), product of:
                0.17489795 = queryWeight, product of:
                  5.7281795 = idf(docFreq=390, maxDocs=44218)
                  0.030532904 = queryNorm
                0.35801122 = fieldWeight in 733, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  5.7281795 = idf(docFreq=390, maxDocs=44218)
                  0.03125 = fieldNorm(doc=733)
          0.5 = coord(1/2)
      0.47058824 = coord(8/17)
    
    Abstract
    Is this book sociology, anthropology, or taxonomy? Sorting Things Out, by communications theorists Geoffrey C. Bowker and Susan Leigh Star, covers a lot of conceptual ground in its effort to sort out exactly how and why we classify and categorize the things and concepts we encounter day to day. But the analysis doesn't stop there; the authors go on to explore what happens to our thinking as a result of our classifications. With great insight and precise academic language, they pick apart our information systems and language structures that lie deeper than the everyday categories we use. The authors focus first on the International Classification of Diseases (ICD), a widely used scheme used by health professionals worldwide, but also look at other health information systems, racial classifications used by South Africa during apartheid, and more. Though it comes off as a bit too academic at times (by the end of the 20th century, most writers should be able to get the spelling of McDonald's restaurant right), the book has a clever charm that thoughtful readers will surely appreciate. A sly sense of humor sneaks into the writing, giving rise to the chapter title "The Kindness of Strangers," for example. After arguing that categorization is both strongly influenced by and a powerful reinforcer of ideology, it follows that revolutions (political or scientific) must change the way things are sorted in order to throw over the old system. Who knew that such simple, basic elements of thought could have such far-reaching consequences? Whether you ultimately place it with social science, linguistics, or (as the authors fear) fantasy, make sure you put Sorting Things Out in your reading pile.
    BK
    02.10 / Wissenschaft und Gesellschaft
    06.70 / Katalogisierung / Bestandserschließung
    Classification
    AN 93400 Allgemeines / Buch- und Bibliothekswesen, Informationswissenschaft / Informationswissenschaft / Grundlagen, Theorie / Klassifikation
    02.10 / Wissenschaft und Gesellschaft
    06.70 / Katalogisierung / Bestandserschließung
    Footnote
    Rez. in: Knowledge organization 27(2000) no.3, H.175-177 (B. Kwasnik); College and research libraries 61(2000) no.4, S.380-381 (J. Williams); Library resources and technical services 44(2000) no.4, S.107-108 (H.A. Olson); JASIST 51(2000) no.12, S.1149-1150 (T.A. Brooks)
    RVK
    AN 93400 Allgemeines / Buch- und Bibliothekswesen, Informationswissenschaft / Informationswissenschaft / Grundlagen, Theorie / Klassifikation
  2. Ibekwe-SanJuan, F.; Bowker, G.C.: Implications of big data for knowledge organization (2017) 0.00
    5.1925366E-4 = product of:
      0.008827312 = sum of:
        0.008827312 = weight(_text_:in in 3624) [ClassicSimilarity], result of:
          0.008827312 = score(doc=3624,freq=16.0), product of:
            0.04153252 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.030532904 = queryNorm
            0.21253976 = fieldWeight in 3624, product of:
              4.0 = tf(freq=16.0), with freq of:
                16.0 = termFreq=16.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0390625 = fieldNorm(doc=3624)
      0.05882353 = coord(1/17)
    
    Abstract
    In this paper, we propose a high-level analysis of the implications of big data for knowledge organisation (KO) and knowledge organisation systems (KOSs). We confront the current debates within the KO community about the relevance of universal bibliographic classifications and the thesaurus in the web with the ongoing discussions about the epistemological and methodological assumptions underlying data-driven inquiry. In essence, big data will not remove the need for humanly-constructed KOSs. However, ongoing transformations in knowledge production processes entailed by big data and Web 2.0 put pressure on the KO community to rethink the standpoint from which KOSs are designed. Essentially, the field of KO needs to move from laying down the apodictic (that which we know for all time) to adapting to the new world of social and natural scientific knowledge by creating maximally flexible schemas-faceted rather than Aristotelean classifications. KO also needs to adapt to the changing nature of output in the social and natural sciences, to the extent that these in turn are being affected by the advent of big data. Theoretically, this entails a shift from purely universalist and normative top-down approaches to more descriptive bottom-up approaches that can be inclusive of diverse viewpoints. Methodologically, this means striking the right balance between two seemingly opposing modalities in designing KOSs: the necessity on the one hand to incorporate automated techniques and on the other, to solicit contributions from amateurs (crowdsourcing) via Web 2.0 platforms.
    Content
    Beitrag in einem Special Issue "New Trends for Knowledge Organization, Guest Editor: Renato Rocha Souza".
  3. Bowker, G.C.: ¬The kindness of strangers : kinds and politics in classification systems (1998) 0.00
    5.140348E-4 = product of:
      0.008738592 = sum of:
        0.008738592 = weight(_text_:in in 853) [ClassicSimilarity], result of:
          0.008738592 = score(doc=853,freq=8.0), product of:
            0.04153252 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.030532904 = queryNorm
            0.21040362 = fieldWeight in 853, product of:
              2.828427 = tf(freq=8.0), with freq of:
                8.0 = termFreq=8.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0546875 = fieldNorm(doc=853)
      0.05882353 = coord(1/17)
    
    Abstract
    This article offers a formal reading of a classification scheme of international scope and long duration: the International Classification of Diseases (ICD). The argument is made that this classification scheme retains many traces of its own administrative and organizational past in its current form. Further, it is argued that such traces operate normatively to favor certain kinds of narrative of medical treatment while denying others. It is suggested that the ICD, like other large-scale classification systems, is able to do its work so effectively precisely because these traces permit a coupling of classification scheme and organizational form.
    Footnote
    Artikel in einem Themenheft "How Classifications Work: Problems and Challenges in an Electronic Age"
  4. Bowker, G.C.: ¬The history of information infrastructures : the case of the International Classification of Diseases (1996) 0.00
    2.9373422E-4 = product of:
      0.0049934816 = sum of:
        0.0049934816 = weight(_text_:in in 1375) [ClassicSimilarity], result of:
          0.0049934816 = score(doc=1375,freq=2.0), product of:
            0.04153252 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.030532904 = queryNorm
            0.120230645 = fieldWeight in 1375, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0625 = fieldNorm(doc=1375)
      0.05882353 = coord(1/17)
    
    Abstract
    Explores ways of wrting the history of information infrastructures. As an example looks at the construction of the universal medical classification: the International Classification of Diseases. Discusses the case of medical classifications, medical classification and the state, and medical classification and information processing. A fundamental figure/ground problem emerges in the analysis of information infrastructures. The medical classification system is historically contingent both with respect to its political origins and technological underpinnings