Search (49 results, page 1 of 3)

  • × theme_ss:"Inhaltsanalyse"
  1. From information to knowledge : conceptual and content analysis by computer (1995) 0.05
    0.047906943 = product of:
      0.095813885 = sum of:
        0.005885557 = product of:
          0.023542227 = sum of:
            0.023542227 = weight(_text_:based in 5392) [ClassicSimilarity], result of:
              0.023542227 = score(doc=5392,freq=2.0), product of:
                0.14144066 = queryWeight, product of:
                  3.0129938 = idf(docFreq=5906, maxDocs=44218)
                  0.04694356 = queryNorm
                0.16644597 = fieldWeight in 5392, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.0129938 = idf(docFreq=5906, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=5392)
          0.25 = coord(1/4)
        0.08992833 = weight(_text_:frequency in 5392) [ClassicSimilarity], result of:
          0.08992833 = score(doc=5392,freq=2.0), product of:
            0.27643865 = queryWeight, product of:
              5.888745 = idf(docFreq=332, maxDocs=44218)
              0.04694356 = queryNorm
            0.32531026 = fieldWeight in 5392, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              5.888745 = idf(docFreq=332, maxDocs=44218)
              0.0390625 = fieldNorm(doc=5392)
      0.5 = coord(2/4)
    
    Content
    SCHMIDT, K.M.: Concepts - content - meaning: an introduction; DUCHASTEL, J. et al.: The SACAO project: using computation toward textual data analysis; PAQUIN, L.-C. u. L. DUPUY: An approach to expertise transfer: computer-assisted text analysis; HOGENRAAD, R., Y. BESTGEN u. J.-L. NYSTEN: Terrorist rhetoric: texture and architecture; MOHLER, P.P.: On the interaction between reading and computing: an interpretative approach to content analysis; LANCASHIRE, I.: Computer tools for cognitive stylistics; MERGENTHALER, E.: An outline of knowledge based text analysis; NAMENWIRTH, J.Z.: Ideography in computer-aided content analysis; WEBER, R.P. u. J.Z. Namenwirth: Content-analytic indicators: a self-critique; McKINNON, A.: Optimizing the aberrant frequency word technique; ROSATI, R.: Factor analysis in classical archaeology: export patterns of Attic pottery trade; PETRILLO, P.S.: Old and new worlds: ancient coinage and modern technology; DARANYI, S., S. MARJAI u.a.: Caryatids and the measurement of semiosis in architecture; ZARRI, G.P.: Intelligent information retrieval: an application in the field of historical biographical data; BOUCHARD, G., R. ROY u.a.: Computers and genealogy: from family reconstitution to population reconstruction; DEMÉLAS-BOHY, M.-D. u. M. RENAUD: Instability, networks and political parties: a political history expert system prototype; DARANYI, S., A. ABRANYI u. G. KOVACS: Knowledge extraction from ethnopoetic texts by multivariate statistical methods; FRAUTSCHI, R.L.: Measures of narrative voice in French prose fiction applied to textual samples from the enlightenment to the twentieth century; DANNENBERG, R. u.a.: A project in computer music: the musician's workbench
  2. Morehead, D.R.; Pejtersen, A.M.; Rouse, W.B.: ¬The value of information and computer-aided information seeking : problem formulation and application to fiction retrieval (1984) 0.03
    0.034802582 = product of:
      0.069605164 = sum of:
        0.014271717 = product of:
          0.057086866 = sum of:
            0.057086866 = weight(_text_:based in 5828) [ClassicSimilarity], result of:
              0.057086866 = score(doc=5828,freq=6.0), product of:
                0.14144066 = queryWeight, product of:
                  3.0129938 = idf(docFreq=5906, maxDocs=44218)
                  0.04694356 = queryNorm
                0.40361002 = fieldWeight in 5828, product of:
                  2.4494898 = tf(freq=6.0), with freq of:
                    6.0 = termFreq=6.0
                  3.0129938 = idf(docFreq=5906, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=5828)
          0.25 = coord(1/4)
        0.055333447 = product of:
          0.11066689 = sum of:
            0.11066689 = weight(_text_:assessment in 5828) [ClassicSimilarity], result of:
              0.11066689 = score(doc=5828,freq=2.0), product of:
                0.25917634 = queryWeight, product of:
                  5.52102 = idf(docFreq=480, maxDocs=44218)
                  0.04694356 = queryNorm
                0.4269946 = fieldWeight in 5828, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  5.52102 = idf(docFreq=480, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=5828)
          0.5 = coord(1/2)
      0.5 = coord(2/4)
    
    Abstract
    Issues concerning the formulation and application of a model of how humans value information are examined. Formulation of a value function is based on research from modelling, value assessment, human information seeking behavior, and human decision making. The proposed function is incorporated into a computer-based fiction retrieval system and evaluated using data from nine searches. Evaluation is based on the ability of an individual's value function to discriminate among novels selected, rejected, and not considered. The results are discussed in terms of both formulation and utilization of a value function as well as the implications for extending the proposed formulation to other information seeking environments
  3. Solomon, P.: Access to fiction for children : a user-based assessment of options and opportunities (1997) 0.03
    0.031786613 = product of:
      0.06357323 = sum of:
        0.00823978 = product of:
          0.03295912 = sum of:
            0.03295912 = weight(_text_:based in 5845) [ClassicSimilarity], result of:
              0.03295912 = score(doc=5845,freq=2.0), product of:
                0.14144066 = queryWeight, product of:
                  3.0129938 = idf(docFreq=5906, maxDocs=44218)
                  0.04694356 = queryNorm
                0.23302436 = fieldWeight in 5845, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.0129938 = idf(docFreq=5906, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=5845)
          0.25 = coord(1/4)
        0.055333447 = product of:
          0.11066689 = sum of:
            0.11066689 = weight(_text_:assessment in 5845) [ClassicSimilarity], result of:
              0.11066689 = score(doc=5845,freq=2.0), product of:
                0.25917634 = queryWeight, product of:
                  5.52102 = idf(docFreq=480, maxDocs=44218)
                  0.04694356 = queryNorm
                0.4269946 = fieldWeight in 5845, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  5.52102 = idf(docFreq=480, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=5845)
          0.5 = coord(1/2)
      0.5 = coord(2/4)
    
  4. Rosso, M.A.: User-based identification of Web genres (2008) 0.03
    0.031173116 = product of:
      0.06234623 = sum of:
        0.005885557 = product of:
          0.023542227 = sum of:
            0.023542227 = weight(_text_:based in 1863) [ClassicSimilarity], result of:
              0.023542227 = score(doc=1863,freq=2.0), product of:
                0.14144066 = queryWeight, product of:
                  3.0129938 = idf(docFreq=5906, maxDocs=44218)
                  0.04694356 = queryNorm
                0.16644597 = fieldWeight in 1863, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.0129938 = idf(docFreq=5906, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=1863)
          0.25 = coord(1/4)
        0.056460675 = weight(_text_:term in 1863) [ClassicSimilarity], result of:
          0.056460675 = score(doc=1863,freq=2.0), product of:
            0.21904005 = queryWeight, product of:
              4.66603 = idf(docFreq=1130, maxDocs=44218)
              0.04694356 = queryNorm
            0.25776416 = fieldWeight in 1863, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              4.66603 = idf(docFreq=1130, maxDocs=44218)
              0.0390625 = fieldNorm(doc=1863)
      0.5 = coord(2/4)
    
    Abstract
    This research explores the use of genre as a document descriptor in order to improve the effectiveness of Web searching. A major issue to be resolved is the identification of what document categories should be used as genres. As genre is a kind of folk typology, document categories must enjoy widespread recognition by their intended user groups in order to qualify as genres. Three user studies were conducted to develop a genre palette and show that it is recognizable to users. (Palette is a term used to denote a classification, attributable to Karlgren, Bretan, Dewe, Hallberg, and Wolkert, 1998.) To simplify the users' classification task, it was decided to focus on Web pages from the edu domain. The first study was a survey of user terminology for Web pages. Three participants separated 100 Web page printouts into stacks according to genre, assigning names and definitions to each genre. The second study aimed to refine the resulting set of 48 (often conceptually and lexically similar) genre names and definitions into a smaller palette of user-preferred terminology. Ten participants classified the same 100 Web pages. A set of five principles for creating a genre palette from individuals' sortings was developed, and the list of 48 was trimmed to 18 genres. The third study aimed to show that users would agree on the genres of Web pages when choosing from the genre palette. In an online experiment in which 257 participants categorized a new set of 55 pages using the 18 genres, on average, over 70% agreed on the genre of each page. Suggestions for improving the genre palette and future directions for the work are discussed.
  5. Naves, M.M.L.: Analise de assunto : concepcoes (1996) 0.03
    0.028230337 = product of:
      0.11292135 = sum of:
        0.11292135 = weight(_text_:term in 607) [ClassicSimilarity], result of:
          0.11292135 = score(doc=607,freq=2.0), product of:
            0.21904005 = queryWeight, product of:
              4.66603 = idf(docFreq=1130, maxDocs=44218)
              0.04694356 = queryNorm
            0.5155283 = fieldWeight in 607, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              4.66603 = idf(docFreq=1130, maxDocs=44218)
              0.078125 = fieldNorm(doc=607)
      0.25 = coord(1/4)
    
    Abstract
    Discusses subject analysis as an important stage in the indexing process and observes confusions that can occur in the meaning of the term. Considers questions and difficulties about subject analysis and the concept of aboutness
  6. Hauser, E.; Tennis, J.T.: Episemantics: aboutness as aroundness (2019) 0.02
    0.024448192 = product of:
      0.09779277 = sum of:
        0.09779277 = weight(_text_:term in 5640) [ClassicSimilarity], result of:
          0.09779277 = score(doc=5640,freq=6.0), product of:
            0.21904005 = queryWeight, product of:
              4.66603 = idf(docFreq=1130, maxDocs=44218)
              0.04694356 = queryNorm
            0.44646066 = fieldWeight in 5640, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              4.66603 = idf(docFreq=1130, maxDocs=44218)
              0.0390625 = fieldNorm(doc=5640)
      0.25 = coord(1/4)
    
    Abstract
    Aboutness ranks amongst our field's greatest bugbears. What is a work about? How can this be known? This mirrors debates within the philosophy of language, where the concept of representation has similarly evaded satisfactory definition. This paper proposes that we abandon the strong sense of the word aboutness, which seems to promise some inherent relationship between work and subject, or, in philosophical terms, between word and world. Instead, we seek an etymological reset to the older sense of aboutness as "in the vicinity, nearby; in some place or various places nearby; all over a surface." To distinguish this sense in the context of information studies, we introduce the term episemantics. The authors have each independently applied this term in slightly different contexts and scales (Hauser 2018a; Tennis 2016), and this article presents a unified definition of the term and guidelines for applying it at the scale of both words and works. The resulting weak concept of aboutness is pragmatic, in Star's sense of a focus on consequences over antecedents, while reserving space for the critique and improvement of aboutness determinations within various contexts and research programs. The paper finishes with a discussion of the implication of the concept of episemantics and methodological possibilities it offers for knowledge organization research and practice. We draw inspiration from Melvil Dewey's use of physical aroundness in his first classification system and ask how aroundness might be more effectively operationalized in digital environments.
  7. Pejtersen, A.M.: Design of a classification scheme for fiction based on an analysis of actual user-librarian communication, and use of the scheme for control of librarians' search strategies (1980) 0.02
    0.021786068 = product of:
      0.043572135 = sum of:
        0.011771114 = product of:
          0.047084454 = sum of:
            0.047084454 = weight(_text_:based in 5835) [ClassicSimilarity], result of:
              0.047084454 = score(doc=5835,freq=2.0), product of:
                0.14144066 = queryWeight, product of:
                  3.0129938 = idf(docFreq=5906, maxDocs=44218)
                  0.04694356 = queryNorm
                0.33289194 = fieldWeight in 5835, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.0129938 = idf(docFreq=5906, maxDocs=44218)
                  0.078125 = fieldNorm(doc=5835)
          0.25 = coord(1/4)
        0.031801023 = product of:
          0.063602045 = sum of:
            0.063602045 = weight(_text_:22 in 5835) [ClassicSimilarity], result of:
              0.063602045 = score(doc=5835,freq=2.0), product of:
                0.16438834 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.04694356 = queryNorm
                0.38690117 = fieldWeight in 5835, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.078125 = fieldNorm(doc=5835)
          0.5 = coord(1/2)
      0.5 = coord(2/4)
    
    Date
    5. 8.2006 13:22:44
  8. Raieli, R.: ¬The semantic hole : enthusiasm and caution around multimedia information retrieval (2012) 0.02
    0.016340401 = product of:
      0.032680802 = sum of:
        0.010194084 = product of:
          0.040776335 = sum of:
            0.040776335 = weight(_text_:based in 4888) [ClassicSimilarity], result of:
              0.040776335 = score(doc=4888,freq=6.0), product of:
                0.14144066 = queryWeight, product of:
                  3.0129938 = idf(docFreq=5906, maxDocs=44218)
                  0.04694356 = queryNorm
                0.28829288 = fieldWeight in 4888, product of:
                  2.4494898 = tf(freq=6.0), with freq of:
                    6.0 = termFreq=6.0
                  3.0129938 = idf(docFreq=5906, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=4888)
          0.25 = coord(1/4)
        0.022486717 = product of:
          0.044973433 = sum of:
            0.044973433 = weight(_text_:22 in 4888) [ClassicSimilarity], result of:
              0.044973433 = score(doc=4888,freq=4.0), product of:
                0.16438834 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.04694356 = queryNorm
                0.27358043 = fieldWeight in 4888, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=4888)
          0.5 = coord(1/2)
      0.5 = coord(2/4)
    
    Abstract
    This paper centres on the tools for the management of new digital documents, which are not only textual, but also visual-video, audio or multimedia in the full sense. Among the aims is to demonstrate that operating within the terms of generic Information Retrieval through textual language only is limiting, and it is instead necessary to consider ampler criteria, such as those of MultiMedia Information Retrieval, according to which, every type of digital document can be analyzed and searched by the proper elements of language for its proper nature. MMIR is presented as the organic complex of the systems of Text Retrieval, Visual Retrieval, Video Retrieval, and Audio Retrieval, each of which has an approach to information management that handles the concrete textual, visual, audio, or video content of the documents directly, here defined as content-based. In conclusion, the limits of this content-based objective access to documents is underlined. The discrepancy known as the semantic gap is that which occurs between semantic-interpretive access and content-based access. Finally, the integration of these conceptions is explained, gathering and composing the merits and the advantages of each of the approaches and of the systems to access to information.
    Date
    22. 1.2012 13:02:10
    Source
    Knowledge organization. 39(2012) no.1, S.13-22
  9. Yoon, J.W.: Utilizing quantitative users' reactions to represent affective meanings of an image (2010) 0.01
    0.009880973 = product of:
      0.039523892 = sum of:
        0.039523892 = product of:
          0.079047784 = sum of:
            0.079047784 = weight(_text_:assessment in 3584) [ClassicSimilarity], result of:
              0.079047784 = score(doc=3584,freq=2.0), product of:
                0.25917634 = queryWeight, product of:
                  5.52102 = idf(docFreq=480, maxDocs=44218)
                  0.04694356 = queryNorm
                0.30499613 = fieldWeight in 3584, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  5.52102 = idf(docFreq=480, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=3584)
          0.5 = coord(1/2)
      0.25 = coord(1/4)
    
    Abstract
    Emotional meaning is critical for users to retrieve relevant images. However, because emotional meanings are subject to the individual viewer's interpretation, they are considered difficult to implement when designing image retrieval systems. With the intent of making an image's emotional messages more readily accessible, this study aims to test a new approach designed to enhance the accessibility of emotional meanings during the image search process. This approach utilizes image searchers' emotional reactions, which are quantitatively measured. Broadly used quantitative measurements for emotional reactions, Semantic Differential (SD) and Self-Assessment Manikin (SAM), were selected as tools for gathering users' reactions. Emotional representations obtained from these two tools were compared with three image perception tasks: searching, describing, and sorting. A survey questionnaire with a set of 12 images was administered to 58 participants, which were tagged with basic emotions. Results demonstrated that the SAM represents basic emotions on 2-dimensional plots (pleasure and arousal dimensions), and this representation consistently corresponded to the three image perception tasks. This study provided experimental evidence that quantitative users' reactions can be a useful complementary element of current image retrieval/indexing systems. Integrating users' reactions obtained from the SAM into image browsing systems would reduce the efforts of human indexers as well as improve the effectiveness of image retrieval systems.
  10. Sauperl, A.: Subject determination during the cataloging process : the development of a system based on theoretical principles (2002) 0.01
    0.0065358197 = product of:
      0.013071639 = sum of:
        0.0035313342 = product of:
          0.014125337 = sum of:
            0.014125337 = weight(_text_:based in 2293) [ClassicSimilarity], result of:
              0.014125337 = score(doc=2293,freq=2.0), product of:
                0.14144066 = queryWeight, product of:
                  3.0129938 = idf(docFreq=5906, maxDocs=44218)
                  0.04694356 = queryNorm
                0.09986758 = fieldWeight in 2293, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.0129938 = idf(docFreq=5906, maxDocs=44218)
                  0.0234375 = fieldNorm(doc=2293)
          0.25 = coord(1/4)
        0.0095403055 = product of:
          0.019080611 = sum of:
            0.019080611 = weight(_text_:22 in 2293) [ClassicSimilarity], result of:
              0.019080611 = score(doc=2293,freq=2.0), product of:
                0.16438834 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.04694356 = queryNorm
                0.116070345 = fieldWeight in 2293, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0234375 = fieldNorm(doc=2293)
          0.5 = coord(1/2)
      0.5 = coord(2/4)
    
    Date
    27. 9.2005 14:22:19
  11. Beghtol, C.: Toward a theory of fiction analysis for information storage and retrieval (1992) 0.01
    0.006360204 = product of:
      0.025440816 = sum of:
        0.025440816 = product of:
          0.05088163 = sum of:
            0.05088163 = weight(_text_:22 in 5830) [ClassicSimilarity], result of:
              0.05088163 = score(doc=5830,freq=2.0), product of:
                0.16438834 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.04694356 = queryNorm
                0.30952093 = fieldWeight in 5830, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0625 = fieldNorm(doc=5830)
          0.5 = coord(1/2)
      0.25 = coord(1/4)
    
    Date
    5. 8.2006 13:22:08
  12. Hauff-Hartig, S.: Automatische Transkription von Videos : Fernsehen 3.0: Automatisierte Sentimentanalyse und Zusammenstellung von Kurzvideos mit hohem Aufregungslevel KI-generierte Metadaten: Von der Technologiebeobachtung bis zum produktiven Einsatz (2021) 0.01
    0.006360204 = product of:
      0.025440816 = sum of:
        0.025440816 = product of:
          0.05088163 = sum of:
            0.05088163 = weight(_text_:22 in 251) [ClassicSimilarity], result of:
              0.05088163 = score(doc=251,freq=2.0), product of:
                0.16438834 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.04694356 = queryNorm
                0.30952093 = fieldWeight in 251, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0625 = fieldNorm(doc=251)
          0.5 = coord(1/2)
      0.25 = coord(1/4)
    
    Date
    22. 5.2021 12:43:05
  13. Bade, D.: ¬The creation and persistence of misinformation in shared library catalogs : language and subject knowledge in a technological era (2002) 0.00
    0.004844789 = product of:
      0.009689578 = sum of:
        0.0033293737 = product of:
          0.013317495 = sum of:
            0.013317495 = weight(_text_:based in 1858) [ClassicSimilarity], result of:
              0.013317495 = score(doc=1858,freq=4.0), product of:
                0.14144066 = queryWeight, product of:
                  3.0129938 = idf(docFreq=5906, maxDocs=44218)
                  0.04694356 = queryNorm
                0.09415606 = fieldWeight in 1858, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  3.0129938 = idf(docFreq=5906, maxDocs=44218)
                  0.015625 = fieldNorm(doc=1858)
          0.25 = coord(1/4)
        0.006360204 = product of:
          0.012720408 = sum of:
            0.012720408 = weight(_text_:22 in 1858) [ClassicSimilarity], result of:
              0.012720408 = score(doc=1858,freq=2.0), product of:
                0.16438834 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.04694356 = queryNorm
                0.07738023 = fieldWeight in 1858, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.015625 = fieldNorm(doc=1858)
          0.5 = coord(1/2)
      0.5 = coord(2/4)
    
    Date
    22. 9.1997 19:16:05
    Footnote
    Bade begins his discussion of errors in subject analysis by summarizing the contents of seven records containing what he considers to be egregious errors. The examples were drawn only from items that he has encountered in the course of his work. Five of the seven records were full-level ("I" level) records for Eastern European materials created between 1996 and 2000 in the OCLC WorldCat database. The final two examples were taken from records created by Bade himself over an unspecified period of time. Although he is to be commended for examining the actual items cataloged and for examining mostly items that he claims to have adequate linguistic and subject expertise to evaluate reliably, Bade's methodology has major flaws. First and foremost, the number of examples provided is completely inadequate to draw any conclusions about the extent of the problem. Although an in-depth qualitative analysis of a small number of records might have yielded some valuable insight into factors that contribute to errors in subject analysis, Bade provides no Information about the circumstances under which the live OCLC records he critiques were created. Instead, he offers simplistic explanations for the errors based solely an his own assumptions. He supplements his analysis of examples with an extremely brief survey of other studies regarding errors in subject analysis, which consists primarily of criticism of work done by Sheila Intner. In the end, it is impossible to draw any reliable conclusions about the nature or extent of errors in subject analysis found in records in shared bibliographic databases based an Bade's analysis. In the final third of the essay, Bade finally reveals his true concern: the deintellectualization of cataloging. It would strengthen the essay tremendously to present this as the primary premise from the very beginning, as this section offers glimpses of a compelling argument. Bade laments, "Many librarians simply do not sec cataloging as an intellectual activity requiring an educated mind" (p. 20). Commenting an recent trends in copy cataloging practice, he declares, "The disaster of our time is that this work is being done more and more by people who can neither evaluate nor correct imported errors and offen are forbidden from even thinking about it" (p. 26). Bade argues that the most valuable content found in catalog records is the intellectual content contributed by knowledgeable catalogers, and he asserts that to perform intellectually demanding tasks such as subject analysis reliably and effectively, catalogers must have the linguistic and subject knowledge required to gain at least a rudimentary understanding of the materials that they describe. He contends that requiring catalogers to quickly dispense with materials in unfamiliar languages and subjects clearly undermines their ability to perform the intellectual work of cataloging and leads to an increasing number of errors in the bibliographic records contributed to shared databases.
  14. Weimer, K.H.: ¬The nexus of subject analysis and bibliographic description : the case of multipart videos (1996) 0.00
    0.0047701527 = product of:
      0.019080611 = sum of:
        0.019080611 = product of:
          0.038161222 = sum of:
            0.038161222 = weight(_text_:22 in 6525) [ClassicSimilarity], result of:
              0.038161222 = score(doc=6525,freq=2.0), product of:
                0.16438834 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.04694356 = queryNorm
                0.23214069 = fieldWeight in 6525, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.046875 = fieldNorm(doc=6525)
          0.5 = coord(1/2)
      0.25 = coord(1/4)
    
    Source
    Cataloging and classification quarterly. 22(1996) no.2, S.5-18
  15. Chen, S.-J.; Lee, H.-L.: Art images and mental associations : a preliminary exploration (2014) 0.00
    0.0047701527 = product of:
      0.019080611 = sum of:
        0.019080611 = product of:
          0.038161222 = sum of:
            0.038161222 = weight(_text_:22 in 1416) [ClassicSimilarity], result of:
              0.038161222 = score(doc=1416,freq=2.0), product of:
                0.16438834 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.04694356 = queryNorm
                0.23214069 = fieldWeight in 1416, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.046875 = fieldNorm(doc=1416)
          0.5 = coord(1/2)
      0.25 = coord(1/4)
    
    Source
    Knowledge organization in the 21st century: between historical patterns and future prospects. Proceedings of the Thirteenth International ISKO Conference 19-22 May 2014, Kraków, Poland. Ed.: Wieslaw Babik
  16. White, M.D.; Marsh, E.E.: Content analysis : a flexible methodology (2006) 0.00
    0.0047701527 = product of:
      0.019080611 = sum of:
        0.019080611 = product of:
          0.038161222 = sum of:
            0.038161222 = weight(_text_:22 in 5589) [ClassicSimilarity], result of:
              0.038161222 = score(doc=5589,freq=2.0), product of:
                0.16438834 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.04694356 = queryNorm
                0.23214069 = fieldWeight in 5589, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.046875 = fieldNorm(doc=5589)
          0.5 = coord(1/2)
      0.25 = coord(1/4)
    
    Source
    Library trends. 55(2006) no.1, S.22-45
  17. Nahl-Jakobovits, D.; Jakobovits, L.A.: ¬A content analysis method for developing user-based objectives (1992) 0.00
    0.004161717 = product of:
      0.016646868 = sum of:
        0.016646868 = product of:
          0.06658747 = sum of:
            0.06658747 = weight(_text_:based in 3015) [ClassicSimilarity], result of:
              0.06658747 = score(doc=3015,freq=4.0), product of:
                0.14144066 = queryWeight, product of:
                  3.0129938 = idf(docFreq=5906, maxDocs=44218)
                  0.04694356 = queryNorm
                0.47078028 = fieldWeight in 3015, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  3.0129938 = idf(docFreq=5906, maxDocs=44218)
                  0.078125 = fieldNorm(doc=3015)
          0.25 = coord(1/4)
      0.25 = coord(1/4)
    
    Abstract
    The article explains content analysis, a method whereby statements taken from oral or written library user comments are labeled as particular speech acts. These speech acts are then categorized into the three behavioral domains: affective, cognitive, and sonsorimotor, ansd used to construct user-based instructional objectives
  18. Hjoerland, B.: Subject representation and information seeking : contributions to a theory based on the theory of knowledge (1993) 0.00
    0.00411989 = product of:
      0.01647956 = sum of:
        0.01647956 = product of:
          0.06591824 = sum of:
            0.06591824 = weight(_text_:based in 7555) [ClassicSimilarity], result of:
              0.06591824 = score(doc=7555,freq=2.0), product of:
                0.14144066 = queryWeight, product of:
                  3.0129938 = idf(docFreq=5906, maxDocs=44218)
                  0.04694356 = queryNorm
                0.46604872 = fieldWeight in 7555, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.0129938 = idf(docFreq=5906, maxDocs=44218)
                  0.109375 = fieldNorm(doc=7555)
          0.25 = coord(1/4)
      0.25 = coord(1/4)
    
  19. Austin, J.; Pejtersen, A.M.: Fiction retrieval: experimental design and evaluation of a search system based on user's value criteria. Pt.1 (1983) 0.00
    0.0035313342 = product of:
      0.014125337 = sum of:
        0.014125337 = product of:
          0.056501348 = sum of:
            0.056501348 = weight(_text_:based in 142) [ClassicSimilarity], result of:
              0.056501348 = score(doc=142,freq=2.0), product of:
                0.14144066 = queryWeight, product of:
                  3.0129938 = idf(docFreq=5906, maxDocs=44218)
                  0.04694356 = queryNorm
                0.39947033 = fieldWeight in 142, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.0129938 = idf(docFreq=5906, maxDocs=44218)
                  0.09375 = fieldNorm(doc=142)
          0.25 = coord(1/4)
      0.25 = coord(1/4)
    
  20. Pejtersen, A.M.: Design of a computer-aided user-system dialogue based on an analysis of users' search behaviour (1984) 0.00
    0.0035313342 = product of:
      0.014125337 = sum of:
        0.014125337 = product of:
          0.056501348 = sum of:
            0.056501348 = weight(_text_:based in 1044) [ClassicSimilarity], result of:
              0.056501348 = score(doc=1044,freq=2.0), product of:
                0.14144066 = queryWeight, product of:
                  3.0129938 = idf(docFreq=5906, maxDocs=44218)
                  0.04694356 = queryNorm
                0.39947033 = fieldWeight in 1044, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.0129938 = idf(docFreq=5906, maxDocs=44218)
                  0.09375 = fieldNorm(doc=1044)
          0.25 = coord(1/4)
      0.25 = coord(1/4)