Search (10 results, page 1 of 1)

  • × year_i:[1990 TO 2000}
  • × theme_ss:"Inhaltsanalyse"
  1. Gervereau, L.: Voir, comprendre, analyser les images (1994) 0.03
    0.030558715 = product of:
      0.12223486 = sum of:
        0.12223486 = weight(_text_:l in 8758) [ClassicSimilarity], result of:
          0.12223486 = score(doc=8758,freq=2.0), product of:
            0.17396861 = queryWeight, product of:
              3.9746525 = idf(docFreq=2257, maxDocs=44218)
              0.043769516 = queryNorm
            0.70262593 = fieldWeight in 8758, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.9746525 = idf(docFreq=2257, maxDocs=44218)
              0.125 = fieldNorm(doc=8758)
      0.25 = coord(1/4)
    
  2. Farrow, J.: All in the mind : concept analysis in indexing (1995) 0.03
    0.030077666 = product of:
      0.120310664 = sum of:
        0.120310664 = weight(_text_:van in 2926) [ClassicSimilarity], result of:
          0.120310664 = score(doc=2926,freq=2.0), product of:
            0.24408463 = queryWeight, product of:
              5.5765896 = idf(docFreq=454, maxDocs=44218)
              0.043769516 = queryNorm
            0.49290553 = fieldWeight in 2926, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              5.5765896 = idf(docFreq=454, maxDocs=44218)
              0.0625 = fieldNorm(doc=2926)
      0.25 = coord(1/4)
    
    Abstract
    The indexing process consists of the comprehension of the document to be indexed, followed by the production of a set of index terms. Differences between academic indexing and back-of-the-book indexing are discussed. Text comprehension is a branch of human information processing, and it is argued that the model of text comprehension and production debeloped by van Dijk and Kintsch can form the basis for a cognitive process model of indexing. Strategies for testing such a model are suggested
  3. Vieira, L.: Modèle d'analyse pur une classification du document iconographique (1999) 0.02
    0.019099196 = product of:
      0.076396786 = sum of:
        0.076396786 = weight(_text_:l in 6320) [ClassicSimilarity], result of:
          0.076396786 = score(doc=6320,freq=2.0), product of:
            0.17396861 = queryWeight, product of:
              3.9746525 = idf(docFreq=2257, maxDocs=44218)
              0.043769516 = queryNorm
            0.4391412 = fieldWeight in 6320, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.9746525 = idf(docFreq=2257, maxDocs=44218)
              0.078125 = fieldNorm(doc=6320)
      0.25 = coord(1/4)
    
  4. From information to knowledge : conceptual and content analysis by computer (1995) 0.02
    0.016540391 = product of:
      0.066161565 = sum of:
        0.066161565 = weight(_text_:l in 5392) [ClassicSimilarity], result of:
          0.066161565 = score(doc=5392,freq=6.0), product of:
            0.17396861 = queryWeight, product of:
              3.9746525 = idf(docFreq=2257, maxDocs=44218)
              0.043769516 = queryNorm
            0.38030747 = fieldWeight in 5392, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              3.9746525 = idf(docFreq=2257, maxDocs=44218)
              0.0390625 = fieldNorm(doc=5392)
      0.25 = coord(1/4)
    
    Content
    SCHMIDT, K.M.: Concepts - content - meaning: an introduction; DUCHASTEL, J. et al.: The SACAO project: using computation toward textual data analysis; PAQUIN, L.-C. u. L. DUPUY: An approach to expertise transfer: computer-assisted text analysis; HOGENRAAD, R., Y. BESTGEN u. J.-L. NYSTEN: Terrorist rhetoric: texture and architecture; MOHLER, P.P.: On the interaction between reading and computing: an interpretative approach to content analysis; LANCASHIRE, I.: Computer tools for cognitive stylistics; MERGENTHALER, E.: An outline of knowledge based text analysis; NAMENWIRTH, J.Z.: Ideography in computer-aided content analysis; WEBER, R.P. u. J.Z. Namenwirth: Content-analytic indicators: a self-critique; McKINNON, A.: Optimizing the aberrant frequency word technique; ROSATI, R.: Factor analysis in classical archaeology: export patterns of Attic pottery trade; PETRILLO, P.S.: Old and new worlds: ancient coinage and modern technology; DARANYI, S., S. MARJAI u.a.: Caryatids and the measurement of semiosis in architecture; ZARRI, G.P.: Intelligent information retrieval: an application in the field of historical biographical data; BOUCHARD, G., R. ROY u.a.: Computers and genealogy: from family reconstitution to population reconstruction; DEMÉLAS-BOHY, M.-D. u. M. RENAUD: Instability, networks and political parties: a political history expert system prototype; DARANYI, S., A. ABRANYI u. G. KOVACS: Knowledge extraction from ethnopoetic texts by multivariate statistical methods; FRAUTSCHI, R.L.: Measures of narrative voice in French prose fiction applied to textual samples from the enlightenment to the twentieth century; DANNENBERG, R. u.a.: A project in computer music: the musician's workbench
  5. Beghtol, C.: ¬The classification of fiction : the development of a system based on theoretical principles (1994) 0.01
    0.013369438 = product of:
      0.053477753 = sum of:
        0.053477753 = weight(_text_:l in 3413) [ClassicSimilarity], result of:
          0.053477753 = score(doc=3413,freq=2.0), product of:
            0.17396861 = queryWeight, product of:
              3.9746525 = idf(docFreq=2257, maxDocs=44218)
              0.043769516 = queryNorm
            0.30739886 = fieldWeight in 3413, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.9746525 = idf(docFreq=2257, maxDocs=44218)
              0.0546875 = fieldNorm(doc=3413)
      0.25 = coord(1/4)
    
    Footnote
    Rez. in: Knowledge organization 21(1994) no.3, S.165-167 (W. Bies); JASIS 46(1995) no.5, S.389-390 (E.G. Bierbaum); Canadian journal of information and library science 20(1995) nos.3/4, S.52-53 (L. Rees-Potter)
  6. Nohr, H.: Inhaltsanalyse (1999) 0.01
    0.006824918 = product of:
      0.027299672 = sum of:
        0.027299672 = product of:
          0.054599345 = sum of:
            0.054599345 = weight(_text_:der in 3430) [ClassicSimilarity], result of:
              0.054599345 = score(doc=3430,freq=16.0), product of:
                0.09777089 = queryWeight, product of:
                  2.2337668 = idf(docFreq=12875, maxDocs=44218)
                  0.043769516 = queryNorm
                0.5584417 = fieldWeight in 3430, product of:
                  4.0 = tf(freq=16.0), with freq of:
                    16.0 = termFreq=16.0
                  2.2337668 = idf(docFreq=12875, maxDocs=44218)
                  0.0625 = fieldNorm(doc=3430)
          0.5 = coord(1/2)
      0.25 = coord(1/4)
    
    Abstract
    Die Inhaltsanalyse ist der elementare Teilprozeß der Indexierung von Dokumenten. Trotz dieser zentralen Stellung im Rahmen einer inhaltlichen Dokumenterschließung wird der Vorgang der Inhaltsanalyse in theorie und Praxis noch zu wenig beachtet. Der Grund dieser Vernachlässigung liegt im vermeintlich subjektiven Charakter des Verstehensprozesses. Zur Überwindung dieses Problems wird zunächst der genaue Gegenstand der Inhaltsanalyse bestimmt. Daraus abgeleitet lassen sich methodisch weiterführende Ansätze und Verfahren einer inhaltlichen Analyse gewinnen. Abschließend werden einige weitere Aufgaben der Inhaltsanalyse, wir z.B. eine qualitative Bewertung, behandelt
  7. Beghtol, C.: Toward a theory of fiction analysis for information storage and retrieval (1992) 0.01
    0.0059301653 = product of:
      0.023720661 = sum of:
        0.023720661 = product of:
          0.047441322 = sum of:
            0.047441322 = weight(_text_:22 in 5830) [ClassicSimilarity], result of:
              0.047441322 = score(doc=5830,freq=2.0), product of:
                0.15327339 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.043769516 = queryNorm
                0.30952093 = fieldWeight in 5830, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0625 = fieldNorm(doc=5830)
          0.5 = coord(1/2)
      0.25 = coord(1/4)
    
    Date
    5. 8.2006 13:22:08
  8. Weimer, K.H.: ¬The nexus of subject analysis and bibliographic description : the case of multipart videos (1996) 0.00
    0.0044476236 = product of:
      0.017790494 = sum of:
        0.017790494 = product of:
          0.03558099 = sum of:
            0.03558099 = weight(_text_:22 in 6525) [ClassicSimilarity], result of:
              0.03558099 = score(doc=6525,freq=2.0), product of:
                0.15327339 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.043769516 = queryNorm
                0.23214069 = fieldWeight in 6525, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.046875 = fieldNorm(doc=6525)
          0.5 = coord(1/2)
      0.25 = coord(1/4)
    
    Source
    Cataloging and classification quarterly. 22(1996) no.2, S.5-18
  9. Mayring, P.: Qualitative Inhaltsanalyse : Grundlagen und Techniken (1990) 0.00
    0.003016216 = product of:
      0.012064864 = sum of:
        0.012064864 = product of:
          0.024129728 = sum of:
            0.024129728 = weight(_text_:der in 34) [ClassicSimilarity], result of:
              0.024129728 = score(doc=34,freq=2.0), product of:
                0.09777089 = queryWeight, product of:
                  2.2337668 = idf(docFreq=12875, maxDocs=44218)
                  0.043769516 = queryNorm
                0.2467987 = fieldWeight in 34, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  2.2337668 = idf(docFreq=12875, maxDocs=44218)
                  0.078125 = fieldNorm(doc=34)
          0.5 = coord(1/2)
      0.25 = coord(1/4)
    
    Abstract
    "Inhaltsanalyse will: Kommunikation analysieren, fixierte Kommunikation analysieren, dabei systematisch vorgehen, das heißt regelgeleitet vorgehen, das heißt auch theoriegeleitet vorgehen, mit dem Ziel, Rückschlüsse auf bestimmte Aspekte der Kommunikation zu ziehen" (S.11)
  10. Langridge, D.W.: Inhaltsanalyse: Grundlagen und Methoden (1994) 0.00
    0.0026121198 = product of:
      0.010448479 = sum of:
        0.010448479 = product of:
          0.020896958 = sum of:
            0.020896958 = weight(_text_:der in 3923) [ClassicSimilarity], result of:
              0.020896958 = score(doc=3923,freq=6.0), product of:
                0.09777089 = queryWeight, product of:
                  2.2337668 = idf(docFreq=12875, maxDocs=44218)
                  0.043769516 = queryNorm
                0.21373394 = fieldWeight in 3923, product of:
                  2.4494898 = tf(freq=6.0), with freq of:
                    6.0 = termFreq=6.0
                  2.2337668 = idf(docFreq=12875, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=3923)
          0.5 = coord(1/2)
      0.25 = coord(1/4)
    
    Abstract
    Nahezu unbeachtet in der gesamten bibliothekswissenschaftlichen Literatur blieb bislang ein Arbeitsschritt der Inhaltserschließung, obgleich er ihren Ausgangspunkt bildet und damit von grundlegender Bedeutung ist: die Analyse des Inhalts von Dokumenten. Klare Aussagen über den Inhalt und die Beschaffenheit eines Dokuments auf der Basis fundierter Kriterien treffen zu können, ist das Anliegen von Langridges, nunmehr erstmalig in deutscher Übersetzung vorliegendem Werk. Gestützt auf ein Fundament philosophisch abgeleiteter Wissensformen, zeigt Langridge anhand einer Vielzahl konkreter Beispiele Wege - aber auch Irrwege- auf, sich dem Inhalt von Dokumenten zu nähern und zu Resultaten zu gelangen, die dank objektiver Kriterien überprüfbar werden. Er wendet sich damit sowohl an Studenten dieses Fachgebiets, indem er die grundlegenden Strukturen von Wissen aufzeigt, als auch an erfahrene Praktiker, die ihre Arbeit effektiver gestalten und ihre Entscheidungen auf eine Basis stellen möchten, die persönliche Einschätzungen objektiviert.