-
Langridge, D.W.: Inhaltsanalyse: Grundlagen und Methoden (1994)
0.23
0.23219483 = product of:
0.5358342 = sum of:
0.109708816 = weight(_text_:allgemeines in 3923) [ClassicSimilarity], result of:
0.109708816 = score(doc=3923,freq=16.0), product of:
0.12306474 = queryWeight, product of:
5.705423 = idf(docFreq=399, maxDocs=44218)
0.021569785 = queryNorm
0.89147234 = fieldWeight in 3923, product of:
4.0 = tf(freq=16.0), with freq of:
16.0 = termFreq=16.0
5.705423 = idf(docFreq=399, maxDocs=44218)
0.0390625 = fieldNorm(doc=3923)
0.06309351 = weight(_text_:buch in 3923) [ClassicSimilarity], result of:
0.06309351 = score(doc=3923,freq=12.0), product of:
0.10028592 = queryWeight, product of:
4.64937 = idf(docFreq=1149, maxDocs=44218)
0.021569785 = queryNorm
0.6291363 = fieldWeight in 3923, product of:
3.4641016 = tf(freq=12.0), with freq of:
12.0 = termFreq=12.0
4.64937 = idf(docFreq=1149, maxDocs=44218)
0.0390625 = fieldNorm(doc=3923)
0.020276587 = weight(_text_:und in 3923) [ClassicSimilarity], result of:
0.020276587 = score(doc=3923,freq=24.0), product of:
0.04780656 = queryWeight, product of:
2.216367 = idf(docFreq=13101, maxDocs=44218)
0.021569785 = queryNorm
0.42413816 = fieldWeight in 3923, product of:
4.8989797 = tf(freq=24.0), with freq of:
24.0 = termFreq=24.0
2.216367 = idf(docFreq=13101, maxDocs=44218)
0.0390625 = fieldNorm(doc=3923)
0.033485137 = product of:
0.066970274 = sum of:
0.066970274 = weight(_text_:bibliothekswesen in 3923) [ClassicSimilarity], result of:
0.066970274 = score(doc=3923,freq=16.0), product of:
0.09615103 = queryWeight, product of:
4.457672 = idf(docFreq=1392, maxDocs=44218)
0.021569785 = queryNorm
0.69651127 = fieldWeight in 3923, product of:
4.0 = tf(freq=16.0), with freq of:
16.0 = termFreq=16.0
4.457672 = idf(docFreq=1392, maxDocs=44218)
0.0390625 = fieldNorm(doc=3923)
0.5 = coord(1/2)
0.07646339 = weight(_text_:informationswissenschaft in 3923) [ClassicSimilarity], result of:
0.07646339 = score(doc=3923,freq=20.0), product of:
0.09716552 = queryWeight, product of:
4.504705 = idf(docFreq=1328, maxDocs=44218)
0.021569785 = queryNorm
0.78693956 = fieldWeight in 3923, product of:
4.472136 = tf(freq=20.0), with freq of:
20.0 = termFreq=20.0
4.504705 = idf(docFreq=1328, maxDocs=44218)
0.0390625 = fieldNorm(doc=3923)
0.066970274 = weight(_text_:bibliothekswesen in 3923) [ClassicSimilarity], result of:
0.066970274 = score(doc=3923,freq=16.0), product of:
0.09615103 = queryWeight, product of:
4.457672 = idf(docFreq=1392, maxDocs=44218)
0.021569785 = queryNorm
0.69651127 = fieldWeight in 3923, product of:
4.0 = tf(freq=16.0), with freq of:
16.0 = termFreq=16.0
4.457672 = idf(docFreq=1392, maxDocs=44218)
0.0390625 = fieldNorm(doc=3923)
0.066970274 = weight(_text_:bibliothekswesen in 3923) [ClassicSimilarity], result of:
0.066970274 = score(doc=3923,freq=16.0), product of:
0.09615103 = queryWeight, product of:
4.457672 = idf(docFreq=1392, maxDocs=44218)
0.021569785 = queryNorm
0.69651127 = fieldWeight in 3923, product of:
4.0 = tf(freq=16.0), with freq of:
16.0 = termFreq=16.0
4.457672 = idf(docFreq=1392, maxDocs=44218)
0.0390625 = fieldNorm(doc=3923)
0.0054005357 = weight(_text_:in in 3923) [ClassicSimilarity], result of:
0.0054005357 = score(doc=3923,freq=12.0), product of:
0.029340398 = queryWeight, product of:
1.3602545 = idf(docFreq=30841, maxDocs=44218)
0.021569785 = queryNorm
0.18406484 = fieldWeight in 3923, product of:
3.4641016 = tf(freq=12.0), with freq of:
12.0 = termFreq=12.0
1.3602545 = idf(docFreq=30841, maxDocs=44218)
0.0390625 = fieldNorm(doc=3923)
0.020276587 = weight(_text_:und in 3923) [ClassicSimilarity], result of:
0.020276587 = score(doc=3923,freq=24.0), product of:
0.04780656 = queryWeight, product of:
2.216367 = idf(docFreq=13101, maxDocs=44218)
0.021569785 = queryNorm
0.42413816 = fieldWeight in 3923, product of:
4.8989797 = tf(freq=24.0), with freq of:
24.0 = termFreq=24.0
2.216367 = idf(docFreq=13101, maxDocs=44218)
0.0390625 = fieldNorm(doc=3923)
0.023926849 = weight(_text_:bibliotheken in 3923) [ClassicSimilarity], result of:
0.023926849 = score(doc=3923,freq=4.0), product of:
0.08127756 = queryWeight, product of:
3.768121 = idf(docFreq=2775, maxDocs=44218)
0.021569785 = queryNorm
0.29438445 = fieldWeight in 3923, product of:
2.0 = tf(freq=4.0), with freq of:
4.0 = termFreq=4.0
3.768121 = idf(docFreq=2775, maxDocs=44218)
0.0390625 = fieldNorm(doc=3923)
0.023926849 = weight(_text_:bibliotheken in 3923) [ClassicSimilarity], result of:
0.023926849 = score(doc=3923,freq=4.0), product of:
0.08127756 = queryWeight, product of:
3.768121 = idf(docFreq=2775, maxDocs=44218)
0.021569785 = queryNorm
0.29438445 = fieldWeight in 3923, product of:
2.0 = tf(freq=4.0), with freq of:
4.0 = termFreq=4.0
3.768121 = idf(docFreq=2775, maxDocs=44218)
0.0390625 = fieldNorm(doc=3923)
0.023926849 = weight(_text_:bibliotheken in 3923) [ClassicSimilarity], result of:
0.023926849 = score(doc=3923,freq=4.0), product of:
0.08127756 = queryWeight, product of:
3.768121 = idf(docFreq=2775, maxDocs=44218)
0.021569785 = queryNorm
0.29438445 = fieldWeight in 3923, product of:
2.0 = tf(freq=4.0), with freq of:
4.0 = termFreq=4.0
3.768121 = idf(docFreq=2775, maxDocs=44218)
0.0390625 = fieldNorm(doc=3923)
0.0014085418 = weight(_text_:s in 3923) [ClassicSimilarity], result of:
0.0014085418 = score(doc=3923,freq=2.0), product of:
0.023451481 = queryWeight, product of:
1.0872376 = idf(docFreq=40523, maxDocs=44218)
0.021569785 = queryNorm
0.060061958 = fieldWeight in 3923, product of:
1.4142135 = tf(freq=2.0), with freq of:
2.0 = termFreq=2.0
1.0872376 = idf(docFreq=40523, maxDocs=44218)
0.0390625 = fieldNorm(doc=3923)
0.43333334 = coord(13/30)
- Abstract
- Nahezu unbeachtet in der gesamten bibliothekswissenschaftlichen Literatur blieb bislang ein Arbeitsschritt der Inhaltserschließung, obgleich er ihren Ausgangspunkt bildet und damit von grundlegender Bedeutung ist: die Analyse des Inhalts von Dokumenten. Klare Aussagen über den Inhalt und die Beschaffenheit eines Dokuments auf der Basis fundierter Kriterien treffen zu können, ist das Anliegen von Langridges, nunmehr erstmalig in deutscher Übersetzung vorliegendem Werk. Gestützt auf ein Fundament philosophisch abgeleiteter Wissensformen, zeigt Langridge anhand einer Vielzahl konkreter Beispiele Wege - aber auch Irrwege- auf, sich dem Inhalt von Dokumenten zu nähern und zu Resultaten zu gelangen, die dank objektiver Kriterien überprüfbar werden. Er wendet sich damit sowohl an Studenten dieses Fachgebiets, indem er die grundlegenden Strukturen von Wissen aufzeigt, als auch an erfahrene Praktiker, die ihre Arbeit effektiver gestalten und ihre Entscheidungen auf eine Basis stellen möchten, die persönliche Einschätzungen objektiviert.
- Classification
- AN 95550 Allgemeines / Buch- und Bibliothekswesen, Informationswissenschaft / Informationswissenschaft / Informationspraxis / Sacherschließung / Verfahren
AN 75000 Allgemeines / Buch- und Bibliothekswesen, Informationswissenschaft / Bibliothekswesen / Sacherschließung in Bibliotheken / Allgemeines
AN 95100 Allgemeines / Buch- und Bibliothekswesen, Informationswissenschaft / Informationswissenschaft / Informationspraxis / Referieren, Klassifizieren, Indexieren
- Content
- Inhalt: Inhaltsanalyse - Ziel und Zweck / Wissensformen / Themen / Dokumentenformen / Inhaltsverdichtung / Inhaltsverdichtung in praktischen Beispielen / Wissensstrukturen in Ordnungssystemen / Tiefenanalyse
- Pages
- 159 S
- RVK
- AN 95550 Allgemeines / Buch- und Bibliothekswesen, Informationswissenschaft / Informationswissenschaft / Informationspraxis / Sacherschließung / Verfahren
AN 75000 Allgemeines / Buch- und Bibliothekswesen, Informationswissenschaft / Bibliothekswesen / Sacherschließung in Bibliotheken / Allgemeines
AN 95100 Allgemeines / Buch- und Bibliothekswesen, Informationswissenschaft / Informationswissenschaft / Informationspraxis / Referieren, Klassifizieren, Indexieren
-
Klüver, J.; Kier, R.: Rekonstruktion und Verstehen : ein Computer-Programm zur Interpretation sozialwissenschaftlicher Texte (1994)
0.01
0.0057485774 = product of:
0.057485774 = sum of:
0.02648922 = weight(_text_:und in 6830) [ClassicSimilarity], result of:
0.02648922 = score(doc=6830,freq=4.0), product of:
0.04780656 = queryWeight, product of:
2.216367 = idf(docFreq=13101, maxDocs=44218)
0.021569785 = queryNorm
0.55409175 = fieldWeight in 6830, product of:
2.0 = tf(freq=4.0), with freq of:
4.0 = termFreq=4.0
2.216367 = idf(docFreq=13101, maxDocs=44218)
0.125 = fieldNorm(doc=6830)
0.02648922 = weight(_text_:und in 6830) [ClassicSimilarity], result of:
0.02648922 = score(doc=6830,freq=4.0), product of:
0.04780656 = queryWeight, product of:
2.216367 = idf(docFreq=13101, maxDocs=44218)
0.021569785 = queryNorm
0.55409175 = fieldWeight in 6830, product of:
2.0 = tf(freq=4.0), with freq of:
4.0 = termFreq=4.0
2.216367 = idf(docFreq=13101, maxDocs=44218)
0.125 = fieldNorm(doc=6830)
0.004507334 = weight(_text_:s in 6830) [ClassicSimilarity], result of:
0.004507334 = score(doc=6830,freq=2.0), product of:
0.023451481 = queryWeight, product of:
1.0872376 = idf(docFreq=40523, maxDocs=44218)
0.021569785 = queryNorm
0.19219826 = fieldWeight in 6830, product of:
1.4142135 = tf(freq=2.0), with freq of:
2.0 = termFreq=2.0
1.0872376 = idf(docFreq=40523, maxDocs=44218)
0.125 = fieldNorm(doc=6830)
0.1 = coord(3/30)
- Source
- Sprache und Datenverarbeitung. 18(1994) H.1, S.3-15
-
Nohr, H.: Inhaltsanalyse (1999)
0.01
0.00509651 = product of:
0.03822382 = sum of:
0.01622127 = weight(_text_:und in 3430) [ClassicSimilarity], result of:
0.01622127 = score(doc=3430,freq=6.0), product of:
0.04780656 = queryWeight, product of:
2.216367 = idf(docFreq=13101, maxDocs=44218)
0.021569785 = queryNorm
0.33931053 = fieldWeight in 3430, product of:
2.4494898 = tf(freq=6.0), with freq of:
6.0 = termFreq=6.0
2.216367 = idf(docFreq=13101, maxDocs=44218)
0.0625 = fieldNorm(doc=3430)
0.003527615 = weight(_text_:in in 3430) [ClassicSimilarity], result of:
0.003527615 = score(doc=3430,freq=2.0), product of:
0.029340398 = queryWeight, product of:
1.3602545 = idf(docFreq=30841, maxDocs=44218)
0.021569785 = queryNorm
0.120230645 = fieldWeight in 3430, product of:
1.4142135 = tf(freq=2.0), with freq of:
2.0 = termFreq=2.0
1.3602545 = idf(docFreq=30841, maxDocs=44218)
0.0625 = fieldNorm(doc=3430)
0.01622127 = weight(_text_:und in 3430) [ClassicSimilarity], result of:
0.01622127 = score(doc=3430,freq=6.0), product of:
0.04780656 = queryWeight, product of:
2.216367 = idf(docFreq=13101, maxDocs=44218)
0.021569785 = queryNorm
0.33931053 = fieldWeight in 3430, product of:
2.4494898 = tf(freq=6.0), with freq of:
6.0 = termFreq=6.0
2.216367 = idf(docFreq=13101, maxDocs=44218)
0.0625 = fieldNorm(doc=3430)
0.002253667 = weight(_text_:s in 3430) [ClassicSimilarity], result of:
0.002253667 = score(doc=3430,freq=2.0), product of:
0.023451481 = queryWeight, product of:
1.0872376 = idf(docFreq=40523, maxDocs=44218)
0.021569785 = queryNorm
0.09609913 = fieldWeight in 3430, product of:
1.4142135 = tf(freq=2.0), with freq of:
2.0 = termFreq=2.0
1.0872376 = idf(docFreq=40523, maxDocs=44218)
0.0625 = fieldNorm(doc=3430)
0.13333334 = coord(4/30)
- Abstract
- Die Inhaltsanalyse ist der elementare Teilprozeß der Indexierung von Dokumenten. Trotz dieser zentralen Stellung im Rahmen einer inhaltlichen Dokumenterschließung wird der Vorgang der Inhaltsanalyse in theorie und Praxis noch zu wenig beachtet. Der Grund dieser Vernachlässigung liegt im vermeintlich subjektiven Charakter des Verstehensprozesses. Zur Überwindung dieses Problems wird zunächst der genaue Gegenstand der Inhaltsanalyse bestimmt. Daraus abgeleitet lassen sich methodisch weiterführende Ansätze und Verfahren einer inhaltlichen Analyse gewinnen. Abschließend werden einige weitere Aufgaben der Inhaltsanalyse, wir z.B. eine qualitative Bewertung, behandelt
- Source
- nfd Information - Wissenschaft und Praxis. 50(1999) H.2, S.69-78
-
Hildebrandt, B.; Moratz, R.; Rickheit, G.; Sagerer, G.: Kognitive Modellierung von Sprach- und Bildverstehen (1996)
0.00
0.003147656 = product of:
0.03147656 = sum of:
0.014048031 = weight(_text_:und in 7292) [ClassicSimilarity], result of:
0.014048031 = score(doc=7292,freq=2.0), product of:
0.04780656 = queryWeight, product of:
2.216367 = idf(docFreq=13101, maxDocs=44218)
0.021569785 = queryNorm
0.29385152 = fieldWeight in 7292, product of:
1.4142135 = tf(freq=2.0), with freq of:
2.0 = termFreq=2.0
2.216367 = idf(docFreq=13101, maxDocs=44218)
0.09375 = fieldNorm(doc=7292)
0.014048031 = weight(_text_:und in 7292) [ClassicSimilarity], result of:
0.014048031 = score(doc=7292,freq=2.0), product of:
0.04780656 = queryWeight, product of:
2.216367 = idf(docFreq=13101, maxDocs=44218)
0.021569785 = queryNorm
0.29385152 = fieldWeight in 7292, product of:
1.4142135 = tf(freq=2.0), with freq of:
2.0 = termFreq=2.0
2.216367 = idf(docFreq=13101, maxDocs=44218)
0.09375 = fieldNorm(doc=7292)
0.0033805002 = weight(_text_:s in 7292) [ClassicSimilarity], result of:
0.0033805002 = score(doc=7292,freq=2.0), product of:
0.023451481 = queryWeight, product of:
1.0872376 = idf(docFreq=40523, maxDocs=44218)
0.021569785 = queryNorm
0.14414869 = fieldWeight in 7292, product of:
1.4142135 = tf(freq=2.0), with freq of:
2.0 = termFreq=2.0
1.0872376 = idf(docFreq=40523, maxDocs=44218)
0.09375 = fieldNorm(doc=7292)
0.1 = coord(3/30)
- Pages
- S.2-13
-
Chen, H.; Ng, T.: ¬An algorithmic approach to concept exploration in a large knowledge network (automatic thesaurus consultation) : symbolic branch-and-bound search versus connectionist Hopfield Net Activation (1995)
0.00
0.0028039606 = product of:
0.021029703 = sum of:
0.0070240153 = weight(_text_:und in 2203) [ClassicSimilarity], result of:
0.0070240153 = score(doc=2203,freq=2.0), product of:
0.04780656 = queryWeight, product of:
2.216367 = idf(docFreq=13101, maxDocs=44218)
0.021569785 = queryNorm
0.14692576 = fieldWeight in 2203, product of:
1.4142135 = tf(freq=2.0), with freq of:
2.0 = termFreq=2.0
2.216367 = idf(docFreq=13101, maxDocs=44218)
0.046875 = fieldNorm(doc=2203)
0.0052914224 = weight(_text_:in in 2203) [ClassicSimilarity], result of:
0.0052914224 = score(doc=2203,freq=8.0), product of:
0.029340398 = queryWeight, product of:
1.3602545 = idf(docFreq=30841, maxDocs=44218)
0.021569785 = queryNorm
0.18034597 = fieldWeight in 2203, product of:
2.828427 = tf(freq=8.0), with freq of:
8.0 = termFreq=8.0
1.3602545 = idf(docFreq=30841, maxDocs=44218)
0.046875 = fieldNorm(doc=2203)
0.0070240153 = weight(_text_:und in 2203) [ClassicSimilarity], result of:
0.0070240153 = score(doc=2203,freq=2.0), product of:
0.04780656 = queryWeight, product of:
2.216367 = idf(docFreq=13101, maxDocs=44218)
0.021569785 = queryNorm
0.14692576 = fieldWeight in 2203, product of:
1.4142135 = tf(freq=2.0), with freq of:
2.0 = termFreq=2.0
2.216367 = idf(docFreq=13101, maxDocs=44218)
0.046875 = fieldNorm(doc=2203)
0.0016902501 = weight(_text_:s in 2203) [ClassicSimilarity], result of:
0.0016902501 = score(doc=2203,freq=2.0), product of:
0.023451481 = queryWeight, product of:
1.0872376 = idf(docFreq=40523, maxDocs=44218)
0.021569785 = queryNorm
0.072074346 = fieldWeight in 2203, product of:
1.4142135 = tf(freq=2.0), with freq of:
2.0 = termFreq=2.0
1.0872376 = idf(docFreq=40523, maxDocs=44218)
0.046875 = fieldNorm(doc=2203)
0.13333334 = coord(4/30)
- Abstract
- Presents a framework for knowledge discovery and concept exploration. In order to enhance the concept exploration capability of knowledge based systems and to alleviate the limitation of the manual browsing approach, develops 2 spreading activation based algorithms for concept exploration in large, heterogeneous networks of concepts (eg multiple thesauri). One algorithm, which is based on the symbolic AI paradigma, performs a conventional branch-and-bound search on a semantic net representation to identify other highly relevant concepts (a serial, optimal search process). The 2nd algorithm, which is absed on the neural network approach, executes the Hopfield net parallel relaxation and convergence process to identify 'convergent' concepts for some initial queries (a parallel, heuristic search process). Tests these 2 algorithms on a large text-based knowledge network of about 13.000 nodes (terms) and 80.000 directed links in the area of computing technologies
- Source
- Journal of the American Society for Information Science. 46(1995) no.5, S.348-369
- Theme
- Konzeption und Anwendung des Prinzips Thesaurus
-
Mayring, P.: Qualitative Inhaltsanalyse : Grundlagen und Techniken (1990)
0.00
0.0027397345 = product of:
0.027397344 = sum of:
0.011706693 = weight(_text_:und in 34) [ClassicSimilarity], result of:
0.011706693 = score(doc=34,freq=2.0), product of:
0.04780656 = queryWeight, product of:
2.216367 = idf(docFreq=13101, maxDocs=44218)
0.021569785 = queryNorm
0.24487628 = fieldWeight in 34, product of:
1.4142135 = tf(freq=2.0), with freq of:
2.0 = termFreq=2.0
2.216367 = idf(docFreq=13101, maxDocs=44218)
0.078125 = fieldNorm(doc=34)
0.011706693 = weight(_text_:und in 34) [ClassicSimilarity], result of:
0.011706693 = score(doc=34,freq=2.0), product of:
0.04780656 = queryWeight, product of:
2.216367 = idf(docFreq=13101, maxDocs=44218)
0.021569785 = queryNorm
0.24487628 = fieldWeight in 34, product of:
1.4142135 = tf(freq=2.0), with freq of:
2.0 = termFreq=2.0
2.216367 = idf(docFreq=13101, maxDocs=44218)
0.078125 = fieldNorm(doc=34)
0.003983958 = weight(_text_:s in 34) [ClassicSimilarity], result of:
0.003983958 = score(doc=34,freq=4.0), product of:
0.023451481 = queryWeight, product of:
1.0872376 = idf(docFreq=40523, maxDocs=44218)
0.021569785 = queryNorm
0.16988087 = fieldWeight in 34, product of:
2.0 = tf(freq=4.0), with freq of:
4.0 = termFreq=4.0
1.0872376 = idf(docFreq=40523, maxDocs=44218)
0.078125 = fieldNorm(doc=34)
0.1 = coord(3/30)
- Abstract
- "Inhaltsanalyse will: Kommunikation analysieren, fixierte Kommunikation analysieren, dabei systematisch vorgehen, das heißt regelgeleitet vorgehen, das heißt auch theoriegeleitet vorgehen, mit dem Ziel, Rückschlüsse auf bestimmte Aspekte der Kommunikation zu ziehen" (S.11)
- Pages
- 118 S
-
From information to knowledge : conceptual and content analysis by computer (1995)
0.00
0.002015168 = product of:
0.02015168 = sum of:
0.005833246 = weight(_text_:in in 5392) [ClassicSimilarity], result of:
0.005833246 = score(doc=5392,freq=14.0), product of:
0.029340398 = queryWeight, product of:
1.3602545 = idf(docFreq=30841, maxDocs=44218)
0.021569785 = queryNorm
0.19881277 = fieldWeight in 5392, product of:
3.7416575 = tf(freq=14.0), with freq of:
14.0 = termFreq=14.0
1.3602545 = idf(docFreq=30841, maxDocs=44218)
0.0390625 = fieldNorm(doc=5392)
0.010868224 = product of:
0.032604672 = sum of:
0.032604672 = weight(_text_:l in 5392) [ClassicSimilarity], result of:
0.032604672 = score(doc=5392,freq=6.0), product of:
0.0857324 = queryWeight, product of:
3.9746525 = idf(docFreq=2257, maxDocs=44218)
0.021569785 = queryNorm
0.38030747 = fieldWeight in 5392, product of:
2.4494898 = tf(freq=6.0), with freq of:
6.0 = termFreq=6.0
3.9746525 = idf(docFreq=2257, maxDocs=44218)
0.0390625 = fieldNorm(doc=5392)
0.33333334 = coord(1/3)
0.0034502088 = weight(_text_:s in 5392) [ClassicSimilarity], result of:
0.0034502088 = score(doc=5392,freq=12.0), product of:
0.023451481 = queryWeight, product of:
1.0872376 = idf(docFreq=40523, maxDocs=44218)
0.021569785 = queryNorm
0.14712115 = fieldWeight in 5392, product of:
3.4641016 = tf(freq=12.0), with freq of:
12.0 = termFreq=12.0
1.0872376 = idf(docFreq=40523, maxDocs=44218)
0.0390625 = fieldNorm(doc=5392)
0.1 = coord(3/30)
- Content
- SCHMIDT, K.M.: Concepts - content - meaning: an introduction; DUCHASTEL, J. et al.: The SACAO project: using computation toward textual data analysis; PAQUIN, L.-C. u. L. DUPUY: An approach to expertise transfer: computer-assisted text analysis; HOGENRAAD, R., Y. BESTGEN u. J.-L. NYSTEN: Terrorist rhetoric: texture and architecture; MOHLER, P.P.: On the interaction between reading and computing: an interpretative approach to content analysis; LANCASHIRE, I.: Computer tools for cognitive stylistics; MERGENTHALER, E.: An outline of knowledge based text analysis; NAMENWIRTH, J.Z.: Ideography in computer-aided content analysis; WEBER, R.P. u. J.Z. Namenwirth: Content-analytic indicators: a self-critique; McKINNON, A.: Optimizing the aberrant frequency word technique; ROSATI, R.: Factor analysis in classical archaeology: export patterns of Attic pottery trade; PETRILLO, P.S.: Old and new worlds: ancient coinage and modern technology; DARANYI, S., S. MARJAI u.a.: Caryatids and the measurement of semiosis in architecture; ZARRI, G.P.: Intelligent information retrieval: an application in the field of historical biographical data; BOUCHARD, G., R. ROY u.a.: Computers and genealogy: from family reconstitution to population reconstruction; DEMÉLAS-BOHY, M.-D. u. M. RENAUD: Instability, networks and political parties: a political history expert system prototype; DARANYI, S., A. ABRANYI u. G. KOVACS: Knowledge extraction from ethnopoetic texts by multivariate statistical methods; FRAUTSCHI, R.L.: Measures of narrative voice in French prose fiction applied to textual samples from the enlightenment to the twentieth century; DANNENBERG, R. u.a.: A project in computer music: the musician's workbench
- Footnote
- Rez. in: Knowledge organization 23(1996) no.3, S.181-182 (O. Sechser)
- Pages
- X,319 S
- Type
- s
-
Beghtol, C.: Toward a theory of fiction analysis for information storage and retrieval (1992)
0.00
0.0018932101 = product of:
0.0189321 = sum of:
0.004988801 = weight(_text_:in in 5830) [ClassicSimilarity], result of:
0.004988801 = score(doc=5830,freq=4.0), product of:
0.029340398 = queryWeight, product of:
1.3602545 = idf(docFreq=30841, maxDocs=44218)
0.021569785 = queryNorm
0.17003182 = fieldWeight in 5830, product of:
2.0 = tf(freq=4.0), with freq of:
4.0 = termFreq=4.0
1.3602545 = idf(docFreq=30841, maxDocs=44218)
0.0625 = fieldNorm(doc=5830)
0.002253667 = weight(_text_:s in 5830) [ClassicSimilarity], result of:
0.002253667 = score(doc=5830,freq=2.0), product of:
0.023451481 = queryWeight, product of:
1.0872376 = idf(docFreq=40523, maxDocs=44218)
0.021569785 = queryNorm
0.09609913 = fieldWeight in 5830, product of:
1.4142135 = tf(freq=2.0), with freq of:
2.0 = termFreq=2.0
1.0872376 = idf(docFreq=40523, maxDocs=44218)
0.0625 = fieldNorm(doc=5830)
0.011689632 = product of:
0.023379264 = sum of:
0.023379264 = weight(_text_:22 in 5830) [ClassicSimilarity], result of:
0.023379264 = score(doc=5830,freq=2.0), product of:
0.07553371 = queryWeight, product of:
3.5018296 = idf(docFreq=3622, maxDocs=44218)
0.021569785 = queryNorm
0.30952093 = fieldWeight in 5830, product of:
1.4142135 = tf(freq=2.0), with freq of:
2.0 = termFreq=2.0
3.5018296 = idf(docFreq=3622, maxDocs=44218)
0.0625 = fieldNorm(doc=5830)
0.5 = coord(1/2)
0.1 = coord(3/30)
- Abstract
- This paper examnines various isues that arise in establishing a theoretical basis for an experimental fiction analysis system. It analyzes the warrants of fiction and of works about fiction. From this analysis, it derives classificatory requirements for a fiction system. Classificatory techniques that may contribute to the specification of data elements in fiction are suggested
- Date
- 5. 8.2006 13:22:08
- Pages
- S.39-48
-
Beghtol, C.: ¬The classification of fiction : the development of a system based on theoretical principles (1994)
0.00
0.0017093798 = product of:
0.017093798 = sum of:
0.004365201 = weight(_text_:in in 3413) [ClassicSimilarity], result of:
0.004365201 = score(doc=3413,freq=4.0), product of:
0.029340398 = queryWeight, product of:
1.3602545 = idf(docFreq=30841, maxDocs=44218)
0.021569785 = queryNorm
0.14877784 = fieldWeight in 3413, product of:
2.0 = tf(freq=4.0), with freq of:
4.0 = termFreq=4.0
1.3602545 = idf(docFreq=30841, maxDocs=44218)
0.0546875 = fieldNorm(doc=3413)
0.008784681 = product of:
0.026354041 = sum of:
0.026354041 = weight(_text_:l in 3413) [ClassicSimilarity], result of:
0.026354041 = score(doc=3413,freq=2.0), product of:
0.0857324 = queryWeight, product of:
3.9746525 = idf(docFreq=2257, maxDocs=44218)
0.021569785 = queryNorm
0.30739886 = fieldWeight in 3413, product of:
1.4142135 = tf(freq=2.0), with freq of:
2.0 = termFreq=2.0
3.9746525 = idf(docFreq=2257, maxDocs=44218)
0.0546875 = fieldNorm(doc=3413)
0.33333334 = coord(1/3)
0.003943917 = weight(_text_:s in 3413) [ClassicSimilarity], result of:
0.003943917 = score(doc=3413,freq=8.0), product of:
0.023451481 = queryWeight, product of:
1.0872376 = idf(docFreq=40523, maxDocs=44218)
0.021569785 = queryNorm
0.16817348 = fieldWeight in 3413, product of:
2.828427 = tf(freq=8.0), with freq of:
8.0 = termFreq=8.0
1.0872376 = idf(docFreq=40523, maxDocs=44218)
0.0546875 = fieldNorm(doc=3413)
0.1 = coord(3/30)
- Abstract
- The work is an adaptation of the author's dissertation and has the following chapters: (1) background and introduction; (2) a problem in classification theory; (3) previous fiction analysis theories and systems and 'The left hand of darkness'; (4) fiction warrant and critical warrant; (5) experimental fiction analysis system (EFAS); (6) application and evaluation of EFAS. Appendix 1 gives references to fiction analysis systems and appendix 2 lists EFAS coding sheets
- Footnote
- Rez. in: Knowledge organization 21(1994) no.3, S.165-167 (W. Bies); JASIS 46(1995) no.5, S.389-390 (E.G. Bierbaum); Canadian journal of information and library science 20(1995) nos.3/4, S.52-53 (L. Rees-Potter)
- Pages
- 366 S
-
Gervereau, L.: Voir, comprendre, analyser les images (1994)
0.00
0.001639107 = product of:
0.024586603 = sum of:
0.02007927 = product of:
0.060237806 = sum of:
0.060237806 = weight(_text_:l in 8758) [ClassicSimilarity], result of:
0.060237806 = score(doc=8758,freq=2.0), product of:
0.0857324 = queryWeight, product of:
3.9746525 = idf(docFreq=2257, maxDocs=44218)
0.021569785 = queryNorm
0.70262593 = fieldWeight in 8758, product of:
1.4142135 = tf(freq=2.0), with freq of:
2.0 = termFreq=2.0
3.9746525 = idf(docFreq=2257, maxDocs=44218)
0.125 = fieldNorm(doc=8758)
0.33333334 = coord(1/3)
0.004507334 = weight(_text_:s in 8758) [ClassicSimilarity], result of:
0.004507334 = score(doc=8758,freq=2.0), product of:
0.023451481 = queryWeight, product of:
1.0872376 = idf(docFreq=40523, maxDocs=44218)
0.021569785 = queryNorm
0.19219826 = fieldWeight in 8758, product of:
1.4142135 = tf(freq=2.0), with freq of:
2.0 = termFreq=2.0
1.0872376 = idf(docFreq=40523, maxDocs=44218)
0.125 = fieldNorm(doc=8758)
0.06666667 = coord(2/30)
- Pages
- 191 S
-
Weimer, K.H.: ¬The nexus of subject analysis and bibliographic description : the case of multipart videos (1996)
0.00
0.0016373465 = product of:
0.016373465 = sum of:
0.005915991 = weight(_text_:in in 6525) [ClassicSimilarity], result of:
0.005915991 = score(doc=6525,freq=10.0), product of:
0.029340398 = queryWeight, product of:
1.3602545 = idf(docFreq=30841, maxDocs=44218)
0.021569785 = queryNorm
0.20163295 = fieldWeight in 6525, product of:
3.1622777 = tf(freq=10.0), with freq of:
10.0 = termFreq=10.0
1.3602545 = idf(docFreq=30841, maxDocs=44218)
0.046875 = fieldNorm(doc=6525)
0.0016902501 = weight(_text_:s in 6525) [ClassicSimilarity], result of:
0.0016902501 = score(doc=6525,freq=2.0), product of:
0.023451481 = queryWeight, product of:
1.0872376 = idf(docFreq=40523, maxDocs=44218)
0.021569785 = queryNorm
0.072074346 = fieldWeight in 6525, product of:
1.4142135 = tf(freq=2.0), with freq of:
2.0 = termFreq=2.0
1.0872376 = idf(docFreq=40523, maxDocs=44218)
0.046875 = fieldNorm(doc=6525)
0.008767224 = product of:
0.017534448 = sum of:
0.017534448 = weight(_text_:22 in 6525) [ClassicSimilarity], result of:
0.017534448 = score(doc=6525,freq=2.0), product of:
0.07553371 = queryWeight, product of:
3.5018296 = idf(docFreq=3622, maxDocs=44218)
0.021569785 = queryNorm
0.23214069 = fieldWeight in 6525, product of:
1.4142135 = tf(freq=2.0), with freq of:
2.0 = termFreq=2.0
3.5018296 = idf(docFreq=3622, maxDocs=44218)
0.046875 = fieldNorm(doc=6525)
0.5 = coord(1/2)
0.1 = coord(3/30)
- Abstract
- Examines the goals of bibliographic control, subject analysis and their relationship for audiovisual materials in general and multipart videotape recordings in particular. Concludes that intellectual access to multipart works is not adequately provided for when these materials are catalogues in collective set records. An alternative is to catalogue the parts separately. This method increases intellectual access by providing more detailed descriptive notes and subject analysis. As evidenced by the large number of records in the national database for parts of multipart videos, cataloguers have made the intellectual content of multipart videos more accessible by cataloguing the parts separately rather than collectively. This reverses the traditional cataloguing process to begin with subject analysis, resulting in the intellectual content of these materials driving the bibliographic description. Suggests ways of determining when multipart videos are best catalogued as sets or separately
- Source
- Cataloging and classification quarterly. 22(1996) no.2, S.5-18
-
Vieira, L.: Modèle d'analyse pur une classification du document iconographique (1999)
0.00
0.0010244418 = product of:
0.015366627 = sum of:
0.012549544 = product of:
0.03764863 = sum of:
0.03764863 = weight(_text_:l in 6320) [ClassicSimilarity], result of:
0.03764863 = score(doc=6320,freq=2.0), product of:
0.0857324 = queryWeight, product of:
3.9746525 = idf(docFreq=2257, maxDocs=44218)
0.021569785 = queryNorm
0.4391412 = fieldWeight in 6320, product of:
1.4142135 = tf(freq=2.0), with freq of:
2.0 = termFreq=2.0
3.9746525 = idf(docFreq=2257, maxDocs=44218)
0.078125 = fieldNorm(doc=6320)
0.33333334 = coord(1/3)
0.0028170836 = weight(_text_:s in 6320) [ClassicSimilarity], result of:
0.0028170836 = score(doc=6320,freq=2.0), product of:
0.023451481 = queryWeight, product of:
1.0872376 = idf(docFreq=40523, maxDocs=44218)
0.021569785 = queryNorm
0.120123915 = fieldWeight in 6320, product of:
1.4142135 = tf(freq=2.0), with freq of:
2.0 = termFreq=2.0
1.0872376 = idf(docFreq=40523, maxDocs=44218)
0.078125 = fieldNorm(doc=6320)
0.06666667 = coord(2/30)
- Pages
- S.275-284
-
Allen, B.; Reser, D.: Content analysis in library and information science research (1990)
0.00
7.708376E-4 = product of:
0.0115625635 = sum of:
0.00705523 = weight(_text_:in in 7510) [ClassicSimilarity], result of:
0.00705523 = score(doc=7510,freq=2.0), product of:
0.029340398 = queryWeight, product of:
1.3602545 = idf(docFreq=30841, maxDocs=44218)
0.021569785 = queryNorm
0.24046129 = fieldWeight in 7510, product of:
1.4142135 = tf(freq=2.0), with freq of:
2.0 = termFreq=2.0
1.3602545 = idf(docFreq=30841, maxDocs=44218)
0.125 = fieldNorm(doc=7510)
0.004507334 = weight(_text_:s in 7510) [ClassicSimilarity], result of:
0.004507334 = score(doc=7510,freq=2.0), product of:
0.023451481 = queryWeight, product of:
1.0872376 = idf(docFreq=40523, maxDocs=44218)
0.021569785 = queryNorm
0.19219826 = fieldWeight in 7510, product of:
1.4142135 = tf(freq=2.0), with freq of:
2.0 = termFreq=2.0
1.0872376 = idf(docFreq=40523, maxDocs=44218)
0.125 = fieldNorm(doc=7510)
0.06666667 = coord(2/30)
- Source
- Library and information science research. 12(1990) no.3, S.251-262
-
Hjoerland, B.: Subject representation and information seeking : contributions to a theory based on the theory of knowledge (1993)
0.00
6.7448284E-4 = product of:
0.010117242 = sum of:
0.0061733257 = weight(_text_:in in 7555) [ClassicSimilarity], result of:
0.0061733257 = score(doc=7555,freq=2.0), product of:
0.029340398 = queryWeight, product of:
1.3602545 = idf(docFreq=30841, maxDocs=44218)
0.021569785 = queryNorm
0.21040362 = fieldWeight in 7555, product of:
1.4142135 = tf(freq=2.0), with freq of:
2.0 = termFreq=2.0
1.3602545 = idf(docFreq=30841, maxDocs=44218)
0.109375 = fieldNorm(doc=7555)
0.003943917 = weight(_text_:s in 7555) [ClassicSimilarity], result of:
0.003943917 = score(doc=7555,freq=2.0), product of:
0.023451481 = queryWeight, product of:
1.0872376 = idf(docFreq=40523, maxDocs=44218)
0.021569785 = queryNorm
0.16817348 = fieldWeight in 7555, product of:
1.4142135 = tf(freq=2.0), with freq of:
2.0 = termFreq=2.0
1.0872376 = idf(docFreq=40523, maxDocs=44218)
0.109375 = fieldNorm(doc=7555)
0.06666667 = coord(2/30)
- Footnote
- [Dissertation]. - Zusammenfassung in: Knowledge organization 21(1994) no.2, S.94-98
-
Rowe, N.C.: Inferring depictions in natural-language captions for efficient access to picture data (1994)
0.00
6.355139E-4 = product of:
0.009532709 = sum of:
0.00756075 = weight(_text_:in in 7296) [ClassicSimilarity], result of:
0.00756075 = score(doc=7296,freq=12.0), product of:
0.029340398 = queryWeight, product of:
1.3602545 = idf(docFreq=30841, maxDocs=44218)
0.021569785 = queryNorm
0.2576908 = fieldWeight in 7296, product of:
3.4641016 = tf(freq=12.0), with freq of:
12.0 = termFreq=12.0
1.3602545 = idf(docFreq=30841, maxDocs=44218)
0.0546875 = fieldNorm(doc=7296)
0.0019719584 = weight(_text_:s in 7296) [ClassicSimilarity], result of:
0.0019719584 = score(doc=7296,freq=2.0), product of:
0.023451481 = queryWeight, product of:
1.0872376 = idf(docFreq=40523, maxDocs=44218)
0.021569785 = queryNorm
0.08408674 = fieldWeight in 7296, product of:
1.4142135 = tf(freq=2.0), with freq of:
2.0 = termFreq=2.0
1.0872376 = idf(docFreq=40523, maxDocs=44218)
0.0546875 = fieldNorm(doc=7296)
0.06666667 = coord(2/30)
- Abstract
- Multimedia data can require significant examination time to find desired features ('content analysis'). An alternative is using natural-language captions to describe the data, and matching captions to English queries. But it is hard to include everything in the caption of a complicated datum, so significant content analysis may still seem required. We discuss linguistic clues in captions, both syntactic and semantic, that can simplify or eliminate content analysis. We introduce the notion of content depiction and ruled for depiction inference. Our approach is implemented in an expert system which demonstrated significant increases in recall in experiments
- Source
- Information processing and management. 30(1994) no.3, S.379-388
-
Nohr, H.: ¬The training of librarians in content analysis : some thoughts on future necessities (1991)
0.00
6.2059314E-4 = product of:
0.009308897 = sum of:
0.00705523 = weight(_text_:in in 5149) [ClassicSimilarity], result of:
0.00705523 = score(doc=5149,freq=8.0), product of:
0.029340398 = queryWeight, product of:
1.3602545 = idf(docFreq=30841, maxDocs=44218)
0.021569785 = queryNorm
0.24046129 = fieldWeight in 5149, product of:
2.828427 = tf(freq=8.0), with freq of:
8.0 = termFreq=8.0
1.3602545 = idf(docFreq=30841, maxDocs=44218)
0.0625 = fieldNorm(doc=5149)
0.002253667 = weight(_text_:s in 5149) [ClassicSimilarity], result of:
0.002253667 = score(doc=5149,freq=2.0), product of:
0.023451481 = queryWeight, product of:
1.0872376 = idf(docFreq=40523, maxDocs=44218)
0.021569785 = queryNorm
0.09609913 = fieldWeight in 5149, product of:
1.4142135 = tf(freq=2.0), with freq of:
2.0 = termFreq=2.0
1.0872376 = idf(docFreq=40523, maxDocs=44218)
0.0625 = fieldNorm(doc=5149)
0.06666667 = coord(2/30)
- Abstract
- The training of librarians in content analysis undergoes influences resulting both from the realities existing in the various application fields and from technological innovations. The present contribution attempts to identify components of such training that are necessary for a future-oriented instruction, and it stresses the importance of furnishing a sound theoretical basis, especially in the light of technological developments. Purpose of the training is to provide the foundation for 'action competence' on the part of the students
- Source
- International classification. 18(1991) no.3, S.153-157
-
Naves, M.M.L.: Analise de assunto : concepcoes (1996)
0.00
6.0353905E-4 = product of:
0.009053085 = sum of:
0.006236001 = weight(_text_:in in 607) [ClassicSimilarity], result of:
0.006236001 = score(doc=607,freq=4.0), product of:
0.029340398 = queryWeight, product of:
1.3602545 = idf(docFreq=30841, maxDocs=44218)
0.021569785 = queryNorm
0.21253976 = fieldWeight in 607, product of:
2.0 = tf(freq=4.0), with freq of:
4.0 = termFreq=4.0
1.3602545 = idf(docFreq=30841, maxDocs=44218)
0.078125 = fieldNorm(doc=607)
0.0028170836 = weight(_text_:s in 607) [ClassicSimilarity], result of:
0.0028170836 = score(doc=607,freq=2.0), product of:
0.023451481 = queryWeight, product of:
1.0872376 = idf(docFreq=40523, maxDocs=44218)
0.021569785 = queryNorm
0.120123915 = fieldWeight in 607, product of:
1.4142135 = tf(freq=2.0), with freq of:
2.0 = termFreq=2.0
1.0872376 = idf(docFreq=40523, maxDocs=44218)
0.078125 = fieldNorm(doc=607)
0.06666667 = coord(2/30)
- Abstract
- Discusses subject analysis as an important stage in the indexing process and observes confusions that can occur in the meaning of the term. Considers questions and difficulties about subject analysis and the concept of aboutness
- Source
- Revista de Biblioteconomia de Brasilia. 20(1996) no.2, S.215-226
-
Beghtol, C.: Stories : applications of narrative discourse analysis to issues in information storage and retrieval (1997)
0.00
5.9159653E-4 = product of:
0.008873948 = sum of:
0.006901989 = weight(_text_:in in 5844) [ClassicSimilarity], result of:
0.006901989 = score(doc=5844,freq=10.0), product of:
0.029340398 = queryWeight, product of:
1.3602545 = idf(docFreq=30841, maxDocs=44218)
0.021569785 = queryNorm
0.23523843 = fieldWeight in 5844, product of:
3.1622777 = tf(freq=10.0), with freq of:
10.0 = termFreq=10.0
1.3602545 = idf(docFreq=30841, maxDocs=44218)
0.0546875 = fieldNorm(doc=5844)
0.0019719584 = weight(_text_:s in 5844) [ClassicSimilarity], result of:
0.0019719584 = score(doc=5844,freq=2.0), product of:
0.023451481 = queryWeight, product of:
1.0872376 = idf(docFreq=40523, maxDocs=44218)
0.021569785 = queryNorm
0.08408674 = fieldWeight in 5844, product of:
1.4142135 = tf(freq=2.0), with freq of:
2.0 = termFreq=2.0
1.0872376 = idf(docFreq=40523, maxDocs=44218)
0.0546875 = fieldNorm(doc=5844)
0.06666667 = coord(2/30)
- Abstract
- The arts, humanities, and social sciences commonly borrow concepts and methods from the sciences, but interdisciplinary borrowing seldom occurs in the opposite direction. Research on narrative discourse is relevant to problems of documentary storage and retrieval, for the arts and humanities in particular, but also for other broad areas of knowledge. This paper views the potential application of narrative discourse analysis to information storage and retrieval problems from 2 perspectives: 1) analysis and comparison of narrative documents in all disciplines may be simplified if fundamental categories that occur in narrative documents can be isolated; and 2) the possibility of subdividing the world of knowledge initially into narrative and non-narrative documents is explored with particular attention to Werlich's work on text types
- Source
- Knowledge organization. 24(1997) no.2, S.64-71
-
Martindale, C.; McKenzie, D.: On the utility of content analysis in author attribution : 'The federalist' (1995)
0.00
5.781282E-4 = product of:
0.008671923 = sum of:
0.0052914224 = weight(_text_:in in 822) [ClassicSimilarity], result of:
0.0052914224 = score(doc=822,freq=2.0), product of:
0.029340398 = queryWeight, product of:
1.3602545 = idf(docFreq=30841, maxDocs=44218)
0.021569785 = queryNorm
0.18034597 = fieldWeight in 822, product of:
1.4142135 = tf(freq=2.0), with freq of:
2.0 = termFreq=2.0
1.3602545 = idf(docFreq=30841, maxDocs=44218)
0.09375 = fieldNorm(doc=822)
0.0033805002 = weight(_text_:s in 822) [ClassicSimilarity], result of:
0.0033805002 = score(doc=822,freq=2.0), product of:
0.023451481 = queryWeight, product of:
1.0872376 = idf(docFreq=40523, maxDocs=44218)
0.021569785 = queryNorm
0.14414869 = fieldWeight in 822, product of:
1.4142135 = tf(freq=2.0), with freq of:
2.0 = termFreq=2.0
1.0872376 = idf(docFreq=40523, maxDocs=44218)
0.09375 = fieldNorm(doc=822)
0.06666667 = coord(2/30)
- Source
- Computers and the humanities. 29(1995) no.4, S.259-270
-
Smith, P.J.; Normore, L.F.; Denning, R.; Johnson, W.P.: Computerized tools to support document analysis (1994)
0.00
5.781282E-4 = product of:
0.008671923 = sum of:
0.0052914224 = weight(_text_:in in 2990) [ClassicSimilarity], result of:
0.0052914224 = score(doc=2990,freq=2.0), product of:
0.029340398 = queryWeight, product of:
1.3602545 = idf(docFreq=30841, maxDocs=44218)
0.021569785 = queryNorm
0.18034597 = fieldWeight in 2990, product of:
1.4142135 = tf(freq=2.0), with freq of:
2.0 = termFreq=2.0
1.3602545 = idf(docFreq=30841, maxDocs=44218)
0.09375 = fieldNorm(doc=2990)
0.0033805002 = weight(_text_:s in 2990) [ClassicSimilarity], result of:
0.0033805002 = score(doc=2990,freq=2.0), product of:
0.023451481 = queryWeight, product of:
1.0872376 = idf(docFreq=40523, maxDocs=44218)
0.021569785 = queryNorm
0.14414869 = fieldWeight in 2990, product of:
1.4142135 = tf(freq=2.0), with freq of:
2.0 = termFreq=2.0
1.0872376 = idf(docFreq=40523, maxDocs=44218)
0.09375 = fieldNorm(doc=2990)
0.06666667 = coord(2/30)
- Pages
- S.221-237
- Source
- Challenges in indexing electronic text and images. Ed.: R. Fidel et al