-
Bock, H.-H.: Datenanalyse zur Strukturierung und Ordnung von Information (1989)
0.04
0.0371964 = product of:
0.0743928 = sum of:
0.0743928 = sum of:
0.030935118 = weight(_text_:h in 141) [ClassicSimilarity], result of:
0.030935118 = score(doc=141,freq=4.0), product of:
0.113842286 = queryWeight, product of:
2.4844491 = idf(docFreq=10020, maxDocs=44218)
0.045821942 = queryNorm
0.27173662 = fieldWeight in 141, product of:
2.0 = tf(freq=4.0), with freq of:
4.0 = termFreq=4.0
2.4844491 = idf(docFreq=10020, maxDocs=44218)
0.0546875 = fieldNorm(doc=141)
0.043457683 = weight(_text_:22 in 141) [ClassicSimilarity], result of:
0.043457683 = score(doc=141,freq=2.0), product of:
0.16046064 = queryWeight, product of:
3.5018296 = idf(docFreq=3622, maxDocs=44218)
0.045821942 = queryNorm
0.2708308 = fieldWeight in 141, product of:
1.4142135 = tf(freq=2.0), with freq of:
2.0 = termFreq=2.0
3.5018296 = idf(docFreq=3622, maxDocs=44218)
0.0546875 = fieldNorm(doc=141)
0.5 = coord(1/2)
- Pages
- S.1-22
-
Bock, H.-H.: Automatische Klassifikation : theoretische und praktische Methoden zur Gruppierung und Strukturierung von Daten (Cluster-Analyse) (1974)
0.02
0.01767721 = product of:
0.03535442 = sum of:
0.03535442 = product of:
0.07070884 = sum of:
0.07070884 = weight(_text_:h in 7693) [ClassicSimilarity], result of:
0.07070884 = score(doc=7693,freq=4.0), product of:
0.113842286 = queryWeight, product of:
2.4844491 = idf(docFreq=10020, maxDocs=44218)
0.045821942 = queryNorm
0.6211123 = fieldWeight in 7693, product of:
2.0 = tf(freq=4.0), with freq of:
4.0 = termFreq=4.0
2.4844491 = idf(docFreq=10020, maxDocs=44218)
0.125 = fieldNorm(doc=7693)
0.5 = coord(1/2)
0.5 = coord(1/2)
-
Reiner, U.: Automatische DDC-Klassifizierung von bibliografischen Titeldatensätzen (2009)
0.02
0.0155206015 = product of:
0.031041203 = sum of:
0.031041203 = product of:
0.062082406 = sum of:
0.062082406 = weight(_text_:22 in 611) [ClassicSimilarity], result of:
0.062082406 = score(doc=611,freq=2.0), product of:
0.16046064 = queryWeight, product of:
3.5018296 = idf(docFreq=3622, maxDocs=44218)
0.045821942 = queryNorm
0.38690117 = fieldWeight in 611, product of:
1.4142135 = tf(freq=2.0), with freq of:
2.0 = termFreq=2.0
3.5018296 = idf(docFreq=3622, maxDocs=44218)
0.078125 = fieldNorm(doc=611)
0.5 = coord(1/2)
0.5 = coord(1/2)
- Date
- 22. 8.2009 12:54:24
-
Kleinoeder, H.H.; Puzicha, J.: Automatische Katalogisierung am Beispiel einer Pilotanwendung (2002)
0.01
0.010937216 = product of:
0.021874432 = sum of:
0.021874432 = product of:
0.043748863 = sum of:
0.043748863 = weight(_text_:h in 1154) [ClassicSimilarity], result of:
0.043748863 = score(doc=1154,freq=2.0), product of:
0.113842286 = queryWeight, product of:
2.4844491 = idf(docFreq=10020, maxDocs=44218)
0.045821942 = queryNorm
0.38429362 = fieldWeight in 1154, product of:
1.4142135 = tf(freq=2.0), with freq of:
2.0 = termFreq=2.0
2.4844491 = idf(docFreq=10020, maxDocs=44218)
0.109375 = fieldNorm(doc=1154)
0.5 = coord(1/2)
0.5 = coord(1/2)
- Source
- Info 7. 17(2002) H.1, S.19-21
-
Pfeffer, M.: Automatische Vergabe von RVK-Notationen mittels fallbasiertem Schließen (2009)
0.01
0.009312361 = product of:
0.018624721 = sum of:
0.018624721 = product of:
0.037249442 = sum of:
0.037249442 = weight(_text_:22 in 3051) [ClassicSimilarity], result of:
0.037249442 = score(doc=3051,freq=2.0), product of:
0.16046064 = queryWeight, product of:
3.5018296 = idf(docFreq=3622, maxDocs=44218)
0.045821942 = queryNorm
0.23214069 = fieldWeight in 3051, product of:
1.4142135 = tf(freq=2.0), with freq of:
2.0 = termFreq=2.0
3.5018296 = idf(docFreq=3622, maxDocs=44218)
0.046875 = fieldNorm(doc=3051)
0.5 = coord(1/2)
0.5 = coord(1/2)
- Date
- 22. 8.2009 19:51:28
-
Bollmann, P.; Konrad, E.; Schneider, H.-J.; Zuse, H.: Anwendung automatischer Klassifikationsverfahren mit dem System FAKYR (1978)
0.01
0.008838605 = product of:
0.01767721 = sum of:
0.01767721 = product of:
0.03535442 = sum of:
0.03535442 = weight(_text_:h in 82) [ClassicSimilarity], result of:
0.03535442 = score(doc=82,freq=4.0), product of:
0.113842286 = queryWeight, product of:
2.4844491 = idf(docFreq=10020, maxDocs=44218)
0.045821942 = queryNorm
0.31055614 = fieldWeight in 82, product of:
2.0 = tf(freq=4.0), with freq of:
4.0 = termFreq=4.0
2.4844491 = idf(docFreq=10020, maxDocs=44218)
0.0625 = fieldNorm(doc=82)
0.5 = coord(1/2)
0.5 = coord(1/2)
-
Brückner, T.; Dambeck, H.: Sortierautomaten : Grundlagen der Textklassifizierung (2003)
0.01
0.008838605 = product of:
0.01767721 = sum of:
0.01767721 = product of:
0.03535442 = sum of:
0.03535442 = weight(_text_:h in 2398) [ClassicSimilarity], result of:
0.03535442 = score(doc=2398,freq=4.0), product of:
0.113842286 = queryWeight, product of:
2.4844491 = idf(docFreq=10020, maxDocs=44218)
0.045821942 = queryNorm
0.31055614 = fieldWeight in 2398, product of:
2.0 = tf(freq=4.0), with freq of:
4.0 = termFreq=4.0
2.4844491 = idf(docFreq=10020, maxDocs=44218)
0.0625 = fieldNorm(doc=2398)
0.5 = coord(1/2)
0.5 = coord(1/2)
- Source
- c't. 2003, H.19, S.192-197
-
Wätjen, H.-J.; Diekmann, B.; Möller, G.; Carstensen, K.-U.: Bericht zum DFG-Projekt: GERHARD : German Harvest Automated Retrieval and Directory (1998)
0.01
0.0078122974 = product of:
0.015624595 = sum of:
0.015624595 = product of:
0.03124919 = sum of:
0.03124919 = weight(_text_:h in 3065) [ClassicSimilarity], result of:
0.03124919 = score(doc=3065,freq=2.0), product of:
0.113842286 = queryWeight, product of:
2.4844491 = idf(docFreq=10020, maxDocs=44218)
0.045821942 = queryNorm
0.27449545 = fieldWeight in 3065, product of:
1.4142135 = tf(freq=2.0), with freq of:
2.0 = termFreq=2.0
2.4844491 = idf(docFreq=10020, maxDocs=44218)
0.078125 = fieldNorm(doc=3065)
0.5 = coord(1/2)
0.5 = coord(1/2)
-
Wätjen, H.-J.: Automatisches Sammeln, Klassifizieren und Indexieren von wissenschaftlich relevanten Informationsressourcen im deutschen World Wide Web : das DFG-Projekt GERHARD (1998)
0.01
0.0078122974 = product of:
0.015624595 = sum of:
0.015624595 = product of:
0.03124919 = sum of:
0.03124919 = weight(_text_:h in 3066) [ClassicSimilarity], result of:
0.03124919 = score(doc=3066,freq=2.0), product of:
0.113842286 = queryWeight, product of:
2.4844491 = idf(docFreq=10020, maxDocs=44218)
0.045821942 = queryNorm
0.27449545 = fieldWeight in 3066, product of:
1.4142135 = tf(freq=2.0), with freq of:
2.0 = termFreq=2.0
2.4844491 = idf(docFreq=10020, maxDocs=44218)
0.078125 = fieldNorm(doc=3066)
0.5 = coord(1/2)
0.5 = coord(1/2)
-
Wätjen, H.-J.: GERHARD : Automatisches Sammeln, Klassifizieren und Indexieren von wissenschaftlich relevanten Informationsressourcen im deutschen World Wide Web (1998)
0.01
0.0077337795 = product of:
0.015467559 = sum of:
0.015467559 = product of:
0.030935118 = sum of:
0.030935118 = weight(_text_:h in 3064) [ClassicSimilarity], result of:
0.030935118 = score(doc=3064,freq=4.0), product of:
0.113842286 = queryWeight, product of:
2.4844491 = idf(docFreq=10020, maxDocs=44218)
0.045821942 = queryNorm
0.27173662 = fieldWeight in 3064, product of:
2.0 = tf(freq=4.0), with freq of:
4.0 = termFreq=4.0
2.4844491 = idf(docFreq=10020, maxDocs=44218)
0.0546875 = fieldNorm(doc=3064)
0.5 = coord(1/2)
0.5 = coord(1/2)
- Source
- B.I.T.online. 1(1998) H.4, S.279-290
-
Panyr, J.: Vektorraum-Modell und Clusteranalyse in Information-Retrieval-Systemen (1987)
0.01
0.0062498376 = product of:
0.012499675 = sum of:
0.012499675 = product of:
0.02499935 = sum of:
0.02499935 = weight(_text_:h in 2322) [ClassicSimilarity], result of:
0.02499935 = score(doc=2322,freq=2.0), product of:
0.113842286 = queryWeight, product of:
2.4844491 = idf(docFreq=10020, maxDocs=44218)
0.045821942 = queryNorm
0.21959636 = fieldWeight in 2322, product of:
1.4142135 = tf(freq=2.0), with freq of:
2.0 = termFreq=2.0
2.4844491 = idf(docFreq=10020, maxDocs=44218)
0.0625 = fieldNorm(doc=2322)
0.5 = coord(1/2)
0.5 = coord(1/2)
- Source
- Nachrichten für Dokumentation. 38(1987) H.1, S.13-20
-
Fangmeyer, H.; Gloden, R.: Bewertung und Vergleich von Klassifikationsergebnissen bei automatischen Verfahren (1978)
0.01
0.0062498376 = product of:
0.012499675 = sum of:
0.012499675 = product of:
0.02499935 = sum of:
0.02499935 = weight(_text_:h in 81) [ClassicSimilarity], result of:
0.02499935 = score(doc=81,freq=2.0), product of:
0.113842286 = queryWeight, product of:
2.4844491 = idf(docFreq=10020, maxDocs=44218)
0.045821942 = queryNorm
0.21959636 = fieldWeight in 81, product of:
1.4142135 = tf(freq=2.0), with freq of:
2.0 = termFreq=2.0
2.4844491 = idf(docFreq=10020, maxDocs=44218)
0.0625 = fieldNorm(doc=81)
0.5 = coord(1/2)
0.5 = coord(1/2)
-
Koch, T.: Nutzung von Klassifikationssystemen zur verbesserten Beschreibung, Organisation und Suche von Internetressourcen (1998)
0.01
0.0062498376 = product of:
0.012499675 = sum of:
0.012499675 = product of:
0.02499935 = sum of:
0.02499935 = weight(_text_:h in 1030) [ClassicSimilarity], result of:
0.02499935 = score(doc=1030,freq=2.0), product of:
0.113842286 = queryWeight, product of:
2.4844491 = idf(docFreq=10020, maxDocs=44218)
0.045821942 = queryNorm
0.21959636 = fieldWeight in 1030, product of:
1.4142135 = tf(freq=2.0), with freq of:
2.0 = termFreq=2.0
2.4844491 = idf(docFreq=10020, maxDocs=44218)
0.0625 = fieldNorm(doc=1030)
0.5 = coord(1/2)
0.5 = coord(1/2)
- Source
- BuB. 50(1998) H.5, S.326-335
-
Illing, S.: Automatisiertes klinisches Codieren (2021)
0.01
0.0062498376 = product of:
0.012499675 = sum of:
0.012499675 = product of:
0.02499935 = sum of:
0.02499935 = weight(_text_:h in 419) [ClassicSimilarity], result of:
0.02499935 = score(doc=419,freq=2.0), product of:
0.113842286 = queryWeight, product of:
2.4844491 = idf(docFreq=10020, maxDocs=44218)
0.045821942 = queryNorm
0.21959636 = fieldWeight in 419, product of:
1.4142135 = tf(freq=2.0), with freq of:
2.0 = termFreq=2.0
2.4844491 = idf(docFreq=10020, maxDocs=44218)
0.0625 = fieldNorm(doc=419)
0.5 = coord(1/2)
0.5 = coord(1/2)
- Source
- Information - Wissenschaft und Praxis. 72(2021) H.5/6, S.285-290
-
Reiner, U.: Automatische DDC-Klassifizierung bibliografischer Titeldatensätze der Deutschen Nationalbibliografie (2009)
0.01
0.0062082405 = product of:
0.012416481 = sum of:
0.012416481 = product of:
0.024832962 = sum of:
0.024832962 = weight(_text_:22 in 3284) [ClassicSimilarity], result of:
0.024832962 = score(doc=3284,freq=2.0), product of:
0.16046064 = queryWeight, product of:
3.5018296 = idf(docFreq=3622, maxDocs=44218)
0.045821942 = queryNorm
0.15476047 = fieldWeight in 3284, product of:
1.4142135 = tf(freq=2.0), with freq of:
2.0 = termFreq=2.0
3.5018296 = idf(docFreq=3622, maxDocs=44218)
0.03125 = fieldNorm(doc=3284)
0.5 = coord(1/2)
0.5 = coord(1/2)
- Date
- 22. 1.2010 14:41:24
-
Oberhauser, O.: Automatisches Klassifizieren und Bibliothekskataloge (2005)
0.01
0.005468608 = product of:
0.010937216 = sum of:
0.010937216 = product of:
0.021874432 = sum of:
0.021874432 = weight(_text_:h in 4099) [ClassicSimilarity], result of:
0.021874432 = score(doc=4099,freq=2.0), product of:
0.113842286 = queryWeight, product of:
2.4844491 = idf(docFreq=10020, maxDocs=44218)
0.045821942 = queryNorm
0.19214681 = fieldWeight in 4099, product of:
1.4142135 = tf(freq=2.0), with freq of:
2.0 = termFreq=2.0
2.4844491 = idf(docFreq=10020, maxDocs=44218)
0.0546875 = fieldNorm(doc=4099)
0.5 = coord(1/2)
0.5 = coord(1/2)
- Source
- Bibliothek Technik Recht. Festschrift für Peter Kubalek zum 60. Geburtstag. Hrsg.: H. Hrusa
-
Kasprzik, A.: Automatisierte und semiautomatisierte Klassifizierung : eine Analyse aktueller Projekte (2014)
0.00
0.004687378 = product of:
0.009374756 = sum of:
0.009374756 = product of:
0.018749513 = sum of:
0.018749513 = weight(_text_:h in 2470) [ClassicSimilarity], result of:
0.018749513 = score(doc=2470,freq=2.0), product of:
0.113842286 = queryWeight, product of:
2.4844491 = idf(docFreq=10020, maxDocs=44218)
0.045821942 = queryNorm
0.16469726 = fieldWeight in 2470, product of:
1.4142135 = tf(freq=2.0), with freq of:
2.0 = termFreq=2.0
2.4844491 = idf(docFreq=10020, maxDocs=44218)
0.046875 = fieldNorm(doc=2470)
0.5 = coord(1/2)
0.5 = coord(1/2)
- Source
- Perspektive Bibliothek. 3(2014) H.1, S.85-110
-
Automatische Klassifikation und Extraktion in Documentum (2005)
0.00
0.0039061487 = product of:
0.0078122974 = sum of:
0.0078122974 = product of:
0.015624595 = sum of:
0.015624595 = weight(_text_:h in 3974) [ClassicSimilarity], result of:
0.015624595 = score(doc=3974,freq=2.0), product of:
0.113842286 = queryWeight, product of:
2.4844491 = idf(docFreq=10020, maxDocs=44218)
0.045821942 = queryNorm
0.13724773 = fieldWeight in 3974, product of:
1.4142135 = tf(freq=2.0), with freq of:
2.0 = termFreq=2.0
2.4844491 = idf(docFreq=10020, maxDocs=44218)
0.0390625 = fieldNorm(doc=3974)
0.5 = coord(1/2)
0.5 = coord(1/2)
- Source
- Information - Wissenschaft und Praxis. 56(2005) H.5/6, S.276
-
Reiner, U.: VZG-Projekt Colibri : Bewertung von automatisch DDC-klassifizierten Titeldatensätzen der Deutschen Nationalbibliothek (DNB) (2009)
0.00
0.0039061487 = product of:
0.0078122974 = sum of:
0.0078122974 = product of:
0.015624595 = sum of:
0.015624595 = weight(_text_:h in 2675) [ClassicSimilarity], result of:
0.015624595 = score(doc=2675,freq=2.0), product of:
0.113842286 = queryWeight, product of:
2.4844491 = idf(docFreq=10020, maxDocs=44218)
0.045821942 = queryNorm
0.13724773 = fieldWeight in 2675, product of:
1.4142135 = tf(freq=2.0), with freq of:
2.0 = termFreq=2.0
2.4844491 = idf(docFreq=10020, maxDocs=44218)
0.0390625 = fieldNorm(doc=2675)
0.5 = coord(1/2)
0.5 = coord(1/2)
- Abstract
- Das VZG-Projekt Colibri/DDC beschäftigt sich seit 2003 mit automatischen Verfahren zur Dewey-Dezimalklassifikation (Dewey Decimal Classification, kurz DDC). Ziel des Projektes ist eine einheitliche DDC-Erschließung von bibliografischen Titeldatensätzen und eine Unterstützung der DDC-Expert(inn)en und DDC-Laien, z. B. bei der Analyse und Synthese von DDC-Notationen und deren Qualitätskontrolle und der DDC-basierten Suche. Der vorliegende Bericht konzentriert sich auf die erste größere automatische DDC-Klassifizierung und erste automatische und intellektuelle Bewertung mit der Klassifizierungskomponente vc_dcl1. Grundlage hierfür waren die von der Deutschen Nationabibliothek (DNB) im November 2007 zur Verfügung gestellten 25.653 Titeldatensätze (12 Wochen-/Monatslieferungen) der Deutschen Nationalbibliografie der Reihen A, B und H. Nach Erläuterung der automatischen DDC-Klassifizierung und automatischen Bewertung in Kapitel 2 wird in Kapitel 3 auf den DNB-Bericht "Colibri_Auswertung_DDC_Endbericht_Sommer_2008" eingegangen. Es werden Sachverhalte geklärt und Fragen gestellt, deren Antworten die Weichen für den Verlauf der weiteren Klassifizierungstests stellen werden. Über das Kapitel 3 hinaus führende weitergehende Betrachtungen und Gedanken zur Fortführung der automatischen DDC-Klassifizierung werden in Kapitel 4 angestellt. Der Bericht dient dem vertieften Verständnis für die automatischen Verfahren.
-
Groß, T.; Faden, M.: Automatische Indexierung elektronischer Dokumente an der Deutschen Zentralbibliothek für Wirtschaftswissenschaften : Bericht über die Jahrestagung der Internationalen Buchwissenschaftlichen Gesellschaft (2010)
0.00
0.0031249188 = product of:
0.0062498376 = sum of:
0.0062498376 = product of:
0.012499675 = sum of:
0.012499675 = weight(_text_:h in 4051) [ClassicSimilarity], result of:
0.012499675 = score(doc=4051,freq=2.0), product of:
0.113842286 = queryWeight, product of:
2.4844491 = idf(docFreq=10020, maxDocs=44218)
0.045821942 = queryNorm
0.10979818 = fieldWeight in 4051, product of:
1.4142135 = tf(freq=2.0), with freq of:
2.0 = termFreq=2.0
2.4844491 = idf(docFreq=10020, maxDocs=44218)
0.03125 = fieldNorm(doc=4051)
0.5 = coord(1/2)
0.5 = coord(1/2)
- Source
- Bibliotheksdienst. 44(2010) H.12, S.1120-1135