Search (79 results, page 1 of 4)

  • × theme_ss:"Automatisches Klassifizieren"
  1. Panyr, J.: Automatische Klassifikation und Information Retrieval : Anwendung und Entwicklung komplexer Verfahren in Information-Retrieval-Systemen und ihre Evaluierung (1986) 0.04
    0.035832528 = product of:
      0.08958132 = sum of:
        0.032795873 = weight(_text_:m in 32) [ClassicSimilarity], result of:
          0.032795873 = score(doc=32,freq=2.0), product of:
            0.09940409 = queryWeight, product of:
              2.4884486 = idf(docFreq=9980, maxDocs=44218)
              0.03994621 = queryNorm
            0.3299248 = fieldWeight in 32, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              2.4884486 = idf(docFreq=9980, maxDocs=44218)
              0.09375 = fieldNorm(doc=32)
        0.056785446 = weight(_text_:u in 32) [ClassicSimilarity], result of:
          0.056785446 = score(doc=32,freq=2.0), product of:
            0.13080163 = queryWeight, product of:
              3.2744443 = idf(docFreq=4547, maxDocs=44218)
              0.03994621 = queryNorm
            0.43413407 = fieldWeight in 32, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.2744443 = idf(docFreq=4547, maxDocs=44218)
              0.09375 = fieldNorm(doc=32)
      0.4 = coord(2/5)
    
    Footnote
    Zugleich Dissertation U Saarbrücken 1085
    Type
    m
  2. Reiner, U.: Automatische DDC-Klassifizierung von bibliografischen Titeldatensätzen (2009) 0.03
    0.029752804 = product of:
      0.07438201 = sum of:
        0.047321208 = weight(_text_:u in 611) [ClassicSimilarity], result of:
          0.047321208 = score(doc=611,freq=2.0), product of:
            0.13080163 = queryWeight, product of:
              3.2744443 = idf(docFreq=4547, maxDocs=44218)
              0.03994621 = queryNorm
            0.3617784 = fieldWeight in 611, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.2744443 = idf(docFreq=4547, maxDocs=44218)
              0.078125 = fieldNorm(doc=611)
        0.0270608 = product of:
          0.0541216 = sum of:
            0.0541216 = weight(_text_:22 in 611) [ClassicSimilarity], result of:
              0.0541216 = score(doc=611,freq=2.0), product of:
                0.13988481 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.03994621 = queryNorm
                0.38690117 = fieldWeight in 611, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.078125 = fieldNorm(doc=611)
          0.5 = coord(1/2)
      0.4 = coord(2/5)
    
    Date
    22. 8.2009 12:54:24
  3. Hotho, A.; Bloehdorn, S.: Data Mining 2004 : Text classification by boosting weak learners based on terms and concepts (2004) 0.03
    0.025528142 = product of:
      0.063820355 = sum of:
        0.047583878 = product of:
          0.19033551 = sum of:
            0.19033551 = weight(_text_:3a in 562) [ClassicSimilarity], result of:
              0.19033551 = score(doc=562,freq=2.0), product of:
                0.3386644 = queryWeight, product of:
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.03994621 = queryNorm
                0.56201804 = fieldWeight in 562, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.046875 = fieldNorm(doc=562)
          0.25 = coord(1/4)
        0.016236478 = product of:
          0.032472957 = sum of:
            0.032472957 = weight(_text_:22 in 562) [ClassicSimilarity], result of:
              0.032472957 = score(doc=562,freq=2.0), product of:
                0.13988481 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.03994621 = queryNorm
                0.23214069 = fieldWeight in 562, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.046875 = fieldNorm(doc=562)
          0.5 = coord(1/2)
      0.4 = coord(2/5)
    
    Content
    Vgl.: http://www.google.de/url?sa=t&rct=j&q=&esrc=s&source=web&cd=1&cad=rja&ved=0CEAQFjAA&url=http%3A%2F%2Fciteseerx.ist.psu.edu%2Fviewdoc%2Fdownload%3Fdoi%3D10.1.1.91.4940%26rep%3Drep1%26type%3Dpdf&ei=dOXrUMeIDYHDtQahsIGACg&usg=AFQjCNHFWVh6gNPvnOrOS9R3rkrXCNVD-A&sig2=5I2F5evRfMnsttSgFF9g7Q&bvm=bv.1357316858,d.Yms.
    Date
    8. 1.2013 10:22:32
  4. Calado, P.; Cristo, M.; Gonçalves, M.A.; Moura, E.S. de; Ribeiro-Neto, B.; Ziviani, N.: Link-based similarity measures for the classification of Web documents (2006) 0.02
    0.02187561 = product of:
      0.054689027 = sum of:
        0.013664948 = weight(_text_:m in 4921) [ClassicSimilarity], result of:
          0.013664948 = score(doc=4921,freq=2.0), product of:
            0.09940409 = queryWeight, product of:
              2.4884486 = idf(docFreq=9980, maxDocs=44218)
              0.03994621 = queryNorm
            0.13746867 = fieldWeight in 4921, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              2.4884486 = idf(docFreq=9980, maxDocs=44218)
              0.0390625 = fieldNorm(doc=4921)
        0.04102408 = weight(_text_:n in 4921) [ClassicSimilarity], result of:
          0.04102408 = score(doc=4921,freq=2.0), product of:
            0.17223433 = queryWeight, product of:
              4.3116565 = idf(docFreq=1611, maxDocs=44218)
              0.03994621 = queryNorm
            0.23818761 = fieldWeight in 4921, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              4.3116565 = idf(docFreq=1611, maxDocs=44218)
              0.0390625 = fieldNorm(doc=4921)
      0.4 = coord(2/5)
    
  5. Rooney, N.; Patterson, D.; Galushka, M.; Dobrynin, V.; Smirnova, E.: ¬An investigation into the stability of contextual document clustering (2008) 0.02
    0.02187561 = product of:
      0.054689027 = sum of:
        0.013664948 = weight(_text_:m in 1356) [ClassicSimilarity], result of:
          0.013664948 = score(doc=1356,freq=2.0), product of:
            0.09940409 = queryWeight, product of:
              2.4884486 = idf(docFreq=9980, maxDocs=44218)
              0.03994621 = queryNorm
            0.13746867 = fieldWeight in 1356, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              2.4884486 = idf(docFreq=9980, maxDocs=44218)
              0.0390625 = fieldNorm(doc=1356)
        0.04102408 = weight(_text_:n in 1356) [ClassicSimilarity], result of:
          0.04102408 = score(doc=1356,freq=2.0), product of:
            0.17223433 = queryWeight, product of:
              4.3116565 = idf(docFreq=1611, maxDocs=44218)
              0.03994621 = queryNorm
            0.23818761 = fieldWeight in 1356, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              4.3116565 = idf(docFreq=1611, maxDocs=44218)
              0.0390625 = fieldNorm(doc=1356)
      0.4 = coord(2/5)
    
  6. Mengle, S.; Goharian, N.: Passage detection using text classification (2009) 0.02
    0.021821793 = product of:
      0.05455448 = sum of:
        0.04102408 = weight(_text_:n in 2765) [ClassicSimilarity], result of:
          0.04102408 = score(doc=2765,freq=2.0), product of:
            0.17223433 = queryWeight, product of:
              4.3116565 = idf(docFreq=1611, maxDocs=44218)
              0.03994621 = queryNorm
            0.23818761 = fieldWeight in 2765, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              4.3116565 = idf(docFreq=1611, maxDocs=44218)
              0.0390625 = fieldNorm(doc=2765)
        0.0135304 = product of:
          0.0270608 = sum of:
            0.0270608 = weight(_text_:22 in 2765) [ClassicSimilarity], result of:
              0.0270608 = score(doc=2765,freq=2.0), product of:
                0.13988481 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.03994621 = queryNorm
                0.19345059 = fieldWeight in 2765, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=2765)
          0.5 = coord(1/2)
      0.4 = coord(2/5)
    
    Date
    22. 3.2009 19:14:43
  7. Reiner, U.: Automatische DDC-Klassifizierung bibliografischer Titeldatensätze der Deutschen Nationalbibliografie (2009) 0.02
    0.017443767 = product of:
      0.043609414 = sum of:
        0.032785095 = weight(_text_:u in 3284) [ClassicSimilarity], result of:
          0.032785095 = score(doc=3284,freq=6.0), product of:
            0.13080163 = queryWeight, product of:
              3.2744443 = idf(docFreq=4547, maxDocs=44218)
              0.03994621 = queryNorm
            0.25064746 = fieldWeight in 3284, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              3.2744443 = idf(docFreq=4547, maxDocs=44218)
              0.03125 = fieldNorm(doc=3284)
        0.01082432 = product of:
          0.02164864 = sum of:
            0.02164864 = weight(_text_:22 in 3284) [ClassicSimilarity], result of:
              0.02164864 = score(doc=3284,freq=2.0), product of:
                0.13988481 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.03994621 = queryNorm
                0.15476047 = fieldWeight in 3284, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.03125 = fieldNorm(doc=3284)
          0.5 = coord(1/2)
      0.4 = coord(2/5)
    
    Abstract
    Das Klassifizieren von Objekten (z. B. Fauna, Flora, Texte) ist ein Verfahren, das auf menschlicher Intelligenz basiert. In der Informatik - insbesondere im Gebiet der Künstlichen Intelligenz (KI) - wird u. a. untersucht, inweit Verfahren, die menschliche Intelligenz benötigen, automatisiert werden können. Hierbei hat sich herausgestellt, dass die Lösung von Alltagsproblemen eine größere Herausforderung darstellt, als die Lösung von Spezialproblemen, wie z. B. das Erstellen eines Schachcomputers. So ist "Rybka" der seit Juni 2007 amtierende Computerschach-Weltmeistern. Inwieweit Alltagsprobleme mit Methoden der Künstlichen Intelligenz gelöst werden können, ist eine - für den allgemeinen Fall - noch offene Frage. Beim Lösen von Alltagsproblemen spielt die Verarbeitung der natürlichen Sprache, wie z. B. das Verstehen, eine wesentliche Rolle. Den "gesunden Menschenverstand" als Maschine (in der Cyc-Wissensbasis in Form von Fakten und Regeln) zu realisieren, ist Lenat's Ziel seit 1984. Bezüglich des KI-Paradeprojektes "Cyc" gibt es CycOptimisten und Cyc-Pessimisten. Das Verstehen der natürlichen Sprache (z. B. Werktitel, Zusammenfassung, Vorwort, Inhalt) ist auch beim intellektuellen Klassifizieren von bibliografischen Titeldatensätzen oder Netzpublikationen notwendig, um diese Textobjekte korrekt klassifizieren zu können. Seit dem Jahr 2007 werden von der Deutschen Nationalbibliothek nahezu alle Veröffentlichungen mit der Dewey Dezimalklassifikation (DDC) intellektuell klassifiziert.
    Die Menge der zu klassifizierenden Veröffentlichungen steigt spätestens seit der Existenz des World Wide Web schneller an, als sie intellektuell sachlich erschlossen werden kann. Daher werden Verfahren gesucht, um die Klassifizierung von Textobjekten zu automatisieren oder die intellektuelle Klassifizierung zumindest zu unterstützen. Seit 1968 gibt es Verfahren zur automatischen Dokumentenklassifizierung (Information Retrieval, kurz: IR) und seit 1992 zur automatischen Textklassifizierung (ATC: Automated Text Categorization). Seit immer mehr digitale Objekte im World Wide Web zur Verfügung stehen, haben Arbeiten zur automatischen Textklassifizierung seit ca. 1998 verstärkt zugenommen. Dazu gehören seit 1996 auch Arbeiten zur automatischen DDC-Klassifizierung bzw. RVK-Klassifizierung von bibliografischen Titeldatensätzen und Volltextdokumenten. Bei den Entwicklungen handelt es sich unseres Wissens bislang um experimentelle und keine im ständigen Betrieb befindlichen Systeme. Auch das VZG-Projekt Colibri/DDC ist seit 2006 u. a. mit der automatischen DDC-Klassifizierung befasst. Die diesbezüglichen Untersuchungen und Entwicklungen dienen zur Beantwortung der Forschungsfrage: "Ist es möglich, eine inhaltlich stimmige DDC-Titelklassifikation aller GVK-PLUS-Titeldatensätze automatisch zu erzielen?"
    Date
    22. 1.2010 14:41:24
  8. Pfeffer, M.: Automatische Vergabe von RVK-Notationen mittels fallbasiertem Schließen (2009) 0.02
    0.015770664 = product of:
      0.039426662 = sum of:
        0.023190185 = weight(_text_:m in 3051) [ClassicSimilarity], result of:
          0.023190185 = score(doc=3051,freq=4.0), product of:
            0.09940409 = queryWeight, product of:
              2.4884486 = idf(docFreq=9980, maxDocs=44218)
              0.03994621 = queryNorm
            0.23329206 = fieldWeight in 3051, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              2.4884486 = idf(docFreq=9980, maxDocs=44218)
              0.046875 = fieldNorm(doc=3051)
        0.016236478 = product of:
          0.032472957 = sum of:
            0.032472957 = weight(_text_:22 in 3051) [ClassicSimilarity], result of:
              0.032472957 = score(doc=3051,freq=2.0), product of:
                0.13988481 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.03994621 = queryNorm
                0.23214069 = fieldWeight in 3051, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.046875 = fieldNorm(doc=3051)
          0.5 = coord(1/2)
      0.4 = coord(2/5)
    
    Date
    22. 8.2009 19:51:28
    Imprint
    Frankfurt, M. : Klostermann
  9. Classification, automation, and new media : Proceedings of the 24th Annual Conference of the Gesellschaft für Klassifikation e.V., University of Passau, March 15 - 17, 2000 (2002) 0.01
    0.014930221 = product of:
      0.037325554 = sum of:
        0.013664948 = weight(_text_:m in 5997) [ClassicSimilarity], result of:
          0.013664948 = score(doc=5997,freq=2.0), product of:
            0.09940409 = queryWeight, product of:
              2.4884486 = idf(docFreq=9980, maxDocs=44218)
              0.03994621 = queryNorm
            0.13746867 = fieldWeight in 5997, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              2.4884486 = idf(docFreq=9980, maxDocs=44218)
              0.0390625 = fieldNorm(doc=5997)
        0.023660604 = weight(_text_:u in 5997) [ClassicSimilarity], result of:
          0.023660604 = score(doc=5997,freq=2.0), product of:
            0.13080163 = queryWeight, product of:
              3.2744443 = idf(docFreq=4547, maxDocs=44218)
              0.03994621 = queryNorm
            0.1808892 = fieldWeight in 5997, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.2744443 = idf(docFreq=4547, maxDocs=44218)
              0.0390625 = fieldNorm(doc=5997)
      0.4 = coord(2/5)
    
    Editor
    Gaul, W. u. G. Ritter
    Type
    m
  10. Cathey, R.J.; Jensen, E.C.; Beitzel, S.M.; Frieder, O.; Grossman, D.: Exploiting parallelism to support scalable hierarchical clustering (2007) 0.01
    0.01421116 = product of:
      0.0710558 = sum of:
        0.0710558 = weight(_text_:n in 448) [ClassicSimilarity], result of:
          0.0710558 = score(doc=448,freq=6.0), product of:
            0.17223433 = queryWeight, product of:
              4.3116565 = idf(docFreq=1611, maxDocs=44218)
              0.03994621 = queryNorm
            0.41255307 = fieldWeight in 448, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              4.3116565 = idf(docFreq=1611, maxDocs=44218)
              0.0390625 = fieldNorm(doc=448)
      0.2 = coord(1/5)
    
    Abstract
    A distributed memory parallel version of the group average hierarchical agglomerative clustering algorithm is proposed to enable scaling the document clustering problem to large collections. Using standard message passing operations reduces interprocess communication while maintaining efficient load balancing. In a series of experiments using a subset of a standard Text REtrieval Conference (TREC) test collection, our parallel hierarchical clustering algorithm is shown to be scalable in terms of processors efficiently used and the collection size. Results show that our algorithm performs close to the expected O(n**2/p) time on p processors rather than the worst-case O(n**3/p) time. Furthermore, the O(n**2/p) memory complexity per node allows larger collections to be clustered as the number of nodes increases. While partitioning algorithms such as k-means are trivially parallelizable, our results confirm those of other studies which showed that hierarchical algorithms produce significantly tighter clusters in the document clustering task. Finally, we show how our parallel hierarchical agglomerative clustering algorithm can be used as the clustering subroutine for a parallel version of the buckshot algorithm to cluster the complete TREC collection at near theoretical runtime expectations.
  11. Choi, B.; Peng, X.: Dynamic and hierarchical classification of Web pages (2004) 0.01
    0.013924035 = product of:
      0.06962018 = sum of:
        0.06962018 = weight(_text_:n in 2555) [ClassicSimilarity], result of:
          0.06962018 = score(doc=2555,freq=4.0), product of:
            0.17223433 = queryWeight, product of:
              4.3116565 = idf(docFreq=1611, maxDocs=44218)
              0.03994621 = queryNorm
            0.40421778 = fieldWeight in 2555, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              4.3116565 = idf(docFreq=1611, maxDocs=44218)
              0.046875 = fieldNorm(doc=2555)
      0.2 = coord(1/5)
    
    Abstract
    Automatic classification of Web pages is an effective way to organise the vast amount of information and to assist in retrieving relevant information from the Internet. Although many automatic classification systems have been proposed, most of them ignore the conflict between the fixed number of categories and the growing number of Web pages being added into the systems. They also require searching through all existing categories to make any classification. This article proposes a dynamic and hierarchical classification system that is capable of adding new categories as required, organising the Web pages into a tree structure, and classifying Web pages by searching through only one path of the tree. The proposed single-path search technique reduces the search complexity from (n) to (log(n)). Test results show that the system improves the accuracy of classification by 6 percent in comparison to related systems. The dynamic-category expansion technique also achieves satisfying results for adding new categories into the system as required.
  12. Schek, M.: Automatische Klassifizierung in Erschließung und Recherche eines Pressearchivs (2006) 0.01
    0.013755443 = product of:
      0.034388606 = sum of:
        0.015460123 = weight(_text_:m in 6043) [ClassicSimilarity], result of:
          0.015460123 = score(doc=6043,freq=4.0), product of:
            0.09940409 = queryWeight, product of:
              2.4884486 = idf(docFreq=9980, maxDocs=44218)
              0.03994621 = queryNorm
            0.15552804 = fieldWeight in 6043, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              2.4884486 = idf(docFreq=9980, maxDocs=44218)
              0.03125 = fieldNorm(doc=6043)
        0.018928483 = weight(_text_:u in 6043) [ClassicSimilarity], result of:
          0.018928483 = score(doc=6043,freq=2.0), product of:
            0.13080163 = queryWeight, product of:
              3.2744443 = idf(docFreq=4547, maxDocs=44218)
              0.03994621 = queryNorm
            0.14471136 = fieldWeight in 6043, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.2744443 = idf(docFreq=4547, maxDocs=44218)
              0.03125 = fieldNorm(doc=6043)
      0.4 = coord(2/5)
    
    Source
    Spezialbibliotheken zwischen Auftrag und Ressourcen: 6.-9. September 2005 in München, 30. Arbeits- und Fortbildungstagung der ASpB e.V. / Sektion 5 im Deutschen Bibliotheksverband. Red.: M. Brauer
    Theme
    Semantisches Umfeld in Indexierung u. Retrieval
  13. Fuhr, N.: Klassifikationsverfahren bei der automatischen Indexierung (1983) 0.01
    0.013127707 = product of:
      0.065638535 = sum of:
        0.065638535 = weight(_text_:n in 7697) [ClassicSimilarity], result of:
          0.065638535 = score(doc=7697,freq=2.0), product of:
            0.17223433 = queryWeight, product of:
              4.3116565 = idf(docFreq=1611, maxDocs=44218)
              0.03994621 = queryNorm
            0.38110018 = fieldWeight in 7697, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              4.3116565 = idf(docFreq=1611, maxDocs=44218)
              0.0625 = fieldNorm(doc=7697)
      0.2 = coord(1/5)
    
  14. Egbert, J.; Biber, D.; Davies, M.: Developing a bottom-up, user-based method of web register classification (2015) 0.01
    0.013053766 = product of:
      0.032634415 = sum of:
        0.016397936 = weight(_text_:m in 2158) [ClassicSimilarity], result of:
          0.016397936 = score(doc=2158,freq=2.0), product of:
            0.09940409 = queryWeight, product of:
              2.4884486 = idf(docFreq=9980, maxDocs=44218)
              0.03994621 = queryNorm
            0.1649624 = fieldWeight in 2158, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              2.4884486 = idf(docFreq=9980, maxDocs=44218)
              0.046875 = fieldNorm(doc=2158)
        0.016236478 = product of:
          0.032472957 = sum of:
            0.032472957 = weight(_text_:22 in 2158) [ClassicSimilarity], result of:
              0.032472957 = score(doc=2158,freq=2.0), product of:
                0.13988481 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.03994621 = queryNorm
                0.23214069 = fieldWeight in 2158, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.046875 = fieldNorm(doc=2158)
          0.5 = coord(1/2)
      0.4 = coord(2/5)
    
    Date
    4. 8.2015 19:22:04
  15. Meder, N.: Artificial intelligence as a tool of classification, or: the network of language games as cognitive paradigm (1985) 0.01
    0.011486744 = product of:
      0.057433717 = sum of:
        0.057433717 = weight(_text_:n in 7694) [ClassicSimilarity], result of:
          0.057433717 = score(doc=7694,freq=2.0), product of:
            0.17223433 = queryWeight, product of:
              4.3116565 = idf(docFreq=1611, maxDocs=44218)
              0.03994621 = queryNorm
            0.33346266 = fieldWeight in 7694, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              4.3116565 = idf(docFreq=1611, maxDocs=44218)
              0.0546875 = fieldNorm(doc=7694)
      0.2 = coord(1/5)
    
  16. Reiner, U.: Automatic analysis of DDC notations (2007) 0.01
    0.0113570895 = product of:
      0.056785446 = sum of:
        0.056785446 = weight(_text_:u in 118) [ClassicSimilarity], result of:
          0.056785446 = score(doc=118,freq=2.0), product of:
            0.13080163 = queryWeight, product of:
              3.2744443 = idf(docFreq=4547, maxDocs=44218)
              0.03994621 = queryNorm
            0.43413407 = fieldWeight in 118, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.2744443 = idf(docFreq=4547, maxDocs=44218)
              0.09375 = fieldNorm(doc=118)
      0.2 = coord(1/5)
    
  17. Wu, M.; Fuller, M.; Wilkinson, R.: Using clustering and classification approaches in interactive retrieval (2001) 0.01
    0.010822087 = product of:
      0.05411043 = sum of:
        0.05411043 = weight(_text_:m in 2666) [ClassicSimilarity], result of:
          0.05411043 = score(doc=2666,freq=4.0), product of:
            0.09940409 = queryWeight, product of:
              2.4884486 = idf(docFreq=9980, maxDocs=44218)
              0.03994621 = queryNorm
            0.5443481 = fieldWeight in 2666, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              2.4884486 = idf(docFreq=9980, maxDocs=44218)
              0.109375 = fieldNorm(doc=2666)
      0.2 = coord(1/5)
    
  18. Schek, M.: Automatische Klassifizierung und Visualisierung im Archiv der Süddeutschen Zeitung (2005) 0.01
    0.010451154 = product of:
      0.026127884 = sum of:
        0.009565463 = weight(_text_:m in 4884) [ClassicSimilarity], result of:
          0.009565463 = score(doc=4884,freq=2.0), product of:
            0.09940409 = queryWeight, product of:
              2.4884486 = idf(docFreq=9980, maxDocs=44218)
              0.03994621 = queryNorm
            0.09622806 = fieldWeight in 4884, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              2.4884486 = idf(docFreq=9980, maxDocs=44218)
              0.02734375 = fieldNorm(doc=4884)
        0.01656242 = weight(_text_:u in 4884) [ClassicSimilarity], result of:
          0.01656242 = score(doc=4884,freq=2.0), product of:
            0.13080163 = queryWeight, product of:
              3.2744443 = idf(docFreq=4547, maxDocs=44218)
              0.03994621 = queryNorm
            0.12662244 = fieldWeight in 4884, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.2744443 = idf(docFreq=4547, maxDocs=44218)
              0.02734375 = fieldNorm(doc=4884)
      0.4 = coord(2/5)
    
    Theme
    Semantisches Umfeld in Indexierung u. Retrieval
  19. Drori, O.; Alon, N.: Using document classification for displaying search results (2003) 0.01
    0.00984578 = product of:
      0.0492289 = sum of:
        0.0492289 = weight(_text_:n in 1565) [ClassicSimilarity], result of:
          0.0492289 = score(doc=1565,freq=2.0), product of:
            0.17223433 = queryWeight, product of:
              4.3116565 = idf(docFreq=1611, maxDocs=44218)
              0.03994621 = queryNorm
            0.28582513 = fieldWeight in 1565, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              4.3116565 = idf(docFreq=1611, maxDocs=44218)
              0.046875 = fieldNorm(doc=1565)
      0.2 = coord(1/5)
    
  20. Finn, A.; Kushmerick, N.: Learning to classify documents according to genre (2006) 0.01
    0.00984578 = product of:
      0.0492289 = sum of:
        0.0492289 = weight(_text_:n in 6010) [ClassicSimilarity], result of:
          0.0492289 = score(doc=6010,freq=2.0), product of:
            0.17223433 = queryWeight, product of:
              4.3116565 = idf(docFreq=1611, maxDocs=44218)
              0.03994621 = queryNorm
            0.28582513 = fieldWeight in 6010, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              4.3116565 = idf(docFreq=1611, maxDocs=44218)
              0.046875 = fieldNorm(doc=6010)
      0.2 = coord(1/5)
    

Languages

  • e 60
  • d 18

Types

  • a 61
  • el 11
  • m 5
  • x 3
  • r 2
  • s 2
  • d 1
  • More… Less…