Search (19 results, page 1 of 1)

  • × theme_ss:"Automatisches Klassifizieren"
  • × type_ss:"a"
  • × year_i:[2000 TO 2010}
  1. Hotho, A.; Bloehdorn, S.: Data Mining 2004 : Text classification by boosting weak learners based on terms and concepts (2004) 0.05
    0.05368833 = product of:
      0.08053249 = sum of:
        0.068795376 = product of:
          0.20638612 = sum of:
            0.20638612 = weight(_text_:3a in 562) [ClassicSimilarity], result of:
              0.20638612 = score(doc=562,freq=2.0), product of:
                0.3672233 = queryWeight, product of:
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.043314792 = queryNorm
                0.56201804 = fieldWeight in 562, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.046875 = fieldNorm(doc=562)
          0.33333334 = coord(1/3)
        0.011737112 = product of:
          0.035211336 = sum of:
            0.035211336 = weight(_text_:22 in 562) [ClassicSimilarity], result of:
              0.035211336 = score(doc=562,freq=2.0), product of:
                0.15168102 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.043314792 = queryNorm
                0.23214069 = fieldWeight in 562, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.046875 = fieldNorm(doc=562)
          0.33333334 = coord(1/3)
      0.6666667 = coord(2/3)
    
    Content
    Vgl.: http://www.google.de/url?sa=t&rct=j&q=&esrc=s&source=web&cd=1&cad=rja&ved=0CEAQFjAA&url=http%3A%2F%2Fciteseerx.ist.psu.edu%2Fviewdoc%2Fdownload%3Fdoi%3D10.1.1.91.4940%26rep%3Drep1%26type%3Dpdf&ei=dOXrUMeIDYHDtQahsIGACg&usg=AFQjCNHFWVh6gNPvnOrOS9R3rkrXCNVD-A&sig2=5I2F5evRfMnsttSgFF9g7Q&bvm=bv.1357316858,d.Yms.
    Date
    8. 1.2013 10:22:32
  2. Reiner, U.: Automatische DDC-Klassifizierung bibliografischer Titeldatensätze der Deutschen Nationalbibliografie (2009) 0.02
    0.016291663 = product of:
      0.048874985 = sum of:
        0.048874985 = product of:
          0.07331248 = sum of:
            0.049838252 = weight(_text_:informatik in 3284) [ClassicSimilarity], result of:
              0.049838252 = score(doc=3284,freq=2.0), product of:
                0.22101259 = queryWeight, product of:
                  5.1024737 = idf(docFreq=730, maxDocs=44218)
                  0.043314792 = queryNorm
                0.2254996 = fieldWeight in 3284, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  5.1024737 = idf(docFreq=730, maxDocs=44218)
                  0.03125 = fieldNorm(doc=3284)
            0.023474226 = weight(_text_:22 in 3284) [ClassicSimilarity], result of:
              0.023474226 = score(doc=3284,freq=2.0), product of:
                0.15168102 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.043314792 = queryNorm
                0.15476047 = fieldWeight in 3284, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.03125 = fieldNorm(doc=3284)
          0.6666667 = coord(2/3)
      0.33333334 = coord(1/3)
    
    Abstract
    Das Klassifizieren von Objekten (z. B. Fauna, Flora, Texte) ist ein Verfahren, das auf menschlicher Intelligenz basiert. In der Informatik - insbesondere im Gebiet der Künstlichen Intelligenz (KI) - wird u. a. untersucht, inweit Verfahren, die menschliche Intelligenz benötigen, automatisiert werden können. Hierbei hat sich herausgestellt, dass die Lösung von Alltagsproblemen eine größere Herausforderung darstellt, als die Lösung von Spezialproblemen, wie z. B. das Erstellen eines Schachcomputers. So ist "Rybka" der seit Juni 2007 amtierende Computerschach-Weltmeistern. Inwieweit Alltagsprobleme mit Methoden der Künstlichen Intelligenz gelöst werden können, ist eine - für den allgemeinen Fall - noch offene Frage. Beim Lösen von Alltagsproblemen spielt die Verarbeitung der natürlichen Sprache, wie z. B. das Verstehen, eine wesentliche Rolle. Den "gesunden Menschenverstand" als Maschine (in der Cyc-Wissensbasis in Form von Fakten und Regeln) zu realisieren, ist Lenat's Ziel seit 1984. Bezüglich des KI-Paradeprojektes "Cyc" gibt es CycOptimisten und Cyc-Pessimisten. Das Verstehen der natürlichen Sprache (z. B. Werktitel, Zusammenfassung, Vorwort, Inhalt) ist auch beim intellektuellen Klassifizieren von bibliografischen Titeldatensätzen oder Netzpublikationen notwendig, um diese Textobjekte korrekt klassifizieren zu können. Seit dem Jahr 2007 werden von der Deutschen Nationalbibliothek nahezu alle Veröffentlichungen mit der Dewey Dezimalklassifikation (DDC) intellektuell klassifiziert.
    Date
    22. 1.2010 14:41:24
  3. Subramanian, S.; Shafer, K.E.: Clustering (2001) 0.01
    0.007824741 = product of:
      0.023474224 = sum of:
        0.023474224 = product of:
          0.07042267 = sum of:
            0.07042267 = weight(_text_:22 in 1046) [ClassicSimilarity], result of:
              0.07042267 = score(doc=1046,freq=2.0), product of:
                0.15168102 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.043314792 = queryNorm
                0.46428138 = fieldWeight in 1046, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.09375 = fieldNorm(doc=1046)
          0.33333334 = coord(1/3)
      0.33333334 = coord(1/3)
    
    Date
    5. 5.2003 14:17:22
  4. Ruiz, M.E.; Srinivasan, P.: Combining machine learning and hierarchical indexing structures for text categorization (2001) 0.00
    0.0046058656 = product of:
      0.013817596 = sum of:
        0.013817596 = product of:
          0.041452788 = sum of:
            0.041452788 = weight(_text_:29 in 1595) [ClassicSimilarity], result of:
              0.041452788 = score(doc=1595,freq=2.0), product of:
                0.15236789 = queryWeight, product of:
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.043314792 = queryNorm
                0.27205724 = fieldWeight in 1595, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=1595)
          0.33333334 = coord(1/3)
      0.33333334 = coord(1/3)
    
    Date
    11. 5.2003 18:29:44
  5. Yoon, Y.; Lee, C.; Lee, G.G.: ¬An effective procedure for constructing a hierarchical text classification system (2006) 0.00
    0.004564433 = product of:
      0.013693298 = sum of:
        0.013693298 = product of:
          0.041079894 = sum of:
            0.041079894 = weight(_text_:22 in 5273) [ClassicSimilarity], result of:
              0.041079894 = score(doc=5273,freq=2.0), product of:
                0.15168102 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.043314792 = queryNorm
                0.2708308 = fieldWeight in 5273, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=5273)
          0.33333334 = coord(1/3)
      0.33333334 = coord(1/3)
    
    Date
    22. 7.2006 16:24:52
  6. Yi, K.: Automatic text classification using library classification schemes : trends, issues and challenges (2007) 0.00
    0.004564433 = product of:
      0.013693298 = sum of:
        0.013693298 = product of:
          0.041079894 = sum of:
            0.041079894 = weight(_text_:22 in 2560) [ClassicSimilarity], result of:
              0.041079894 = score(doc=2560,freq=2.0), product of:
                0.15168102 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.043314792 = queryNorm
                0.2708308 = fieldWeight in 2560, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=2560)
          0.33333334 = coord(1/3)
      0.33333334 = coord(1/3)
    
    Date
    22. 9.2008 18:31:54
  7. Drori, O.; Alon, N.: Using document classification for displaying search results (2003) 0.00
    0.003947885 = product of:
      0.011843654 = sum of:
        0.011843654 = product of:
          0.035530962 = sum of:
            0.035530962 = weight(_text_:29 in 1565) [ClassicSimilarity], result of:
              0.035530962 = score(doc=1565,freq=2.0), product of:
                0.15236789 = queryWeight, product of:
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.043314792 = queryNorm
                0.23319192 = fieldWeight in 1565, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.046875 = fieldNorm(doc=1565)
          0.33333334 = coord(1/3)
      0.33333334 = coord(1/3)
    
    Source
    Journal of information science. 29(2003) no.2, S.97-106
  8. Chung, Y.-M.; Noh, Y.-H.: Developing a specialized directory system by automatically classifying Web documents (2003) 0.00
    0.003947885 = product of:
      0.011843654 = sum of:
        0.011843654 = product of:
          0.035530962 = sum of:
            0.035530962 = weight(_text_:29 in 1566) [ClassicSimilarity], result of:
              0.035530962 = score(doc=1566,freq=2.0), product of:
                0.15236789 = queryWeight, product of:
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.043314792 = queryNorm
                0.23319192 = fieldWeight in 1566, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.046875 = fieldNorm(doc=1566)
          0.33333334 = coord(1/3)
      0.33333334 = coord(1/3)
    
    Source
    Journal of information science. 29(2003) no.2, S.117-126
  9. Li, T.; Zhu, S.; Ogihara, M.: Hierarchical document classification using automatically generated hierarchy (2007) 0.00
    0.003947885 = product of:
      0.011843654 = sum of:
        0.011843654 = product of:
          0.035530962 = sum of:
            0.035530962 = weight(_text_:29 in 4797) [ClassicSimilarity], result of:
              0.035530962 = score(doc=4797,freq=2.0), product of:
                0.15236789 = queryWeight, product of:
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.043314792 = queryNorm
                0.23319192 = fieldWeight in 4797, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.046875 = fieldNorm(doc=4797)
          0.33333334 = coord(1/3)
      0.33333334 = coord(1/3)
    
    Source
    Journal of intelligent information systems. 29(2007) no.2, S.211-230
  10. Liu, R.-L.: Context recognition for hierarchical text classification (2009) 0.00
    0.0039123707 = product of:
      0.011737112 = sum of:
        0.011737112 = product of:
          0.035211336 = sum of:
            0.035211336 = weight(_text_:22 in 2760) [ClassicSimilarity], result of:
              0.035211336 = score(doc=2760,freq=2.0), product of:
                0.15168102 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.043314792 = queryNorm
                0.23214069 = fieldWeight in 2760, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.046875 = fieldNorm(doc=2760)
          0.33333334 = coord(1/3)
      0.33333334 = coord(1/3)
    
    Date
    22. 3.2009 19:11:54
  11. Pfeffer, M.: Automatische Vergabe von RVK-Notationen mittels fallbasiertem Schließen (2009) 0.00
    0.0039123707 = product of:
      0.011737112 = sum of:
        0.011737112 = product of:
          0.035211336 = sum of:
            0.035211336 = weight(_text_:22 in 3051) [ClassicSimilarity], result of:
              0.035211336 = score(doc=3051,freq=2.0), product of:
                0.15168102 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.043314792 = queryNorm
                0.23214069 = fieldWeight in 3051, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.046875 = fieldNorm(doc=3051)
          0.33333334 = coord(1/3)
      0.33333334 = coord(1/3)
    
    Date
    22. 8.2009 19:51:28
  12. Ribeiro-Neto, B.; Laender, A.H.F.; Lima, L.R.S. de: ¬An experimental study in automatically categorizing medical documents (2001) 0.00
    0.0032899042 = product of:
      0.009869712 = sum of:
        0.009869712 = product of:
          0.029609136 = sum of:
            0.029609136 = weight(_text_:29 in 5702) [ClassicSimilarity], result of:
              0.029609136 = score(doc=5702,freq=2.0), product of:
                0.15236789 = queryWeight, product of:
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.043314792 = queryNorm
                0.19432661 = fieldWeight in 5702, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=5702)
          0.33333334 = coord(1/3)
      0.33333334 = coord(1/3)
    
    Date
    29. 9.2001 13:59:42
  13. Chung, Y.M.; Lee, J.Y.: ¬A corpus-based approach to comparative evaluation of statistical term association measures (2001) 0.00
    0.0032899042 = product of:
      0.009869712 = sum of:
        0.009869712 = product of:
          0.029609136 = sum of:
            0.029609136 = weight(_text_:29 in 5769) [ClassicSimilarity], result of:
              0.029609136 = score(doc=5769,freq=2.0), product of:
                0.15236789 = queryWeight, product of:
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.043314792 = queryNorm
                0.19432661 = fieldWeight in 5769, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=5769)
          0.33333334 = coord(1/3)
      0.33333334 = coord(1/3)
    
    Date
    29. 9.2001 14:01:18
  14. Ibekwe-SanJuan, F.; SanJuan, E.: From term variants to research topics (2002) 0.00
    0.0032899042 = product of:
      0.009869712 = sum of:
        0.009869712 = product of:
          0.029609136 = sum of:
            0.029609136 = weight(_text_:29 in 1853) [ClassicSimilarity], result of:
              0.029609136 = score(doc=1853,freq=2.0), product of:
                0.15236789 = queryWeight, product of:
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.043314792 = queryNorm
                0.19432661 = fieldWeight in 1853, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=1853)
          0.33333334 = coord(1/3)
      0.33333334 = coord(1/3)
    
    Source
    Knowledge organization. 29(2002) nos.3/4, S.181-197
  15. Automatische Klassifikation und Extraktion in Documentum (2005) 0.00
    0.0032899042 = product of:
      0.009869712 = sum of:
        0.009869712 = product of:
          0.029609136 = sum of:
            0.029609136 = weight(_text_:29 in 3974) [ClassicSimilarity], result of:
              0.029609136 = score(doc=3974,freq=2.0), product of:
                0.15236789 = queryWeight, product of:
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.043314792 = queryNorm
                0.19432661 = fieldWeight in 3974, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=3974)
          0.33333334 = coord(1/3)
      0.33333334 = coord(1/3)
    
    Footnote
    Kontakt: LCI GmbH, Freiburger Str. 16, 16,79199 Kirchzarten, Tel.: (0 76 61) 9 89 961o, Fax: (01212) 5 37 48 29 36, info@lci-software.com, www.lci-software.com
  16. Giorgetti, D.; Sebastiani, F.: Automating survey coding by multiclass text categorization techniques (2003) 0.00
    0.0032899042 = product of:
      0.009869712 = sum of:
        0.009869712 = product of:
          0.029609136 = sum of:
            0.029609136 = weight(_text_:29 in 5172) [ClassicSimilarity], result of:
              0.029609136 = score(doc=5172,freq=2.0), product of:
                0.15236789 = queryWeight, product of:
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.043314792 = queryNorm
                0.19432661 = fieldWeight in 5172, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=5172)
          0.33333334 = coord(1/3)
      0.33333334 = coord(1/3)
    
    Date
    9. 7.2006 10:29:12
  17. Kwon, O.W.; Lee, J.H.: Text categorization based on k-nearest neighbor approach for web site classification (2003) 0.00
    0.0032899042 = product of:
      0.009869712 = sum of:
        0.009869712 = product of:
          0.029609136 = sum of:
            0.029609136 = weight(_text_:29 in 1070) [ClassicSimilarity], result of:
              0.029609136 = score(doc=1070,freq=2.0), product of:
                0.15236789 = queryWeight, product of:
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.043314792 = queryNorm
                0.19432661 = fieldWeight in 1070, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=1070)
          0.33333334 = coord(1/3)
      0.33333334 = coord(1/3)
    
    Date
    27.12.2007 17:32:29
  18. Mengle, S.; Goharian, N.: Passage detection using text classification (2009) 0.00
    0.0032603093 = product of:
      0.009780928 = sum of:
        0.009780928 = product of:
          0.029342782 = sum of:
            0.029342782 = weight(_text_:22 in 2765) [ClassicSimilarity], result of:
              0.029342782 = score(doc=2765,freq=2.0), product of:
                0.15168102 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.043314792 = queryNorm
                0.19345059 = fieldWeight in 2765, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=2765)
          0.33333334 = coord(1/3)
      0.33333334 = coord(1/3)
    
    Date
    22. 3.2009 19:14:43
  19. Khoo, C.S.G.; Ng, K.; Ou, S.: ¬An exploratory study of human clustering of Web pages (2003) 0.00
    0.0026082476 = product of:
      0.007824742 = sum of:
        0.007824742 = product of:
          0.023474226 = sum of:
            0.023474226 = weight(_text_:22 in 2741) [ClassicSimilarity], result of:
              0.023474226 = score(doc=2741,freq=2.0), product of:
                0.15168102 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.043314792 = queryNorm
                0.15476047 = fieldWeight in 2741, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.03125 = fieldNorm(doc=2741)
          0.33333334 = coord(1/3)
      0.33333334 = coord(1/3)
    
    Date
    12. 9.2004 9:56:22