Search (56 results, page 3 of 3)

  • × theme_ss:"Automatisches Klassifizieren"
  1. Malenica, M.; Smuc, T.; Snajder, J.; Basic, B.D.: Language morphology offset : text classification on a Croatian-English parallel corpus (2008) 0.01
    0.0076358644 = product of:
      0.015271729 = sum of:
        0.015271729 = product of:
          0.030543458 = sum of:
            0.030543458 = weight(_text_:j in 2035) [ClassicSimilarity], result of:
              0.030543458 = score(doc=2035,freq=2.0), product of:
                0.14500295 = queryWeight, product of:
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.045634337 = queryNorm
                0.21064025 = fieldWeight in 2035, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.046875 = fieldNorm(doc=2035)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  2. Ko, Y.; Seo, J.: Text classification from unlabeled documents with bootstrapping and feature projection techniques (2009) 0.01
    0.0076358644 = product of:
      0.015271729 = sum of:
        0.015271729 = product of:
          0.030543458 = sum of:
            0.030543458 = weight(_text_:j in 2452) [ClassicSimilarity], result of:
              0.030543458 = score(doc=2452,freq=2.0), product of:
                0.14500295 = queryWeight, product of:
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.045634337 = queryNorm
                0.21064025 = fieldWeight in 2452, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.046875 = fieldNorm(doc=2452)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  3. Puzicha, J.: Informationen finden! : Intelligente Suchmaschinentechnologie & automatische Kategorisierung (2007) 0.01
    0.0076358644 = product of:
      0.015271729 = sum of:
        0.015271729 = product of:
          0.030543458 = sum of:
            0.030543458 = weight(_text_:j in 2817) [ClassicSimilarity], result of:
              0.030543458 = score(doc=2817,freq=2.0), product of:
                0.14500295 = queryWeight, product of:
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.045634337 = queryNorm
                0.21064025 = fieldWeight in 2817, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.046875 = fieldNorm(doc=2817)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  4. Malo, P.; Sinha, A.; Wallenius, J.; Korhonen, P.: Concept-based document classification using Wikipedia and value function (2011) 0.01
    0.0076358644 = product of:
      0.015271729 = sum of:
        0.015271729 = product of:
          0.030543458 = sum of:
            0.030543458 = weight(_text_:j in 4948) [ClassicSimilarity], result of:
              0.030543458 = score(doc=4948,freq=2.0), product of:
                0.14500295 = queryWeight, product of:
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.045634337 = queryNorm
                0.21064025 = fieldWeight in 4948, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.046875 = fieldNorm(doc=4948)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  5. Na, J.-C.; Sui, H.; Khoo, C.; Chan, S.; Zhou, Y.: Effectiveness of simple linguistic processing in automatic sentiment classification of product reviews (2004) 0.01
    0.00636322 = product of:
      0.01272644 = sum of:
        0.01272644 = product of:
          0.02545288 = sum of:
            0.02545288 = weight(_text_:j in 2624) [ClassicSimilarity], result of:
              0.02545288 = score(doc=2624,freq=2.0), product of:
                0.14500295 = queryWeight, product of:
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.045634337 = queryNorm
                0.17553353 = fieldWeight in 2624, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=2624)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  6. Pong, J.Y.-H.; Kwok, R.C.-W.; Lau, R.Y.-K.; Hao, J.-X.; Wong, P.C.-C.: ¬A comparative study of two automatic document classification methods in a library setting (2008) 0.01
    0.00636322 = product of:
      0.01272644 = sum of:
        0.01272644 = product of:
          0.02545288 = sum of:
            0.02545288 = weight(_text_:j in 2532) [ClassicSimilarity], result of:
              0.02545288 = score(doc=2532,freq=2.0), product of:
                0.14500295 = queryWeight, product of:
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.045634337 = queryNorm
                0.17553353 = fieldWeight in 2532, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=2532)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  7. Wang, J.: ¬An extensive study on automated Dewey Decimal Classification (2009) 0.01
    0.00636322 = product of:
      0.01272644 = sum of:
        0.01272644 = product of:
          0.02545288 = sum of:
            0.02545288 = weight(_text_:j in 3172) [ClassicSimilarity], result of:
              0.02545288 = score(doc=3172,freq=2.0), product of:
                0.14500295 = queryWeight, product of:
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.045634337 = queryNorm
                0.17553353 = fieldWeight in 3172, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=3172)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  8. Humphrey, S.M.; Névéol, A.; Browne, A.; Gobeil, J.; Ruch, P.; Darmoni, S.J.: Comparing a rule-based versus statistical system for automatic categorization of MEDLINE documents according to biomedical specialty (2009) 0.01
    0.00636322 = product of:
      0.01272644 = sum of:
        0.01272644 = product of:
          0.02545288 = sum of:
            0.02545288 = weight(_text_:j in 3300) [ClassicSimilarity], result of:
              0.02545288 = score(doc=3300,freq=2.0), product of:
                0.14500295 = queryWeight, product of:
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.045634337 = queryNorm
                0.17553353 = fieldWeight in 3300, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=3300)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  9. Golub, K.; Hansson, J.; Soergel, D.; Tudhope, D.: Managing classification in libraries : a methodological outline for evaluating automatic subject indexing and classification in Swedish library catalogues (2015) 0.01
    0.00636322 = product of:
      0.01272644 = sum of:
        0.01272644 = product of:
          0.02545288 = sum of:
            0.02545288 = weight(_text_:j in 2300) [ClassicSimilarity], result of:
              0.02545288 = score(doc=2300,freq=2.0), product of:
                0.14500295 = queryWeight, product of:
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.045634337 = queryNorm
                0.17553353 = fieldWeight in 2300, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=2300)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  10. Borodin, Y.; Polishchuk, V.; Mahmud, J.; Ramakrishnan, I.V.; Stent, A.: Live and learn from mistakes : a lightweight system for document classification (2013) 0.01
    0.00636322 = product of:
      0.01272644 = sum of:
        0.01272644 = product of:
          0.02545288 = sum of:
            0.02545288 = weight(_text_:j in 2722) [ClassicSimilarity], result of:
              0.02545288 = score(doc=2722,freq=2.0), product of:
                0.14500295 = queryWeight, product of:
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.045634337 = queryNorm
                0.17553353 = fieldWeight in 2722, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=2722)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  11. Ru, C.; Tang, J.; Li, S.; Xie, S.; Wang, T.: Using semantic similarity to reduce wrong labels in distant supervision for relation extraction (2018) 0.01
    0.00636322 = product of:
      0.01272644 = sum of:
        0.01272644 = product of:
          0.02545288 = sum of:
            0.02545288 = weight(_text_:j in 5055) [ClassicSimilarity], result of:
              0.02545288 = score(doc=5055,freq=2.0), product of:
                0.14500295 = queryWeight, product of:
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.045634337 = queryNorm
                0.17553353 = fieldWeight in 5055, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=5055)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  12. Han, K.; Rezapour, R.; Nakamura, K.; Devkota, D.; Miller, D.C.; Diesner, J.: ¬An expert-in-the-loop method for domain-specific document categorization based on small training data (2023) 0.01
    0.00636322 = product of:
      0.01272644 = sum of:
        0.01272644 = product of:
          0.02545288 = sum of:
            0.02545288 = weight(_text_:j in 967) [ClassicSimilarity], result of:
              0.02545288 = score(doc=967,freq=2.0), product of:
                0.14500295 = queryWeight, product of:
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.045634337 = queryNorm
                0.17553353 = fieldWeight in 967, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=967)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  13. Khoo, C.S.G.; Ng, K.; Ou, S.: ¬An exploratory study of human clustering of Web pages (2003) 0.01
    0.006182823 = product of:
      0.012365646 = sum of:
        0.012365646 = product of:
          0.024731291 = sum of:
            0.024731291 = weight(_text_:22 in 2741) [ClassicSimilarity], result of:
              0.024731291 = score(doc=2741,freq=2.0), product of:
                0.15980367 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.045634337 = queryNorm
                0.15476047 = fieldWeight in 2741, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.03125 = fieldNorm(doc=2741)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Date
    12. 9.2004 9:56:22
  14. Reiner, U.: Automatische DDC-Klassifizierung bibliografischer Titeldatensätze der Deutschen Nationalbibliografie (2009) 0.01
    0.006182823 = product of:
      0.012365646 = sum of:
        0.012365646 = product of:
          0.024731291 = sum of:
            0.024731291 = weight(_text_:22 in 3284) [ClassicSimilarity], result of:
              0.024731291 = score(doc=3284,freq=2.0), product of:
                0.15980367 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.045634337 = queryNorm
                0.15476047 = fieldWeight in 3284, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.03125 = fieldNorm(doc=3284)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Date
    22. 1.2010 14:41:24
  15. Dolin, R.; Agrawal, D.; El Abbadi, A.; Pearlman, J.: Using automated classification for summarizing and selecting heterogeneous information sources (1998) 0.00
    0.0038179322 = product of:
      0.0076358644 = sum of:
        0.0076358644 = product of:
          0.015271729 = sum of:
            0.015271729 = weight(_text_:j in 1253) [ClassicSimilarity], result of:
              0.015271729 = score(doc=1253,freq=2.0), product of:
                0.14500295 = queryWeight, product of:
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.045634337 = queryNorm
                0.105320126 = fieldWeight in 1253, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.0234375 = fieldNorm(doc=1253)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  16. Oberhauser, O.: Automatisches Klassifizieren : Entwicklungsstand - Methodik - Anwendungsbereiche (2005) 0.00
    0.00318161 = product of:
      0.00636322 = sum of:
        0.00636322 = product of:
          0.01272644 = sum of:
            0.01272644 = weight(_text_:j in 38) [ClassicSimilarity], result of:
              0.01272644 = score(doc=38,freq=2.0), product of:
                0.14500295 = queryWeight, product of:
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.045634337 = queryNorm
                0.08776677 = fieldWeight in 38, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.01953125 = fieldNorm(doc=38)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Footnote
    Die am Anfang des Werkes gestellte Frage, ob »die Techniken des automatischen Klassifizierens heute bereits so weit [sind], dass damit grosse Mengen elektronischer Dokumente [-] zufrieden stellend erschlossen werden können? « (S. 13), beantwortet der Verfasser mit einem eindeutigen »nein«, was Salton und McGills Aussage von 1983, »daß einfache automatische Indexierungsverfahren schnell und kostengünstig arbeiten, und daß sie Recall- und Precisionwerte erreichen, die mindestens genauso gut sind wie bei der manuellen Indexierung mit kontrolliertem Vokabular « (Gerard Salton und Michael J. McGill: Information Retrieval. Hamburg u.a. 1987, S. 64 f.) kräftig relativiert. Über die Gründe, warum drei der großen Projekte nicht weiter verfolgt werden, will Oberhauser nicht spekulieren, nennt aber mangelnden Erfolg, Verlagerung der Arbeit in den beteiligten Institutionen sowie Finanzierungsprobleme als mögliche Ursachen. Das größte Entwicklungspotenzial beim automatischen Erschließen großer Dokumentenmengen sieht der Verfasser heute in den Bereichen der Patentund Mediendokumentation. Hier solle man im bibliothekarischen Bereich die Entwicklung genau verfolgen, da diese »sicherlich mittelfristig auf eine qualitativ zufrieden stellende Vollautomatisierung« abziele (S. 146). Oberhausers Darstellung ist ein rundum gelungenes Werk, das zum Handapparat eines jeden, der sich für automatische Erschließung interessiert, gehört."

Languages

  • e 37
  • d 19

Types

  • a 47
  • el 9
  • m 2
  • r 2
  • d 1
  • x 1
  • More… Less…