Search (34 results, page 2 of 2)

  • × theme_ss:"Automatisches Klassifizieren"
  1. Giorgetti, D.; Sebastiani, F.: Automating survey coding by multiclass text categorization techniques (2003) 0.01
    0.009952499 = product of:
      0.019904997 = sum of:
        0.019904997 = product of:
          0.039809994 = sum of:
            0.039809994 = weight(_text_:f in 5172) [ClassicSimilarity], result of:
              0.039809994 = score(doc=5172,freq=2.0), product of:
                0.18080194 = queryWeight, product of:
                  3.985786 = idf(docFreq=2232, maxDocs=44218)
                  0.04536168 = queryNorm
                0.22018565 = fieldWeight in 5172, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.985786 = idf(docFreq=2232, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=5172)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  2. Peng, F.; Huang, X.: Machine learning for Asian language text classification (2007) 0.01
    0.009952499 = product of:
      0.019904997 = sum of:
        0.019904997 = product of:
          0.039809994 = sum of:
            0.039809994 = weight(_text_:f in 831) [ClassicSimilarity], result of:
              0.039809994 = score(doc=831,freq=2.0), product of:
                0.18080194 = queryWeight, product of:
                  3.985786 = idf(docFreq=2232, maxDocs=44218)
                  0.04536168 = queryNorm
                0.22018565 = fieldWeight in 831, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.985786 = idf(docFreq=2232, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=831)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  3. Fagni, T.; Sebastiani, F.: Selecting negative examples for hierarchical text classification: An experimental comparison (2010) 0.01
    0.009952499 = product of:
      0.019904997 = sum of:
        0.019904997 = product of:
          0.039809994 = sum of:
            0.039809994 = weight(_text_:f in 4101) [ClassicSimilarity], result of:
              0.039809994 = score(doc=4101,freq=2.0), product of:
                0.18080194 = queryWeight, product of:
                  3.985786 = idf(docFreq=2232, maxDocs=44218)
                  0.04536168 = queryNorm
                0.22018565 = fieldWeight in 4101, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.985786 = idf(docFreq=2232, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=4101)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  4. Yang, P.; Gao, W.; Tan, Q.; Wong, K.-F.: ¬A link-bridged topic model for cross-domain document classification (2013) 0.01
    0.009952499 = product of:
      0.019904997 = sum of:
        0.019904997 = product of:
          0.039809994 = sum of:
            0.039809994 = weight(_text_:f in 2706) [ClassicSimilarity], result of:
              0.039809994 = score(doc=2706,freq=2.0), product of:
                0.18080194 = queryWeight, product of:
                  3.985786 = idf(docFreq=2232, maxDocs=44218)
                  0.04536168 = queryNorm
                0.22018565 = fieldWeight in 2706, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.985786 = idf(docFreq=2232, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=2706)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  5. Liu, R.-L.: Context recognition for hierarchical text classification (2009) 0.01
    0.009218821 = product of:
      0.018437643 = sum of:
        0.018437643 = product of:
          0.036875285 = sum of:
            0.036875285 = weight(_text_:22 in 2760) [ClassicSimilarity], result of:
              0.036875285 = score(doc=2760,freq=2.0), product of:
                0.15884887 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.04536168 = queryNorm
                0.23214069 = fieldWeight in 2760, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.046875 = fieldNorm(doc=2760)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Date
    22. 3.2009 19:11:54
  6. Pfeffer, M.: Automatische Vergabe von RVK-Notationen mittels fallbasiertem Schließen (2009) 0.01
    0.009218821 = product of:
      0.018437643 = sum of:
        0.018437643 = product of:
          0.036875285 = sum of:
            0.036875285 = weight(_text_:22 in 3051) [ClassicSimilarity], result of:
              0.036875285 = score(doc=3051,freq=2.0), product of:
                0.15884887 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.04536168 = queryNorm
                0.23214069 = fieldWeight in 3051, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.046875 = fieldNorm(doc=3051)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Date
    22. 8.2009 19:51:28
  7. Zhu, W.Z.; Allen, R.B.: Document clustering using the LSI subspace signature model (2013) 0.01
    0.009218821 = product of:
      0.018437643 = sum of:
        0.018437643 = product of:
          0.036875285 = sum of:
            0.036875285 = weight(_text_:22 in 690) [ClassicSimilarity], result of:
              0.036875285 = score(doc=690,freq=2.0), product of:
                0.15884887 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.04536168 = queryNorm
                0.23214069 = fieldWeight in 690, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.046875 = fieldNorm(doc=690)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Date
    23. 3.2013 13:22:36
  8. Egbert, J.; Biber, D.; Davies, M.: Developing a bottom-up, user-based method of web register classification (2015) 0.01
    0.009218821 = product of:
      0.018437643 = sum of:
        0.018437643 = product of:
          0.036875285 = sum of:
            0.036875285 = weight(_text_:22 in 2158) [ClassicSimilarity], result of:
              0.036875285 = score(doc=2158,freq=2.0), product of:
                0.15884887 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.04536168 = queryNorm
                0.23214069 = fieldWeight in 2158, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.046875 = fieldNorm(doc=2158)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Date
    4. 8.2015 19:22:04
  9. Billal, B.; Fonseca, A.; Sadat, F.; Lounis, H.: Semi-supervised learning and social media text analysis towards multi-labeling categorization (2017) 0.01
    0.007961999 = product of:
      0.015923997 = sum of:
        0.015923997 = product of:
          0.031847995 = sum of:
            0.031847995 = weight(_text_:f in 4095) [ClassicSimilarity], result of:
              0.031847995 = score(doc=4095,freq=2.0), product of:
                0.18080194 = queryWeight, product of:
                  3.985786 = idf(docFreq=2232, maxDocs=44218)
                  0.04536168 = queryNorm
                0.17614852 = fieldWeight in 4095, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.985786 = idf(docFreq=2232, maxDocs=44218)
                  0.03125 = fieldNorm(doc=4095)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  10. Mengle, S.; Goharian, N.: Passage detection using text classification (2009) 0.01
    0.0076823514 = product of:
      0.015364703 = sum of:
        0.015364703 = product of:
          0.030729406 = sum of:
            0.030729406 = weight(_text_:22 in 2765) [ClassicSimilarity], result of:
              0.030729406 = score(doc=2765,freq=2.0), product of:
                0.15884887 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.04536168 = queryNorm
                0.19345059 = fieldWeight in 2765, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=2765)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Date
    22. 3.2009 19:14:43
  11. Liu, R.-L.: ¬A passage extractor for classification of disease aspect information (2013) 0.01
    0.0076823514 = product of:
      0.015364703 = sum of:
        0.015364703 = product of:
          0.030729406 = sum of:
            0.030729406 = weight(_text_:22 in 1107) [ClassicSimilarity], result of:
              0.030729406 = score(doc=1107,freq=2.0), product of:
                0.15884887 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.04536168 = queryNorm
                0.19345059 = fieldWeight in 1107, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=1107)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Date
    28.10.2013 19:22:57
  12. Khoo, C.S.G.; Ng, K.; Ou, S.: ¬An exploratory study of human clustering of Web pages (2003) 0.01
    0.006145881 = product of:
      0.012291762 = sum of:
        0.012291762 = product of:
          0.024583524 = sum of:
            0.024583524 = weight(_text_:22 in 2741) [ClassicSimilarity], result of:
              0.024583524 = score(doc=2741,freq=2.0), product of:
                0.15884887 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.04536168 = queryNorm
                0.15476047 = fieldWeight in 2741, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.03125 = fieldNorm(doc=2741)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Date
    12. 9.2004 9:56:22
  13. Reiner, U.: Automatische DDC-Klassifizierung bibliografischer Titeldatensätze der Deutschen Nationalbibliografie (2009) 0.01
    0.006145881 = product of:
      0.012291762 = sum of:
        0.012291762 = product of:
          0.024583524 = sum of:
            0.024583524 = weight(_text_:22 in 3284) [ClassicSimilarity], result of:
              0.024583524 = score(doc=3284,freq=2.0), product of:
                0.15884887 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.04536168 = queryNorm
                0.15476047 = fieldWeight in 3284, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.03125 = fieldNorm(doc=3284)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Date
    22. 1.2010 14:41:24
  14. Oberhauser, O.: Automatisches Klassifizieren : Entwicklungsstand - Methodik - Anwendungsbereiche (2005) 0.00
    0.0049762493 = product of:
      0.009952499 = sum of:
        0.009952499 = product of:
          0.019904997 = sum of:
            0.019904997 = weight(_text_:f in 38) [ClassicSimilarity], result of:
              0.019904997 = score(doc=38,freq=2.0), product of:
                0.18080194 = queryWeight, product of:
                  3.985786 = idf(docFreq=2232, maxDocs=44218)
                  0.04536168 = queryNorm
                0.110092826 = fieldWeight in 38, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.985786 = idf(docFreq=2232, maxDocs=44218)
                  0.01953125 = fieldNorm(doc=38)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Footnote
    Die am Anfang des Werkes gestellte Frage, ob »die Techniken des automatischen Klassifizierens heute bereits so weit [sind], dass damit grosse Mengen elektronischer Dokumente [-] zufrieden stellend erschlossen werden können? « (S. 13), beantwortet der Verfasser mit einem eindeutigen »nein«, was Salton und McGills Aussage von 1983, »daß einfache automatische Indexierungsverfahren schnell und kostengünstig arbeiten, und daß sie Recall- und Precisionwerte erreichen, die mindestens genauso gut sind wie bei der manuellen Indexierung mit kontrolliertem Vokabular « (Gerard Salton und Michael J. McGill: Information Retrieval. Hamburg u.a. 1987, S. 64 f.) kräftig relativiert. Über die Gründe, warum drei der großen Projekte nicht weiter verfolgt werden, will Oberhauser nicht spekulieren, nennt aber mangelnden Erfolg, Verlagerung der Arbeit in den beteiligten Institutionen sowie Finanzierungsprobleme als mögliche Ursachen. Das größte Entwicklungspotenzial beim automatischen Erschließen großer Dokumentenmengen sieht der Verfasser heute in den Bereichen der Patentund Mediendokumentation. Hier solle man im bibliothekarischen Bereich die Entwicklung genau verfolgen, da diese »sicherlich mittelfristig auf eine qualitativ zufrieden stellende Vollautomatisierung« abziele (S. 146). Oberhausers Darstellung ist ein rundum gelungenes Werk, das zum Handapparat eines jeden, der sich für automatische Erschließung interessiert, gehört."