Search (40 results, page 2 of 2)

  • × theme_ss:"Automatisches Klassifizieren"
  1. Dolin, R.; Agrawal, D.; El Abbadi, A.; Pearlman, J.: Using automated classification for summarizing and selecting heterogeneous information sources (1998) 0.01
    0.0082798535 = product of:
      0.016559707 = sum of:
        0.016559707 = product of:
          0.033119414 = sum of:
            0.033119414 = weight(_text_:r in 316) [ClassicSimilarity], result of:
              0.033119414 = score(doc=316,freq=2.0), product of:
                0.15092614 = queryWeight, product of:
                  3.3102584 = idf(docFreq=4387, maxDocs=44218)
                  0.045593463 = queryNorm
                0.2194412 = fieldWeight in 316, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.3102584 = idf(docFreq=4387, maxDocs=44218)
                  0.046875 = fieldNorm(doc=316)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  2. Mukhopadhyay, S.; Peng, S.; Raje, R.; Palakal, M.; Mostafa, J.: Multi-agent information classification using dynamic acquaintance lists (2003) 0.01
    0.0082798535 = product of:
      0.016559707 = sum of:
        0.016559707 = product of:
          0.033119414 = sum of:
            0.033119414 = weight(_text_:r in 1755) [ClassicSimilarity], result of:
              0.033119414 = score(doc=1755,freq=2.0), product of:
                0.15092614 = queryWeight, product of:
                  3.3102584 = idf(docFreq=4387, maxDocs=44218)
                  0.045593463 = queryNorm
                0.2194412 = fieldWeight in 1755, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.3102584 = idf(docFreq=4387, maxDocs=44218)
                  0.046875 = fieldNorm(doc=1755)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  3. Prabowo, R.; Jackson, M.; Burden, P.; Knoell, H.-D.: Ontology-based automatic classification for the Web pages : design, implementation and evaluation (2002) 0.01
    0.0082798535 = product of:
      0.016559707 = sum of:
        0.016559707 = product of:
          0.033119414 = sum of:
            0.033119414 = weight(_text_:r in 3383) [ClassicSimilarity], result of:
              0.033119414 = score(doc=3383,freq=2.0), product of:
                0.15092614 = queryWeight, product of:
                  3.3102584 = idf(docFreq=4387, maxDocs=44218)
                  0.045593463 = queryNorm
                0.2194412 = fieldWeight in 3383, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.3102584 = idf(docFreq=4387, maxDocs=44218)
                  0.046875 = fieldNorm(doc=3383)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  4. Liu, R.-L.: Dynamic category profiling for text filtering and classification (2007) 0.01
    0.0082798535 = product of:
      0.016559707 = sum of:
        0.016559707 = product of:
          0.033119414 = sum of:
            0.033119414 = weight(_text_:r in 900) [ClassicSimilarity], result of:
              0.033119414 = score(doc=900,freq=2.0), product of:
                0.15092614 = queryWeight, product of:
                  3.3102584 = idf(docFreq=4387, maxDocs=44218)
                  0.045593463 = queryNorm
                0.2194412 = fieldWeight in 900, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.3102584 = idf(docFreq=4387, maxDocs=44218)
                  0.046875 = fieldNorm(doc=900)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  5. Cosh, K.J.; Burns, R.; Daniel, T.: Content clouds : classifying content in Web 2.0 (2008) 0.01
    0.0082798535 = product of:
      0.016559707 = sum of:
        0.016559707 = product of:
          0.033119414 = sum of:
            0.033119414 = weight(_text_:r in 2013) [ClassicSimilarity], result of:
              0.033119414 = score(doc=2013,freq=2.0), product of:
                0.15092614 = queryWeight, product of:
                  3.3102584 = idf(docFreq=4387, maxDocs=44218)
                  0.045593463 = queryNorm
                0.2194412 = fieldWeight in 2013, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.3102584 = idf(docFreq=4387, maxDocs=44218)
                  0.046875 = fieldNorm(doc=2013)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  6. Puzicha, J.: Informationen finden! : Intelligente Suchmaschinentechnologie & automatische Kategorisierung (2007) 0.01
    0.0082798535 = product of:
      0.016559707 = sum of:
        0.016559707 = product of:
          0.033119414 = sum of:
            0.033119414 = weight(_text_:r in 2817) [ClassicSimilarity], result of:
              0.033119414 = score(doc=2817,freq=2.0), product of:
                0.15092614 = queryWeight, product of:
                  3.3102584 = idf(docFreq=4387, maxDocs=44218)
                  0.045593463 = queryNorm
                0.2194412 = fieldWeight in 2817, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.3102584 = idf(docFreq=4387, maxDocs=44218)
                  0.046875 = fieldNorm(doc=2817)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Type
    r
  7. Kanaan, G.; Al-Shalabi, R.; Ghwanmeh, S.; Al-Ma'adeed, H.: ¬A comparison of text-classification techniques applied to Arabic text (2009) 0.01
    0.0082798535 = product of:
      0.016559707 = sum of:
        0.016559707 = product of:
          0.033119414 = sum of:
            0.033119414 = weight(_text_:r in 3096) [ClassicSimilarity], result of:
              0.033119414 = score(doc=3096,freq=2.0), product of:
                0.15092614 = queryWeight, product of:
                  3.3102584 = idf(docFreq=4387, maxDocs=44218)
                  0.045593463 = queryNorm
                0.2194412 = fieldWeight in 3096, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.3102584 = idf(docFreq=4387, maxDocs=44218)
                  0.046875 = fieldNorm(doc=3096)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  8. Liu, R.-L.: Context-based term frequency assessment for text classification (2010) 0.01
    0.0082798535 = product of:
      0.016559707 = sum of:
        0.016559707 = product of:
          0.033119414 = sum of:
            0.033119414 = weight(_text_:r in 3331) [ClassicSimilarity], result of:
              0.033119414 = score(doc=3331,freq=2.0), product of:
                0.15092614 = queryWeight, product of:
                  3.3102584 = idf(docFreq=4387, maxDocs=44218)
                  0.045593463 = queryNorm
                0.2194412 = fieldWeight in 3331, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.3102584 = idf(docFreq=4387, maxDocs=44218)
                  0.046875 = fieldNorm(doc=3331)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  9. Desale, S.K.; Kumbhar, R.: Research on automatic classification of documents in library environment : a literature review (2013) 0.01
    0.0082798535 = product of:
      0.016559707 = sum of:
        0.016559707 = product of:
          0.033119414 = sum of:
            0.033119414 = weight(_text_:r in 1071) [ClassicSimilarity], result of:
              0.033119414 = score(doc=1071,freq=2.0), product of:
                0.15092614 = queryWeight, product of:
                  3.3102584 = idf(docFreq=4387, maxDocs=44218)
                  0.045593463 = queryNorm
                0.2194412 = fieldWeight in 1071, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.3102584 = idf(docFreq=4387, maxDocs=44218)
                  0.046875 = fieldNorm(doc=1071)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  10. Wu, M.; Liu, Y.-H.; Brownlee, R.; Zhang, X.: Evaluating utility and automatic classification of subject metadata from Research Data Australia (2021) 0.01
    0.0082798535 = product of:
      0.016559707 = sum of:
        0.016559707 = product of:
          0.033119414 = sum of:
            0.033119414 = weight(_text_:r in 453) [ClassicSimilarity], result of:
              0.033119414 = score(doc=453,freq=2.0), product of:
                0.15092614 = queryWeight, product of:
                  3.3102584 = idf(docFreq=4387, maxDocs=44218)
                  0.045593463 = queryNorm
                0.2194412 = fieldWeight in 453, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.3102584 = idf(docFreq=4387, maxDocs=44218)
                  0.046875 = fieldNorm(doc=453)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  11. Mengle, S.; Goharian, N.: Passage detection using text classification (2009) 0.01
    0.0077216057 = product of:
      0.015443211 = sum of:
        0.015443211 = product of:
          0.030886423 = sum of:
            0.030886423 = weight(_text_:22 in 2765) [ClassicSimilarity], result of:
              0.030886423 = score(doc=2765,freq=2.0), product of:
                0.15966053 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.045593463 = queryNorm
                0.19345059 = fieldWeight in 2765, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=2765)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Date
    22. 3.2009 19:14:43
  12. Reiner, U.: VZG-Projekt Colibri : Bewertung von automatisch DDC-klassifizierten Titeldatensätzen der Deutschen Nationalbibliothek (DNB) (2009) 0.01
    0.006899878 = product of:
      0.013799756 = sum of:
        0.013799756 = product of:
          0.027599512 = sum of:
            0.027599512 = weight(_text_:r in 2675) [ClassicSimilarity], result of:
              0.027599512 = score(doc=2675,freq=2.0), product of:
                0.15092614 = queryWeight, product of:
                  3.3102584 = idf(docFreq=4387, maxDocs=44218)
                  0.045593463 = queryNorm
                0.18286766 = fieldWeight in 2675, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.3102584 = idf(docFreq=4387, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=2675)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Type
    r
  13. Yilmaz, T.; Ozcan, R.; Altingovde, I.S.; Ulusoy, Ö.: Improving educational web search for question-like queries through subject classification (2019) 0.01
    0.006899878 = product of:
      0.013799756 = sum of:
        0.013799756 = product of:
          0.027599512 = sum of:
            0.027599512 = weight(_text_:r in 5041) [ClassicSimilarity], result of:
              0.027599512 = score(doc=5041,freq=2.0), product of:
                0.15092614 = queryWeight, product of:
                  3.3102584 = idf(docFreq=4387, maxDocs=44218)
                  0.045593463 = queryNorm
                0.18286766 = fieldWeight in 5041, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.3102584 = idf(docFreq=4387, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=5041)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  14. Han, K.; Rezapour, R.; Nakamura, K.; Devkota, D.; Miller, D.C.; Diesner, J.: ¬An expert-in-the-loop method for domain-specific document categorization based on small training data (2023) 0.01
    0.006899878 = product of:
      0.013799756 = sum of:
        0.013799756 = product of:
          0.027599512 = sum of:
            0.027599512 = weight(_text_:r in 967) [ClassicSimilarity], result of:
              0.027599512 = score(doc=967,freq=2.0), product of:
                0.15092614 = queryWeight, product of:
                  3.3102584 = idf(docFreq=4387, maxDocs=44218)
                  0.045593463 = queryNorm
                0.18286766 = fieldWeight in 967, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.3102584 = idf(docFreq=4387, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=967)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  15. Khoo, C.S.G.; Ng, K.; Ou, S.: ¬An exploratory study of human clustering of Web pages (2003) 0.01
    0.0061772848 = product of:
      0.0123545695 = sum of:
        0.0123545695 = product of:
          0.024709139 = sum of:
            0.024709139 = weight(_text_:22 in 2741) [ClassicSimilarity], result of:
              0.024709139 = score(doc=2741,freq=2.0), product of:
                0.15966053 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.045593463 = queryNorm
                0.15476047 = fieldWeight in 2741, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.03125 = fieldNorm(doc=2741)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Date
    12. 9.2004 9:56:22
  16. Reiner, U.: Automatische DDC-Klassifizierung bibliografischer Titeldatensätze der Deutschen Nationalbibliografie (2009) 0.01
    0.0061772848 = product of:
      0.0123545695 = sum of:
        0.0123545695 = product of:
          0.024709139 = sum of:
            0.024709139 = weight(_text_:22 in 3284) [ClassicSimilarity], result of:
              0.024709139 = score(doc=3284,freq=2.0), product of:
                0.15966053 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.045593463 = queryNorm
                0.15476047 = fieldWeight in 3284, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.03125 = fieldNorm(doc=3284)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Date
    22. 1.2010 14:41:24
  17. Koch, T.; Ardö, A.; Brümmer, A.: ¬The building and maintenance of robot based internet search services : A review of current indexing and data collection methods. Prepared to meet the requirements of Work Package 3 of EU Telematics for Research, project DESIRE. Version D3.11v0.3 (Draft version 3) (1996) 0.01
    0.0055199023 = product of:
      0.011039805 = sum of:
        0.011039805 = product of:
          0.02207961 = sum of:
            0.02207961 = weight(_text_:r in 1669) [ClassicSimilarity], result of:
              0.02207961 = score(doc=1669,freq=2.0), product of:
                0.15092614 = queryWeight, product of:
                  3.3102584 = idf(docFreq=4387, maxDocs=44218)
                  0.045593463 = queryNorm
                0.14629413 = fieldWeight in 1669, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.3102584 = idf(docFreq=4387, maxDocs=44218)
                  0.03125 = fieldNorm(doc=1669)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Type
    r
  18. Hoffmann, R.: Entwicklung einer benutzerunterstützten automatisierten Klassifikation von Web - Dokumenten : Untersuchung gegenwärtiger Methoden zur automatisierten Dokumentklassifikation und Implementierung eines Prototyps zum verbesserten Information Retrieval für das xFIND System (2002) 0.01
    0.0055199023 = product of:
      0.011039805 = sum of:
        0.011039805 = product of:
          0.02207961 = sum of:
            0.02207961 = weight(_text_:r in 4197) [ClassicSimilarity], result of:
              0.02207961 = score(doc=4197,freq=2.0), product of:
                0.15092614 = queryWeight, product of:
                  3.3102584 = idf(docFreq=4387, maxDocs=44218)
                  0.045593463 = queryNorm
                0.14629413 = fieldWeight in 4197, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.3102584 = idf(docFreq=4387, maxDocs=44218)
                  0.03125 = fieldNorm(doc=4197)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  19. Borko, H.: Research in computer based classification systems (1985) 0.00
    0.004829915 = product of:
      0.00965983 = sum of:
        0.00965983 = product of:
          0.01931966 = sum of:
            0.01931966 = weight(_text_:r in 3647) [ClassicSimilarity], result of:
              0.01931966 = score(doc=3647,freq=2.0), product of:
                0.15092614 = queryWeight, product of:
                  3.3102584 = idf(docFreq=4387, maxDocs=44218)
                  0.045593463 = queryNorm
                0.12800737 = fieldWeight in 3647, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.3102584 = idf(docFreq=4387, maxDocs=44218)
                  0.02734375 = fieldNorm(doc=3647)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    The selection in this reader by R. M. Needham and K. Sparck Jones reports an early approach to automatic classification that was taken in England. The following selection reviews various approaches that were being pursued in the United States at about the same time. It then discusses a particular approach initiated in the early 1960s by Harold Borko, at that time Head of the Language Processing and Retrieval Research Staff at the System Development Corporation, Santa Monica, California and, since 1966, a member of the faculty at the Graduate School of Library and Information Science, University of California, Los Angeles. As was described earlier, there are two steps in automatic classification, the first being to identify pairs of terms that are similar by virtue of co-occurring as index terms in the same documents, and the second being to form equivalence classes of intersubstitutable terms. To compute similarities, Borko and his associates used a standard correlation formula; to derive classification categories, where Needham and Sparck Jones used clumping, the Borko team used the statistical technique of factor analysis. The fact that documents can be classified automatically, and in any number of ways, is worthy of passing notice. Worthy of serious attention would be a demonstra tion that a computer-based classification system was effective in the organization and retrieval of documents. One reason for the inclusion of the following selection in the reader is that it addresses the question of evaluation. To evaluate the effectiveness of their automatically derived classification, Borko and his team asked three questions. The first was Is the classification reliable? in other words, could the categories derived from one sample of texts be used to classify other texts? Reliability was assessed by a case-study comparison of the classes derived from three different samples of abstracts. The notso-surprising conclusion reached was that automatically derived classes were reliable only to the extent that the sample from which they were derived was representative of the total document collection. The second evaluation question asked whether the classification was reasonable, in the sense of adequately describing the content of the document collection. The answer was sought by comparing the automatically derived categories with categories in a related classification system that was manually constructed. Here the conclusion was that the automatic method yielded categories that fairly accurately reflected the major area of interest in the sample collection of texts; however, since there were only eleven such categories and they were quite broad, they could not be regarded as suitable for use in a university or any large general library. The third evaluation question asked whether automatic classification was accurate, in the sense of producing results similar to those obtainabie by human cIassifiers. When using human classification as a criterion, automatic classification was found to be 50 percent accurate.
  20. Dolin, R.; Agrawal, D.; El Abbadi, A.; Pearlman, J.: Using automated classification for summarizing and selecting heterogeneous information sources (1998) 0.00
    0.0041399268 = product of:
      0.0082798535 = sum of:
        0.0082798535 = product of:
          0.016559707 = sum of:
            0.016559707 = weight(_text_:r in 1253) [ClassicSimilarity], result of:
              0.016559707 = score(doc=1253,freq=2.0), product of:
                0.15092614 = queryWeight, product of:
                  3.3102584 = idf(docFreq=4387, maxDocs=44218)
                  0.045593463 = queryNorm
                0.1097206 = fieldWeight in 1253, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.3102584 = idf(docFreq=4387, maxDocs=44218)
                  0.0234375 = fieldNorm(doc=1253)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    

Languages

  • e 30
  • d 10

Types

  • a 30
  • el 10
  • r 4
  • x 2
  • More… Less…