Search (50 results, page 1 of 3)

  • × theme_ss:"Automatisches Klassifizieren"
  1. Hotho, A.; Bloehdorn, S.: Data Mining 2004 : Text classification by boosting weak learners based on terms and concepts (2004) 0.15
    0.15280244 = sum of:
      0.06001481 = product of:
        0.24005924 = sum of:
          0.24005924 = weight(_text_:3a in 562) [ClassicSimilarity], result of:
            0.24005924 = score(doc=562,freq=2.0), product of:
              0.42713797 = queryWeight, product of:
                8.478011 = idf(docFreq=24, maxDocs=44218)
                0.05038186 = queryNorm
              0.56201804 = fieldWeight in 562, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                8.478011 = idf(docFreq=24, maxDocs=44218)
                0.046875 = fieldNorm(doc=562)
        0.25 = coord(1/4)
      0.09278763 = sum of:
        0.051831353 = weight(_text_:t in 562) [ClassicSimilarity], result of:
          0.051831353 = score(doc=562,freq=2.0), product of:
            0.19847474 = queryWeight, product of:
              3.9394085 = idf(docFreq=2338, maxDocs=44218)
              0.05038186 = queryNorm
            0.26114836 = fieldWeight in 562, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.9394085 = idf(docFreq=2338, maxDocs=44218)
              0.046875 = fieldNorm(doc=562)
        0.040956277 = weight(_text_:22 in 562) [ClassicSimilarity], result of:
          0.040956277 = score(doc=562,freq=2.0), product of:
            0.17642869 = queryWeight, product of:
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.05038186 = queryNorm
            0.23214069 = fieldWeight in 562, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.046875 = fieldNorm(doc=562)
    
    Content
    Vgl.: http://www.google.de/url?sa=t&rct=j&q=&esrc=s&source=web&cd=1&cad=rja&ved=0CEAQFjAA&url=http%3A%2F%2Fciteseerx.ist.psu.edu%2Fviewdoc%2Fdownload%3Fdoi%3D10.1.1.91.4940%26rep%3Drep1%26type%3Dpdf&ei=dOXrUMeIDYHDtQahsIGACg&usg=AFQjCNHFWVh6gNPvnOrOS9R3rkrXCNVD-A&sig2=5I2F5evRfMnsttSgFF9g7Q&bvm=bv.1357316858,d.Yms.
    Date
    8. 1.2013 10:22:32
  2. Ardö, A.; Koch, T.: Automatic classification applied to full-text Internet documents in a robot-generated subject index (1999) 0.03
    0.025915677 = product of:
      0.051831353 = sum of:
        0.051831353 = product of:
          0.10366271 = sum of:
            0.10366271 = weight(_text_:t in 382) [ClassicSimilarity], result of:
              0.10366271 = score(doc=382,freq=2.0), product of:
                0.19847474 = queryWeight, product of:
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.05038186 = queryNorm
                0.5222967 = fieldWeight in 382, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.09375 = fieldNorm(doc=382)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  3. Braun, T.: Dokumentklassifikation durch Clustering (o.J.) 0.02
    0.021596396 = product of:
      0.043192793 = sum of:
        0.043192793 = product of:
          0.086385585 = sum of:
            0.086385585 = weight(_text_:t in 1671) [ClassicSimilarity], result of:
              0.086385585 = score(doc=1671,freq=2.0), product of:
                0.19847474 = queryWeight, product of:
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.05038186 = queryNorm
                0.43524727 = fieldWeight in 1671, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.078125 = fieldNorm(doc=1671)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  4. Subramanian, S.; Shafer, K.E.: Clustering (2001) 0.02
    0.020478139 = product of:
      0.040956277 = sum of:
        0.040956277 = product of:
          0.081912555 = sum of:
            0.081912555 = weight(_text_:22 in 1046) [ClassicSimilarity], result of:
              0.081912555 = score(doc=1046,freq=2.0), product of:
                0.17642869 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.05038186 = queryNorm
                0.46428138 = fieldWeight in 1046, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.09375 = fieldNorm(doc=1046)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Date
    5. 5.2003 14:17:22
  5. Liu, R.-L.: Context-based term frequency assessment for text classification (2010) 0.02
    0.01832515 = product of:
      0.0366503 = sum of:
        0.0366503 = product of:
          0.0733006 = sum of:
            0.0733006 = weight(_text_:t in 3331) [ClassicSimilarity], result of:
              0.0733006 = score(doc=3331,freq=4.0), product of:
                0.19847474 = queryWeight, product of:
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.05038186 = queryNorm
                0.36931956 = fieldWeight in 3331, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.046875 = fieldNorm(doc=3331)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    Automatic text classification (TC) is essential for the management of information. To properly classify a document d, it is essential to identify the semantics of each term t in d, while the semantics heavily depend on context (neighboring terms) of t in d. Therefore, we present a technique CTFA (Context-based Term Frequency Assessment) that improves text classifiers by considering term contexts in test documents. The results of the term context recognition are used to assess term frequencies of terms, and hence CTFA may easily work with various kinds of text classifiers that base their TC decisions on term frequencies, without needing to modify the classifiers. Moreover, CTFA is efficient, and neither huge memory nor domain-specific knowledge is required. Empirical results show that CTFA successfully enhances performance of several kinds of text classifiers on different experimental data.
  6. Koch, T.: Nutzung von Klassifikationssystemen zur verbesserten Beschreibung, Organisation und Suche von Internetressourcen (1998) 0.02
    0.017277118 = product of:
      0.034554236 = sum of:
        0.034554236 = product of:
          0.06910847 = sum of:
            0.06910847 = weight(_text_:t in 1030) [ClassicSimilarity], result of:
              0.06910847 = score(doc=1030,freq=2.0), product of:
                0.19847474 = queryWeight, product of:
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.05038186 = queryNorm
                0.34819782 = fieldWeight in 1030, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.0625 = fieldNorm(doc=1030)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  7. Koch, T.; Ardö, A.: Automatic classification of full-text HTML-documents from one specific subject area : DESIRE II D3.6a, Working Paper 2 (2000) 0.02
    0.017277118 = product of:
      0.034554236 = sum of:
        0.034554236 = product of:
          0.06910847 = sum of:
            0.06910847 = weight(_text_:t in 1667) [ClassicSimilarity], result of:
              0.06910847 = score(doc=1667,freq=2.0), product of:
                0.19847474 = queryWeight, product of:
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.05038186 = queryNorm
                0.34819782 = fieldWeight in 1667, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.0625 = fieldNorm(doc=1667)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  8. Brückner, T.; Dambeck, H.: Sortierautomaten : Grundlagen der Textklassifizierung (2003) 0.02
    0.017277118 = product of:
      0.034554236 = sum of:
        0.034554236 = product of:
          0.06910847 = sum of:
            0.06910847 = weight(_text_:t in 2398) [ClassicSimilarity], result of:
              0.06910847 = score(doc=2398,freq=2.0), product of:
                0.19847474 = queryWeight, product of:
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.05038186 = queryNorm
                0.34819782 = fieldWeight in 2398, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.0625 = fieldNorm(doc=2398)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  9. Lindholm, J.; Schönthal, T.; Jansson , K.: Experiences of harvesting Web resources in engineering using automatic classification (2003) 0.02
    0.017277118 = product of:
      0.034554236 = sum of:
        0.034554236 = product of:
          0.06910847 = sum of:
            0.06910847 = weight(_text_:t in 4088) [ClassicSimilarity], result of:
              0.06910847 = score(doc=4088,freq=2.0), product of:
                0.19847474 = queryWeight, product of:
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.05038186 = queryNorm
                0.34819782 = fieldWeight in 4088, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.0625 = fieldNorm(doc=4088)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  10. Pfister, J.: Clustering von Patent-Dokumenten am Beispiel der Datenbanken des Fachinformationszentrums Karlsruhe (2006) 0.02
    0.017277118 = product of:
      0.034554236 = sum of:
        0.034554236 = product of:
          0.06910847 = sum of:
            0.06910847 = weight(_text_:t in 5976) [ClassicSimilarity], result of:
              0.06910847 = score(doc=5976,freq=2.0), product of:
                0.19847474 = queryWeight, product of:
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.05038186 = queryNorm
                0.34819782 = fieldWeight in 5976, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.0625 = fieldNorm(doc=5976)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Source
    Effektive Information Retrieval Verfahren in Theorie und Praxis: ausgewählte und erweiterte Beiträge des Vierten Hildesheimer Evaluierungs- und Retrievalworkshop (HIER 2005), Hildesheim, 20.7.2005. Hrsg.: T. Mandl u. C. Womser-Hacker
  11. Jersek, T.: Automatische DDC-Klassifizierung mit Lingo : Vorgehensweise und Ergebnisse (2012) 0.02
    0.017277118 = product of:
      0.034554236 = sum of:
        0.034554236 = product of:
          0.06910847 = sum of:
            0.06910847 = weight(_text_:t in 122) [ClassicSimilarity], result of:
              0.06910847 = score(doc=122,freq=2.0), product of:
                0.19847474 = queryWeight, product of:
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.05038186 = queryNorm
                0.34819782 = fieldWeight in 122, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.0625 = fieldNorm(doc=122)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  12. Reiner, U.: Automatische DDC-Klassifizierung von bibliografischen Titeldatensätzen (2009) 0.02
    0.017065117 = product of:
      0.034130234 = sum of:
        0.034130234 = product of:
          0.06826047 = sum of:
            0.06826047 = weight(_text_:22 in 611) [ClassicSimilarity], result of:
              0.06826047 = score(doc=611,freq=2.0), product of:
                0.17642869 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.05038186 = queryNorm
                0.38690117 = fieldWeight in 611, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.078125 = fieldNorm(doc=611)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Date
    22. 8.2009 12:54:24
  13. HaCohen-Kerner, Y. et al.: Classification using various machine learning methods and combinations of key-phrases and visual features (2016) 0.02
    0.017065117 = product of:
      0.034130234 = sum of:
        0.034130234 = product of:
          0.06826047 = sum of:
            0.06826047 = weight(_text_:22 in 2748) [ClassicSimilarity], result of:
              0.06826047 = score(doc=2748,freq=2.0), product of:
                0.17642869 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.05038186 = queryNorm
                0.38690117 = fieldWeight in 2748, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.078125 = fieldNorm(doc=2748)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Date
    1. 2.2016 18:25:22
  14. Golub, K.: Automated subject classification of textual web documents (2006) 0.02
    0.0157115 = product of:
      0.031423 = sum of:
        0.031423 = product of:
          0.125692 = sum of:
            0.125692 = weight(_text_:author's in 5600) [ClassicSimilarity], result of:
              0.125692 = score(doc=5600,freq=2.0), product of:
                0.33857384 = queryWeight, product of:
                  6.7201533 = idf(docFreq=144, maxDocs=44218)
                  0.05038186 = queryNorm
                0.3712395 = fieldWeight in 5600, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  6.7201533 = idf(docFreq=144, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=5600)
          0.25 = coord(1/4)
      0.5 = coord(1/2)
    
    Abstract
    Purpose - To provide an integrated perspective to similarities and differences between approaches to automated classification in different research communities (machine learning, information retrieval and library science), and point to problems with the approaches and automated classification as such. Design/methodology/approach - A range of works dealing with automated classification of full-text web documents are discussed. Explorations of individual approaches are given in the following sections: special features (description, differences, evaluation), application and characteristics of web pages. Findings - Provides major similarities and differences between the three approaches: document pre-processing and utilization of web-specific document characteristics is common to all the approaches; major differences are in applied algorithms, employment or not of the vector space model and of controlled vocabularies. Problems of automated classification are recognized. Research limitations/implications - The paper does not attempt to provide an exhaustive bibliography of related resources. Practical implications - As an integrated overview of approaches from different research communities with application examples, it is very useful for students in library and information science and computer science, as well as for practitioners. Researchers from one community have the information on how similar tasks are conducted in different communities. Originality/value - To the author's knowledge, no review paper on automated text classification attempted to discuss more than one community's approach from an integrated perspective.
  15. Koch, T.: Experiments with automatic classification of WAIS databases and indexing of WWW : some results from the Nordic WAIS/WWW project (1994) 0.02
    0.015117477 = product of:
      0.030234953 = sum of:
        0.030234953 = product of:
          0.060469907 = sum of:
            0.060469907 = weight(_text_:t in 7209) [ClassicSimilarity], result of:
              0.060469907 = score(doc=7209,freq=2.0), product of:
                0.19847474 = queryWeight, product of:
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.05038186 = queryNorm
                0.30467308 = fieldWeight in 7209, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=7209)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  16. Koch, T.; Vizine-Goetz, D.: Automatic classification and content navigation support for Web services : DESIRE II cooperates with OCLC (1998) 0.02
    0.015117477 = product of:
      0.030234953 = sum of:
        0.030234953 = product of:
          0.060469907 = sum of:
            0.060469907 = weight(_text_:t in 1568) [ClassicSimilarity], result of:
              0.060469907 = score(doc=1568,freq=2.0), product of:
                0.19847474 = queryWeight, product of:
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.05038186 = queryNorm
                0.30467308 = fieldWeight in 1568, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=1568)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  17. Koch, T.; Vizine-Goetz, D.: DDC and knowledge organization in the digital library : Research and development. Demonstration pages (1999) 0.01
    0.012957838 = product of:
      0.025915677 = sum of:
        0.025915677 = product of:
          0.051831353 = sum of:
            0.051831353 = weight(_text_:t in 942) [ClassicSimilarity], result of:
              0.051831353 = score(doc=942,freq=2.0), product of:
                0.19847474 = queryWeight, product of:
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.05038186 = queryNorm
                0.26114836 = fieldWeight in 942, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.046875 = fieldNorm(doc=942)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  18. Koch, T.; Ardö, A.; Noodén, L.: ¬The construction of a robot-generated subject index : DESIRE II D3.6a, Working Paper 1 (1999) 0.01
    0.012957838 = product of:
      0.025915677 = sum of:
        0.025915677 = product of:
          0.051831353 = sum of:
            0.051831353 = weight(_text_:t in 1668) [ClassicSimilarity], result of:
              0.051831353 = score(doc=1668,freq=2.0), product of:
                0.19847474 = queryWeight, product of:
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.05038186 = queryNorm
                0.26114836 = fieldWeight in 1668, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.046875 = fieldNorm(doc=1668)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  19. Golub, K.; Hamon, T.; Ardö, A.: Automated classification of textual documents based on a controlled vocabulary in engineering (2007) 0.01
    0.012957838 = product of:
      0.025915677 = sum of:
        0.025915677 = product of:
          0.051831353 = sum of:
            0.051831353 = weight(_text_:t in 1461) [ClassicSimilarity], result of:
              0.051831353 = score(doc=1461,freq=2.0), product of:
                0.19847474 = queryWeight, product of:
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.05038186 = queryNorm
                0.26114836 = fieldWeight in 1461, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.046875 = fieldNorm(doc=1461)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  20. Leroy, G.; Miller, T.; Rosemblat, G.; Browne, A.: ¬A balanced approach to health information evaluation : a vocabulary-based naïve Bayes classifier and readability formulas (2008) 0.01
    0.012957838 = product of:
      0.025915677 = sum of:
        0.025915677 = product of:
          0.051831353 = sum of:
            0.051831353 = weight(_text_:t in 1998) [ClassicSimilarity], result of:
              0.051831353 = score(doc=1998,freq=2.0), product of:
                0.19847474 = queryWeight, product of:
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.05038186 = queryNorm
                0.26114836 = fieldWeight in 1998, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.046875 = fieldNorm(doc=1998)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    

Years

Languages

  • e 40
  • d 10

Types

  • a 40
  • el 11
  • r 1
  • x 1
  • More… Less…