Search (21 results, page 1 of 2)

  • × theme_ss:"Automatisches Klassifizieren"
  1. Hotho, A.; Bloehdorn, S.: Data Mining 2004 : Text classification by boosting weak learners based on terms and concepts (2004) 0.32
    0.31819054 = sum of:
      0.07476433 = product of:
        0.224293 = sum of:
          0.224293 = weight(_text_:3a in 562) [ClassicSimilarity], result of:
            0.224293 = score(doc=562,freq=2.0), product of:
              0.39908504 = queryWeight, product of:
                8.478011 = idf(docFreq=24, maxDocs=44218)
                0.047072954 = queryNorm
              0.56201804 = fieldWeight in 562, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                8.478011 = idf(docFreq=24, maxDocs=44218)
                0.046875 = fieldNorm(doc=562)
        0.33333334 = coord(1/3)
      0.224293 = weight(_text_:2f in 562) [ClassicSimilarity], result of:
        0.224293 = score(doc=562,freq=2.0), product of:
          0.39908504 = queryWeight, product of:
            8.478011 = idf(docFreq=24, maxDocs=44218)
            0.047072954 = queryNorm
          0.56201804 = fieldWeight in 562, product of:
            1.4142135 = tf(freq=2.0), with freq of:
              2.0 = termFreq=2.0
            8.478011 = idf(docFreq=24, maxDocs=44218)
            0.046875 = fieldNorm(doc=562)
      0.019133206 = product of:
        0.038266413 = sum of:
          0.038266413 = weight(_text_:22 in 562) [ClassicSimilarity], result of:
            0.038266413 = score(doc=562,freq=2.0), product of:
              0.16484147 = queryWeight, product of:
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.047072954 = queryNorm
              0.23214069 = fieldWeight in 562, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.046875 = fieldNorm(doc=562)
        0.5 = coord(1/2)
    
    Content
    Vgl.: http://www.google.de/url?sa=t&rct=j&q=&esrc=s&source=web&cd=1&cad=rja&ved=0CEAQFjAA&url=http%3A%2F%2Fciteseerx.ist.psu.edu%2Fviewdoc%2Fdownload%3Fdoi%3D10.1.1.91.4940%26rep%3Drep1%26type%3Dpdf&ei=dOXrUMeIDYHDtQahsIGACg&usg=AFQjCNHFWVh6gNPvnOrOS9R3rkrXCNVD-A&sig2=5I2F5evRfMnsttSgFF9g7Q&bvm=bv.1357316858,d.Yms.
    Date
    8. 1.2013 10:22:32
  2. Montesi, M.; Navarrete, T.: Classifying web genres in context : A case study documenting the web genres used by a software engineer (2008) 0.02
    0.015917234 = product of:
      0.0477517 = sum of:
        0.0477517 = product of:
          0.0955034 = sum of:
            0.0955034 = weight(_text_:history in 2100) [ClassicSimilarity], result of:
              0.0955034 = score(doc=2100,freq=4.0), product of:
                0.21898255 = queryWeight, product of:
                  4.6519823 = idf(docFreq=1146, maxDocs=44218)
                  0.047072954 = queryNorm
                0.43612334 = fieldWeight in 2100, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  4.6519823 = idf(docFreq=1146, maxDocs=44218)
                  0.046875 = fieldNorm(doc=2100)
          0.5 = coord(1/2)
      0.33333334 = coord(1/3)
    
    Abstract
    This case study analyzes the Internet-based resources that a software engineer uses in his daily work. Methodologically, we studied the web browser history of the participant, classifying all the web pages he had seen over a period of 12 days into web genres. We interviewed him before and after the analysis of the web browser history. In the first interview, he spoke about his general information behavior; in the second, he commented on each web genre, explaining why and how he used them. As a result, three approaches allow us to describe the set of 23 web genres obtained: (a) the purposes they serve for the participant; (b) the role they play in the various work and search phases; (c) and the way they are used in combination with each other. Further observations concern the way the participant assesses quality of web-based resources, and his information behavior as a software engineer.
  3. Subramanian, S.; Shafer, K.E.: Clustering (2001) 0.01
    0.012755471 = product of:
      0.038266413 = sum of:
        0.038266413 = product of:
          0.076532826 = sum of:
            0.076532826 = weight(_text_:22 in 1046) [ClassicSimilarity], result of:
              0.076532826 = score(doc=1046,freq=2.0), product of:
                0.16484147 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.047072954 = queryNorm
                0.46428138 = fieldWeight in 1046, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.09375 = fieldNorm(doc=1046)
          0.5 = coord(1/2)
      0.33333334 = coord(1/3)
    
    Date
    5. 5.2003 14:17:22
  4. Mukhopadhyay, S.; Peng, S.; Raje, R.; Palakal, M.; Mostafa, J.: Multi-agent information classification using dynamic acquaintance lists (2003) 0.01
    0.011255185 = product of:
      0.033765554 = sum of:
        0.033765554 = product of:
          0.06753111 = sum of:
            0.06753111 = weight(_text_:history in 1755) [ClassicSimilarity], result of:
              0.06753111 = score(doc=1755,freq=2.0), product of:
                0.21898255 = queryWeight, product of:
                  4.6519823 = idf(docFreq=1146, maxDocs=44218)
                  0.047072954 = queryNorm
                0.3083858 = fieldWeight in 1755, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  4.6519823 = idf(docFreq=1146, maxDocs=44218)
                  0.046875 = fieldNorm(doc=1755)
          0.5 = coord(1/2)
      0.33333334 = coord(1/3)
    
    Abstract
    There has been considerable interest in recent years in providing automated information services, such as information classification, by means of a society of collaborative agents. These agents augment each other's knowledge structures (e.g., the vocabularies) and assist each other in providing efficient information services to a human user. However, when the number of agents present in the society increases, exhaustive communication and collaboration among agents result in a [arge communication overhead and increased delays in response time. This paper introduces a method to achieve selective interaction with a relatively small number of potentially useful agents, based an simple agent modeling and acquaintance lists. The key idea presented here is that the acquaintance list of an agent, representing a small number of other agents to be collaborated with, is dynamically adjusted. The best acquaintances are automatically discovered using a learning algorithm, based an the past history of collaboration. Experimental results are presented to demonstrate that such dynamically learned acquaintance lists can lead to high quality of classification, while significantly reducing the delay in response time.
  5. Sebastiani, F.: ¬A tutorial an automated text categorisation (1999) 0.01
    0.011255185 = product of:
      0.033765554 = sum of:
        0.033765554 = product of:
          0.06753111 = sum of:
            0.06753111 = weight(_text_:history in 3390) [ClassicSimilarity], result of:
              0.06753111 = score(doc=3390,freq=2.0), product of:
                0.21898255 = queryWeight, product of:
                  4.6519823 = idf(docFreq=1146, maxDocs=44218)
                  0.047072954 = queryNorm
                0.3083858 = fieldWeight in 3390, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  4.6519823 = idf(docFreq=1146, maxDocs=44218)
                  0.046875 = fieldNorm(doc=3390)
          0.5 = coord(1/2)
      0.33333334 = coord(1/3)
    
    Abstract
    The automated categorisation (or classification) of texts into topical categories has a long history, dating back at least to 1960. Until the late '80s, the dominant approach to the problem involved knowledge-engineering automatic categorisers, i.e. manually building a set of rules encoding expert knowledge an how to classify documents. In the '90s, with the booming production and availability of on-line documents, automated text categorisation has witnessed an increased and renewed interest. A newer paradigm based an machine learning has superseded the previous approach. Within this paradigm, a general inductive process automatically builds a classifier by "learning", from a set of previously classified documents, the characteristics of one or more categories; the advantages are a very good effectiveness, a considerable savings in terms of expert manpower, and domain independence. In this tutorial we look at the main approaches that have been taken towards automatic text categorisation within the general machine learning paradigm. Issues of document indexing, classifier construction, and classifier evaluation, will be touched upon.
  6. Reiner, U.: Automatische DDC-Klassifizierung von bibliografischen Titeldatensätzen (2009) 0.01
    0.01062956 = product of:
      0.03188868 = sum of:
        0.03188868 = product of:
          0.06377736 = sum of:
            0.06377736 = weight(_text_:22 in 611) [ClassicSimilarity], result of:
              0.06377736 = score(doc=611,freq=2.0), product of:
                0.16484147 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.047072954 = queryNorm
                0.38690117 = fieldWeight in 611, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.078125 = fieldNorm(doc=611)
          0.5 = coord(1/2)
      0.33333334 = coord(1/3)
    
    Date
    22. 8.2009 12:54:24
  7. HaCohen-Kerner, Y. et al.: Classification using various machine learning methods and combinations of key-phrases and visual features (2016) 0.01
    0.01062956 = product of:
      0.03188868 = sum of:
        0.03188868 = product of:
          0.06377736 = sum of:
            0.06377736 = weight(_text_:22 in 2748) [ClassicSimilarity], result of:
              0.06377736 = score(doc=2748,freq=2.0), product of:
                0.16484147 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.047072954 = queryNorm
                0.38690117 = fieldWeight in 2748, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.078125 = fieldNorm(doc=2748)
          0.5 = coord(1/2)
      0.33333334 = coord(1/3)
    
    Date
    1. 2.2016 18:25:22
  8. Bock, H.-H.: Datenanalyse zur Strukturierung und Ordnung von Information (1989) 0.01
    0.007440692 = product of:
      0.022322075 = sum of:
        0.022322075 = product of:
          0.04464415 = sum of:
            0.04464415 = weight(_text_:22 in 141) [ClassicSimilarity], result of:
              0.04464415 = score(doc=141,freq=2.0), product of:
                0.16484147 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.047072954 = queryNorm
                0.2708308 = fieldWeight in 141, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=141)
          0.5 = coord(1/2)
      0.33333334 = coord(1/3)
    
    Pages
    S.1-22
  9. Dubin, D.: Dimensions and discriminability (1998) 0.01
    0.007440692 = product of:
      0.022322075 = sum of:
        0.022322075 = product of:
          0.04464415 = sum of:
            0.04464415 = weight(_text_:22 in 2338) [ClassicSimilarity], result of:
              0.04464415 = score(doc=2338,freq=2.0), product of:
                0.16484147 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.047072954 = queryNorm
                0.2708308 = fieldWeight in 2338, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=2338)
          0.5 = coord(1/2)
      0.33333334 = coord(1/3)
    
    Date
    22. 9.1997 19:16:05
  10. Automatic classification research at OCLC (2002) 0.01
    0.007440692 = product of:
      0.022322075 = sum of:
        0.022322075 = product of:
          0.04464415 = sum of:
            0.04464415 = weight(_text_:22 in 1563) [ClassicSimilarity], result of:
              0.04464415 = score(doc=1563,freq=2.0), product of:
                0.16484147 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.047072954 = queryNorm
                0.2708308 = fieldWeight in 1563, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=1563)
          0.5 = coord(1/2)
      0.33333334 = coord(1/3)
    
    Date
    5. 5.2003 9:22:09
  11. Jenkins, C.: Automatic classification of Web resources using Java and Dewey Decimal Classification (1998) 0.01
    0.007440692 = product of:
      0.022322075 = sum of:
        0.022322075 = product of:
          0.04464415 = sum of:
            0.04464415 = weight(_text_:22 in 1673) [ClassicSimilarity], result of:
              0.04464415 = score(doc=1673,freq=2.0), product of:
                0.16484147 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.047072954 = queryNorm
                0.2708308 = fieldWeight in 1673, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=1673)
          0.5 = coord(1/2)
      0.33333334 = coord(1/3)
    
    Date
    1. 8.1996 22:08:06
  12. Yoon, Y.; Lee, C.; Lee, G.G.: ¬An effective procedure for constructing a hierarchical text classification system (2006) 0.01
    0.007440692 = product of:
      0.022322075 = sum of:
        0.022322075 = product of:
          0.04464415 = sum of:
            0.04464415 = weight(_text_:22 in 5273) [ClassicSimilarity], result of:
              0.04464415 = score(doc=5273,freq=2.0), product of:
                0.16484147 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.047072954 = queryNorm
                0.2708308 = fieldWeight in 5273, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=5273)
          0.5 = coord(1/2)
      0.33333334 = coord(1/3)
    
    Date
    22. 7.2006 16:24:52
  13. Yi, K.: Automatic text classification using library classification schemes : trends, issues and challenges (2007) 0.01
    0.007440692 = product of:
      0.022322075 = sum of:
        0.022322075 = product of:
          0.04464415 = sum of:
            0.04464415 = weight(_text_:22 in 2560) [ClassicSimilarity], result of:
              0.04464415 = score(doc=2560,freq=2.0), product of:
                0.16484147 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.047072954 = queryNorm
                0.2708308 = fieldWeight in 2560, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=2560)
          0.5 = coord(1/2)
      0.33333334 = coord(1/3)
    
    Date
    22. 9.2008 18:31:54
  14. Liu, R.-L.: Context recognition for hierarchical text classification (2009) 0.01
    0.0063777356 = product of:
      0.019133206 = sum of:
        0.019133206 = product of:
          0.038266413 = sum of:
            0.038266413 = weight(_text_:22 in 2760) [ClassicSimilarity], result of:
              0.038266413 = score(doc=2760,freq=2.0), product of:
                0.16484147 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.047072954 = queryNorm
                0.23214069 = fieldWeight in 2760, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.046875 = fieldNorm(doc=2760)
          0.5 = coord(1/2)
      0.33333334 = coord(1/3)
    
    Date
    22. 3.2009 19:11:54
  15. Pfeffer, M.: Automatische Vergabe von RVK-Notationen mittels fallbasiertem Schließen (2009) 0.01
    0.0063777356 = product of:
      0.019133206 = sum of:
        0.019133206 = product of:
          0.038266413 = sum of:
            0.038266413 = weight(_text_:22 in 3051) [ClassicSimilarity], result of:
              0.038266413 = score(doc=3051,freq=2.0), product of:
                0.16484147 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.047072954 = queryNorm
                0.23214069 = fieldWeight in 3051, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.046875 = fieldNorm(doc=3051)
          0.5 = coord(1/2)
      0.33333334 = coord(1/3)
    
    Date
    22. 8.2009 19:51:28
  16. Zhu, W.Z.; Allen, R.B.: Document clustering using the LSI subspace signature model (2013) 0.01
    0.0063777356 = product of:
      0.019133206 = sum of:
        0.019133206 = product of:
          0.038266413 = sum of:
            0.038266413 = weight(_text_:22 in 690) [ClassicSimilarity], result of:
              0.038266413 = score(doc=690,freq=2.0), product of:
                0.16484147 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.047072954 = queryNorm
                0.23214069 = fieldWeight in 690, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.046875 = fieldNorm(doc=690)
          0.5 = coord(1/2)
      0.33333334 = coord(1/3)
    
    Date
    23. 3.2013 13:22:36
  17. Egbert, J.; Biber, D.; Davies, M.: Developing a bottom-up, user-based method of web register classification (2015) 0.01
    0.0063777356 = product of:
      0.019133206 = sum of:
        0.019133206 = product of:
          0.038266413 = sum of:
            0.038266413 = weight(_text_:22 in 2158) [ClassicSimilarity], result of:
              0.038266413 = score(doc=2158,freq=2.0), product of:
                0.16484147 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.047072954 = queryNorm
                0.23214069 = fieldWeight in 2158, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.046875 = fieldNorm(doc=2158)
          0.5 = coord(1/2)
      0.33333334 = coord(1/3)
    
    Date
    4. 8.2015 19:22:04
  18. Mengle, S.; Goharian, N.: Passage detection using text classification (2009) 0.01
    0.00531478 = product of:
      0.01594434 = sum of:
        0.01594434 = product of:
          0.03188868 = sum of:
            0.03188868 = weight(_text_:22 in 2765) [ClassicSimilarity], result of:
              0.03188868 = score(doc=2765,freq=2.0), product of:
                0.16484147 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.047072954 = queryNorm
                0.19345059 = fieldWeight in 2765, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=2765)
          0.5 = coord(1/2)
      0.33333334 = coord(1/3)
    
    Date
    22. 3.2009 19:14:43
  19. Liu, R.-L.: ¬A passage extractor for classification of disease aspect information (2013) 0.01
    0.00531478 = product of:
      0.01594434 = sum of:
        0.01594434 = product of:
          0.03188868 = sum of:
            0.03188868 = weight(_text_:22 in 1107) [ClassicSimilarity], result of:
              0.03188868 = score(doc=1107,freq=2.0), product of:
                0.16484147 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.047072954 = queryNorm
                0.19345059 = fieldWeight in 1107, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=1107)
          0.5 = coord(1/2)
      0.33333334 = coord(1/3)
    
    Date
    28.10.2013 19:22:57
  20. Khoo, C.S.G.; Ng, K.; Ou, S.: ¬An exploratory study of human clustering of Web pages (2003) 0.00
    0.0042518238 = product of:
      0.012755471 = sum of:
        0.012755471 = product of:
          0.025510943 = sum of:
            0.025510943 = weight(_text_:22 in 2741) [ClassicSimilarity], result of:
              0.025510943 = score(doc=2741,freq=2.0), product of:
                0.16484147 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.047072954 = queryNorm
                0.15476047 = fieldWeight in 2741, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.03125 = fieldNorm(doc=2741)
          0.5 = coord(1/2)
      0.33333334 = coord(1/3)
    
    Date
    12. 9.2004 9:56:22