Search (5 results, page 1 of 1)

  • × theme_ss:"Automatisches Klassifizieren"
  • × year_i:[2010 TO 2020}
  1. Cortez, E.; Herrera, M.R.; Silva, A.S. da; Moura, E.S. de; Neubert, M.: Lightweight methods for large-scale product categorization (2011) 0.02
    0.018076904 = product of:
      0.03615381 = sum of:
        0.03615381 = product of:
          0.14461523 = sum of:
            0.14461523 = weight(_text_:editors in 4758) [ClassicSimilarity], result of:
              0.14461523 = score(doc=4758,freq=2.0), product of:
                0.32495478 = queryWeight, product of:
                  6.7132807 = idf(docFreq=145, maxDocs=44218)
                  0.048404764 = queryNorm
                0.44503185 = fieldWeight in 4758, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  6.7132807 = idf(docFreq=145, maxDocs=44218)
                  0.046875 = fieldNorm(doc=4758)
          0.25 = coord(1/4)
      0.5 = coord(1/2)
    
    Abstract
    In this article, we present a study about classification methods for large-scale categorization of product offers on e-shopping web sites. We present a study about the performance of previously proposed approaches and deployed a probabilistic approach to model the classification problem. We also studied an alternative way of modeling information about the description of product offers and investigated the usage of price and store of product offers as features adopted in the classification process. Our experiments used two collections of over a million product offers previously categorized by human editors and taxonomies of hundreds of categories from a real e-shopping web site. In these experiments, our method achieved an improvement of up to 9% in the quality of the categorization in comparison with the best baseline we have found.
  2. HaCohen-Kerner, Y. et al.: Classification using various machine learning methods and combinations of key-phrases and visual features (2016) 0.02
    0.016395444 = product of:
      0.032790888 = sum of:
        0.032790888 = product of:
          0.065581776 = sum of:
            0.065581776 = weight(_text_:22 in 2748) [ClassicSimilarity], result of:
              0.065581776 = score(doc=2748,freq=2.0), product of:
                0.16950524 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.048404764 = queryNorm
                0.38690117 = fieldWeight in 2748, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.078125 = fieldNorm(doc=2748)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Date
    1. 2.2016 18:25:22
  3. Zhu, W.Z.; Allen, R.B.: Document clustering using the LSI subspace signature model (2013) 0.01
    0.009837266 = product of:
      0.019674532 = sum of:
        0.019674532 = product of:
          0.039349064 = sum of:
            0.039349064 = weight(_text_:22 in 690) [ClassicSimilarity], result of:
              0.039349064 = score(doc=690,freq=2.0), product of:
                0.16950524 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.048404764 = queryNorm
                0.23214069 = fieldWeight in 690, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.046875 = fieldNorm(doc=690)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Date
    23. 3.2013 13:22:36
  4. Egbert, J.; Biber, D.; Davies, M.: Developing a bottom-up, user-based method of web register classification (2015) 0.01
    0.009837266 = product of:
      0.019674532 = sum of:
        0.019674532 = product of:
          0.039349064 = sum of:
            0.039349064 = weight(_text_:22 in 2158) [ClassicSimilarity], result of:
              0.039349064 = score(doc=2158,freq=2.0), product of:
                0.16950524 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.048404764 = queryNorm
                0.23214069 = fieldWeight in 2158, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.046875 = fieldNorm(doc=2158)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Date
    4. 8.2015 19:22:04
  5. Liu, R.-L.: ¬A passage extractor for classification of disease aspect information (2013) 0.01
    0.008197722 = product of:
      0.016395444 = sum of:
        0.016395444 = product of:
          0.032790888 = sum of:
            0.032790888 = weight(_text_:22 in 1107) [ClassicSimilarity], result of:
              0.032790888 = score(doc=1107,freq=2.0), product of:
                0.16950524 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.048404764 = queryNorm
                0.19345059 = fieldWeight in 1107, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=1107)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Date
    28.10.2013 19:22:57