Search (42 results, page 2 of 3)

  • × theme_ss:"Automatisches Klassifizieren"
  1. Hung, C.-M.; Chien, L.-F.: Web-based text classification in the absence of manually labeled training documents (2007) 0.01
    0.00860596 = product of:
      0.03442384 = sum of:
        0.03442384 = weight(_text_:c in 87) [ClassicSimilarity], result of:
          0.03442384 = score(doc=87,freq=2.0), product of:
            0.1505424 = queryWeight, product of:
              3.4494052 = idf(docFreq=3817, maxDocs=44218)
              0.043643 = queryNorm
            0.22866541 = fieldWeight in 87, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.4494052 = idf(docFreq=3817, maxDocs=44218)
              0.046875 = fieldNorm(doc=87)
      0.25 = coord(1/4)
    
  2. Montesi, M.; Navarrete, T.: Classifying web genres in context : A case study documenting the web genres used by a software engineer (2008) 0.01
    0.00860596 = product of:
      0.03442384 = sum of:
        0.03442384 = weight(_text_:c in 2100) [ClassicSimilarity], result of:
          0.03442384 = score(doc=2100,freq=2.0), product of:
            0.1505424 = queryWeight, product of:
              3.4494052 = idf(docFreq=3817, maxDocs=44218)
              0.043643 = queryNorm
            0.22866541 = fieldWeight in 2100, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.4494052 = idf(docFreq=3817, maxDocs=44218)
              0.046875 = fieldNorm(doc=2100)
      0.25 = coord(1/4)
    
    Abstract
    This case study analyzes the Internet-based resources that a software engineer uses in his daily work. Methodologically, we studied the web browser history of the participant, classifying all the web pages he had seen over a period of 12 days into web genres. We interviewed him before and after the analysis of the web browser history. In the first interview, he spoke about his general information behavior; in the second, he commented on each web genre, explaining why and how he used them. As a result, three approaches allow us to describe the set of 23 web genres obtained: (a) the purposes they serve for the participant; (b) the role they play in the various work and search phases; (c) and the way they are used in combination with each other. Further observations concern the way the participant assesses quality of web-based resources, and his information behavior as a software engineer.
  3. Sojka, P.; Lee, M.; Rehurek, R.; Hatlapatka, R.; Kucbel, M.; Bouche, T.; Goutorbe, C.; Anghelache, R.; Wojciechowski, K.: Toolset for entity and semantic associations : Final Release (2013) 0.01
    0.00860596 = product of:
      0.03442384 = sum of:
        0.03442384 = weight(_text_:c in 1057) [ClassicSimilarity], result of:
          0.03442384 = score(doc=1057,freq=2.0), product of:
            0.1505424 = queryWeight, product of:
              3.4494052 = idf(docFreq=3817, maxDocs=44218)
              0.043643 = queryNorm
            0.22866541 = fieldWeight in 1057, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.4494052 = idf(docFreq=3817, maxDocs=44218)
              0.046875 = fieldNorm(doc=1057)
      0.25 = coord(1/4)
    
  4. Reiner, U.: Automatische DDC-Klassifizierung von bibliografischen Titeldatensätzen (2009) 0.01
    0.007391281 = product of:
      0.029565124 = sum of:
        0.029565124 = product of:
          0.059130248 = sum of:
            0.059130248 = weight(_text_:22 in 611) [ClassicSimilarity], result of:
              0.059130248 = score(doc=611,freq=2.0), product of:
                0.15283036 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.043643 = queryNorm
                0.38690117 = fieldWeight in 611, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.078125 = fieldNorm(doc=611)
          0.5 = coord(1/2)
      0.25 = coord(1/4)
    
    Date
    22. 8.2009 12:54:24
  5. Krüger, C.: Evaluation des WWW-Suchdienstes GERHARD unter besonderer Beachtung automatischer Indexierung (1999) 0.01
    0.0071716327 = product of:
      0.02868653 = sum of:
        0.02868653 = weight(_text_:c in 1777) [ClassicSimilarity], result of:
          0.02868653 = score(doc=1777,freq=2.0), product of:
            0.1505424 = queryWeight, product of:
              3.4494052 = idf(docFreq=3817, maxDocs=44218)
              0.043643 = queryNorm
            0.1905545 = fieldWeight in 1777, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.4494052 = idf(docFreq=3817, maxDocs=44218)
              0.0390625 = fieldNorm(doc=1777)
      0.25 = coord(1/4)
    
  6. Pong, J.Y.-H.; Kwok, R.C.-W.; Lau, R.Y.-K.; Hao, J.-X.; Wong, P.C.-C.: ¬A comparative study of two automatic document classification methods in a library setting (2008) 0.01
    0.0071716327 = product of:
      0.02868653 = sum of:
        0.02868653 = weight(_text_:c in 2532) [ClassicSimilarity], result of:
          0.02868653 = score(doc=2532,freq=2.0), product of:
            0.1505424 = queryWeight, product of:
              3.4494052 = idf(docFreq=3817, maxDocs=44218)
              0.043643 = queryNorm
            0.1905545 = fieldWeight in 2532, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.4494052 = idf(docFreq=3817, maxDocs=44218)
              0.0390625 = fieldNorm(doc=2532)
      0.25 = coord(1/4)
    
  7. Vilares, D.; Alonso, M.A.; Gómez-Rodríguez, C.: On the usefulness of lexical and syntactic processing in polarity classification of Twitter messages (2015) 0.01
    0.0071716327 = product of:
      0.02868653 = sum of:
        0.02868653 = weight(_text_:c in 2161) [ClassicSimilarity], result of:
          0.02868653 = score(doc=2161,freq=2.0), product of:
            0.1505424 = queryWeight, product of:
              3.4494052 = idf(docFreq=3817, maxDocs=44218)
              0.043643 = queryNorm
            0.1905545 = fieldWeight in 2161, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.4494052 = idf(docFreq=3817, maxDocs=44218)
              0.0390625 = fieldNorm(doc=2161)
      0.25 = coord(1/4)
    
  8. Chae, G.; Park, J.; Park, J.; Yeo, W.S.; Shi, C.: Linking and clustering artworks using social tags : revitalizing crowd-sourced information on cultural collections (2016) 0.01
    0.0071716327 = product of:
      0.02868653 = sum of:
        0.02868653 = weight(_text_:c in 2852) [ClassicSimilarity], result of:
          0.02868653 = score(doc=2852,freq=2.0), product of:
            0.1505424 = queryWeight, product of:
              3.4494052 = idf(docFreq=3817, maxDocs=44218)
              0.043643 = queryNorm
            0.1905545 = fieldWeight in 2852, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.4494052 = idf(docFreq=3817, maxDocs=44218)
              0.0390625 = fieldNorm(doc=2852)
      0.25 = coord(1/4)
    
  9. Ru, C.; Tang, J.; Li, S.; Xie, S.; Wang, T.: Using semantic similarity to reduce wrong labels in distant supervision for relation extraction (2018) 0.01
    0.0071716327 = product of:
      0.02868653 = sum of:
        0.02868653 = weight(_text_:c in 5055) [ClassicSimilarity], result of:
          0.02868653 = score(doc=5055,freq=2.0), product of:
            0.1505424 = queryWeight, product of:
              3.4494052 = idf(docFreq=3817, maxDocs=44218)
              0.043643 = queryNorm
            0.1905545 = fieldWeight in 5055, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.4494052 = idf(docFreq=3817, maxDocs=44218)
              0.0390625 = fieldNorm(doc=5055)
      0.25 = coord(1/4)
    
  10. Pech, G.; Delgado, C.; Sorella, S.P.: Classifying papers into subfields using Abstracts, Titles, Keywords and KeyWords Plus through pattern detection and optimization procedures : an application in Physics (2022) 0.01
    0.0071716327 = product of:
      0.02868653 = sum of:
        0.02868653 = weight(_text_:c in 744) [ClassicSimilarity], result of:
          0.02868653 = score(doc=744,freq=2.0), product of:
            0.1505424 = queryWeight, product of:
              3.4494052 = idf(docFreq=3817, maxDocs=44218)
              0.043643 = queryNorm
            0.1905545 = fieldWeight in 744, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.4494052 = idf(docFreq=3817, maxDocs=44218)
              0.0390625 = fieldNorm(doc=744)
      0.25 = coord(1/4)
    
  11. Han, K.; Rezapour, R.; Nakamura, K.; Devkota, D.; Miller, D.C.; Diesner, J.: ¬An expert-in-the-loop method for domain-specific document categorization based on small training data (2023) 0.01
    0.0071716327 = product of:
      0.02868653 = sum of:
        0.02868653 = weight(_text_:c in 967) [ClassicSimilarity], result of:
          0.02868653 = score(doc=967,freq=2.0), product of:
            0.1505424 = queryWeight, product of:
              3.4494052 = idf(docFreq=3817, maxDocs=44218)
              0.043643 = queryNorm
            0.1905545 = fieldWeight in 967, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.4494052 = idf(docFreq=3817, maxDocs=44218)
              0.0390625 = fieldNorm(doc=967)
      0.25 = coord(1/4)
    
    Abstract
    Automated text categorization methods are of broad relevance for domain experts since they free researchers and practitioners from manual labeling, save their resources (e.g., time, labor), and enrich the data with information helpful to study substantive questions. Despite a variety of newly developed categorization methods that require substantial amounts of annotated data, little is known about how to build models when (a) labeling texts with categories requires substantial domain expertise and/or in-depth reading, (b) only a few annotated documents are available for model training, and (c) no relevant computational resources, such as pretrained models, are available. In a collaboration with environmental scientists who study the socio-ecological impact of funded biodiversity conservation projects, we develop a method that integrates deep domain expertise with computational models to automatically categorize project reports based on a small sample of 93 annotated documents. Our results suggest that domain expertise can improve automated categorization and that the magnitude of these improvements is influenced by the experts' understanding of categories and their confidence in their annotation, as well as data sparsity and additional category characteristics such as the portion of exclusive keywords that can identify a category.
  12. Bock, H.-H.: Datenanalyse zur Strukturierung und Ordnung von Information (1989) 0.01
    0.0051738964 = product of:
      0.020695586 = sum of:
        0.020695586 = product of:
          0.04139117 = sum of:
            0.04139117 = weight(_text_:22 in 141) [ClassicSimilarity], result of:
              0.04139117 = score(doc=141,freq=2.0), product of:
                0.15283036 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.043643 = queryNorm
                0.2708308 = fieldWeight in 141, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=141)
          0.5 = coord(1/2)
      0.25 = coord(1/4)
    
    Pages
    S.1-22
  13. Automatic classification research at OCLC (2002) 0.01
    0.0051738964 = product of:
      0.020695586 = sum of:
        0.020695586 = product of:
          0.04139117 = sum of:
            0.04139117 = weight(_text_:22 in 1563) [ClassicSimilarity], result of:
              0.04139117 = score(doc=1563,freq=2.0), product of:
                0.15283036 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.043643 = queryNorm
                0.2708308 = fieldWeight in 1563, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=1563)
          0.5 = coord(1/2)
      0.25 = coord(1/4)
    
    Date
    5. 5.2003 9:22:09
  14. Yi, K.: Automatic text classification using library classification schemes : trends, issues and challenges (2007) 0.01
    0.0051738964 = product of:
      0.020695586 = sum of:
        0.020695586 = product of:
          0.04139117 = sum of:
            0.04139117 = weight(_text_:22 in 2560) [ClassicSimilarity], result of:
              0.04139117 = score(doc=2560,freq=2.0), product of:
                0.15283036 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.043643 = queryNorm
                0.2708308 = fieldWeight in 2560, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=2560)
          0.5 = coord(1/2)
      0.25 = coord(1/4)
    
    Date
    22. 9.2008 18:31:54
  15. Liu, R.-L.: Context recognition for hierarchical text classification (2009) 0.00
    0.004434768 = product of:
      0.017739072 = sum of:
        0.017739072 = product of:
          0.035478145 = sum of:
            0.035478145 = weight(_text_:22 in 2760) [ClassicSimilarity], result of:
              0.035478145 = score(doc=2760,freq=2.0), product of:
                0.15283036 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.043643 = queryNorm
                0.23214069 = fieldWeight in 2760, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.046875 = fieldNorm(doc=2760)
          0.5 = coord(1/2)
      0.25 = coord(1/4)
    
    Date
    22. 3.2009 19:11:54
  16. Pfeffer, M.: Automatische Vergabe von RVK-Notationen mittels fallbasiertem Schließen (2009) 0.00
    0.004434768 = product of:
      0.017739072 = sum of:
        0.017739072 = product of:
          0.035478145 = sum of:
            0.035478145 = weight(_text_:22 in 3051) [ClassicSimilarity], result of:
              0.035478145 = score(doc=3051,freq=2.0), product of:
                0.15283036 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.043643 = queryNorm
                0.23214069 = fieldWeight in 3051, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.046875 = fieldNorm(doc=3051)
          0.5 = coord(1/2)
      0.25 = coord(1/4)
    
    Date
    22. 8.2009 19:51:28
  17. Zhu, W.Z.; Allen, R.B.: Document clustering using the LSI subspace signature model (2013) 0.00
    0.004434768 = product of:
      0.017739072 = sum of:
        0.017739072 = product of:
          0.035478145 = sum of:
            0.035478145 = weight(_text_:22 in 690) [ClassicSimilarity], result of:
              0.035478145 = score(doc=690,freq=2.0), product of:
                0.15283036 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.043643 = queryNorm
                0.23214069 = fieldWeight in 690, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.046875 = fieldNorm(doc=690)
          0.5 = coord(1/2)
      0.25 = coord(1/4)
    
    Date
    23. 3.2013 13:22:36
  18. Egbert, J.; Biber, D.; Davies, M.: Developing a bottom-up, user-based method of web register classification (2015) 0.00
    0.004434768 = product of:
      0.017739072 = sum of:
        0.017739072 = product of:
          0.035478145 = sum of:
            0.035478145 = weight(_text_:22 in 2158) [ClassicSimilarity], result of:
              0.035478145 = score(doc=2158,freq=2.0), product of:
                0.15283036 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.043643 = queryNorm
                0.23214069 = fieldWeight in 2158, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.046875 = fieldNorm(doc=2158)
          0.5 = coord(1/2)
      0.25 = coord(1/4)
    
    Date
    4. 8.2015 19:22:04
  19. Mengle, S.; Goharian, N.: Passage detection using text classification (2009) 0.00
    0.0036956405 = product of:
      0.014782562 = sum of:
        0.014782562 = product of:
          0.029565124 = sum of:
            0.029565124 = weight(_text_:22 in 2765) [ClassicSimilarity], result of:
              0.029565124 = score(doc=2765,freq=2.0), product of:
                0.15283036 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.043643 = queryNorm
                0.19345059 = fieldWeight in 2765, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=2765)
          0.5 = coord(1/2)
      0.25 = coord(1/4)
    
    Date
    22. 3.2009 19:14:43
  20. Liu, R.-L.: ¬A passage extractor for classification of disease aspect information (2013) 0.00
    0.0036956405 = product of:
      0.014782562 = sum of:
        0.014782562 = product of:
          0.029565124 = sum of:
            0.029565124 = weight(_text_:22 in 1107) [ClassicSimilarity], result of:
              0.029565124 = score(doc=1107,freq=2.0), product of:
                0.15283036 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.043643 = queryNorm
                0.19345059 = fieldWeight in 1107, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=1107)
          0.5 = coord(1/2)
      0.25 = coord(1/4)
    
    Date
    28.10.2013 19:22:57

Languages

  • e 35
  • d 7

Types

  • a 35
  • el 6
  • m 1
  • s 1
  • x 1
  • More… Less…