Search (84 results, page 1 of 5)

  • × theme_ss:"Automatisches Klassifizieren"
  1. Hotho, A.; Bloehdorn, S.: Data Mining 2004 : Text classification by boosting weak learners based on terms and concepts (2004) 0.10
    0.09882007 = sum of:
      0.054013528 = product of:
        0.21605411 = sum of:
          0.21605411 = weight(_text_:3a in 562) [ClassicSimilarity], result of:
            0.21605411 = score(doc=562,freq=2.0), product of:
              0.38442558 = queryWeight, product of:
                8.478011 = idf(docFreq=24, maxDocs=44218)
                0.04534384 = queryNorm
              0.56201804 = fieldWeight in 562, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                8.478011 = idf(docFreq=24, maxDocs=44218)
                0.046875 = fieldNorm(doc=562)
        0.25 = coord(1/4)
      0.04480654 = product of:
        0.06720981 = sum of:
          0.030349022 = weight(_text_:j in 562) [ClassicSimilarity], result of:
            0.030349022 = score(doc=562,freq=2.0), product of:
              0.14407988 = queryWeight, product of:
                3.1774964 = idf(docFreq=5010, maxDocs=44218)
                0.04534384 = queryNorm
              0.21064025 = fieldWeight in 562, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.1774964 = idf(docFreq=5010, maxDocs=44218)
                0.046875 = fieldNorm(doc=562)
          0.036860786 = weight(_text_:22 in 562) [ClassicSimilarity], result of:
            0.036860786 = score(doc=562,freq=2.0), product of:
              0.1587864 = queryWeight, product of:
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.04534384 = queryNorm
              0.23214069 = fieldWeight in 562, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.046875 = fieldNorm(doc=562)
        0.6666667 = coord(2/3)
    
    Content
    Vgl.: http://www.google.de/url?sa=t&rct=j&q=&esrc=s&source=web&cd=1&cad=rja&ved=0CEAQFjAA&url=http%3A%2F%2Fciteseerx.ist.psu.edu%2Fviewdoc%2Fdownload%3Fdoi%3D10.1.1.91.4940%26rep%3Drep1%26type%3Dpdf&ei=dOXrUMeIDYHDtQahsIGACg&usg=AFQjCNHFWVh6gNPvnOrOS9R3rkrXCNVD-A&sig2=5I2F5evRfMnsttSgFF9g7Q&bvm=bv.1357316858,d.Yms.
    Date
    8. 1.2013 10:22:32
  2. Kleinoeder, H.H.; Puzicha, J.: Automatische Katalogisierung am Beispiel einer Pilotanwendung (2002) 0.04
    0.038035594 = product of:
      0.07607119 = sum of:
        0.07607119 = product of:
          0.114106774 = sum of:
            0.043292392 = weight(_text_:h in 1154) [ClassicSimilarity], result of:
              0.043292392 = score(doc=1154,freq=2.0), product of:
                0.11265446 = queryWeight, product of:
                  2.4844491 = idf(docFreq=10020, maxDocs=44218)
                  0.04534384 = queryNorm
                0.38429362 = fieldWeight in 1154, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  2.4844491 = idf(docFreq=10020, maxDocs=44218)
                  0.109375 = fieldNorm(doc=1154)
            0.070814386 = weight(_text_:j in 1154) [ClassicSimilarity], result of:
              0.070814386 = score(doc=1154,freq=2.0), product of:
                0.14407988 = queryWeight, product of:
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.04534384 = queryNorm
                0.4914939 = fieldWeight in 1154, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.109375 = fieldNorm(doc=1154)
          0.6666667 = coord(2/3)
      0.5 = coord(1/2)
    
    Source
    Info 7. 17(2002) H.1, S.19-21
  3. HaCohen-Kerner, Y. et al.: Classification using various machine learning methods and combinations of key-phrases and visual features (2016) 0.04
    0.037338786 = product of:
      0.07467757 = sum of:
        0.07467757 = product of:
          0.11201635 = sum of:
            0.0505817 = weight(_text_:j in 2748) [ClassicSimilarity], result of:
              0.0505817 = score(doc=2748,freq=2.0), product of:
                0.14407988 = queryWeight, product of:
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.04534384 = queryNorm
                0.35106707 = fieldWeight in 2748, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.078125 = fieldNorm(doc=2748)
            0.061434645 = weight(_text_:22 in 2748) [ClassicSimilarity], result of:
              0.061434645 = score(doc=2748,freq=2.0), product of:
                0.1587864 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.04534384 = queryNorm
                0.38690117 = fieldWeight in 2748, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.078125 = fieldNorm(doc=2748)
          0.6666667 = coord(2/3)
      0.5 = coord(1/2)
    
    Date
    1. 2.2016 18:25:22
    Source
    Semantic keyword-based search on structured data sources: First COST Action IC1302 International KEYSTONE Conference, IKC 2015, Coimbra, Portugal, September 8-9, 2015. Revised Selected Papers. Eds.: J. Cardoso et al
  4. Wätjen, H.-J.; Diekmann, B.; Möller, G.; Carstensen, K.-U.: Bericht zum DFG-Projekt: GERHARD : German Harvest Automated Retrieval and Directory (1998) 0.03
    0.02716828 = product of:
      0.05433656 = sum of:
        0.05433656 = product of:
          0.08150484 = sum of:
            0.030923137 = weight(_text_:h in 3065) [ClassicSimilarity], result of:
              0.030923137 = score(doc=3065,freq=2.0), product of:
                0.11265446 = queryWeight, product of:
                  2.4844491 = idf(docFreq=10020, maxDocs=44218)
                  0.04534384 = queryNorm
                0.27449545 = fieldWeight in 3065, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  2.4844491 = idf(docFreq=10020, maxDocs=44218)
                  0.078125 = fieldNorm(doc=3065)
            0.0505817 = weight(_text_:j in 3065) [ClassicSimilarity], result of:
              0.0505817 = score(doc=3065,freq=2.0), product of:
                0.14407988 = queryWeight, product of:
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.04534384 = queryNorm
                0.35106707 = fieldWeight in 3065, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.078125 = fieldNorm(doc=3065)
          0.6666667 = coord(2/3)
      0.5 = coord(1/2)
    
  5. Wätjen, H.-J.: Automatisches Sammeln, Klassifizieren und Indexieren von wissenschaftlich relevanten Informationsressourcen im deutschen World Wide Web : das DFG-Projekt GERHARD (1998) 0.03
    0.02716828 = product of:
      0.05433656 = sum of:
        0.05433656 = product of:
          0.08150484 = sum of:
            0.030923137 = weight(_text_:h in 3066) [ClassicSimilarity], result of:
              0.030923137 = score(doc=3066,freq=2.0), product of:
                0.11265446 = queryWeight, product of:
                  2.4844491 = idf(docFreq=10020, maxDocs=44218)
                  0.04534384 = queryNorm
                0.27449545 = fieldWeight in 3066, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  2.4844491 = idf(docFreq=10020, maxDocs=44218)
                  0.078125 = fieldNorm(doc=3066)
            0.0505817 = weight(_text_:j in 3066) [ClassicSimilarity], result of:
              0.0505817 = score(doc=3066,freq=2.0), product of:
                0.14407988 = queryWeight, product of:
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.04534384 = queryNorm
                0.35106707 = fieldWeight in 3066, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.078125 = fieldNorm(doc=3066)
          0.6666667 = coord(2/3)
      0.5 = coord(1/2)
    
  6. Bollmann, P.; Konrad, E.; Schneider, H.-J.; Zuse, H.: Anwendung automatischer Klassifikationsverfahren mit dem System FAKYR (1978) 0.03
    0.0251503 = product of:
      0.0503006 = sum of:
        0.0503006 = product of:
          0.0754509 = sum of:
            0.034985535 = weight(_text_:h in 82) [ClassicSimilarity], result of:
              0.034985535 = score(doc=82,freq=4.0), product of:
                0.11265446 = queryWeight, product of:
                  2.4844491 = idf(docFreq=10020, maxDocs=44218)
                  0.04534384 = queryNorm
                0.31055614 = fieldWeight in 82, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  2.4844491 = idf(docFreq=10020, maxDocs=44218)
                  0.0625 = fieldNorm(doc=82)
            0.040465362 = weight(_text_:j in 82) [ClassicSimilarity], result of:
              0.040465362 = score(doc=82,freq=2.0), product of:
                0.14407988 = queryWeight, product of:
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.04534384 = queryNorm
                0.28085366 = fieldWeight in 82, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.0625 = fieldNorm(doc=82)
          0.6666667 = coord(2/3)
      0.5 = coord(1/2)
    
  7. Bock, H.-H.: Datenanalyse zur Strukturierung und Ordnung von Information (1989) 0.02
    0.024538865 = product of:
      0.04907773 = sum of:
        0.04907773 = product of:
          0.073616594 = sum of:
            0.030612344 = weight(_text_:h in 141) [ClassicSimilarity], result of:
              0.030612344 = score(doc=141,freq=4.0), product of:
                0.11265446 = queryWeight, product of:
                  2.4844491 = idf(docFreq=10020, maxDocs=44218)
                  0.04534384 = queryNorm
                0.27173662 = fieldWeight in 141, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  2.4844491 = idf(docFreq=10020, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=141)
            0.04300425 = weight(_text_:22 in 141) [ClassicSimilarity], result of:
              0.04300425 = score(doc=141,freq=2.0), product of:
                0.1587864 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.04534384 = queryNorm
                0.2708308 = fieldWeight in 141, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=141)
          0.6666667 = coord(2/3)
      0.5 = coord(1/2)
    
    Pages
    S.1-22
  8. Egbert, J.; Biber, D.; Davies, M.: Developing a bottom-up, user-based method of web register classification (2015) 0.02
    0.02240327 = product of:
      0.04480654 = sum of:
        0.04480654 = product of:
          0.06720981 = sum of:
            0.030349022 = weight(_text_:j in 2158) [ClassicSimilarity], result of:
              0.030349022 = score(doc=2158,freq=2.0), product of:
                0.14407988 = queryWeight, product of:
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.04534384 = queryNorm
                0.21064025 = fieldWeight in 2158, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.046875 = fieldNorm(doc=2158)
            0.036860786 = weight(_text_:22 in 2158) [ClassicSimilarity], result of:
              0.036860786 = score(doc=2158,freq=2.0), product of:
                0.1587864 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.04534384 = queryNorm
                0.23214069 = fieldWeight in 2158, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.046875 = fieldNorm(doc=2158)
          0.6666667 = coord(2/3)
      0.5 = coord(1/2)
    
    Date
    4. 8.2015 19:22:04
  9. Wätjen, H.-J.: GERHARD : Automatisches Sammeln, Klassifizieren und Indexieren von wissenschaftlich relevanten Informationsressourcen im deutschen World Wide Web (1998) 0.02
    0.022006512 = product of:
      0.044013023 = sum of:
        0.044013023 = product of:
          0.066019535 = sum of:
            0.030612344 = weight(_text_:h in 3064) [ClassicSimilarity], result of:
              0.030612344 = score(doc=3064,freq=4.0), product of:
                0.11265446 = queryWeight, product of:
                  2.4844491 = idf(docFreq=10020, maxDocs=44218)
                  0.04534384 = queryNorm
                0.27173662 = fieldWeight in 3064, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  2.4844491 = idf(docFreq=10020, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=3064)
            0.035407193 = weight(_text_:j in 3064) [ClassicSimilarity], result of:
              0.035407193 = score(doc=3064,freq=2.0), product of:
                0.14407988 = queryWeight, product of:
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.04534384 = queryNorm
                0.24574696 = fieldWeight in 3064, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=3064)
          0.6666667 = coord(2/3)
      0.5 = coord(1/2)
    
    Source
    B.I.T.online. 1(1998) H.4, S.279-290
  10. Panyr, J.: Vektorraum-Modell und Clusteranalyse in Information-Retrieval-Systemen (1987) 0.02
    0.021734625 = product of:
      0.04346925 = sum of:
        0.04346925 = product of:
          0.065203875 = sum of:
            0.02473851 = weight(_text_:h in 2322) [ClassicSimilarity], result of:
              0.02473851 = score(doc=2322,freq=2.0), product of:
                0.11265446 = queryWeight, product of:
                  2.4844491 = idf(docFreq=10020, maxDocs=44218)
                  0.04534384 = queryNorm
                0.21959636 = fieldWeight in 2322, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  2.4844491 = idf(docFreq=10020, maxDocs=44218)
                  0.0625 = fieldNorm(doc=2322)
            0.040465362 = weight(_text_:j in 2322) [ClassicSimilarity], result of:
              0.040465362 = score(doc=2322,freq=2.0), product of:
                0.14407988 = queryWeight, product of:
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.04534384 = queryNorm
                0.28085366 = fieldWeight in 2322, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.0625 = fieldNorm(doc=2322)
          0.6666667 = coord(2/3)
      0.5 = coord(1/2)
    
    Source
    Nachrichten für Dokumentation. 38(1987) H.1, S.13-20
  11. Na, J.-C.; Sui, H.; Khoo, C.; Chan, S.; Zhou, Y.: Effectiveness of simple linguistic processing in automatic sentiment classification of product reviews (2004) 0.01
    0.01358414 = product of:
      0.02716828 = sum of:
        0.02716828 = product of:
          0.04075242 = sum of:
            0.015461569 = weight(_text_:h in 2624) [ClassicSimilarity], result of:
              0.015461569 = score(doc=2624,freq=2.0), product of:
                0.11265446 = queryWeight, product of:
                  2.4844491 = idf(docFreq=10020, maxDocs=44218)
                  0.04534384 = queryNorm
                0.13724773 = fieldWeight in 2624, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  2.4844491 = idf(docFreq=10020, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=2624)
            0.02529085 = weight(_text_:j in 2624) [ClassicSimilarity], result of:
              0.02529085 = score(doc=2624,freq=2.0), product of:
                0.14407988 = queryWeight, product of:
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.04534384 = queryNorm
                0.17553353 = fieldWeight in 2624, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=2624)
          0.6666667 = coord(2/3)
      0.5 = coord(1/2)
    
  12. Pong, J.Y.-H.; Kwok, R.C.-W.; Lau, R.Y.-K.; Hao, J.-X.; Wong, P.C.-C.: ¬A comparative study of two automatic document classification methods in a library setting (2008) 0.01
    0.01358414 = product of:
      0.02716828 = sum of:
        0.02716828 = product of:
          0.04075242 = sum of:
            0.015461569 = weight(_text_:h in 2532) [ClassicSimilarity], result of:
              0.015461569 = score(doc=2532,freq=2.0), product of:
                0.11265446 = queryWeight, product of:
                  2.4844491 = idf(docFreq=10020, maxDocs=44218)
                  0.04534384 = queryNorm
                0.13724773 = fieldWeight in 2532, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  2.4844491 = idf(docFreq=10020, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=2532)
            0.02529085 = weight(_text_:j in 2532) [ClassicSimilarity], result of:
              0.02529085 = score(doc=2532,freq=2.0), product of:
                0.14407988 = queryWeight, product of:
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.04534384 = queryNorm
                0.17553353 = fieldWeight in 2532, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=2532)
          0.6666667 = coord(2/3)
      0.5 = coord(1/2)
    
  13. Panyr, J.: STEINADLER: ein Verfahren zur automatischen Deskribierung und zur automatischen thematischen Klassifikation (1978) 0.01
    0.013488455 = product of:
      0.02697691 = sum of:
        0.02697691 = product of:
          0.080930725 = sum of:
            0.080930725 = weight(_text_:j in 5169) [ClassicSimilarity], result of:
              0.080930725 = score(doc=5169,freq=2.0), product of:
                0.14407988 = queryWeight, product of:
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.04534384 = queryNorm
                0.5617073 = fieldWeight in 5169, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.125 = fieldNorm(doc=5169)
          0.33333334 = coord(1/3)
      0.5 = coord(1/2)
    
  14. Subramanian, S.; Shafer, K.E.: Clustering (2001) 0.01
    0.012286929 = product of:
      0.024573859 = sum of:
        0.024573859 = product of:
          0.07372157 = sum of:
            0.07372157 = weight(_text_:22 in 1046) [ClassicSimilarity], result of:
              0.07372157 = score(doc=1046,freq=2.0), product of:
                0.1587864 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.04534384 = queryNorm
                0.46428138 = fieldWeight in 1046, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.09375 = fieldNorm(doc=1046)
          0.33333334 = coord(1/3)
      0.5 = coord(1/2)
    
    Date
    5. 5.2003 14:17:22
  15. Bock, H.-H.: Automatische Klassifikation : theoretische und praktische Methoden zur Gruppierung und Strukturierung von Daten (Cluster-Analyse) (1974) 0.01
    0.011661845 = product of:
      0.02332369 = sum of:
        0.02332369 = product of:
          0.06997107 = sum of:
            0.06997107 = weight(_text_:h in 7693) [ClassicSimilarity], result of:
              0.06997107 = score(doc=7693,freq=4.0), product of:
                0.11265446 = queryWeight, product of:
                  2.4844491 = idf(docFreq=10020, maxDocs=44218)
                  0.04534384 = queryNorm
                0.6211123 = fieldWeight in 7693, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  2.4844491 = idf(docFreq=10020, maxDocs=44218)
                  0.125 = fieldNorm(doc=7693)
          0.33333334 = coord(1/3)
      0.5 = coord(1/2)
    
  16. Reiner, U.: Automatische DDC-Klassifizierung von bibliografischen Titeldatensätzen (2009) 0.01
    0.0102391075 = product of:
      0.020478215 = sum of:
        0.020478215 = product of:
          0.061434645 = sum of:
            0.061434645 = weight(_text_:22 in 611) [ClassicSimilarity], result of:
              0.061434645 = score(doc=611,freq=2.0), product of:
                0.1587864 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.04534384 = queryNorm
                0.38690117 = fieldWeight in 611, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.078125 = fieldNorm(doc=611)
          0.33333334 = coord(1/3)
      0.5 = coord(1/2)
    
    Date
    22. 8.2009 12:54:24
  17. Panyr, J.: Automatische Klassifikation und Information Retrieval : Anwendung und Entwicklung komplexer Verfahren in Information-Retrieval-Systemen und ihre Evaluierung (1986) 0.01
    0.010116341 = product of:
      0.020232681 = sum of:
        0.020232681 = product of:
          0.060698044 = sum of:
            0.060698044 = weight(_text_:j in 32) [ClassicSimilarity], result of:
              0.060698044 = score(doc=32,freq=2.0), product of:
                0.14407988 = queryWeight, product of:
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.04534384 = queryNorm
                0.4212805 = fieldWeight in 32, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.09375 = fieldNorm(doc=32)
          0.33333334 = coord(1/3)
      0.5 = coord(1/2)
    
  18. Godby, C. J.; Stuler, J.: ¬The Library of Congress Classification as a knowledge base for automatic subject categorization (2001) 0.01
    0.009537777 = product of:
      0.019075554 = sum of:
        0.019075554 = product of:
          0.05722666 = sum of:
            0.05722666 = weight(_text_:j in 1567) [ClassicSimilarity], result of:
              0.05722666 = score(doc=1567,freq=4.0), product of:
                0.14407988 = queryWeight, product of:
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.04534384 = queryNorm
                0.39718705 = fieldWeight in 1567, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.0625 = fieldNorm(doc=1567)
          0.33333334 = coord(1/3)
      0.5 = coord(1/2)
    
  19. Oberhauser, O.: Automatisches Klassifizieren : Entwicklungsstand - Methodik - Anwendungsbereiche (2005) 0.01
    0.007859468 = product of:
      0.015718937 = sum of:
        0.015718937 = product of:
          0.023578405 = sum of:
            0.01093298 = weight(_text_:h in 38) [ClassicSimilarity], result of:
              0.01093298 = score(doc=38,freq=4.0), product of:
                0.11265446 = queryWeight, product of:
                  2.4844491 = idf(docFreq=10020, maxDocs=44218)
                  0.04534384 = queryNorm
                0.0970488 = fieldWeight in 38, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  2.4844491 = idf(docFreq=10020, maxDocs=44218)
                  0.01953125 = fieldNorm(doc=38)
            0.012645425 = weight(_text_:j in 38) [ClassicSimilarity], result of:
              0.012645425 = score(doc=38,freq=2.0), product of:
                0.14407988 = queryWeight, product of:
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.04534384 = queryNorm
                0.08776677 = fieldWeight in 38, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.01953125 = fieldNorm(doc=38)
          0.6666667 = coord(2/3)
      0.5 = coord(1/2)
    
    Footnote
    Rez. in: VÖB-Mitteilungen 58(2005) H.3, S.102-104 (R.F. Müller); ZfBB 53(2006) H.5, S.282-283 (L. Svensson): "Das Sammeln und Verzeichnen elektronischer Ressourcen gehört in wissenschaftlichen Bibliotheken längst zum Alltag. Parallel dazu kündigt sich ein Paradigmenwechsel bei den Findmitteln an: Um einen effizienten und benutzerorientierten Zugang zu den gemischten Kollektionen bieten zu können, experimentieren einige bibliothekarische Diensteanbieter wie z. B. das hbz (http://suchen.hbz-nrw.de/dreilaender/), die Bibliothek der North Carolina State University (www.lib.ncsu.edu/) und demnächst vascoda (www.vascoda.de/) und der Librarians-Internet Index (www.lii.org/) zunehmend mit Suchmaschinentechnologie. Dabei wird angestrebt, nicht nur einen vollinvertierten Suchindex anzubieten, sondern auch das Browsing durch eine hierarchisch geordnete Klassifikation. Von den Daten in den deutschen Verbunddatenbanken ist jedoch nur ein kleiner Teil schon klassifikatorisch erschlossen. Fremddaten aus dem angloamerikanischen Bereich sind oft mit LCC und/oder DDC erschlossen, wobei die Library of Congress sich bei der DDCErschließung auf Titel, die hauptsächlich für die Public Libraries interessant sind, konzentriert. Die Deutsche Nationalbibliothek wird ab 2007 Printmedien und Hochschulschriften flächendeckend mit DDC erschließen. Es ist aber schon offensichtlich, dass v. a. im Bereich der elektronischen Publikationen die anfallenden Dokumentenmengen mit immer knapperen Personalressourcen nicht intellektuell erschlossen werden können, sondern dass neue Verfahren entwickelt werden müssen. Hier kommt Oberhausers Buch gerade richtig. Seit Anfang der 1990er Jahre sind mehrere Projekte zum Thema automatisches Klassifizieren durchgeführt worden. Wer sich in diese Thematik einarbeiten wollte oder sich für die Ergebnisse der größeren Projekte interessierte, konnte bislang auf keine Überblicksdarstellung zurückgreifen, sondern war auf eine Vielzahl von Einzeluntersuchungen sowie die Projektdokumentationen angewiesen. Oberhausers Darstellung, die auf einer Fülle von publizierter und grauer Literatur fußt, schließt diese Lücke. Das selbst gesetzte Ziel, einen guten Überblick über den momentanen Kenntnisstand und die Ergebnisse der einschlägigen Projekte verständlich zu vermitteln, erfüllt der Autor mit Bravour. Dabei ist anzumerken, dass er ein bibliothekarisches Grundwissen und mindestens grundlegende Kenntnisse über informationswissenschaftliche Grundbegriffe und Fragestellungen voraussetzt, wobei hier für den Einsteiger einige Hinweise auf einführende Darstellungen wünschenswert gewesen wären.
    Die am Anfang des Werkes gestellte Frage, ob »die Techniken des automatischen Klassifizierens heute bereits so weit [sind], dass damit grosse Mengen elektronischer Dokumente [-] zufrieden stellend erschlossen werden können? « (S. 13), beantwortet der Verfasser mit einem eindeutigen »nein«, was Salton und McGills Aussage von 1983, »daß einfache automatische Indexierungsverfahren schnell und kostengünstig arbeiten, und daß sie Recall- und Precisionwerte erreichen, die mindestens genauso gut sind wie bei der manuellen Indexierung mit kontrolliertem Vokabular « (Gerard Salton und Michael J. McGill: Information Retrieval. Hamburg u.a. 1987, S. 64 f.) kräftig relativiert. Über die Gründe, warum drei der großen Projekte nicht weiter verfolgt werden, will Oberhauser nicht spekulieren, nennt aber mangelnden Erfolg, Verlagerung der Arbeit in den beteiligten Institutionen sowie Finanzierungsprobleme als mögliche Ursachen. Das größte Entwicklungspotenzial beim automatischen Erschließen großer Dokumentenmengen sieht der Verfasser heute in den Bereichen der Patentund Mediendokumentation. Hier solle man im bibliothekarischen Bereich die Entwicklung genau verfolgen, da diese »sicherlich mittelfristig auf eine qualitativ zufrieden stellende Vollautomatisierung« abziele (S. 146). Oberhausers Darstellung ist ein rundum gelungenes Werk, das zum Handapparat eines jeden, der sich für automatische Erschließung interessiert, gehört."
  20. Dubin, D.: Dimensions and discriminability (1998) 0.01
    0.0071673747 = product of:
      0.014334749 = sum of:
        0.014334749 = product of:
          0.04300425 = sum of:
            0.04300425 = weight(_text_:22 in 2338) [ClassicSimilarity], result of:
              0.04300425 = score(doc=2338,freq=2.0), product of:
                0.1587864 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.04534384 = queryNorm
                0.2708308 = fieldWeight in 2338, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=2338)
          0.33333334 = coord(1/3)
      0.5 = coord(1/2)
    
    Date
    22. 9.1997 19:16:05

Years

Languages

  • e 54
  • d 30

Types

  • a 72
  • el 11
  • m 3
  • r 3
  • d 1
  • x 1
  • More… Less…