Search (70 results, page 1 of 4)

  • × theme_ss:"Automatisches Klassifizieren"
  1. Hotho, A.; Bloehdorn, S.: Data Mining 2004 : Text classification by boosting weak learners based on terms and concepts (2004) 0.10
    0.09784776 = sum of:
      0.05348208 = product of:
        0.21392833 = sum of:
          0.21392833 = weight(_text_:3a in 562) [ClassicSimilarity], result of:
            0.21392833 = score(doc=562,freq=2.0), product of:
              0.38064316 = queryWeight, product of:
                8.478011 = idf(docFreq=24, maxDocs=44218)
                0.044897694 = queryNorm
              0.56201804 = fieldWeight in 562, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                8.478011 = idf(docFreq=24, maxDocs=44218)
                0.046875 = fieldNorm(doc=562)
        0.25 = coord(1/4)
      0.04436568 = product of:
        0.06654852 = sum of:
          0.030050414 = weight(_text_:j in 562) [ClassicSimilarity], result of:
            0.030050414 = score(doc=562,freq=2.0), product of:
              0.14266226 = queryWeight, product of:
                3.1774964 = idf(docFreq=5010, maxDocs=44218)
                0.044897694 = queryNorm
              0.21064025 = fieldWeight in 562, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.1774964 = idf(docFreq=5010, maxDocs=44218)
                0.046875 = fieldNorm(doc=562)
          0.036498103 = weight(_text_:22 in 562) [ClassicSimilarity], result of:
            0.036498103 = score(doc=562,freq=2.0), product of:
              0.15722407 = queryWeight, product of:
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.044897694 = queryNorm
              0.23214069 = fieldWeight in 562, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.046875 = fieldNorm(doc=562)
        0.6666667 = coord(2/3)
    
    Content
    Vgl.: http://www.google.de/url?sa=t&rct=j&q=&esrc=s&source=web&cd=1&cad=rja&ved=0CEAQFjAA&url=http%3A%2F%2Fciteseerx.ist.psu.edu%2Fviewdoc%2Fdownload%3Fdoi%3D10.1.1.91.4940%26rep%3Drep1%26type%3Dpdf&ei=dOXrUMeIDYHDtQahsIGACg&usg=AFQjCNHFWVh6gNPvnOrOS9R3rkrXCNVD-A&sig2=5I2F5evRfMnsttSgFF9g7Q&bvm=bv.1357316858,d.Yms.
    Date
    8. 1.2013 10:22:32
  2. HaCohen-Kerner, Y. et al.: Classification using various machine learning methods and combinations of key-phrases and visual features (2016) 0.04
    0.0369714 = product of:
      0.0739428 = sum of:
        0.0739428 = product of:
          0.1109142 = sum of:
            0.05008402 = weight(_text_:j in 2748) [ClassicSimilarity], result of:
              0.05008402 = score(doc=2748,freq=2.0), product of:
                0.14266226 = queryWeight, product of:
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.044897694 = queryNorm
                0.35106707 = fieldWeight in 2748, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.078125 = fieldNorm(doc=2748)
            0.06083018 = weight(_text_:22 in 2748) [ClassicSimilarity], result of:
              0.06083018 = score(doc=2748,freq=2.0), product of:
                0.15722407 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.044897694 = queryNorm
                0.38690117 = fieldWeight in 2748, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.078125 = fieldNorm(doc=2748)
          0.6666667 = coord(2/3)
      0.5 = coord(1/2)
    
    Date
    1. 2.2016 18:25:22
    Source
    Semantic keyword-based search on structured data sources: First COST Action IC1302 International KEYSTONE Conference, IKC 2015, Coimbra, Portugal, September 8-9, 2015. Revised Selected Papers. Eds.: J. Cardoso et al
  3. Wille, J.: Automatisches Klassifizieren bibliographischer Beschreibungsdaten : Vorgehensweise und Ergebnisse (2006) 0.03
    0.030074244 = product of:
      0.06014849 = sum of:
        0.06014849 = product of:
          0.09022273 = sum of:
            0.035058815 = weight(_text_:j in 6090) [ClassicSimilarity], result of:
              0.035058815 = score(doc=6090,freq=2.0), product of:
                0.14266226 = queryWeight, product of:
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.044897694 = queryNorm
                0.24574696 = fieldWeight in 6090, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=6090)
            0.055163912 = weight(_text_:f in 6090) [ClassicSimilarity], result of:
              0.055163912 = score(doc=6090,freq=2.0), product of:
                0.1789526 = queryWeight, product of:
                  3.985786 = idf(docFreq=2232, maxDocs=44218)
                  0.044897694 = queryNorm
                0.3082599 = fieldWeight in 6090, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.985786 = idf(docFreq=2232, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=6090)
          0.6666667 = coord(2/3)
      0.5 = coord(1/2)
    
    Abstract
    Diese Arbeit befasst sich mit den praktischen Aspekten des Automatischen Klassifizierens bibliographischer Referenzdaten. Im Vordergrund steht die konkrete Vorgehensweise anhand des eigens zu diesem Zweck entwickelten Open Source-Programms COBRA "Classification Of Bibliographic Records, Automatic". Es werden die Rahmenbedingungen und Parameter f¨ur einen Einsatz im bibliothekarischen Umfeld geklärt. Schließlich erfolgt eine Auswertung von Klassifizierungsergebnissen am Beispiel sozialwissenschaftlicher Daten aus der Datenbank SOLIS.
  4. Egbert, J.; Biber, D.; Davies, M.: Developing a bottom-up, user-based method of web register classification (2015) 0.02
    0.02218284 = product of:
      0.04436568 = sum of:
        0.04436568 = product of:
          0.06654852 = sum of:
            0.030050414 = weight(_text_:j in 2158) [ClassicSimilarity], result of:
              0.030050414 = score(doc=2158,freq=2.0), product of:
                0.14266226 = queryWeight, product of:
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.044897694 = queryNorm
                0.21064025 = fieldWeight in 2158, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.046875 = fieldNorm(doc=2158)
            0.036498103 = weight(_text_:22 in 2158) [ClassicSimilarity], result of:
              0.036498103 = score(doc=2158,freq=2.0), product of:
                0.15722407 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.044897694 = queryNorm
                0.23214069 = fieldWeight in 2158, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.046875 = fieldNorm(doc=2158)
          0.6666667 = coord(2/3)
      0.5 = coord(1/2)
    
    Date
    4. 8.2015 19:22:04
  5. Panyr, J.: STEINADLER: ein Verfahren zur automatischen Deskribierung und zur automatischen thematischen Klassifikation (1978) 0.01
    0.013355739 = product of:
      0.026711479 = sum of:
        0.026711479 = product of:
          0.08013444 = sum of:
            0.08013444 = weight(_text_:j in 5169) [ClassicSimilarity], result of:
              0.08013444 = score(doc=5169,freq=2.0), product of:
                0.14266226 = queryWeight, product of:
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.044897694 = queryNorm
                0.5617073 = fieldWeight in 5169, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.125 = fieldNorm(doc=5169)
          0.33333334 = coord(1/3)
      0.5 = coord(1/2)
    
  6. Subramanian, S.; Shafer, K.E.: Clustering (2001) 0.01
    0.012166034 = product of:
      0.024332069 = sum of:
        0.024332069 = product of:
          0.07299621 = sum of:
            0.07299621 = weight(_text_:22 in 1046) [ClassicSimilarity], result of:
              0.07299621 = score(doc=1046,freq=2.0), product of:
                0.15722407 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.044897694 = queryNorm
                0.46428138 = fieldWeight in 1046, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.09375 = fieldNorm(doc=1046)
          0.33333334 = coord(1/3)
      0.5 = coord(1/2)
    
    Date
    5. 5.2003 14:17:22
  7. Kleinoeder, H.H.; Puzicha, J.: Automatische Katalogisierung am Beispiel einer Pilotanwendung (2002) 0.01
    0.011686272 = product of:
      0.023372544 = sum of:
        0.023372544 = product of:
          0.07011763 = sum of:
            0.07011763 = weight(_text_:j in 1154) [ClassicSimilarity], result of:
              0.07011763 = score(doc=1154,freq=2.0), product of:
                0.14266226 = queryWeight, product of:
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.044897694 = queryNorm
                0.4914939 = fieldWeight in 1154, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.109375 = fieldNorm(doc=1154)
          0.33333334 = coord(1/3)
      0.5 = coord(1/2)
    
  8. Oberhauser, O.: Automatisches Klassifizieren : Entwicklungsstand - Methodik - Anwendungsbereiche (2005) 0.01
    0.010740802 = product of:
      0.021481603 = sum of:
        0.021481603 = product of:
          0.032222405 = sum of:
            0.012521005 = weight(_text_:j in 38) [ClassicSimilarity], result of:
              0.012521005 = score(doc=38,freq=2.0), product of:
                0.14266226 = queryWeight, product of:
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.044897694 = queryNorm
                0.08776677 = fieldWeight in 38, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.01953125 = fieldNorm(doc=38)
            0.019701399 = weight(_text_:f in 38) [ClassicSimilarity], result of:
              0.019701399 = score(doc=38,freq=2.0), product of:
                0.1789526 = queryWeight, product of:
                  3.985786 = idf(docFreq=2232, maxDocs=44218)
                  0.044897694 = queryNorm
                0.110092826 = fieldWeight in 38, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.985786 = idf(docFreq=2232, maxDocs=44218)
                  0.01953125 = fieldNorm(doc=38)
          0.6666667 = coord(2/3)
      0.5 = coord(1/2)
    
    Footnote
    Die am Anfang des Werkes gestellte Frage, ob »die Techniken des automatischen Klassifizierens heute bereits so weit [sind], dass damit grosse Mengen elektronischer Dokumente [-] zufrieden stellend erschlossen werden können? « (S. 13), beantwortet der Verfasser mit einem eindeutigen »nein«, was Salton und McGills Aussage von 1983, »daß einfache automatische Indexierungsverfahren schnell und kostengünstig arbeiten, und daß sie Recall- und Precisionwerte erreichen, die mindestens genauso gut sind wie bei der manuellen Indexierung mit kontrolliertem Vokabular « (Gerard Salton und Michael J. McGill: Information Retrieval. Hamburg u.a. 1987, S. 64 f.) kräftig relativiert. Über die Gründe, warum drei der großen Projekte nicht weiter verfolgt werden, will Oberhauser nicht spekulieren, nennt aber mangelnden Erfolg, Verlagerung der Arbeit in den beteiligten Institutionen sowie Finanzierungsprobleme als mögliche Ursachen. Das größte Entwicklungspotenzial beim automatischen Erschließen großer Dokumentenmengen sieht der Verfasser heute in den Bereichen der Patentund Mediendokumentation. Hier solle man im bibliothekarischen Bereich die Entwicklung genau verfolgen, da diese »sicherlich mittelfristig auf eine qualitativ zufrieden stellende Vollautomatisierung« abziele (S. 146). Oberhausers Darstellung ist ein rundum gelungenes Werk, das zum Handapparat eines jeden, der sich für automatische Erschließung interessiert, gehört."
  9. Autonomy, Inc.: Automatic classification (o.J.) 0.01
    0.010507412 = product of:
      0.021014825 = sum of:
        0.021014825 = product of:
          0.06304447 = sum of:
            0.06304447 = weight(_text_:f in 1666) [ClassicSimilarity], result of:
              0.06304447 = score(doc=1666,freq=2.0), product of:
                0.1789526 = queryWeight, product of:
                  3.985786 = idf(docFreq=2232, maxDocs=44218)
                  0.044897694 = queryNorm
                0.35229704 = fieldWeight in 1666, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.985786 = idf(docFreq=2232, maxDocs=44218)
                  0.0625 = fieldNorm(doc=1666)
          0.33333334 = coord(1/3)
      0.5 = coord(1/2)
    
    Source
    http://www.autonomy.com/Content/Products/IDOL/f/Classification#01
  10. Guerrero-Bote, V.P.; Moya Anegón, F. de; Herrero Solana, V.: Document organization using Kohonen's algorithm (2002) 0.01
    0.010507412 = product of:
      0.021014825 = sum of:
        0.021014825 = product of:
          0.06304447 = sum of:
            0.06304447 = weight(_text_:f in 2564) [ClassicSimilarity], result of:
              0.06304447 = score(doc=2564,freq=2.0), product of:
                0.1789526 = queryWeight, product of:
                  3.985786 = idf(docFreq=2232, maxDocs=44218)
                  0.044897694 = queryNorm
                0.35229704 = fieldWeight in 2564, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.985786 = idf(docFreq=2232, maxDocs=44218)
                  0.0625 = fieldNorm(doc=2564)
          0.33333334 = coord(1/3)
      0.5 = coord(1/2)
    
  11. Reiner, U.: Automatische DDC-Klassifizierung von bibliografischen Titeldatensätzen (2009) 0.01
    0.010138364 = product of:
      0.020276727 = sum of:
        0.020276727 = product of:
          0.06083018 = sum of:
            0.06083018 = weight(_text_:22 in 611) [ClassicSimilarity], result of:
              0.06083018 = score(doc=611,freq=2.0), product of:
                0.15722407 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.044897694 = queryNorm
                0.38690117 = fieldWeight in 611, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.078125 = fieldNorm(doc=611)
          0.33333334 = coord(1/3)
      0.5 = coord(1/2)
    
    Date
    22. 8.2009 12:54:24
  12. Panyr, J.: Automatische Klassifikation und Information Retrieval : Anwendung und Entwicklung komplexer Verfahren in Information-Retrieval-Systemen und ihre Evaluierung (1986) 0.01
    0.010016805 = product of:
      0.02003361 = sum of:
        0.02003361 = product of:
          0.060100827 = sum of:
            0.060100827 = weight(_text_:j in 32) [ClassicSimilarity], result of:
              0.060100827 = score(doc=32,freq=2.0), product of:
                0.14266226 = queryWeight, product of:
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.044897694 = queryNorm
                0.4212805 = fieldWeight in 32, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.09375 = fieldNorm(doc=32)
          0.33333334 = coord(1/3)
      0.5 = coord(1/2)
    
  13. Godby, C. J.; Stuler, J.: ¬The Library of Congress Classification as a knowledge base for automatic subject categorization (2001) 0.01
    0.009443934 = product of:
      0.018887868 = sum of:
        0.018887868 = product of:
          0.056663603 = sum of:
            0.056663603 = weight(_text_:j in 1567) [ClassicSimilarity], result of:
              0.056663603 = score(doc=1567,freq=4.0), product of:
                0.14266226 = queryWeight, product of:
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.044897694 = queryNorm
                0.39718705 = fieldWeight in 1567, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.0625 = fieldNorm(doc=1567)
          0.33333334 = coord(1/3)
      0.5 = coord(1/2)
    
  14. Salles, T.; Rocha, L.; Gonçalves, M.A.; Almeida, J.M.; Mourão, F.; Meira Jr., W.; Viegas, F.: ¬A quantitative analysis of the temporal effects on automatic text classification (2016) 0.01
    0.009287328 = product of:
      0.018574657 = sum of:
        0.018574657 = product of:
          0.05572397 = sum of:
            0.05572397 = weight(_text_:f in 3014) [ClassicSimilarity], result of:
              0.05572397 = score(doc=3014,freq=4.0), product of:
                0.1789526 = queryWeight, product of:
                  3.985786 = idf(docFreq=2232, maxDocs=44218)
                  0.044897694 = queryNorm
                0.31138954 = fieldWeight in 3014, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  3.985786 = idf(docFreq=2232, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=3014)
          0.33333334 = coord(1/3)
      0.5 = coord(1/2)
    
  15. Sebastiani, F.: Classification of text, automatic (2006) 0.01
    0.009193986 = product of:
      0.018387971 = sum of:
        0.018387971 = product of:
          0.055163912 = sum of:
            0.055163912 = weight(_text_:f in 5003) [ClassicSimilarity], result of:
              0.055163912 = score(doc=5003,freq=2.0), product of:
                0.1789526 = queryWeight, product of:
                  3.985786 = idf(docFreq=2232, maxDocs=44218)
                  0.044897694 = queryNorm
                0.3082599 = fieldWeight in 5003, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.985786 = idf(docFreq=2232, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=5003)
          0.33333334 = coord(1/3)
      0.5 = coord(1/2)
    
  16. Wätjen, H.-J.; Diekmann, B.; Möller, G.; Carstensen, K.-U.: Bericht zum DFG-Projekt: GERHARD : German Harvest Automated Retrieval and Directory (1998) 0.01
    0.008347337 = product of:
      0.016694674 = sum of:
        0.016694674 = product of:
          0.05008402 = sum of:
            0.05008402 = weight(_text_:j in 3065) [ClassicSimilarity], result of:
              0.05008402 = score(doc=3065,freq=2.0), product of:
                0.14266226 = queryWeight, product of:
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.044897694 = queryNorm
                0.35106707 = fieldWeight in 3065, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.078125 = fieldNorm(doc=3065)
          0.33333334 = coord(1/3)
      0.5 = coord(1/2)
    
  17. Wätjen, H.-J.: Automatisches Sammeln, Klassifizieren und Indexieren von wissenschaftlich relevanten Informationsressourcen im deutschen World Wide Web : das DFG-Projekt GERHARD (1998) 0.01
    0.008347337 = product of:
      0.016694674 = sum of:
        0.016694674 = product of:
          0.05008402 = sum of:
            0.05008402 = weight(_text_:j in 3066) [ClassicSimilarity], result of:
              0.05008402 = score(doc=3066,freq=2.0), product of:
                0.14266226 = queryWeight, product of:
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.044897694 = queryNorm
                0.35106707 = fieldWeight in 3066, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.1774964 = idf(docFreq=5010, maxDocs=44218)
                  0.078125 = fieldNorm(doc=3066)
          0.33333334 = coord(1/3)
      0.5 = coord(1/2)
    
  18. Sebastiani, F.: Machine learning in automated text categorization (2002) 0.01
    0.007880559 = product of:
      0.015761118 = sum of:
        0.015761118 = product of:
          0.04728335 = sum of:
            0.04728335 = weight(_text_:f in 3389) [ClassicSimilarity], result of:
              0.04728335 = score(doc=3389,freq=2.0), product of:
                0.1789526 = queryWeight, product of:
                  3.985786 = idf(docFreq=2232, maxDocs=44218)
                  0.044897694 = queryNorm
                0.26422277 = fieldWeight in 3389, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.985786 = idf(docFreq=2232, maxDocs=44218)
                  0.046875 = fieldNorm(doc=3389)
          0.33333334 = coord(1/3)
      0.5 = coord(1/2)
    
  19. Sebastiani, F.: ¬A tutorial an automated text categorisation (1999) 0.01
    0.007880559 = product of:
      0.015761118 = sum of:
        0.015761118 = product of:
          0.04728335 = sum of:
            0.04728335 = weight(_text_:f in 3390) [ClassicSimilarity], result of:
              0.04728335 = score(doc=3390,freq=2.0), product of:
                0.1789526 = queryWeight, product of:
                  3.985786 = idf(docFreq=2232, maxDocs=44218)
                  0.044897694 = queryNorm
                0.26422277 = fieldWeight in 3390, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.985786 = idf(docFreq=2232, maxDocs=44218)
                  0.046875 = fieldNorm(doc=3390)
          0.33333334 = coord(1/3)
      0.5 = coord(1/2)
    
  20. Hung, C.-M.; Chien, L.-F.: Web-based text classification in the absence of manually labeled training documents (2007) 0.01
    0.007880559 = product of:
      0.015761118 = sum of:
        0.015761118 = product of:
          0.04728335 = sum of:
            0.04728335 = weight(_text_:f in 87) [ClassicSimilarity], result of:
              0.04728335 = score(doc=87,freq=2.0), product of:
                0.1789526 = queryWeight, product of:
                  3.985786 = idf(docFreq=2232, maxDocs=44218)
                  0.044897694 = queryNorm
                0.26422277 = fieldWeight in 87, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.985786 = idf(docFreq=2232, maxDocs=44218)
                  0.046875 = fieldNorm(doc=87)
          0.33333334 = coord(1/3)
      0.5 = coord(1/2)
    

Years

Languages

  • e 51
  • d 19

Types

  • a 59
  • el 11
  • m 2
  • r 2
  • d 1
  • x 1
  • More… Less…