Search (11 results, page 1 of 1)

  • × theme_ss:"Computerlinguistik"
  • × year_i:[2010 TO 2020}
  1. Budin, G.: Zum Entwicklungsstand der Terminologiewissenschaft (2019) 0.02
    0.01939604 = product of:
      0.03879208 = sum of:
        0.03879208 = product of:
          0.07758416 = sum of:
            0.07758416 = weight(_text_:book in 5604) [ClassicSimilarity], result of:
              0.07758416 = score(doc=5604,freq=2.0), product of:
                0.2272612 = queryWeight, product of:
                  4.414126 = idf(docFreq=1454, maxDocs=44218)
                  0.051484983 = queryNorm
                0.34138763 = fieldWeight in 5604, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  4.414126 = idf(docFreq=1454, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=5604)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Footnote
    Vgl.: https://www.springer.com/de/book/9783662589489.
  2. Mengel, T.: Wie viel Terminologiearbeit steckt in der Übersetzung der Dewey-Dezimalklassifikation? (2019) 0.02
    0.016625179 = product of:
      0.033250358 = sum of:
        0.033250358 = product of:
          0.066500716 = sum of:
            0.066500716 = weight(_text_:book in 5603) [ClassicSimilarity], result of:
              0.066500716 = score(doc=5603,freq=2.0), product of:
                0.2272612 = queryWeight, product of:
                  4.414126 = idf(docFreq=1454, maxDocs=44218)
                  0.051484983 = queryNorm
                0.29261798 = fieldWeight in 5603, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  4.414126 = idf(docFreq=1454, maxDocs=44218)
                  0.046875 = fieldNorm(doc=5603)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Footnote
    Vgl.: https://www.springer.com/de/book/9783662589489.
  3. Ramisch, C.: Multiword expressions acquisition : a generic and open framework (2015) 0.02
    0.015674368 = product of:
      0.031348735 = sum of:
        0.031348735 = product of:
          0.06269747 = sum of:
            0.06269747 = weight(_text_:book in 1649) [ClassicSimilarity], result of:
              0.06269747 = score(doc=1649,freq=4.0), product of:
                0.2272612 = queryWeight, product of:
                  4.414126 = idf(docFreq=1454, maxDocs=44218)
                  0.051484983 = queryNorm
                0.27588287 = fieldWeight in 1649, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  4.414126 = idf(docFreq=1454, maxDocs=44218)
                  0.03125 = fieldNorm(doc=1649)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    This book is an excellent introduction to multiword expressions. It provides a unique, comprehensive and up-to-date overview of this exciting topic in computational linguistics. The first part describes the diversity and richness of multiword expressions, including many examples in several languages. These constructions are not only complex and arbitrary, but also much more frequent than one would guess, making them a real nightmare for natural language processing applications. The second part introduces a new generic framework for automatic acquisition of multiword expressions from texts. Furthermore, it describes the accompanying free software tool, the mwetoolkit, which comes in handy when looking for expressions in texts (regardless of the language). Evaluation is greatly emphasized, underlining the fact that results depend on parameters like corpus size, language, MWE type, etc. The last part contains solid experimental results and evaluates the mwetoolkit, demonstrating its usefulness for computer-assisted lexicography and machine translation. This is the first book to cover the whole pipeline of multiword expression acquisition in a single volume. It is addresses the needs of students and researchers in computational and theoretical linguistics, cognitive sciences, artificial intelligence and computer science. Its good balance between computational and linguistic views make it the perfect starting point for anyone interested in multiword expressions, language and text processing in general.
  4. Lezius, W.: Morphy - Morphologie und Tagging für das Deutsche (2013) 0.01
    0.013951009 = product of:
      0.027902018 = sum of:
        0.027902018 = product of:
          0.055804037 = sum of:
            0.055804037 = weight(_text_:22 in 1490) [ClassicSimilarity], result of:
              0.055804037 = score(doc=1490,freq=2.0), product of:
                0.18029164 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.051484983 = queryNorm
                0.30952093 = fieldWeight in 1490, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0625 = fieldNorm(doc=1490)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Date
    22. 3.2015 9:30:24
  5. Helbig, H.: Knowledge representation and the semantics of natural language (2014) 0.01
    0.013854314 = product of:
      0.027708627 = sum of:
        0.027708627 = product of:
          0.055417255 = sum of:
            0.055417255 = weight(_text_:book in 2396) [ClassicSimilarity], result of:
              0.055417255 = score(doc=2396,freq=2.0), product of:
                0.2272612 = queryWeight, product of:
                  4.414126 = idf(docFreq=1454, maxDocs=44218)
                  0.051484983 = queryNorm
                0.2438483 = fieldWeight in 2396, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  4.414126 = idf(docFreq=1454, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=2396)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    Natural Language is not only the most important means of communication between human beings, it is also used over historical periods for the preservation of cultural achievements and their transmission from one generation to the other. During the last few decades, the flod of digitalized information has been growing tremendously. This tendency will continue with the globalisation of information societies and with the growing importance of national and international computer networks. This is one reason why the theoretical understanding and the automated treatment of communication processes based on natural language have such a decisive social and economic impact. In this context, the semantic representation of knowledge originally formulated in natural language plays a central part, because it connects all components of natural language processing systems, be they the automatic understanding of natural language (analysis), the rational reasoning over knowledge bases, or the generation of natural language expressions from formal representations. This book presents a method for the semantic representation of natural language expressions (texts, sentences, phrases, etc.) which can be used as a universal knowledge representation paradigm in the human sciences, like linguistics, cognitive psychology, or philosophy of language, as well as in computational linguistics and in artificial intelligence. It is also an attempt to close the gap between these disciplines, which to a large extent are still working separately.
  6. Terminologie : Epochen - Schwerpunkte - Umsetzungen : zum 25-jährigen Bestehen des Rats für Deutschsprachige Terminologie (2019) 0.01
    0.013854314 = product of:
      0.027708627 = sum of:
        0.027708627 = product of:
          0.055417255 = sum of:
            0.055417255 = weight(_text_:book in 5602) [ClassicSimilarity], result of:
              0.055417255 = score(doc=5602,freq=2.0), product of:
                0.2272612 = queryWeight, product of:
                  4.414126 = idf(docFreq=1454, maxDocs=44218)
                  0.051484983 = queryNorm
                0.2438483 = fieldWeight in 5602, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  4.414126 = idf(docFreq=1454, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=5602)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Footnote
    Vgl.: https://www.springer.com/de/book/9783662589489.
  7. Huo, W.: Automatic multi-word term extraction and its application to Web-page summarization (2012) 0.01
    0.010463256 = product of:
      0.020926513 = sum of:
        0.020926513 = product of:
          0.041853026 = sum of:
            0.041853026 = weight(_text_:22 in 563) [ClassicSimilarity], result of:
              0.041853026 = score(doc=563,freq=2.0), product of:
                0.18029164 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.051484983 = queryNorm
                0.23214069 = fieldWeight in 563, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.046875 = fieldNorm(doc=563)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Date
    10. 1.2013 19:22:47
  8. Lawrie, D.; Mayfield, J.; McNamee, P.; Oard, P.W.: Cross-language person-entity linking from 20 languages (2015) 0.01
    0.010463256 = product of:
      0.020926513 = sum of:
        0.020926513 = product of:
          0.041853026 = sum of:
            0.041853026 = weight(_text_:22 in 1848) [ClassicSimilarity], result of:
              0.041853026 = score(doc=1848,freq=2.0), product of:
                0.18029164 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.051484983 = queryNorm
                0.23214069 = fieldWeight in 1848, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.046875 = fieldNorm(doc=1848)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    The goal of entity linking is to associate references to an entity that is found in unstructured natural language content to an authoritative inventory of known entities. This article describes the construction of 6 test collections for cross-language person-entity linking that together span 22 languages. Fully automated components were used together with 2 crowdsourced validation stages to affordably generate ground-truth annotations with an accuracy comparable to that of a completely manual process. The resulting test collections each contain between 642 (Arabic) and 2,361 (Romanian) person references in non-English texts for which the correct resolution in English Wikipedia is known, plus a similar number of references for which no correct resolution into English Wikipedia is believed to exist. Fully automated cross-language person-name linking experiments with 20 non-English languages yielded a resolution accuracy of between 0.84 (Serbian) and 0.98 (Romanian), which compares favorably with previously reported cross-language entity linking results for Spanish.
  9. Fóris, A.: Network theory and terminology (2013) 0.01
    0.008719381 = product of:
      0.017438762 = sum of:
        0.017438762 = product of:
          0.034877524 = sum of:
            0.034877524 = weight(_text_:22 in 1365) [ClassicSimilarity], result of:
              0.034877524 = score(doc=1365,freq=2.0), product of:
                0.18029164 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.051484983 = queryNorm
                0.19345059 = fieldWeight in 1365, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=1365)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Date
    2. 9.2014 21:22:48
  10. Rötzer, F.: KI-Programm besser als Menschen im Verständnis natürlicher Sprache (2018) 0.01
    0.0069755046 = product of:
      0.013951009 = sum of:
        0.013951009 = product of:
          0.027902018 = sum of:
            0.027902018 = weight(_text_:22 in 4217) [ClassicSimilarity], result of:
              0.027902018 = score(doc=4217,freq=2.0), product of:
                0.18029164 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.051484983 = queryNorm
                0.15476047 = fieldWeight in 4217, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.03125 = fieldNorm(doc=4217)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Date
    22. 1.2018 11:32:44
  11. Deventer, J.P. van; Kruger, C.J.; Johnson, R.D.: Delineating knowledge management through lexical analysis : a retrospective (2015) 0.01
    0.0061035664 = product of:
      0.012207133 = sum of:
        0.012207133 = product of:
          0.024414266 = sum of:
            0.024414266 = weight(_text_:22 in 3807) [ClassicSimilarity], result of:
              0.024414266 = score(doc=3807,freq=2.0), product of:
                0.18029164 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.051484983 = queryNorm
                0.1354154 = fieldWeight in 3807, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.02734375 = fieldNorm(doc=3807)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Date
    20. 1.2015 18:30:22