Search (61 results, page 1 of 4)

  • × theme_ss:"Computerlinguistik"
  1. Boleda, G.; Evert, S.: Multiword expressions : a pain in the neck of lexical semantics (2009) 0.09
    0.09429854 = product of:
      0.18859708 = sum of:
        0.18859708 = sum of:
          0.122909784 = weight(_text_:literatur in 4888) [ClassicSimilarity], result of:
            0.122909784 = score(doc=4888,freq=2.0), product of:
              0.19353195 = queryWeight, product of:
                4.7901325 = idf(docFreq=998, maxDocs=44218)
                0.04040221 = queryNorm
              0.63508785 = fieldWeight in 4888, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                4.7901325 = idf(docFreq=998, maxDocs=44218)
                0.09375 = fieldNorm(doc=4888)
          0.0656873 = weight(_text_:22 in 4888) [ClassicSimilarity], result of:
            0.0656873 = score(doc=4888,freq=2.0), product of:
              0.14148165 = queryWeight, product of:
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.04040221 = queryNorm
              0.46428138 = fieldWeight in 4888, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.09375 = fieldNorm(doc=4888)
      0.5 = coord(1/2)
    
    Abstract
    Mit einem Überblick über: Probleme, Methoden, Stand der Forschung u. Literatur.
    Date
    1. 3.2013 14:56:22
  2. Hotho, A.; Bloehdorn, S.: Data Mining 2004 : Text classification by boosting weak learners based on terms and concepts (2004) 0.05
    0.05492348 = sum of:
      0.038501654 = product of:
        0.19250827 = sum of:
          0.19250827 = weight(_text_:3a in 562) [ClassicSimilarity], result of:
            0.19250827 = score(doc=562,freq=2.0), product of:
              0.3425304 = queryWeight, product of:
                8.478011 = idf(docFreq=24, maxDocs=44218)
                0.04040221 = queryNorm
              0.56201804 = fieldWeight in 562, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                8.478011 = idf(docFreq=24, maxDocs=44218)
                0.046875 = fieldNorm(doc=562)
        0.2 = coord(1/5)
      0.016421825 = product of:
        0.03284365 = sum of:
          0.03284365 = weight(_text_:22 in 562) [ClassicSimilarity], result of:
            0.03284365 = score(doc=562,freq=2.0), product of:
              0.14148165 = queryWeight, product of:
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.04040221 = queryNorm
              0.23214069 = fieldWeight in 562, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.046875 = fieldNorm(doc=562)
        0.5 = coord(1/2)
    
    Content
    Vgl.: http://www.google.de/url?sa=t&rct=j&q=&esrc=s&source=web&cd=1&cad=rja&ved=0CEAQFjAA&url=http%3A%2F%2Fciteseerx.ist.psu.edu%2Fviewdoc%2Fdownload%3Fdoi%3D10.1.1.91.4940%26rep%3Drep1%26type%3Dpdf&ei=dOXrUMeIDYHDtQahsIGACg&usg=AFQjCNHFWVh6gNPvnOrOS9R3rkrXCNVD-A&sig2=5I2F5evRfMnsttSgFF9g7Q&bvm=bv.1357316858,d.Yms.
    Date
    8. 1.2013 10:22:32
  3. Stock, M.: Textwortmethode und Übersetzungsrelation : Eine Methode zum Aufbau von kombinierten Literaturnachweis- und Terminologiedatenbanken (1989) 0.03
    0.025606204 = product of:
      0.051212408 = sum of:
        0.051212408 = product of:
          0.102424815 = sum of:
            0.102424815 = weight(_text_:literatur in 3412) [ClassicSimilarity], result of:
              0.102424815 = score(doc=3412,freq=2.0), product of:
                0.19353195 = queryWeight, product of:
                  4.7901325 = idf(docFreq=998, maxDocs=44218)
                  0.04040221 = queryNorm
                0.52923983 = fieldWeight in 3412, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  4.7901325 = idf(docFreq=998, maxDocs=44218)
                  0.078125 = fieldNorm(doc=3412)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    Geisteswissenschaftliche Fachinformation erfordert eine enge Kooperation zwischen Literaturnachweis- und Terminologieinformationssystemen. Eine geeignete Dokumentationsmethode für die Auswertung geisteswissen- schaftlicher Literatur ist die Textwortwethode. Dem originalsprachig aufgenommenen Begriffsrepertoire ist ein einheitssprachiger Zugriff beizuordnen, der einerseits ein vollständiges und genaues Retrieval garantiert und andererseits den Aufbau fachspezifischer Wörterbücher vorantreibt
  4. Scobel, G.: GPT: Eine Software, die die Welt verändert (2023) 0.03
    0.025606204 = product of:
      0.051212408 = sum of:
        0.051212408 = product of:
          0.102424815 = sum of:
            0.102424815 = weight(_text_:literatur in 839) [ClassicSimilarity], result of:
              0.102424815 = score(doc=839,freq=2.0), product of:
                0.19353195 = queryWeight, product of:
                  4.7901325 = idf(docFreq=998, maxDocs=44218)
                  0.04040221 = queryNorm
                0.52923983 = fieldWeight in 839, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  4.7901325 = idf(docFreq=998, maxDocs=44218)
                  0.078125 = fieldNorm(doc=839)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Source
    https://www.zdf.de/nachrichten/panorama/gpt-ki-literatur-terrax-gert-scobel-kolumne-100.html?utm_source=pocket-newtab-global-de-DE
  5. Warner, A.J.: Natural language processing (1987) 0.02
    0.021895766 = product of:
      0.043791533 = sum of:
        0.043791533 = product of:
          0.087583065 = sum of:
            0.087583065 = weight(_text_:22 in 337) [ClassicSimilarity], result of:
              0.087583065 = score(doc=337,freq=2.0), product of:
                0.14148165 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.04040221 = queryNorm
                0.61904186 = fieldWeight in 337, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.125 = fieldNorm(doc=337)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Source
    Annual review of information science and technology. 22(1987), S.79-108
  6. Stock, W.G.: Textwortmethode : Norbert Henrichs zum 65. (3) (2000) 0.02
    0.020484963 = product of:
      0.040969927 = sum of:
        0.040969927 = product of:
          0.081939854 = sum of:
            0.081939854 = weight(_text_:literatur in 4891) [ClassicSimilarity], result of:
              0.081939854 = score(doc=4891,freq=2.0), product of:
                0.19353195 = queryWeight, product of:
                  4.7901325 = idf(docFreq=998, maxDocs=44218)
                  0.04040221 = queryNorm
                0.42339188 = fieldWeight in 4891, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  4.7901325 = idf(docFreq=998, maxDocs=44218)
                  0.0625 = fieldNorm(doc=4891)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    Nur wenige Dokumentationsmethoden werden mit dem Namen ihrer Entwickler assoziiert. Ausnahmen sind Melvil Dewey (DDC), S.R. Ranganathan (Colon Classification) - und Norbert Henrichs. Seine Textwortmethode ermöglicht die Indexierung und das Retrieval von Literatur aus Fachgebieten, die keine allseits akzeptierte Fachterminologie vorweisen, also viele Sozial- und Geisteswissenschaften, vorneweg die Philosophie. Für den Einsatz in der elektronischen Philosophie-Dokumentation hat Henrichs in den späten sechziger Jahren die Textwortmethode entworfen. Er ist damit nicht nur einer der Pioniere der Anwendung der elektronischen Datenverarbeitung in der Informationspraxis, sondern auch der Pionier bei der Dokumentation terminologisch nicht starrer Fachsprachen
  7. Noever, D.; Ciolino, M.: ¬The Turing deception (2022) 0.02
    0.019250827 = product of:
      0.038501654 = sum of:
        0.038501654 = product of:
          0.19250827 = sum of:
            0.19250827 = weight(_text_:3a in 862) [ClassicSimilarity], result of:
              0.19250827 = score(doc=862,freq=2.0), product of:
                0.3425304 = queryWeight, product of:
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.04040221 = queryNorm
                0.56201804 = fieldWeight in 862, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.046875 = fieldNorm(doc=862)
          0.2 = coord(1/5)
      0.5 = coord(1/2)
    
    Source
    https%3A%2F%2Farxiv.org%2Fabs%2F2212.06721&usg=AOvVaw3i_9pZm9y_dQWoHi6uv0EN
  8. McMahon, J.G.; Smith, F.J.: Improved statistical language model performance with automatic generated word hierarchies (1996) 0.02
    0.019158795 = product of:
      0.03831759 = sum of:
        0.03831759 = product of:
          0.07663518 = sum of:
            0.07663518 = weight(_text_:22 in 3164) [ClassicSimilarity], result of:
              0.07663518 = score(doc=3164,freq=2.0), product of:
                0.14148165 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.04040221 = queryNorm
                0.5416616 = fieldWeight in 3164, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.109375 = fieldNorm(doc=3164)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Source
    Computational linguistics. 22(1996) no.2, S.217-248
  9. Ruge, G.: ¬A spreading activation network for automatic generation of thesaurus relationships (1991) 0.02
    0.019158795 = product of:
      0.03831759 = sum of:
        0.03831759 = product of:
          0.07663518 = sum of:
            0.07663518 = weight(_text_:22 in 4506) [ClassicSimilarity], result of:
              0.07663518 = score(doc=4506,freq=2.0), product of:
                0.14148165 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.04040221 = queryNorm
                0.5416616 = fieldWeight in 4506, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.109375 = fieldNorm(doc=4506)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Date
    8.10.2000 11:52:22
  10. Somers, H.: Example-based machine translation : Review article (1999) 0.02
    0.019158795 = product of:
      0.03831759 = sum of:
        0.03831759 = product of:
          0.07663518 = sum of:
            0.07663518 = weight(_text_:22 in 6672) [ClassicSimilarity], result of:
              0.07663518 = score(doc=6672,freq=2.0), product of:
                0.14148165 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.04040221 = queryNorm
                0.5416616 = fieldWeight in 6672, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.109375 = fieldNorm(doc=6672)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Date
    31. 7.1996 9:22:19
  11. New tools for human translators (1997) 0.02
    0.019158795 = product of:
      0.03831759 = sum of:
        0.03831759 = product of:
          0.07663518 = sum of:
            0.07663518 = weight(_text_:22 in 1179) [ClassicSimilarity], result of:
              0.07663518 = score(doc=1179,freq=2.0), product of:
                0.14148165 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.04040221 = queryNorm
                0.5416616 = fieldWeight in 1179, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.109375 = fieldNorm(doc=1179)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Date
    31. 7.1996 9:22:19
  12. Baayen, R.H.; Lieber, H.: Word frequency distributions and lexical semantics (1997) 0.02
    0.019158795 = product of:
      0.03831759 = sum of:
        0.03831759 = product of:
          0.07663518 = sum of:
            0.07663518 = weight(_text_:22 in 3117) [ClassicSimilarity], result of:
              0.07663518 = score(doc=3117,freq=2.0), product of:
                0.14148165 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.04040221 = queryNorm
                0.5416616 = fieldWeight in 3117, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.109375 = fieldNorm(doc=3117)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Date
    28. 2.1999 10:48:22
  13. ¬Der Student aus dem Computer (2023) 0.02
    0.019158795 = product of:
      0.03831759 = sum of:
        0.03831759 = product of:
          0.07663518 = sum of:
            0.07663518 = weight(_text_:22 in 1079) [ClassicSimilarity], result of:
              0.07663518 = score(doc=1079,freq=2.0), product of:
                0.14148165 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.04040221 = queryNorm
                0.5416616 = fieldWeight in 1079, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.109375 = fieldNorm(doc=1079)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Date
    27. 1.2023 16:22:55
  14. Granitzer, M.: Statistische Verfahren der Textanalyse (2006) 0.02
    0.017924344 = product of:
      0.03584869 = sum of:
        0.03584869 = product of:
          0.07169738 = sum of:
            0.07169738 = weight(_text_:literatur in 5809) [ClassicSimilarity], result of:
              0.07169738 = score(doc=5809,freq=2.0), product of:
                0.19353195 = queryWeight, product of:
                  4.7901325 = idf(docFreq=998, maxDocs=44218)
                  0.04040221 = queryNorm
                0.3704679 = fieldWeight in 5809, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  4.7901325 = idf(docFreq=998, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=5809)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    Der vorliegende Artikel bietet einen Überblick über statistische Verfahren der Textanalyse im Kontext des Semantic Webs. Als Einleitung erfolgt die Diskussion von Methoden und gängigen Techniken zur Vorverarbeitung von Texten wie z. B. Stemming oder Part-of-Speech Tagging. Die so eingeführten Repräsentationsformen dienen als Basis für statistische Merkmalsanalysen sowie für weiterführende Techniken wie Information Extraction und maschinelle Lernverfahren. Die Darstellung dieser speziellen Techniken erfolgt im Überblick, wobei auf die wichtigsten Aspekte in Bezug auf das Semantic Web detailliert eingegangen wird. Die Anwendung der vorgestellten Techniken zur Erstellung und Wartung von Ontologien sowie der Verweis auf weiterführende Literatur bilden den Abschluss dieses Artikels.
  15. Byrne, C.C.; McCracken, S.A.: ¬An adaptive thesaurus employing semantic distance, relational inheritance and nominal compound interpretation for linguistic support of information retrieval (1999) 0.02
    0.016421825 = product of:
      0.03284365 = sum of:
        0.03284365 = product of:
          0.0656873 = sum of:
            0.0656873 = weight(_text_:22 in 4483) [ClassicSimilarity], result of:
              0.0656873 = score(doc=4483,freq=2.0), product of:
                0.14148165 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.04040221 = queryNorm
                0.46428138 = fieldWeight in 4483, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.09375 = fieldNorm(doc=4483)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Date
    15. 3.2000 10:22:37
  16. Monnerjahn, P.: Vorsprung ohne Technik : Übersetzen: Computer und Qualität (2000) 0.02
    0.016421825 = product of:
      0.03284365 = sum of:
        0.03284365 = product of:
          0.0656873 = sum of:
            0.0656873 = weight(_text_:22 in 5429) [ClassicSimilarity], result of:
              0.0656873 = score(doc=5429,freq=2.0), product of:
                0.14148165 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.04040221 = queryNorm
                0.46428138 = fieldWeight in 5429, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.09375 = fieldNorm(doc=5429)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Source
    c't. 2000, H.22, S.230-231
  17. dpa: 14 Forscher mit viel Geld angelockt : Wolfgang-Paul-Preis (2001) 0.02
    0.015363723 = product of:
      0.030727446 = sum of:
        0.030727446 = product of:
          0.061454892 = sum of:
            0.061454892 = weight(_text_:literatur in 6814) [ClassicSimilarity], result of:
              0.061454892 = score(doc=6814,freq=2.0), product of:
                0.19353195 = queryWeight, product of:
                  4.7901325 = idf(docFreq=998, maxDocs=44218)
                  0.04040221 = queryNorm
                0.31754392 = fieldWeight in 6814, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  4.7901325 = idf(docFreq=998, maxDocs=44218)
                  0.046875 = fieldNorm(doc=6814)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Content
    Darin. "Die Sprachwissenschaftlerin Christiane Fellbaum (dpa-Bild) wird ihr Preisgeld für das an der Berlin-Brandenburgischen Akademie der Wissenschaften zu erstellende "Digitale Wörterbuch der Deutschen Sprache des 20. Jahrhunderts" einsetzen. Sie setzt mit ihrem Computer dort an, wo konventionelle Wörterbücher nicht mehr mithalten können. Sie stellt per Knopfdruck Wortverbindungen her, die eine Sprache so reich an Bildern und Vorstellungen - und damit einzigartig - machen. Ihr elektronisches Lexikon aus über 500 Millionen Wörtern soll später als Datenbank zugänglich sein. Seine Grundlage ist die deutsche Sprache der vergangenen hundert Jahre - ein repräsentativer Querschnitt, zusammengestellt aus Literatur, Zeitungsdeutsch, Fachbuchsprache, Werbetexten und niedergeschriebener Umgangssprache. Wo ein Wörterbuch heute nur ein Wort mit Synonymen oder wenigen Verwendungsmöglichkeiten präsentiert, spannt die Forscherin ein riesiges Netz von Wortverbindungen. Bei Christiane Fellbaums Systematik heißt es beispielsweise nicht nur "verlieren", sondern auch noch "den Faden" oder "die Geduld" verlieren - samt allen möglichen weiteren Kombinationen, die der Computer wie eine Suchmaschine in seinen gespeicherten Texten findet."
  18. Computerlinguistik und Sprachtechnologie : Eine Einführung (2001) 0.02
    0.015363723 = product of:
      0.030727446 = sum of:
        0.030727446 = product of:
          0.061454892 = sum of:
            0.061454892 = weight(_text_:literatur in 1749) [ClassicSimilarity], result of:
              0.061454892 = score(doc=1749,freq=2.0), product of:
                0.19353195 = queryWeight, product of:
                  4.7901325 = idf(docFreq=998, maxDocs=44218)
                  0.04040221 = queryNorm
                0.31754392 = fieldWeight in 1749, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  4.7901325 = idf(docFreq=998, maxDocs=44218)
                  0.046875 = fieldNorm(doc=1749)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Theme
    Grundlagen u. Einführungen: Allgemeine Literatur
  19. Hutchins, J.: From first conception to first demonstration : the nascent years of machine translation, 1947-1954. A chronology (1997) 0.01
    0.013684854 = product of:
      0.027369708 = sum of:
        0.027369708 = product of:
          0.054739416 = sum of:
            0.054739416 = weight(_text_:22 in 1463) [ClassicSimilarity], result of:
              0.054739416 = score(doc=1463,freq=2.0), product of:
                0.14148165 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.04040221 = queryNorm
                0.38690117 = fieldWeight in 1463, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.078125 = fieldNorm(doc=1463)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Date
    31. 7.1996 9:22:19
  20. Kuhlmann, U.; Monnerjahn, P.: Sprache auf Knopfdruck : Sieben automatische Übersetzungsprogramme im Test (2000) 0.01
    0.013684854 = product of:
      0.027369708 = sum of:
        0.027369708 = product of:
          0.054739416 = sum of:
            0.054739416 = weight(_text_:22 in 5428) [ClassicSimilarity], result of:
              0.054739416 = score(doc=5428,freq=2.0), product of:
                0.14148165 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.04040221 = queryNorm
                0.38690117 = fieldWeight in 5428, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.078125 = fieldNorm(doc=5428)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Source
    c't. 2000, H.22, S.220-229

Years

Languages

  • e 36
  • d 25

Types

  • a 44
  • m 8
  • el 7
  • s 3
  • x 3
  • p 2
  • d 1
  • More… Less…