Search (15 results, page 1 of 1)

  • × theme_ss:"Computerlinguistik"
  • × type_ss:"el"
  1. Dampz, N.: ChatGPT interpretiert jetzt auch Bilder : Neue Version (2023) 0.02
    0.023228312 = product of:
      0.046456624 = sum of:
        0.046456624 = product of:
          0.09291325 = sum of:
            0.09291325 = weight(_text_:n in 874) [ClassicSimilarity], result of:
              0.09291325 = score(doc=874,freq=2.0), product of:
                0.19504215 = queryWeight, product of:
                  4.3116565 = idf(docFreq=1611, maxDocs=44218)
                  0.045236014 = queryNorm
                0.47637522 = fieldWeight in 874, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  4.3116565 = idf(docFreq=1611, maxDocs=44218)
                  0.078125 = fieldNorm(doc=874)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  2. Vaswani, A.; Shazeer, N.; Parmar, N.; Uszkoreit, J.; Jones, L.; Gomez, A.N.; Kaiser, L.; Polosukhin, I.: Attention Is all you need (2017) 0.02
    0.019709876 = product of:
      0.03941975 = sum of:
        0.03941975 = product of:
          0.0788395 = sum of:
            0.0788395 = weight(_text_:n in 970) [ClassicSimilarity], result of:
              0.0788395 = score(doc=970,freq=4.0), product of:
                0.19504215 = queryWeight, product of:
                  4.3116565 = idf(docFreq=1611, maxDocs=44218)
                  0.045236014 = queryNorm
                0.40421778 = fieldWeight in 970, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  4.3116565 = idf(docFreq=1611, maxDocs=44218)
                  0.046875 = fieldNorm(doc=970)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  3. Bubenhofer, N.: Einführung in die Korpuslinguistik : Praktische Grundlagen und Werkzeuge (2006) 0.02
    0.01858265 = product of:
      0.0371653 = sum of:
        0.0371653 = product of:
          0.0743306 = sum of:
            0.0743306 = weight(_text_:n in 3126) [ClassicSimilarity], result of:
              0.0743306 = score(doc=3126,freq=2.0), product of:
                0.19504215 = queryWeight, product of:
                  4.3116565 = idf(docFreq=1611, maxDocs=44218)
                  0.045236014 = queryNorm
                0.38110018 = fieldWeight in 3126, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  4.3116565 = idf(docFreq=1611, maxDocs=44218)
                  0.0625 = fieldNorm(doc=3126)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  4. Boleda, G.; Evert, S.: Multiword expressions : a pain in the neck of lexical semantics (2009) 0.02
    0.018386567 = product of:
      0.036773134 = sum of:
        0.036773134 = product of:
          0.07354627 = sum of:
            0.07354627 = weight(_text_:22 in 4888) [ClassicSimilarity], result of:
              0.07354627 = score(doc=4888,freq=2.0), product of:
                0.15840882 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.045236014 = queryNorm
                0.46428138 = fieldWeight in 4888, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.09375 = fieldNorm(doc=4888)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Date
    1. 3.2013 14:56:22
  5. Wordhoard (o.J.) 0.02
    0.016259817 = product of:
      0.032519635 = sum of:
        0.032519635 = product of:
          0.06503927 = sum of:
            0.06503927 = weight(_text_:n in 3922) [ClassicSimilarity], result of:
              0.06503927 = score(doc=3922,freq=2.0), product of:
                0.19504215 = queryWeight, product of:
                  4.3116565 = idf(docFreq=1611, maxDocs=44218)
                  0.045236014 = queryNorm
                0.33346266 = fieldWeight in 3922, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  4.3116565 = idf(docFreq=1611, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=3922)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    WordHoard defines a multiword unit as a special type of collocate in which the component words comprise a meaningful phrase. For example, "Knight of the Round Table" is a meaningful multiword unit or phrase. WordHoard uses the notion of a pseudo-bigram to generalize the computation of bigram (two word) statistical measures to phrases (n-grams) longer than two words, and to allow comparisons of these measures for phrases with different word counts. WordHoard applies the localmaxs algorithm of Silva et al. to the pseudo-bigrams to identify potential compositional phrases that "stand out" in a text. WordHoard can also filter two and three word phrases using the word class filters suggested by Justeson and Katz.
  6. WordHoard: finding multiword units (20??) 0.02
    0.016259817 = product of:
      0.032519635 = sum of:
        0.032519635 = product of:
          0.06503927 = sum of:
            0.06503927 = weight(_text_:n in 1123) [ClassicSimilarity], result of:
              0.06503927 = score(doc=1123,freq=2.0), product of:
                0.19504215 = queryWeight, product of:
                  4.3116565 = idf(docFreq=1611, maxDocs=44218)
                  0.045236014 = queryNorm
                0.33346266 = fieldWeight in 1123, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  4.3116565 = idf(docFreq=1611, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=1123)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    WordHoard defines a multiword unit as a special type of collocate in which the component words comprise a meaningful phrase. For example, "Knight of the Round Table" is a meaningful multiword unit or phrase. WordHoard uses the notion of a pseudo-bigram to generalize the computation of bigram (two word) statistical measures to phrases (n-grams) longer than two words, and to allow comparisons of these measures for phrases with different word counts. WordHoard applies the localmaxs algorithm of Silva et al. to the pseudo-bigrams to identify potential compositional phrases that "stand out" in a text. WordHoard can also filter two and three word phrases using the word class filters suggested by Justeson and Katz.
  7. Aizawa, A.; Kohlhase, M.: Mathematical information retrieval (2021) 0.02
    0.016259817 = product of:
      0.032519635 = sum of:
        0.032519635 = product of:
          0.06503927 = sum of:
            0.06503927 = weight(_text_:n in 667) [ClassicSimilarity], result of:
              0.06503927 = score(doc=667,freq=2.0), product of:
                0.19504215 = queryWeight, product of:
                  4.3116565 = idf(docFreq=1611, maxDocs=44218)
                  0.045236014 = queryNorm
                0.33346266 = fieldWeight in 667, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  4.3116565 = idf(docFreq=1611, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=667)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Source
    Evaluating information retrieval and access tasks. Eds.: Sakai, T., Oard, D., Kando, N. [https://doi.org/10.1007/978-981-15-5554-1_12]
  8. Liu, P.J.; Saleh, M.; Pot, E.; Goodrich, B.; Sepassi, R.; Kaiser, L.; Shazeer, N.: Generating Wikipedia by summarizing long sequences (2018) 0.02
    0.016259817 = product of:
      0.032519635 = sum of:
        0.032519635 = product of:
          0.06503927 = sum of:
            0.06503927 = weight(_text_:n in 773) [ClassicSimilarity], result of:
              0.06503927 = score(doc=773,freq=2.0), product of:
                0.19504215 = queryWeight, product of:
                  4.3116565 = idf(docFreq=1611, maxDocs=44218)
                  0.045236014 = queryNorm
                0.33346266 = fieldWeight in 773, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  4.3116565 = idf(docFreq=1611, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=773)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  9. Perovsek, M.; Kranjca, J.; Erjaveca, T.; Cestnika, B.; Lavraca, N.: TextFlows : a visual programming platform for text mining and natural language processing (2016) 0.01
    0.013936987 = product of:
      0.027873974 = sum of:
        0.027873974 = product of:
          0.05574795 = sum of:
            0.05574795 = weight(_text_:n in 2697) [ClassicSimilarity], result of:
              0.05574795 = score(doc=2697,freq=2.0), product of:
                0.19504215 = queryWeight, product of:
                  4.3116565 = idf(docFreq=1611, maxDocs=44218)
                  0.045236014 = queryNorm
                0.28582513 = fieldWeight in 2697, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  4.3116565 = idf(docFreq=1611, maxDocs=44218)
                  0.046875 = fieldNorm(doc=2697)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  10. Lezius, W.: Morphy - Morphologie und Tagging für das Deutsche (2013) 0.01
    0.012257711 = product of:
      0.024515422 = sum of:
        0.024515422 = product of:
          0.049030844 = sum of:
            0.049030844 = weight(_text_:22 in 1490) [ClassicSimilarity], result of:
              0.049030844 = score(doc=1490,freq=2.0), product of:
                0.15840882 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.045236014 = queryNorm
                0.30952093 = fieldWeight in 1490, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0625 = fieldNorm(doc=1490)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Date
    22. 3.2015 9:30:24
  11. Bager, J.: ¬Die Text-KI ChatGPT schreibt Fachtexte, Prosa, Gedichte und Programmcode (2023) 0.01
    0.012257711 = product of:
      0.024515422 = sum of:
        0.024515422 = product of:
          0.049030844 = sum of:
            0.049030844 = weight(_text_:22 in 835) [ClassicSimilarity], result of:
              0.049030844 = score(doc=835,freq=2.0), product of:
                0.15840882 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.045236014 = queryNorm
                0.30952093 = fieldWeight in 835, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0625 = fieldNorm(doc=835)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Date
    29.12.2022 18:22:55
  12. Rieger, F.: Lügende Computer (2023) 0.01
    0.012257711 = product of:
      0.024515422 = sum of:
        0.024515422 = product of:
          0.049030844 = sum of:
            0.049030844 = weight(_text_:22 in 912) [ClassicSimilarity], result of:
              0.049030844 = score(doc=912,freq=2.0), product of:
                0.15840882 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.045236014 = queryNorm
                0.30952093 = fieldWeight in 912, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0625 = fieldNorm(doc=912)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Date
    16. 3.2023 19:22:55
  13. Brown, T.B.; Mann, B.; Ryder, N.; Subbiah, M.; Kaplan, J.; Dhariwal, P.; Neelakantan, A.; Shyam, P.; Sastry, G.; Askell, A.; Agarwal, S.; Herbert-Voss, A.; Krueger, G.; Henighan, T.; Child, R.; Ramesh, A.; Ziegler, D.M.; Wu, J.; Winter, C.; Hesse, C.; Chen, M.; Sigler, E.; Litwin, M.; Gray, S.; Chess, B.; Clark, J.; Berner, C.; McCandlish, S.; Radford, A.; Sutskever, I.; Amodei, D.: Language models are few-shot learners (2020) 0.01
    0.009291325 = product of:
      0.01858265 = sum of:
        0.01858265 = product of:
          0.0371653 = sum of:
            0.0371653 = weight(_text_:n in 872) [ClassicSimilarity], result of:
              0.0371653 = score(doc=872,freq=2.0), product of:
                0.19504215 = queryWeight, product of:
                  4.3116565 = idf(docFreq=1611, maxDocs=44218)
                  0.045236014 = queryNorm
                0.19055009 = fieldWeight in 872, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  4.3116565 = idf(docFreq=1611, maxDocs=44218)
                  0.03125 = fieldNorm(doc=872)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  14. Nagy T., I.: Detecting multiword expressions and named entities in natural language texts (2014) 0.01
    0.008129909 = product of:
      0.016259817 = sum of:
        0.016259817 = product of:
          0.032519635 = sum of:
            0.032519635 = weight(_text_:n in 1536) [ClassicSimilarity], result of:
              0.032519635 = score(doc=1536,freq=2.0), product of:
                0.19504215 = queryWeight, product of:
                  4.3116565 = idf(docFreq=1611, maxDocs=44218)
                  0.045236014 = queryNorm
                0.16673133 = fieldWeight in 1536, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  4.3116565 = idf(docFreq=1611, maxDocs=44218)
                  0.02734375 = fieldNorm(doc=1536)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    Multiword expressions (MWEs) are lexical items that can be decomposed into single words and display lexical, syntactic, semantic, pragmatic and/or statistical idiosyncrasy (Sag et al., 2002; Kim, 2008; Calzolari et al., 2002). The proper treatment of multiword expressions such as rock 'n' roll and make a decision is essential for many natural language processing (NLP) applications like information extraction and retrieval, terminology extraction and machine translation, and it is important to identify multiword expressions in context. For example, in machine translation we must know that MWEs form one semantic unit, hence their parts should not be translated separately. For this, multiword expressions should be identified first in the text to be translated. The chief aim of this thesis is to develop machine learning-based approaches for the automatic detection of different types of multiword expressions in English and Hungarian natural language texts. In our investigations, we pay attention to the characteristics of different types of multiword expressions such as nominal compounds, multiword named entities and light verb constructions, and we apply novel methods to identify MWEs in raw texts. In the thesis it will be demonstrated that nominal compounds and multiword amed entities may require a similar approach for their automatic detection as they behave in the same way from a linguistic point of view. Furthermore, it will be shown that the automatic detection of light verb constructions can be carried out using two effective machine learning-based approaches.
  15. Rötzer, F.: KI-Programm besser als Menschen im Verständnis natürlicher Sprache (2018) 0.01
    0.0061288555 = product of:
      0.012257711 = sum of:
        0.012257711 = product of:
          0.024515422 = sum of:
            0.024515422 = weight(_text_:22 in 4217) [ClassicSimilarity], result of:
              0.024515422 = score(doc=4217,freq=2.0), product of:
                0.15840882 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.045236014 = queryNorm
                0.15476047 = fieldWeight in 4217, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.03125 = fieldNorm(doc=4217)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Date
    22. 1.2018 11:32:44