Search (71 results, page 1 of 4)

  • × theme_ss:"Computerlinguistik"
  1. Hotho, A.; Bloehdorn, S.: Data Mining 2004 : Text classification by boosting weak learners based on terms and concepts (2004) 0.43
    0.43398947 = product of:
      0.60758525 = sum of:
        0.059242435 = product of:
          0.1777273 = sum of:
            0.1777273 = weight(_text_:3a in 562) [ClassicSimilarity], result of:
              0.1777273 = score(doc=562,freq=2.0), product of:
                0.3162306 = queryWeight, product of:
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.03730009 = queryNorm
                0.56201804 = fieldWeight in 562, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.046875 = fieldNorm(doc=562)
          0.33333334 = coord(1/3)
        0.1777273 = weight(_text_:2f in 562) [ClassicSimilarity], result of:
          0.1777273 = score(doc=562,freq=2.0), product of:
            0.3162306 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03730009 = queryNorm
            0.56201804 = fieldWeight in 562, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=562)
        0.1777273 = weight(_text_:2f in 562) [ClassicSimilarity], result of:
          0.1777273 = score(doc=562,freq=2.0), product of:
            0.3162306 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03730009 = queryNorm
            0.56201804 = fieldWeight in 562, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=562)
        0.1777273 = weight(_text_:2f in 562) [ClassicSimilarity], result of:
          0.1777273 = score(doc=562,freq=2.0), product of:
            0.3162306 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03730009 = queryNorm
            0.56201804 = fieldWeight in 562, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=562)
        0.015160941 = product of:
          0.030321881 = sum of:
            0.030321881 = weight(_text_:22 in 562) [ClassicSimilarity], result of:
              0.030321881 = score(doc=562,freq=2.0), product of:
                0.13061856 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.03730009 = queryNorm
                0.23214069 = fieldWeight in 562, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.046875 = fieldNorm(doc=562)
          0.5 = coord(1/2)
      0.71428573 = coord(5/7)
    
    Content
    Vgl.: http://www.google.de/url?sa=t&rct=j&q=&esrc=s&source=web&cd=1&cad=rja&ved=0CEAQFjAA&url=http%3A%2F%2Fciteseerx.ist.psu.edu%2Fviewdoc%2Fdownload%3Fdoi%3D10.1.1.91.4940%26rep%3Drep1%26type%3Dpdf&ei=dOXrUMeIDYHDtQahsIGACg&usg=AFQjCNHFWVh6gNPvnOrOS9R3rkrXCNVD-A&sig2=5I2F5evRfMnsttSgFF9g7Q&bvm=bv.1357316858,d.Yms.
    Date
    8. 1.2013 10:22:32
  2. Noever, D.; Ciolino, M.: ¬The Turing deception (2022) 0.34
    0.33852822 = product of:
      0.59242433 = sum of:
        0.059242435 = product of:
          0.1777273 = sum of:
            0.1777273 = weight(_text_:3a in 862) [ClassicSimilarity], result of:
              0.1777273 = score(doc=862,freq=2.0), product of:
                0.3162306 = queryWeight, product of:
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.03730009 = queryNorm
                0.56201804 = fieldWeight in 862, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.046875 = fieldNorm(doc=862)
          0.33333334 = coord(1/3)
        0.1777273 = weight(_text_:2f in 862) [ClassicSimilarity], result of:
          0.1777273 = score(doc=862,freq=2.0), product of:
            0.3162306 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03730009 = queryNorm
            0.56201804 = fieldWeight in 862, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=862)
        0.1777273 = weight(_text_:2f in 862) [ClassicSimilarity], result of:
          0.1777273 = score(doc=862,freq=2.0), product of:
            0.3162306 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03730009 = queryNorm
            0.56201804 = fieldWeight in 862, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=862)
        0.1777273 = weight(_text_:2f in 862) [ClassicSimilarity], result of:
          0.1777273 = score(doc=862,freq=2.0), product of:
            0.3162306 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03730009 = queryNorm
            0.56201804 = fieldWeight in 862, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=862)
      0.5714286 = coord(4/7)
    
    Source
    https%3A%2F%2Farxiv.org%2Fabs%2F2212.06721&usg=AOvVaw3i_9pZm9y_dQWoHi6uv0EN
  3. Huo, W.: Automatic multi-word term extraction and its application to Web-page summarization (2012) 0.31
    0.31333876 = product of:
      0.5483428 = sum of:
        0.1777273 = weight(_text_:2f in 563) [ClassicSimilarity], result of:
          0.1777273 = score(doc=563,freq=2.0), product of:
            0.3162306 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03730009 = queryNorm
            0.56201804 = fieldWeight in 563, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=563)
        0.1777273 = weight(_text_:2f in 563) [ClassicSimilarity], result of:
          0.1777273 = score(doc=563,freq=2.0), product of:
            0.3162306 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03730009 = queryNorm
            0.56201804 = fieldWeight in 563, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=563)
        0.1777273 = weight(_text_:2f in 563) [ClassicSimilarity], result of:
          0.1777273 = score(doc=563,freq=2.0), product of:
            0.3162306 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03730009 = queryNorm
            0.56201804 = fieldWeight in 563, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=563)
        0.015160941 = product of:
          0.030321881 = sum of:
            0.030321881 = weight(_text_:22 in 563) [ClassicSimilarity], result of:
              0.030321881 = score(doc=563,freq=2.0), product of:
                0.13061856 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.03730009 = queryNorm
                0.23214069 = fieldWeight in 563, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.046875 = fieldNorm(doc=563)
          0.5 = coord(1/2)
      0.5714286 = coord(4/7)
    
    Content
    A Thesis presented to The University of Guelph In partial fulfilment of requirements for the degree of Master of Science in Computer Science. Vgl. Unter: http://www.inf.ufrgs.br%2F~ceramisch%2Fdownload_files%2Fpublications%2F2009%2Fp01.pdf.
    Date
    10. 1.2013 19:22:47
  4. Byrne, C.C.; McCracken, S.A.: ¬An adaptive thesaurus employing semantic distance, relational inheritance and nominal compound interpretation for linguistic support of information retrieval (1999) 0.03
    0.025223158 = product of:
      0.1765621 = sum of:
        0.1765621 = sum of:
          0.115918346 = weight(_text_:anwendung in 4483) [ClassicSimilarity], result of:
            0.115918346 = score(doc=4483,freq=2.0), product of:
              0.18058759 = queryWeight, product of:
                4.8414783 = idf(docFreq=948, maxDocs=44218)
                0.03730009 = queryNorm
              0.6418954 = fieldWeight in 4483, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                4.8414783 = idf(docFreq=948, maxDocs=44218)
                0.09375 = fieldNorm(doc=4483)
          0.060643762 = weight(_text_:22 in 4483) [ClassicSimilarity], result of:
            0.060643762 = score(doc=4483,freq=2.0), product of:
              0.13061856 = queryWeight, product of:
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.03730009 = queryNorm
              0.46428138 = fieldWeight in 4483, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.09375 = fieldNorm(doc=4483)
      0.14285715 = coord(1/7)
    
    Date
    15. 3.2000 10:22:37
    Theme
    Konzeption und Anwendung des Prinzips Thesaurus
  5. Owei, V.; Higa, K.: ¬A paradigm for natural language explanation of database queries : a semantic data model approach (1994) 0.02
    0.024102125 = product of:
      0.16871487 = sum of:
        0.16871487 = weight(_text_:interpretations in 8189) [ClassicSimilarity], result of:
          0.16871487 = score(doc=8189,freq=2.0), product of:
            0.26682967 = queryWeight, product of:
              7.1535926 = idf(docFreq=93, maxDocs=44218)
              0.03730009 = queryNorm
            0.63229424 = fieldWeight in 8189, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              7.1535926 = idf(docFreq=93, maxDocs=44218)
              0.0625 = fieldNorm(doc=8189)
      0.14285715 = coord(1/7)
    
    Abstract
    An interface that provides the user with automatic feedback in the form of an explanation of how the database management system interprets user specified queries. Proposes an approach that exploits the rich semantics of graphical semantic data models to construct a restricted natural language explanation of database queries that are specified in a very high level declarative form. These interpretations of the specified query represent the system's 'understanding' of the query, and are returned to the user for validation
  6. Schneider, J.W.; Borlund, P.: ¬A bibliometric-based semiautomatic approach to identification of candidate thesaurus terms : parsing and filtering of noun phrases from citation contexts (2005) 0.01
    0.014713509 = product of:
      0.10299456 = sum of:
        0.10299456 = sum of:
          0.06761903 = weight(_text_:anwendung in 156) [ClassicSimilarity], result of:
            0.06761903 = score(doc=156,freq=2.0), product of:
              0.18058759 = queryWeight, product of:
                4.8414783 = idf(docFreq=948, maxDocs=44218)
                0.03730009 = queryNorm
              0.37443897 = fieldWeight in 156, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                4.8414783 = idf(docFreq=948, maxDocs=44218)
                0.0546875 = fieldNorm(doc=156)
          0.035375528 = weight(_text_:22 in 156) [ClassicSimilarity], result of:
            0.035375528 = score(doc=156,freq=2.0), product of:
              0.13061856 = queryWeight, product of:
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.03730009 = queryNorm
              0.2708308 = fieldWeight in 156, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.0546875 = fieldNorm(doc=156)
      0.14285715 = coord(1/7)
    
    Date
    8. 3.2007 19:55:22
    Theme
    Konzeption und Anwendung des Prinzips Thesaurus
  7. Semantik, Lexikographie und Computeranwendungen : Workshop ... (Bonn) : 1995.01.27-28 (1996) 0.01
    0.013367682 = product of:
      0.09357377 = sum of:
        0.09357377 = sum of:
          0.06830554 = weight(_text_:anwendung in 190) [ClassicSimilarity], result of:
            0.06830554 = score(doc=190,freq=4.0), product of:
              0.18058759 = queryWeight, product of:
                4.8414783 = idf(docFreq=948, maxDocs=44218)
                0.03730009 = queryNorm
              0.3782405 = fieldWeight in 190, product of:
                2.0 = tf(freq=4.0), with freq of:
                  4.0 = termFreq=4.0
                4.8414783 = idf(docFreq=948, maxDocs=44218)
                0.0390625 = fieldNorm(doc=190)
          0.025268236 = weight(_text_:22 in 190) [ClassicSimilarity], result of:
            0.025268236 = score(doc=190,freq=2.0), product of:
              0.13061856 = queryWeight, product of:
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.03730009 = queryNorm
              0.19345059 = fieldWeight in 190, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.0390625 = fieldNorm(doc=190)
      0.14285715 = coord(1/7)
    
    Date
    14. 4.2007 10:04:22
    RSWK
    Computer / Anwendung / Computerunterstützte Lexikographie / Aufsatzsammlung
    Subject
    Computer / Anwendung / Computerunterstützte Lexikographie / Aufsatzsammlung
  8. Pimenov, E.N.: Normativnost' i nekotorye problem razrabotki tezauruzov i drugikh lingvistiicheskikh sredstv IPS (2000) 0.01
    0.0068999017 = product of:
      0.04829931 = sum of:
        0.04829931 = product of:
          0.09659862 = sum of:
            0.09659862 = weight(_text_:anwendung in 3281) [ClassicSimilarity], result of:
              0.09659862 = score(doc=3281,freq=2.0), product of:
                0.18058759 = queryWeight, product of:
                  4.8414783 = idf(docFreq=948, maxDocs=44218)
                  0.03730009 = queryNorm
                0.5349128 = fieldWeight in 3281, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  4.8414783 = idf(docFreq=948, maxDocs=44218)
                  0.078125 = fieldNorm(doc=3281)
          0.5 = coord(1/2)
      0.14285715 = coord(1/7)
    
    Theme
    Konzeption und Anwendung des Prinzips Thesaurus
  9. Warner, A.J.: Natural language processing (1987) 0.01
    0.0057755974 = product of:
      0.04042918 = sum of:
        0.04042918 = product of:
          0.08085836 = sum of:
            0.08085836 = weight(_text_:22 in 337) [ClassicSimilarity], result of:
              0.08085836 = score(doc=337,freq=2.0), product of:
                0.13061856 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.03730009 = queryNorm
                0.61904186 = fieldWeight in 337, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.125 = fieldNorm(doc=337)
          0.5 = coord(1/2)
      0.14285715 = coord(1/7)
    
    Source
    Annual review of information science and technology. 22(1987), S.79-108
  10. Sparck Jones, K.; Kay, M.: Linguistik und Informationswissenschaft (1976) 0.01
    0.0055199214 = product of:
      0.03863945 = sum of:
        0.03863945 = product of:
          0.0772789 = sum of:
            0.0772789 = weight(_text_:anwendung in 3) [ClassicSimilarity], result of:
              0.0772789 = score(doc=3,freq=2.0), product of:
                0.18058759 = queryWeight, product of:
                  4.8414783 = idf(docFreq=948, maxDocs=44218)
                  0.03730009 = queryNorm
                0.42793027 = fieldWeight in 3, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  4.8414783 = idf(docFreq=948, maxDocs=44218)
                  0.0625 = fieldNorm(doc=3)
          0.5 = coord(1/2)
      0.14285715 = coord(1/7)
    
    Abstract
    Diese Arbeit befaßt sich mit den sprachlichen Aspekten der Informationswissenschaft, insbesondere mit den sprachlichen Komponenten der Analyse, der beschreibung und dem Retrieval von Dokumenten. Dabei wird erforscht, welche linguistischen Verfahren und Theorien von der Informationswissenschaft genützt werden können. Unter anderem werden untersucht die Anwendung der Sprachtheorie auf die Struktur der Erkenntnis, die Verwertung der Phonologie, Morphologie, Syntax und Semantik in der Organisation, der Speicherung und in der Überbringung von Informationen
  11. Stock, W.G.: Textwortmethode : Norbert Henrichs zum 65. (3) (2000) 0.01
    0.0055199214 = product of:
      0.03863945 = sum of:
        0.03863945 = product of:
          0.0772789 = sum of:
            0.0772789 = weight(_text_:anwendung in 4891) [ClassicSimilarity], result of:
              0.0772789 = score(doc=4891,freq=2.0), product of:
                0.18058759 = queryWeight, product of:
                  4.8414783 = idf(docFreq=948, maxDocs=44218)
                  0.03730009 = queryNorm
                0.42793027 = fieldWeight in 4891, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  4.8414783 = idf(docFreq=948, maxDocs=44218)
                  0.0625 = fieldNorm(doc=4891)
          0.5 = coord(1/2)
      0.14285715 = coord(1/7)
    
    Abstract
    Nur wenige Dokumentationsmethoden werden mit dem Namen ihrer Entwickler assoziiert. Ausnahmen sind Melvil Dewey (DDC), S.R. Ranganathan (Colon Classification) - und Norbert Henrichs. Seine Textwortmethode ermöglicht die Indexierung und das Retrieval von Literatur aus Fachgebieten, die keine allseits akzeptierte Fachterminologie vorweisen, also viele Sozial- und Geisteswissenschaften, vorneweg die Philosophie. Für den Einsatz in der elektronischen Philosophie-Dokumentation hat Henrichs in den späten sechziger Jahren die Textwortmethode entworfen. Er ist damit nicht nur einer der Pioniere der Anwendung der elektronischen Datenverarbeitung in der Informationspraxis, sondern auch der Pionier bei der Dokumentation terminologisch nicht starrer Fachsprachen
  12. Baierer, K.; Zumstein, P.: Verbesserung der OCR in digitalen Sammlungen von Bibliotheken (2016) 0.01
    0.0055199214 = product of:
      0.03863945 = sum of:
        0.03863945 = product of:
          0.0772789 = sum of:
            0.0772789 = weight(_text_:anwendung in 2818) [ClassicSimilarity], result of:
              0.0772789 = score(doc=2818,freq=2.0), product of:
                0.18058759 = queryWeight, product of:
                  4.8414783 = idf(docFreq=948, maxDocs=44218)
                  0.03730009 = queryNorm
                0.42793027 = fieldWeight in 2818, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  4.8414783 = idf(docFreq=948, maxDocs=44218)
                  0.0625 = fieldNorm(doc=2818)
          0.5 = coord(1/2)
      0.14285715 = coord(1/7)
    
    Abstract
    Möglichkeiten zur Verbesserung der automatischen Texterkennung (OCR) in digitalen Sammlungen insbesondere durch computerlinguistische Methoden werden beschrieben und bisherige PostOCR-Verfahren analysiert. Im Gegensatz zu diesen Möglichkeiten aus der Forschung oder aus einzelnen Projekten unterscheidet sich die momentane Anwendung von OCR in der Bibliothekspraxis wesentlich und nutzt das Potential nur teilweise aus.
  13. McMahon, J.G.; Smith, F.J.: Improved statistical language model performance with automatic generated word hierarchies (1996) 0.01
    0.005053647 = product of:
      0.035375528 = sum of:
        0.035375528 = product of:
          0.070751056 = sum of:
            0.070751056 = weight(_text_:22 in 3164) [ClassicSimilarity], result of:
              0.070751056 = score(doc=3164,freq=2.0), product of:
                0.13061856 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.03730009 = queryNorm
                0.5416616 = fieldWeight in 3164, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.109375 = fieldNorm(doc=3164)
          0.5 = coord(1/2)
      0.14285715 = coord(1/7)
    
    Source
    Computational linguistics. 22(1996) no.2, S.217-248
  14. Ruge, G.: ¬A spreading activation network for automatic generation of thesaurus relationships (1991) 0.01
    0.005053647 = product of:
      0.035375528 = sum of:
        0.035375528 = product of:
          0.070751056 = sum of:
            0.070751056 = weight(_text_:22 in 4506) [ClassicSimilarity], result of:
              0.070751056 = score(doc=4506,freq=2.0), product of:
                0.13061856 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.03730009 = queryNorm
                0.5416616 = fieldWeight in 4506, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.109375 = fieldNorm(doc=4506)
          0.5 = coord(1/2)
      0.14285715 = coord(1/7)
    
    Date
    8.10.2000 11:52:22
  15. Somers, H.: Example-based machine translation : Review article (1999) 0.01
    0.005053647 = product of:
      0.035375528 = sum of:
        0.035375528 = product of:
          0.070751056 = sum of:
            0.070751056 = weight(_text_:22 in 6672) [ClassicSimilarity], result of:
              0.070751056 = score(doc=6672,freq=2.0), product of:
                0.13061856 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.03730009 = queryNorm
                0.5416616 = fieldWeight in 6672, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.109375 = fieldNorm(doc=6672)
          0.5 = coord(1/2)
      0.14285715 = coord(1/7)
    
    Date
    31. 7.1996 9:22:19
  16. New tools for human translators (1997) 0.01
    0.005053647 = product of:
      0.035375528 = sum of:
        0.035375528 = product of:
          0.070751056 = sum of:
            0.070751056 = weight(_text_:22 in 1179) [ClassicSimilarity], result of:
              0.070751056 = score(doc=1179,freq=2.0), product of:
                0.13061856 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.03730009 = queryNorm
                0.5416616 = fieldWeight in 1179, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.109375 = fieldNorm(doc=1179)
          0.5 = coord(1/2)
      0.14285715 = coord(1/7)
    
    Date
    31. 7.1996 9:22:19
  17. Baayen, R.H.; Lieber, H.: Word frequency distributions and lexical semantics (1997) 0.01
    0.005053647 = product of:
      0.035375528 = sum of:
        0.035375528 = product of:
          0.070751056 = sum of:
            0.070751056 = weight(_text_:22 in 3117) [ClassicSimilarity], result of:
              0.070751056 = score(doc=3117,freq=2.0), product of:
                0.13061856 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.03730009 = queryNorm
                0.5416616 = fieldWeight in 3117, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.109375 = fieldNorm(doc=3117)
          0.5 = coord(1/2)
      0.14285715 = coord(1/7)
    
    Date
    28. 2.1999 10:48:22
  18. ¬Der Student aus dem Computer (2023) 0.01
    0.005053647 = product of:
      0.035375528 = sum of:
        0.035375528 = product of:
          0.070751056 = sum of:
            0.070751056 = weight(_text_:22 in 1079) [ClassicSimilarity], result of:
              0.070751056 = score(doc=1079,freq=2.0), product of:
                0.13061856 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.03730009 = queryNorm
                0.5416616 = fieldWeight in 1079, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.109375 = fieldNorm(doc=1079)
          0.5 = coord(1/2)
      0.14285715 = coord(1/7)
    
    Date
    27. 1.2023 16:22:55
  19. Granitzer, M.: Statistische Verfahren der Textanalyse (2006) 0.00
    0.004829931 = product of:
      0.033809517 = sum of:
        0.033809517 = product of:
          0.06761903 = sum of:
            0.06761903 = weight(_text_:anwendung in 5809) [ClassicSimilarity], result of:
              0.06761903 = score(doc=5809,freq=2.0), product of:
                0.18058759 = queryWeight, product of:
                  4.8414783 = idf(docFreq=948, maxDocs=44218)
                  0.03730009 = queryNorm
                0.37443897 = fieldWeight in 5809, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  4.8414783 = idf(docFreq=948, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=5809)
          0.5 = coord(1/2)
      0.14285715 = coord(1/7)
    
    Abstract
    Der vorliegende Artikel bietet einen Überblick über statistische Verfahren der Textanalyse im Kontext des Semantic Webs. Als Einleitung erfolgt die Diskussion von Methoden und gängigen Techniken zur Vorverarbeitung von Texten wie z. B. Stemming oder Part-of-Speech Tagging. Die so eingeführten Repräsentationsformen dienen als Basis für statistische Merkmalsanalysen sowie für weiterführende Techniken wie Information Extraction und maschinelle Lernverfahren. Die Darstellung dieser speziellen Techniken erfolgt im Überblick, wobei auf die wichtigsten Aspekte in Bezug auf das Semantic Web detailliert eingegangen wird. Die Anwendung der vorgestellten Techniken zur Erstellung und Wartung von Ontologien sowie der Verweis auf weiterführende Literatur bilden den Abschluss dieses Artikels.
  20. Schönbächler, E.; Strasser, T.; Himpsl-Gutermann, K.: Vom Chat zum Check : Informationskompetenz mit ChatGPT steigern (2023) 0.00
    0.004829931 = product of:
      0.033809517 = sum of:
        0.033809517 = product of:
          0.06761903 = sum of:
            0.06761903 = weight(_text_:anwendung in 924) [ClassicSimilarity], result of:
              0.06761903 = score(doc=924,freq=2.0), product of:
                0.18058759 = queryWeight, product of:
                  4.8414783 = idf(docFreq=948, maxDocs=44218)
                  0.03730009 = queryNorm
                0.37443897 = fieldWeight in 924, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  4.8414783 = idf(docFreq=948, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=924)
          0.5 = coord(1/2)
      0.14285715 = coord(1/7)
    
    Abstract
    Der Beitrag greift den aktuellen Diskurs um die KI-Anwendung ChatGPT und deren Bedeutung in Schule und Hochschule auf. Dabei werden durch einen Überblick über verschiedene Assistenzsysteme, die auf Künstlicher Intelligenz beruhen, Grundlagen und Unterschiede herausgearbeitet. Der Bereich der Chatbots wird näher beleuchtet, die beiden grundlegenden Arten des regelbasierten Chatbots und des Machine Learning Bots werden anhand von anschaulichen Beispielen praxisnah erklärt. Schließlich wird herausgearbeitet, dass Informationskompetenz als Schlüsselkompetenz des 21. Jahrhunderts auch die wesentliche Grundlage dafür ist, im Bildungsbereich konstruktiv mit KI-Systemen wie ChatGPT umzugehen und die wesentlichen Funktionsmechanismen zu verstehen. Ein Unterrichtsentwurf zum Thema "Biene" schließt den Praxisbeitrag ab.

Years

Languages

  • e 40
  • d 31
  • m 1
  • ru 1
  • More… Less…

Types

  • a 53
  • m 10
  • el 7
  • s 5
  • x 3
  • p 2
  • d 1
  • More… Less…

Classifications