Search (200 results, page 1 of 10)

  • × theme_ss:"Computerlinguistik"
  • × type_ss:"a"
  1. Stock, M.; Stock, W.G.: Literaturnachweis- und Terminologiedatenbank : die Erfassung von Fachliteratur und Fachterminologie eines Fachgebiets in einer kombinierten Datenbank (1991) 0.06
    0.056307487 = product of:
      0.22522995 = sum of:
        0.1341764 = weight(_text_:bibliographien in 3411) [ClassicSimilarity], result of:
          0.1341764 = score(doc=3411,freq=2.0), product of:
            0.17148633 = queryWeight, product of:
              7.0817666 = idf(docFreq=100, maxDocs=44218)
              0.024215192 = queryNorm
            0.782432 = fieldWeight in 3411, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              7.0817666 = idf(docFreq=100, maxDocs=44218)
              0.078125 = fieldNorm(doc=3411)
        0.02276339 = weight(_text_:und in 3411) [ClassicSimilarity], result of:
          0.02276339 = score(doc=3411,freq=6.0), product of:
            0.05366975 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.024215192 = queryNorm
            0.42413816 = fieldWeight in 3411, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.078125 = fieldNorm(doc=3411)
        0.02276339 = weight(_text_:und in 3411) [ClassicSimilarity], result of:
          0.02276339 = score(doc=3411,freq=6.0), product of:
            0.05366975 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.024215192 = queryNorm
            0.42413816 = fieldWeight in 3411, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.078125 = fieldNorm(doc=3411)
        0.02276339 = weight(_text_:und in 3411) [ClassicSimilarity], result of:
          0.02276339 = score(doc=3411,freq=6.0), product of:
            0.05366975 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.024215192 = queryNorm
            0.42413816 = fieldWeight in 3411, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.078125 = fieldNorm(doc=3411)
        0.02276339 = weight(_text_:und in 3411) [ClassicSimilarity], result of:
          0.02276339 = score(doc=3411,freq=6.0), product of:
            0.05366975 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.024215192 = queryNorm
            0.42413816 = fieldWeight in 3411, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.078125 = fieldNorm(doc=3411)
      0.25 = coord(5/20)
    
    Abstract
    In wissenschaftlichen Spezialgebieten kann über den Aufbau einer Literaturdatenbank gleichzeitig eine Terminologiedatenbank mit erstellt werden. Als Dokumentationsmethode eignet sich die Textwortmethode mit Übersetzungrelation. Mit dem Softwarepaket LBase aufgebaute Druckbildprogramme gestatten die Ausgabe von Bibliographien und Wörterbüchern
  2. Hotho, A.; Bloehdorn, S.: Data Mining 2004 : Text classification by boosting weak learners based on terms and concepts (2004) 0.06
    0.055812698 = product of:
      0.2790635 = sum of:
        0.038460143 = product of:
          0.11538043 = sum of:
            0.11538043 = weight(_text_:3a in 562) [ClassicSimilarity], result of:
              0.11538043 = score(doc=562,freq=2.0), product of:
                0.20529667 = queryWeight, product of:
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.024215192 = queryNorm
                0.56201804 = fieldWeight in 562, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.046875 = fieldNorm(doc=562)
          0.33333334 = coord(1/3)
        0.11538043 = weight(_text_:2f in 562) [ClassicSimilarity], result of:
          0.11538043 = score(doc=562,freq=2.0), product of:
            0.20529667 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.024215192 = queryNorm
            0.56201804 = fieldWeight in 562, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=562)
        0.11538043 = weight(_text_:2f in 562) [ClassicSimilarity], result of:
          0.11538043 = score(doc=562,freq=2.0), product of:
            0.20529667 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.024215192 = queryNorm
            0.56201804 = fieldWeight in 562, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=562)
        0.009842472 = product of:
          0.019684944 = sum of:
            0.019684944 = weight(_text_:22 in 562) [ClassicSimilarity], result of:
              0.019684944 = score(doc=562,freq=2.0), product of:
                0.08479747 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.024215192 = queryNorm
                0.23214069 = fieldWeight in 562, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.046875 = fieldNorm(doc=562)
          0.5 = coord(1/2)
      0.2 = coord(4/20)
    
    Content
    Vgl.: http://www.google.de/url?sa=t&rct=j&q=&esrc=s&source=web&cd=1&cad=rja&ved=0CEAQFjAA&url=http%3A%2F%2Fciteseerx.ist.psu.edu%2Fviewdoc%2Fdownload%3Fdoi%3D10.1.1.91.4940%26rep%3Drep1%26type%3Dpdf&ei=dOXrUMeIDYHDtQahsIGACg&usg=AFQjCNHFWVh6gNPvnOrOS9R3rkrXCNVD-A&sig2=5I2F5evRfMnsttSgFF9g7Q&bvm=bv.1357316858,d.Yms.
    Date
    8. 1.2013 10:22:32
  3. Gombocz, W.L.: Stichwort oder Schlagwort versus Textwort : Grazer und Düsseldorfer Philosophie-Dokumentation und -Information nach bzw. gemäß Norbert Henrichs (2000) 0.05
    0.051478792 = product of:
      0.17159596 = sum of:
        0.026284898 = weight(_text_:und in 3413) [ClassicSimilarity], result of:
          0.026284898 = score(doc=3413,freq=8.0), product of:
            0.05366975 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.024215192 = queryNorm
            0.48975256 = fieldWeight in 3413, product of:
              2.828427 = tf(freq=8.0), with freq of:
                8.0 = termFreq=8.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.078125 = fieldNorm(doc=3413)
        0.026284898 = weight(_text_:und in 3413) [ClassicSimilarity], result of:
          0.026284898 = score(doc=3413,freq=8.0), product of:
            0.05366975 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.024215192 = queryNorm
            0.48975256 = fieldWeight in 3413, product of:
              2.828427 = tf(freq=8.0), with freq of:
                8.0 = termFreq=8.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.078125 = fieldNorm(doc=3413)
        0.013349611 = weight(_text_:der in 3413) [ClassicSimilarity], result of:
          0.013349611 = score(doc=3413,freq=2.0), product of:
            0.054091092 = queryWeight, product of:
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.024215192 = queryNorm
            0.2467987 = fieldWeight in 3413, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.078125 = fieldNorm(doc=3413)
        0.026284898 = weight(_text_:und in 3413) [ClassicSimilarity], result of:
          0.026284898 = score(doc=3413,freq=8.0), product of:
            0.05366975 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.024215192 = queryNorm
            0.48975256 = fieldWeight in 3413, product of:
              2.828427 = tf(freq=8.0), with freq of:
                8.0 = termFreq=8.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.078125 = fieldNorm(doc=3413)
        0.053106755 = product of:
          0.10621351 = sum of:
            0.10621351 = weight(_text_:philosophie in 3413) [ClassicSimilarity], result of:
              0.10621351 = score(doc=3413,freq=4.0), product of:
                0.12829916 = queryWeight, product of:
                  5.298292 = idf(docFreq=600, maxDocs=44218)
                  0.024215192 = queryNorm
                0.82785815 = fieldWeight in 3413, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  5.298292 = idf(docFreq=600, maxDocs=44218)
                  0.078125 = fieldNorm(doc=3413)
          0.5 = coord(1/2)
        0.026284898 = weight(_text_:und in 3413) [ClassicSimilarity], result of:
          0.026284898 = score(doc=3413,freq=8.0), product of:
            0.05366975 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.024215192 = queryNorm
            0.48975256 = fieldWeight in 3413, product of:
              2.828427 = tf(freq=8.0), with freq of:
                8.0 = termFreq=8.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.078125 = fieldNorm(doc=3413)
      0.3 = coord(6/20)
    
    Field
    Philosophie
    Imprint
    Düsseldorf : Universitäts- und Landesbibliothek
    Series
    Schriften der Universitäts- und Landesbibliothek Düsseldorf; 32
  4. Stock, W.G.: Textwortmethode : Norbert Henrichs zum 65. (3) (2000) 0.04
    0.043075215 = product of:
      0.14358404 = sum of:
        0.018210711 = weight(_text_:und in 4891) [ClassicSimilarity], result of:
          0.018210711 = score(doc=4891,freq=6.0), product of:
            0.05366975 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.024215192 = queryNorm
            0.33931053 = fieldWeight in 4891, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.0625 = fieldNorm(doc=4891)
        0.018210711 = weight(_text_:und in 4891) [ClassicSimilarity], result of:
          0.018210711 = score(doc=4891,freq=6.0), product of:
            0.05366975 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.024215192 = queryNorm
            0.33931053 = fieldWeight in 4891, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.0625 = fieldNorm(doc=4891)
        0.028255802 = weight(_text_:der in 4891) [ClassicSimilarity], result of:
          0.028255802 = score(doc=4891,freq=14.0), product of:
            0.054091092 = queryWeight, product of:
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.024215192 = queryNorm
            0.5223744 = fieldWeight in 4891, product of:
              3.7416575 = tf(freq=14.0), with freq of:
                14.0 = termFreq=14.0
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.0625 = fieldNorm(doc=4891)
        0.018210711 = weight(_text_:und in 4891) [ClassicSimilarity], result of:
          0.018210711 = score(doc=4891,freq=6.0), product of:
            0.05366975 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.024215192 = queryNorm
            0.33931053 = fieldWeight in 4891, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.0625 = fieldNorm(doc=4891)
        0.0424854 = product of:
          0.0849708 = sum of:
            0.0849708 = weight(_text_:philosophie in 4891) [ClassicSimilarity], result of:
              0.0849708 = score(doc=4891,freq=4.0), product of:
                0.12829916 = queryWeight, product of:
                  5.298292 = idf(docFreq=600, maxDocs=44218)
                  0.024215192 = queryNorm
                0.6622865 = fieldWeight in 4891, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  5.298292 = idf(docFreq=600, maxDocs=44218)
                  0.0625 = fieldNorm(doc=4891)
          0.5 = coord(1/2)
        0.018210711 = weight(_text_:und in 4891) [ClassicSimilarity], result of:
          0.018210711 = score(doc=4891,freq=6.0), product of:
            0.05366975 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.024215192 = queryNorm
            0.33931053 = fieldWeight in 4891, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.0625 = fieldNorm(doc=4891)
      0.3 = coord(6/20)
    
    Abstract
    Nur wenige Dokumentationsmethoden werden mit dem Namen ihrer Entwickler assoziiert. Ausnahmen sind Melvil Dewey (DDC), S.R. Ranganathan (Colon Classification) - und Norbert Henrichs. Seine Textwortmethode ermöglicht die Indexierung und das Retrieval von Literatur aus Fachgebieten, die keine allseits akzeptierte Fachterminologie vorweisen, also viele Sozial- und Geisteswissenschaften, vorneweg die Philosophie. Für den Einsatz in der elektronischen Philosophie-Dokumentation hat Henrichs in den späten sechziger Jahren die Textwortmethode entworfen. Er ist damit nicht nur einer der Pioniere der Anwendung der elektronischen Datenverarbeitung in der Informationspraxis, sondern auch der Pionier bei der Dokumentation terminologisch nicht starrer Fachsprachen
  5. Schank, R.C.: Computer, elementare Aktionen und linguistische Theorien (1977) 0.04
    0.04147122 = product of:
      0.16588488 = sum of:
        0.036798857 = weight(_text_:und in 6142) [ClassicSimilarity], result of:
          0.036798857 = score(doc=6142,freq=8.0), product of:
            0.05366975 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.024215192 = queryNorm
            0.68565357 = fieldWeight in 6142, product of:
              2.828427 = tf(freq=8.0), with freq of:
                8.0 = termFreq=8.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.109375 = fieldNorm(doc=6142)
        0.036798857 = weight(_text_:und in 6142) [ClassicSimilarity], result of:
          0.036798857 = score(doc=6142,freq=8.0), product of:
            0.05366975 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.024215192 = queryNorm
            0.68565357 = fieldWeight in 6142, product of:
              2.828427 = tf(freq=8.0), with freq of:
                8.0 = termFreq=8.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.109375 = fieldNorm(doc=6142)
        0.018689455 = weight(_text_:der in 6142) [ClassicSimilarity], result of:
          0.018689455 = score(doc=6142,freq=2.0), product of:
            0.054091092 = queryWeight, product of:
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.024215192 = queryNorm
            0.34551817 = fieldWeight in 6142, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.109375 = fieldNorm(doc=6142)
        0.036798857 = weight(_text_:und in 6142) [ClassicSimilarity], result of:
          0.036798857 = score(doc=6142,freq=8.0), product of:
            0.05366975 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.024215192 = queryNorm
            0.68565357 = fieldWeight in 6142, product of:
              2.828427 = tf(freq=8.0), with freq of:
                8.0 = termFreq=8.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.109375 = fieldNorm(doc=6142)
        0.036798857 = weight(_text_:und in 6142) [ClassicSimilarity], result of:
          0.036798857 = score(doc=6142,freq=8.0), product of:
            0.05366975 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.024215192 = queryNorm
            0.68565357 = fieldWeight in 6142, product of:
              2.828427 = tf(freq=8.0), with freq of:
                8.0 = termFreq=8.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.109375 = fieldNorm(doc=6142)
      0.25 = coord(5/20)
    
    Series
    Grundlagen der Kommunikation und Kognition
    Source
    Semantik und künstliche Intelligenz: Beiträge zur automatischen Sprachbearbeitung II. Hrsg. und eingeleitet von P. Eisenberg
  6. Noever, D.; Ciolino, M.: ¬The Turing deception (2022) 0.04
    0.040383153 = product of:
      0.269221 = sum of:
        0.038460143 = product of:
          0.11538043 = sum of:
            0.11538043 = weight(_text_:3a in 862) [ClassicSimilarity], result of:
              0.11538043 = score(doc=862,freq=2.0), product of:
                0.20529667 = queryWeight, product of:
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.024215192 = queryNorm
                0.56201804 = fieldWeight in 862, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.046875 = fieldNorm(doc=862)
          0.33333334 = coord(1/3)
        0.11538043 = weight(_text_:2f in 862) [ClassicSimilarity], result of:
          0.11538043 = score(doc=862,freq=2.0), product of:
            0.20529667 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.024215192 = queryNorm
            0.56201804 = fieldWeight in 862, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=862)
        0.11538043 = weight(_text_:2f in 862) [ClassicSimilarity], result of:
          0.11538043 = score(doc=862,freq=2.0), product of:
            0.20529667 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.024215192 = queryNorm
            0.56201804 = fieldWeight in 862, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=862)
      0.15 = coord(3/20)
    
    Source
    https%3A%2F%2Farxiv.org%2Fabs%2F2212.06721&usg=AOvVaw3i_9pZm9y_dQWoHi6uv0EN
  7. Engerer, V.: Informationswissenschaft und Linguistik. : kurze Geschichte eines fruchtbaren interdisziplinäaren Verhäaltnisses in drei Akten (2012) 0.04
    0.03787226 = product of:
      0.15148903 = sum of:
        0.02276339 = weight(_text_:und in 3376) [ClassicSimilarity], result of:
          0.02276339 = score(doc=3376,freq=6.0), product of:
            0.05366975 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.024215192 = queryNorm
            0.42413816 = fieldWeight in 3376, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.078125 = fieldNorm(doc=3376)
        0.060435485 = weight(_text_:geschichte in 3376) [ClassicSimilarity], result of:
          0.060435485 = score(doc=3376,freq=2.0), product of:
            0.11508996 = queryWeight, product of:
              4.7528 = idf(docFreq=1036, maxDocs=44218)
              0.024215192 = queryNorm
            0.5251152 = fieldWeight in 3376, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              4.7528 = idf(docFreq=1036, maxDocs=44218)
              0.078125 = fieldNorm(doc=3376)
        0.02276339 = weight(_text_:und in 3376) [ClassicSimilarity], result of:
          0.02276339 = score(doc=3376,freq=6.0), product of:
            0.05366975 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.024215192 = queryNorm
            0.42413816 = fieldWeight in 3376, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.078125 = fieldNorm(doc=3376)
        0.02276339 = weight(_text_:und in 3376) [ClassicSimilarity], result of:
          0.02276339 = score(doc=3376,freq=6.0), product of:
            0.05366975 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.024215192 = queryNorm
            0.42413816 = fieldWeight in 3376, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.078125 = fieldNorm(doc=3376)
        0.02276339 = weight(_text_:und in 3376) [ClassicSimilarity], result of:
          0.02276339 = score(doc=3376,freq=6.0), product of:
            0.05366975 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.024215192 = queryNorm
            0.42413816 = fieldWeight in 3376, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.078125 = fieldNorm(doc=3376)
      0.25 = coord(5/20)
    
    Source
    SDV - Sprache und Datenverarbeitung. International journal for language data processing. 36(2012) H.2, S.71-91 [= E-Books - Fakten, Perspektiven und Szenarien] 36/2 (2012), S. 71-91
  8. Stock, W.G.: Textwortmethode (2000) 0.04
    0.037574004 = product of:
      0.12524667 = sum of:
        0.018586228 = weight(_text_:und in 3408) [ClassicSimilarity], result of:
          0.018586228 = score(doc=3408,freq=4.0), product of:
            0.05366975 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.024215192 = queryNorm
            0.34630734 = fieldWeight in 3408, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.078125 = fieldNorm(doc=3408)
        0.018586228 = weight(_text_:und in 3408) [ClassicSimilarity], result of:
          0.018586228 = score(doc=3408,freq=4.0), product of:
            0.05366975 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.024215192 = queryNorm
            0.34630734 = fieldWeight in 3408, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.078125 = fieldNorm(doc=3408)
        0.013349611 = weight(_text_:der in 3408) [ClassicSimilarity], result of:
          0.013349611 = score(doc=3408,freq=2.0), product of:
            0.054091092 = queryWeight, product of:
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.024215192 = queryNorm
            0.2467987 = fieldWeight in 3408, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.078125 = fieldNorm(doc=3408)
        0.018586228 = weight(_text_:und in 3408) [ClassicSimilarity], result of:
          0.018586228 = score(doc=3408,freq=4.0), product of:
            0.05366975 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.024215192 = queryNorm
            0.34630734 = fieldWeight in 3408, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.078125 = fieldNorm(doc=3408)
        0.037552148 = product of:
          0.075104296 = sum of:
            0.075104296 = weight(_text_:philosophie in 3408) [ClassicSimilarity], result of:
              0.075104296 = score(doc=3408,freq=2.0), product of:
                0.12829916 = queryWeight, product of:
                  5.298292 = idf(docFreq=600, maxDocs=44218)
                  0.024215192 = queryNorm
                0.58538413 = fieldWeight in 3408, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  5.298292 = idf(docFreq=600, maxDocs=44218)
                  0.078125 = fieldNorm(doc=3408)
          0.5 = coord(1/2)
        0.018586228 = weight(_text_:und in 3408) [ClassicSimilarity], result of:
          0.018586228 = score(doc=3408,freq=4.0), product of:
            0.05366975 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.024215192 = queryNorm
            0.34630734 = fieldWeight in 3408, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.078125 = fieldNorm(doc=3408)
      0.3 = coord(6/20)
    
    Field
    Philosophie
    Imprint
    Düsseldorf : Universitäts- und Landesbibliothek
    Series
    Schriften der Universitäts- und Landesbibliothek Düsseldorf; 32
  9. Neumann, H.: Inszenierung und Metabotschaften eines periodisch getakteten Fernsehauftritts : Die Neujahrsansprachen der Bundeskanzler Helmut Kohl und Gerhard Schröder im Vergleich (2003) 0.04
    0.036234457 = product of:
      0.14493783 = sum of:
        0.030512001 = weight(_text_:und in 1632) [ClassicSimilarity], result of:
          0.030512001 = score(doc=1632,freq=22.0), product of:
            0.05366975 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.024215192 = queryNorm
            0.56851393 = fieldWeight in 1632, product of:
              4.690416 = tf(freq=22.0), with freq of:
                22.0 = termFreq=22.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.0546875 = fieldNorm(doc=1632)
        0.030512001 = weight(_text_:und in 1632) [ClassicSimilarity], result of:
          0.030512001 = score(doc=1632,freq=22.0), product of:
            0.05366975 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.024215192 = queryNorm
            0.56851393 = fieldWeight in 1632, product of:
              4.690416 = tf(freq=22.0), with freq of:
                22.0 = termFreq=22.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.0546875 = fieldNorm(doc=1632)
        0.022889815 = weight(_text_:der in 1632) [ClassicSimilarity], result of:
          0.022889815 = score(doc=1632,freq=12.0), product of:
            0.054091092 = queryWeight, product of:
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.024215192 = queryNorm
            0.4231716 = fieldWeight in 1632, product of:
              3.4641016 = tf(freq=12.0), with freq of:
                12.0 = termFreq=12.0
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.0546875 = fieldNorm(doc=1632)
        0.030512001 = weight(_text_:und in 1632) [ClassicSimilarity], result of:
          0.030512001 = score(doc=1632,freq=22.0), product of:
            0.05366975 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.024215192 = queryNorm
            0.56851393 = fieldWeight in 1632, product of:
              4.690416 = tf(freq=22.0), with freq of:
                22.0 = termFreq=22.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.0546875 = fieldNorm(doc=1632)
        0.030512001 = weight(_text_:und in 1632) [ClassicSimilarity], result of:
          0.030512001 = score(doc=1632,freq=22.0), product of:
            0.05366975 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.024215192 = queryNorm
            0.56851393 = fieldWeight in 1632, product of:
              4.690416 = tf(freq=22.0), with freq of:
                22.0 = termFreq=22.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.0546875 = fieldNorm(doc=1632)
      0.25 = coord(5/20)
    
    Abstract
    Herstellen der gleichen Wellenlänge zwischen Sender und Empfänger entscheidet über den kommunikativen Erfolg -gerade auch im politischen Bereich. Unter politikwissenschaftlicher als auch unter kommunikationswissenschaftlicher Fragestellung werden in der vorliegenden Arbeit acht Neujahrsansprachen von 1994 bis 2001 der Bundeskanzler Helmut Kohl und Gerhard Schröder einer systematischen Analyse unterzogen. Es findet eine Untersuchung der Sach- und Beziehungsebene statt. Verbale und visuelle Rhetorik beider Bundeskanzler werden miteinander verglichen und decodiert. Die Arbeit gibt zum einen Aufschluss über die Metabotschaften und das Corporate Design beider Bundeskanzler und diskutiert zum anderen Vor- und Nachteile der Kommunikationsstrategien zweier Kommunikationstypen, die unterschiedlicher nicht sein können.
    Source
    Information - Wissenschaft und Praxis. 54(2003) H.5, S.261-272
  10. Schneider, R.: Web 3.0 ante portas? : Integration von Social Web und Semantic Web (2008) 0.03
    0.034451026 = product of:
      0.114836745 = sum of:
        0.022534605 = weight(_text_:und in 4184) [ClassicSimilarity], result of:
          0.022534605 = score(doc=4184,freq=12.0), product of:
            0.05366975 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.024215192 = queryNorm
            0.41987535 = fieldWeight in 4184, product of:
              3.4641016 = tf(freq=12.0), with freq of:
                12.0 = termFreq=12.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.0546875 = fieldNorm(doc=4184)
        0.022534605 = weight(_text_:und in 4184) [ClassicSimilarity], result of:
          0.022534605 = score(doc=4184,freq=12.0), product of:
            0.05366975 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.024215192 = queryNorm
            0.41987535 = fieldWeight in 4184, product of:
              3.4641016 = tf(freq=12.0), with freq of:
                12.0 = termFreq=12.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.0546875 = fieldNorm(doc=4184)
        0.013215441 = weight(_text_:der in 4184) [ClassicSimilarity], result of:
          0.013215441 = score(doc=4184,freq=4.0), product of:
            0.054091092 = queryWeight, product of:
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.024215192 = queryNorm
            0.24431825 = fieldWeight in 4184, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.0546875 = fieldNorm(doc=4184)
        0.022534605 = weight(_text_:und in 4184) [ClassicSimilarity], result of:
          0.022534605 = score(doc=4184,freq=12.0), product of:
            0.05366975 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.024215192 = queryNorm
            0.41987535 = fieldWeight in 4184, product of:
              3.4641016 = tf(freq=12.0), with freq of:
                12.0 = termFreq=12.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.0546875 = fieldNorm(doc=4184)
        0.022534605 = weight(_text_:und in 4184) [ClassicSimilarity], result of:
          0.022534605 = score(doc=4184,freq=12.0), product of:
            0.05366975 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.024215192 = queryNorm
            0.41987535 = fieldWeight in 4184, product of:
              3.4641016 = tf(freq=12.0), with freq of:
                12.0 = termFreq=12.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.0546875 = fieldNorm(doc=4184)
        0.011482884 = product of:
          0.022965768 = sum of:
            0.022965768 = weight(_text_:22 in 4184) [ClassicSimilarity], result of:
              0.022965768 = score(doc=4184,freq=2.0), product of:
                0.08479747 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.024215192 = queryNorm
                0.2708308 = fieldWeight in 4184, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=4184)
          0.5 = coord(1/2)
      0.3 = coord(6/20)
    
    Abstract
    Das Medium Internet ist im Wandel, und mit ihm ändern sich seine Publikations- und Rezeptionsbedingungen. Welche Chancen bieten die momentan parallel diskutierten Zukunftsentwürfe von Social Web und Semantic Web? Zur Beantwortung dieser Frage beschäftigt sich der Beitrag mit den Grundlagen beider Modelle unter den Aspekten Anwendungsbezug und Technologie, beleuchtet darüber hinaus jedoch auch deren Unzulänglichkeiten sowie den Mehrwert einer mediengerechten Kombination. Am Beispiel des grammatischen Online-Informationssystems grammis wird eine Strategie zur integrativen Nutzung der jeweiligen Stärken skizziert.
    Date
    22. 1.2011 10:38:28
    Source
    Kommunikation, Partizipation und Wirkungen im Social Web, Band 1. Hrsg.: A. Zerfaß u.a
  11. Stock, M.: Textwortmethode und Übersetzungsrelation : Eine Methode zum Aufbau von kombinierten Literaturnachweis- und Terminologiedatenbanken (1989) 0.03
    0.03272481 = product of:
      0.13089924 = sum of:
        0.029387407 = weight(_text_:und in 3412) [ClassicSimilarity], result of:
          0.029387407 = score(doc=3412,freq=10.0), product of:
            0.05366975 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.024215192 = queryNorm
            0.54756 = fieldWeight in 3412, product of:
              3.1622777 = tf(freq=10.0), with freq of:
                10.0 = termFreq=10.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.078125 = fieldNorm(doc=3412)
        0.029387407 = weight(_text_:und in 3412) [ClassicSimilarity], result of:
          0.029387407 = score(doc=3412,freq=10.0), product of:
            0.05366975 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.024215192 = queryNorm
            0.54756 = fieldWeight in 3412, product of:
              3.1622777 = tf(freq=10.0), with freq of:
                10.0 = termFreq=10.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.078125 = fieldNorm(doc=3412)
        0.013349611 = weight(_text_:der in 3412) [ClassicSimilarity], result of:
          0.013349611 = score(doc=3412,freq=2.0), product of:
            0.054091092 = queryWeight, product of:
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.024215192 = queryNorm
            0.2467987 = fieldWeight in 3412, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.078125 = fieldNorm(doc=3412)
        0.029387407 = weight(_text_:und in 3412) [ClassicSimilarity], result of:
          0.029387407 = score(doc=3412,freq=10.0), product of:
            0.05366975 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.024215192 = queryNorm
            0.54756 = fieldWeight in 3412, product of:
              3.1622777 = tf(freq=10.0), with freq of:
                10.0 = termFreq=10.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.078125 = fieldNorm(doc=3412)
        0.029387407 = weight(_text_:und in 3412) [ClassicSimilarity], result of:
          0.029387407 = score(doc=3412,freq=10.0), product of:
            0.05366975 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.024215192 = queryNorm
            0.54756 = fieldWeight in 3412, product of:
              3.1622777 = tf(freq=10.0), with freq of:
                10.0 = termFreq=10.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.078125 = fieldNorm(doc=3412)
      0.25 = coord(5/20)
    
    Abstract
    Geisteswissenschaftliche Fachinformation erfordert eine enge Kooperation zwischen Literaturnachweis- und Terminologieinformationssystemen. Eine geeignete Dokumentationsmethode für die Auswertung geisteswissen- schaftlicher Literatur ist die Textwortwethode. Dem originalsprachig aufgenommenen Begriffsrepertoire ist ein einheitssprachiger Zugriff beizuordnen, der einerseits ein vollständiges und genaues Retrieval garantiert und andererseits den Aufbau fachspezifischer Wörterbücher vorantreibt
  12. Schmitz, K.-D.: Projektforschung und Infrastrukturen im Bereich der Terminologie : Wie kann die Wirtschaft davon profitieren? (2000) 0.03
    0.03248542 = product of:
      0.12994169 = sum of:
        0.026153143 = weight(_text_:und in 5568) [ClassicSimilarity], result of:
          0.026153143 = score(doc=5568,freq=22.0), product of:
            0.05366975 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.024215192 = queryNorm
            0.48729765 = fieldWeight in 5568, product of:
              4.690416 = tf(freq=22.0), with freq of:
                22.0 = termFreq=22.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.046875 = fieldNorm(doc=5568)
        0.026153143 = weight(_text_:und in 5568) [ClassicSimilarity], result of:
          0.026153143 = score(doc=5568,freq=22.0), product of:
            0.05366975 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.024215192 = queryNorm
            0.48729765 = fieldWeight in 5568, product of:
              4.690416 = tf(freq=22.0), with freq of:
                22.0 = termFreq=22.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.046875 = fieldNorm(doc=5568)
        0.025329107 = weight(_text_:der in 5568) [ClassicSimilarity], result of:
          0.025329107 = score(doc=5568,freq=20.0), product of:
            0.054091092 = queryWeight, product of:
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.024215192 = queryNorm
            0.46826762 = fieldWeight in 5568, product of:
              4.472136 = tf(freq=20.0), with freq of:
                20.0 = termFreq=20.0
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.046875 = fieldNorm(doc=5568)
        0.026153143 = weight(_text_:und in 5568) [ClassicSimilarity], result of:
          0.026153143 = score(doc=5568,freq=22.0), product of:
            0.05366975 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.024215192 = queryNorm
            0.48729765 = fieldWeight in 5568, product of:
              4.690416 = tf(freq=22.0), with freq of:
                22.0 = termFreq=22.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.046875 = fieldNorm(doc=5568)
        0.026153143 = weight(_text_:und in 5568) [ClassicSimilarity], result of:
          0.026153143 = score(doc=5568,freq=22.0), product of:
            0.05366975 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.024215192 = queryNorm
            0.48729765 = fieldWeight in 5568, product of:
              4.690416 = tf(freq=22.0), with freq of:
                22.0 = termFreq=22.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.046875 = fieldNorm(doc=5568)
      0.25 = coord(5/20)
    
    Abstract
    In der heutigen Informationsgesellschaft bieten sich der Industrie neue Perspektiven für Kommunikation und Handel auf dem europäischen und internationalen Markt; beide Märkte sind von einer großen sprachlichen, kulturellen und gesellschaftlichen Vielfalt geprägt. Uni Nutzen aus diesen neuen Möglichkeiten zu ziehen und um weiterhin konkurrenzfähig zu bleiben, muß die Industrie spezifische und adäquate Lösungen zur Überwindung der Sprachbarrieren finden. Voraussetzung hierfür ist die genaue Definition, systematische Ordnung und exakte Benennung der Begriffe innerhalb der jeweiligen Fachgebiete, in der eigenen Sprache ebenso wie in den Fremdsprachen. Genau dies sind die Themenbereiche, mit dem sich die Terminologiewissenschaft und die praktische Temninologiearbeit beschäftigen. Die Ergebnisse der Terminologiearbeit im Unternehmen beeinflussen Konstruktion, Produktion, Einkauf, Marketing und Verkauf, Vertragswesen, technische Dokumentation und Übersetzung
    Source
    Sprachtechnologie für eine dynamische Wirtschaft im Medienzeitalter - Language technologies for dynamic business in the age of the media - L'ingénierie linguistique au service de la dynamisation économique à l'ère du multimédia: Tagungsakten der XXVI. Jahrestagung der Internationalen Vereinigung Sprache und Wirtschaft e.V., 23.-25.11.2000, Fachhochschule Köln. Hrsg.: K.-D. Schmitz
  13. Rahmstorf, G.: Rückkehr von Ordnung in die Informationstechnik? (2000) 0.03
    0.03216047 = product of:
      0.12864187 = sum of:
        0.026153143 = weight(_text_:und in 5504) [ClassicSimilarity], result of:
          0.026153143 = score(doc=5504,freq=22.0), product of:
            0.05366975 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.024215192 = queryNorm
            0.48729765 = fieldWeight in 5504, product of:
              4.690416 = tf(freq=22.0), with freq of:
                22.0 = termFreq=22.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.046875 = fieldNorm(doc=5504)
        0.026153143 = weight(_text_:und in 5504) [ClassicSimilarity], result of:
          0.026153143 = score(doc=5504,freq=22.0), product of:
            0.05366975 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.024215192 = queryNorm
            0.48729765 = fieldWeight in 5504, product of:
              4.690416 = tf(freq=22.0), with freq of:
                22.0 = termFreq=22.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.046875 = fieldNorm(doc=5504)
        0.0240293 = weight(_text_:der in 5504) [ClassicSimilarity], result of:
          0.0240293 = score(doc=5504,freq=18.0), product of:
            0.054091092 = queryWeight, product of:
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.024215192 = queryNorm
            0.44423765 = fieldWeight in 5504, product of:
              4.2426405 = tf(freq=18.0), with freq of:
                18.0 = termFreq=18.0
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.046875 = fieldNorm(doc=5504)
        0.026153143 = weight(_text_:und in 5504) [ClassicSimilarity], result of:
          0.026153143 = score(doc=5504,freq=22.0), product of:
            0.05366975 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.024215192 = queryNorm
            0.48729765 = fieldWeight in 5504, product of:
              4.690416 = tf(freq=22.0), with freq of:
                22.0 = termFreq=22.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.046875 = fieldNorm(doc=5504)
        0.026153143 = weight(_text_:und in 5504) [ClassicSimilarity], result of:
          0.026153143 = score(doc=5504,freq=22.0), product of:
            0.05366975 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.024215192 = queryNorm
            0.48729765 = fieldWeight in 5504, product of:
              4.690416 = tf(freq=22.0), with freq of:
                22.0 = termFreq=22.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.046875 = fieldNorm(doc=5504)
      0.25 = coord(5/20)
    
    Abstract
    Im Zuge der aktuellen Informationstechnik, der weltweiten Kommunikation und des elektronischen Publizierens scheinen die herkömmlichen Instrumente der Ordnungsstiftung - bibliothekarische Klassifikationssysteme und Thesauren - an den Rand gedrängt zu werden oder sogar ganz zu verschwinden. Andererseits sind die Endbenutzer oft unzufrieden mit dem Ergebnis des Recherchierens im Bestand des unabsehbar wachsenden Informationsangebotes. Ist eine präzise und vollständige Recherche bei den gegebenen technischen und Ökonomischen Verhältnissen überhaupt noch realisierbar'?
    Series
    Gemeinsamer Kongress der Bundesvereinigung Deutscher Bibliotheksverbände e.V. (BDB) und der Deutschen Gesellschaft für Informationswissenschaft und Informationspraxis e.V. (DGI); Bd.1)(Tagungen der Deutschen Gesellschaft für Informationswissenschaft und Informationspraxis e.V.; Bd.3
    Source
    Information und Öffentlichkeit: 1. Gemeinsamer Kongress der Bundesvereinigung Deutscher Bibliotheksverbände e.V. (BDB) und der Deutschen Gesellschaft für Informationswissenschaft und Informationspraxis e.V. (DGI), Leipzig, 20.-23.3.2000. Zugleich 90. Deutscher Bibliothekartag, 52. Jahrestagung der Deutschen Gesellschaft für Informationswissenschaft und Informationspraxis e.V. (DGI). Hrsg.: G. Ruppelt u. H. Neißer
  14. Schneider, R.: Question answering : das Retrieval der Zukunft? (2007) 0.03
    0.03172396 = product of:
      0.12689584 = sum of:
        0.025753833 = weight(_text_:und in 5953) [ClassicSimilarity], result of:
          0.025753833 = score(doc=5953,freq=12.0), product of:
            0.05366975 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.024215192 = queryNorm
            0.47985753 = fieldWeight in 5953, product of:
              3.4641016 = tf(freq=12.0), with freq of:
                12.0 = termFreq=12.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.0625 = fieldNorm(doc=5953)
        0.025753833 = weight(_text_:und in 5953) [ClassicSimilarity], result of:
          0.025753833 = score(doc=5953,freq=12.0), product of:
            0.05366975 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.024215192 = queryNorm
            0.47985753 = fieldWeight in 5953, product of:
              3.4641016 = tf(freq=12.0), with freq of:
                12.0 = termFreq=12.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.0625 = fieldNorm(doc=5953)
        0.02388051 = weight(_text_:der in 5953) [ClassicSimilarity], result of:
          0.02388051 = score(doc=5953,freq=10.0), product of:
            0.054091092 = queryWeight, product of:
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.024215192 = queryNorm
            0.44148692 = fieldWeight in 5953, product of:
              3.1622777 = tf(freq=10.0), with freq of:
                10.0 = termFreq=10.0
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.0625 = fieldNorm(doc=5953)
        0.025753833 = weight(_text_:und in 5953) [ClassicSimilarity], result of:
          0.025753833 = score(doc=5953,freq=12.0), product of:
            0.05366975 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.024215192 = queryNorm
            0.47985753 = fieldWeight in 5953, product of:
              3.4641016 = tf(freq=12.0), with freq of:
                12.0 = termFreq=12.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.0625 = fieldNorm(doc=5953)
        0.025753833 = weight(_text_:und in 5953) [ClassicSimilarity], result of:
          0.025753833 = score(doc=5953,freq=12.0), product of:
            0.05366975 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.024215192 = queryNorm
            0.47985753 = fieldWeight in 5953, product of:
              3.4641016 = tf(freq=12.0), with freq of:
                12.0 = termFreq=12.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.0625 = fieldNorm(doc=5953)
      0.25 = coord(5/20)
    
    Abstract
    Der Artikel geht der Frage nach, ob und inwieweit Informations- und Recherchesysteme von der Technologie natürlich sprachlicher Frage-Antwortsysteme, so genannter Question Answering-Systeme, profitieren können. Nach einer allgemeinen Einführung in die Zielsetzung und die historische Entwicklung dieses Sonderzweigs der maschinellen Sprachverarbeitung werden dessen Abgrenzung von herkömmlichen Retrieval- und Extraktionsverfahren erläutert und die besondere Struktur von Question Answering-Systemen sowie einzelne Evaluierungsinitiativen aufgezeichnet. Zudem werden konkrete Anwendungsfelder im Bibliothekswesen vorgestellt.
    Source
    Zeitschrift für Bibliothekswesen und Bibliographie. 54(2007) H.1, S.3-11
  15. Kunze, C.: Lexikalisch-semantische Wortnetze in Sprachwissenschaft und Sprachtechnologie (2006) 0.03
    0.031593163 = product of:
      0.12637265 = sum of:
        0.02781732 = weight(_text_:und in 6023) [ClassicSimilarity], result of:
          0.02781732 = score(doc=6023,freq=14.0), product of:
            0.05366975 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.024215192 = queryNorm
            0.51830536 = fieldWeight in 6023, product of:
              3.7416575 = tf(freq=14.0), with freq of:
                14.0 = termFreq=14.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.0625 = fieldNorm(doc=6023)
        0.02781732 = weight(_text_:und in 6023) [ClassicSimilarity], result of:
          0.02781732 = score(doc=6023,freq=14.0), product of:
            0.05366975 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.024215192 = queryNorm
            0.51830536 = fieldWeight in 6023, product of:
              3.7416575 = tf(freq=14.0), with freq of:
                14.0 = termFreq=14.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.0625 = fieldNorm(doc=6023)
        0.015103361 = weight(_text_:der in 6023) [ClassicSimilarity], result of:
          0.015103361 = score(doc=6023,freq=4.0), product of:
            0.054091092 = queryWeight, product of:
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.024215192 = queryNorm
            0.27922085 = fieldWeight in 6023, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.0625 = fieldNorm(doc=6023)
        0.02781732 = weight(_text_:und in 6023) [ClassicSimilarity], result of:
          0.02781732 = score(doc=6023,freq=14.0), product of:
            0.05366975 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.024215192 = queryNorm
            0.51830536 = fieldWeight in 6023, product of:
              3.7416575 = tf(freq=14.0), with freq of:
                14.0 = termFreq=14.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.0625 = fieldNorm(doc=6023)
        0.02781732 = weight(_text_:und in 6023) [ClassicSimilarity], result of:
          0.02781732 = score(doc=6023,freq=14.0), product of:
            0.05366975 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.024215192 = queryNorm
            0.51830536 = fieldWeight in 6023, product of:
              3.7416575 = tf(freq=14.0), with freq of:
                14.0 = termFreq=14.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.0625 = fieldNorm(doc=6023)
      0.25 = coord(5/20)
    
    Abstract
    Dieser Beitrag beschreibt die Strukturierungsprinzipien und Anwendungskontexte lexikalisch-semantischer Wortnetze, insbesondere des deutschen Wortnetzes GermaNet. Wortnetze sind zurzeit besonders populäre elektronische Lexikonressourcen, die große Abdeckungen semantisch strukturierter Datenfür verschiedene Sprachen und Sprachverbünde enthalten. In Wortnetzen sind die häufigsten und wichtigsten Konzepte einer Sprache mit ihren elementaren Bedeutungsrelationen repräsentiert. Zentrale Anwendungen für Wortnetze sind u.a. die Lesartendisambiguierung und die Informationserschließung. Der Artikel skizziert die neusten Szenarien, in denen GermaNet eingesetzt wird: die Semantische Informationserschließung und die Integration allgemeinsprachlicher Wortnetze mit terminologischen Ressourcen vordem Hintergrund der Datenkonvertierung in OWL.
    Source
    Information - Wissenschaft und Praxis. 57(2006) H.6/7, S.309-314
  16. Heid, U.: Computerlinguistik zwischen Informationswissenschaft und multilingualer Kommunikation (2010) 0.03
    0.031593163 = product of:
      0.12637265 = sum of:
        0.02781732 = weight(_text_:und in 4018) [ClassicSimilarity], result of:
          0.02781732 = score(doc=4018,freq=14.0), product of:
            0.05366975 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.024215192 = queryNorm
            0.51830536 = fieldWeight in 4018, product of:
              3.7416575 = tf(freq=14.0), with freq of:
                14.0 = termFreq=14.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.0625 = fieldNorm(doc=4018)
        0.02781732 = weight(_text_:und in 4018) [ClassicSimilarity], result of:
          0.02781732 = score(doc=4018,freq=14.0), product of:
            0.05366975 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.024215192 = queryNorm
            0.51830536 = fieldWeight in 4018, product of:
              3.7416575 = tf(freq=14.0), with freq of:
                14.0 = termFreq=14.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.0625 = fieldNorm(doc=4018)
        0.015103361 = weight(_text_:der in 4018) [ClassicSimilarity], result of:
          0.015103361 = score(doc=4018,freq=4.0), product of:
            0.054091092 = queryWeight, product of:
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.024215192 = queryNorm
            0.27922085 = fieldWeight in 4018, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.0625 = fieldNorm(doc=4018)
        0.02781732 = weight(_text_:und in 4018) [ClassicSimilarity], result of:
          0.02781732 = score(doc=4018,freq=14.0), product of:
            0.05366975 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.024215192 = queryNorm
            0.51830536 = fieldWeight in 4018, product of:
              3.7416575 = tf(freq=14.0), with freq of:
                14.0 = termFreq=14.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.0625 = fieldNorm(doc=4018)
        0.02781732 = weight(_text_:und in 4018) [ClassicSimilarity], result of:
          0.02781732 = score(doc=4018,freq=14.0), product of:
            0.05366975 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.024215192 = queryNorm
            0.51830536 = fieldWeight in 4018, product of:
              3.7416575 = tf(freq=14.0), with freq of:
                14.0 = termFreq=14.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.0625 = fieldNorm(doc=4018)
      0.25 = coord(5/20)
    
    Abstract
    Dieser Beitrag widmet sich den Möglichkeiten der Interaktion zwischen Informationswissenschaft und Computerlinguistik. Dazu werden relevante Aspekte computerlinguistischer Forschung präsentiert, und ihr Potential für die Interaktion mit informationswissenschaftlichen Fragestellungen und Produkten wird erläutert. Im dritten Teil werden anhand der spezifischen Hildesheimer Situation Vorschläge für eine solche Interaktion diskutiert, und zwar im Dreieck: Informationswissenschaft, Computerlinguistik und Multilinguale Kommunikation.
    Source
    Information - Wissenschaft und Praxis. 61(2010) H.6/7, S.361-366
  17. Schürmann, H.: Software scannt Radio- und Fernsehsendungen : Recherche in Nachrichtenarchiven erleichtert (2001) 0.03
    0.030836739 = product of:
      0.10278913 = sum of:
        0.020050311 = weight(_text_:und in 5759) [ClassicSimilarity], result of:
          0.020050311 = score(doc=5759,freq=38.0), product of:
            0.05366975 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.024215192 = queryNorm
            0.3735868 = fieldWeight in 5759, product of:
              6.164414 = tf(freq=38.0), with freq of:
                38.0 = termFreq=38.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.02734375 = fieldNorm(doc=5759)
        0.020050311 = weight(_text_:und in 5759) [ClassicSimilarity], result of:
          0.020050311 = score(doc=5759,freq=38.0), product of:
            0.05366975 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.024215192 = queryNorm
            0.3735868 = fieldWeight in 5759, product of:
              6.164414 = tf(freq=38.0), with freq of:
                38.0 = termFreq=38.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.02734375 = fieldNorm(doc=5759)
        0.016846446 = weight(_text_:der in 5759) [ClassicSimilarity], result of:
          0.016846446 = score(doc=5759,freq=26.0), product of:
            0.054091092 = queryWeight, product of:
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.024215192 = queryNorm
            0.31144586 = fieldWeight in 5759, product of:
              5.0990195 = tf(freq=26.0), with freq of:
                26.0 = termFreq=26.0
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.02734375 = fieldNorm(doc=5759)
        0.020050311 = weight(_text_:und in 5759) [ClassicSimilarity], result of:
          0.020050311 = score(doc=5759,freq=38.0), product of:
            0.05366975 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.024215192 = queryNorm
            0.3735868 = fieldWeight in 5759, product of:
              6.164414 = tf(freq=38.0), with freq of:
                38.0 = termFreq=38.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.02734375 = fieldNorm(doc=5759)
        0.020050311 = weight(_text_:und in 5759) [ClassicSimilarity], result of:
          0.020050311 = score(doc=5759,freq=38.0), product of:
            0.05366975 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.024215192 = queryNorm
            0.3735868 = fieldWeight in 5759, product of:
              6.164414 = tf(freq=38.0), with freq of:
                38.0 = termFreq=38.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.02734375 = fieldNorm(doc=5759)
        0.005741442 = product of:
          0.011482884 = sum of:
            0.011482884 = weight(_text_:22 in 5759) [ClassicSimilarity], result of:
              0.011482884 = score(doc=5759,freq=2.0), product of:
                0.08479747 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.024215192 = queryNorm
                0.1354154 = fieldWeight in 5759, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.02734375 = fieldNorm(doc=5759)
          0.5 = coord(1/2)
      0.3 = coord(6/20)
    
    Abstract
    Computer müssen lernen, die Sprache des Menschen zu verstehen. Forscher an der Uni Duisburg haben eine Methode entwickelt, mit der ein Rechner Informationen aus Radiobeiträgen herausfiltern kann.
    Content
    Um Firmen und Agenturen die Beobachtungen von Medien zu erleichtern, entwickeln Forscher an der Duisburger Hochschule zurzeit ein System zur automatischen Themenerkennung in Rundfunk und Fernsehen. Das so genannte Alert-System soll dem Nutzer helfen, die für ihn relevanten Sprachinformationen aus Nachrichtensendungen herauszufiltem und weiterzuverarbeiten. Durch die automatische Analyse durch den Computer können mehrere Programme rund um die Uhr beobachtet werden. Noch erfolgt die Informationsgewinnung aus TV- und Radiosendungen auf klassischem Wege: Ein Mensch sieht, hört, liest und wertet aus. Das ist enorm zeitaufwendig und für eine Firma, die beispielsweise die Konkurrenz beobachten oder ihre Medienpräsenz dokumentieren lassen möchte, auch sehr teuer. Diese Arbeit ließe sich mit einem Spracherkenner automatisieren, sagten sich die Duisburger Forscher. Sie arbeiten nun zusammen mit Partnern aus Deutschland, Frankreich und Portugal in einem europaweiten Projekt an der Entwicklung einer entsprechenden Technologie (http://alert.uni-duisburg.de). An dem Projekt sind auch zwei Medienbeobachtungsuntemehmen beteiligt, die Oberserver Argus Media GmbH aus Baden-Baden und das französische Unternehmen Secodip. Unsere Arbeit würde schon dadurch erleichtert, wenn Informationen, die über unsere Kunden in den Medien erscheinen, vorselektiert würden", beschreibt Simone Holderbach, Leiterin der Produktentwicklung bei Oberserver, ihr Interesse an der Technik. Und wie funktioniert Alert? Das Spracherkennungssystem wird darauf getrimmt, Nachrichtensendungen in Radio und Fernsehen zu überwachen: Alles, was gesagt wird - sei es vom Nachrichtensprecher, Reporter oder Interviewten -, wird durch die automatische Spracherkennung in Text umgewandelt. Dabei werden Themen und Schlüsselwörter erkannt und gespeichert. Diese werden mit den Suchbegriffen des Nutzers verglichen. Gefundene Übereinstimmungen werden angezeigt und dem Benutzer automatisch mitgeteilt. Konventionelle Spracherkennungstechnik sei für die Medienbeobachtung nicht einsetzbar, da diese für einen anderen Zweck entwickelt worden sei, betont Prof. Gerhard Rigoll, Leiter des Fachgebiets Technische Informatik an der Duisburger Hochschule. Für die Umwandlung von Sprache in Text wurde die Alert-Software gründlich trainiert. Aus Zeitungstexten, Audio- und Video-Material wurden bislang rund 3 50 Millionen Wörter verarbeitet. Das System arbeitet in drei Sprachen. Doch so ganz fehlerfrei sei der automatisch gewonnene Text nicht, räumt Rigoll ein. Zurzeit liegt die Erkennungsrate bei 40 bis 70 Prozent. Und das wird sich in absehbarer Zeit auch nicht ändern." Musiküberlagerungen oder starke Hintergrundgeräusche bei Reportagen führen zu Ungenauigkeiten bei der Textumwandlung. Deshalb haben die, Duisburger Wissenschaftler Methoden entwickelt, die über die herkömmliche Suche nach Schlüsselwörtern hinausgehen und eine inhaltsorientierte Zuordnung ermöglichen. Dadurch erhält der Nutzer dann auch solche Nachrichten, die zwar zum Thema passen, in denen das Stichwort aber gar nicht auftaucht", bringt Rigoll den Vorteil der Technik auf den Punkt. Wird beispielsweise "Ölpreis" als Suchbegriff eingegeben, werden auch solche Nachrichten angezeigt, in denen Olkonzerne und Energieagenturen eine Rolle spielen. Rigoll: Das Alert-System liest sozusagen zwischen den Zeilen!' Das Forschungsprojekt wurde vor einem Jahr gestartet und läuft noch bis Mitte 2002. Wer sich über den Stand der Technik informieren möchte, kann dies in dieser Woche auf der Industriemesse in Hannover. Das Alert-System wird auf dem Gemeinschaftsstand "Forschungsland NRW" in Halle 18, Stand M12, präsentiert
    Source
    Handelsblatt. Nr.79 vom 24.4.2001, S.22
  18. Minsky, M.L.: Materie, Geist, Modell (1977) 0.03
    0.030693084 = product of:
      0.122772336 = sum of:
        0.02602072 = weight(_text_:und in 5547) [ClassicSimilarity], result of:
          0.02602072 = score(doc=5547,freq=4.0), product of:
            0.05366975 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.024215192 = queryNorm
            0.4848303 = fieldWeight in 5547, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.109375 = fieldNorm(doc=5547)
        0.02602072 = weight(_text_:und in 5547) [ClassicSimilarity], result of:
          0.02602072 = score(doc=5547,freq=4.0), product of:
            0.05366975 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.024215192 = queryNorm
            0.4848303 = fieldWeight in 5547, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.109375 = fieldNorm(doc=5547)
        0.018689455 = weight(_text_:der in 5547) [ClassicSimilarity], result of:
          0.018689455 = score(doc=5547,freq=2.0), product of:
            0.054091092 = queryWeight, product of:
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.024215192 = queryNorm
            0.34551817 = fieldWeight in 5547, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.109375 = fieldNorm(doc=5547)
        0.02602072 = weight(_text_:und in 5547) [ClassicSimilarity], result of:
          0.02602072 = score(doc=5547,freq=4.0), product of:
            0.05366975 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.024215192 = queryNorm
            0.4848303 = fieldWeight in 5547, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.109375 = fieldNorm(doc=5547)
        0.02602072 = weight(_text_:und in 5547) [ClassicSimilarity], result of:
          0.02602072 = score(doc=5547,freq=4.0), product of:
            0.05366975 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.024215192 = queryNorm
            0.4848303 = fieldWeight in 5547, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.109375 = fieldNorm(doc=5547)
      0.25 = coord(5/20)
    
    Series
    Grundlagen der Kommunikation und Kognition
    Source
    Semantik und künstliche Intelligenz: Beiträge zur automatischen Sprachbearbeitung II. Hsrg. u. eingeleitet von P. Eisenberg
  19. Winograd, T.: ¬Ein prozedurales Modell des Sprachverstehens (1977) 0.03
    0.030693084 = product of:
      0.122772336 = sum of:
        0.02602072 = weight(_text_:und in 4707) [ClassicSimilarity], result of:
          0.02602072 = score(doc=4707,freq=4.0), product of:
            0.05366975 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.024215192 = queryNorm
            0.4848303 = fieldWeight in 4707, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.109375 = fieldNorm(doc=4707)
        0.02602072 = weight(_text_:und in 4707) [ClassicSimilarity], result of:
          0.02602072 = score(doc=4707,freq=4.0), product of:
            0.05366975 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.024215192 = queryNorm
            0.4848303 = fieldWeight in 4707, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.109375 = fieldNorm(doc=4707)
        0.018689455 = weight(_text_:der in 4707) [ClassicSimilarity], result of:
          0.018689455 = score(doc=4707,freq=2.0), product of:
            0.054091092 = queryWeight, product of:
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.024215192 = queryNorm
            0.34551817 = fieldWeight in 4707, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.109375 = fieldNorm(doc=4707)
        0.02602072 = weight(_text_:und in 4707) [ClassicSimilarity], result of:
          0.02602072 = score(doc=4707,freq=4.0), product of:
            0.05366975 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.024215192 = queryNorm
            0.4848303 = fieldWeight in 4707, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.109375 = fieldNorm(doc=4707)
        0.02602072 = weight(_text_:und in 4707) [ClassicSimilarity], result of:
          0.02602072 = score(doc=4707,freq=4.0), product of:
            0.05366975 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.024215192 = queryNorm
            0.4848303 = fieldWeight in 4707, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.109375 = fieldNorm(doc=4707)
      0.25 = coord(5/20)
    
    Series
    Grundlagen der Kommunikation und Kognition
    Source
    Semantik und künstliche Intelligenz: Beiträge zur automatischen Sprachbearbeitung II. Hrsg. u. eingeleitet von P. Eisenberg
  20. Budin, G.: Zum Entwicklungsstand der Terminologiewissenschaft (2019) 0.03
    0.03052111 = product of:
      0.12208444 = sum of:
        0.024340155 = weight(_text_:und in 5604) [ClassicSimilarity], result of:
          0.024340155 = score(doc=5604,freq=14.0), product of:
            0.05366975 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.024215192 = queryNorm
            0.4535172 = fieldWeight in 5604, product of:
              3.7416575 = tf(freq=14.0), with freq of:
                14.0 = termFreq=14.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.0546875 = fieldNorm(doc=5604)
        0.024340155 = weight(_text_:und in 5604) [ClassicSimilarity], result of:
          0.024340155 = score(doc=5604,freq=14.0), product of:
            0.05366975 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.024215192 = queryNorm
            0.4535172 = fieldWeight in 5604, product of:
              3.7416575 = tf(freq=14.0), with freq of:
                14.0 = termFreq=14.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.0546875 = fieldNorm(doc=5604)
        0.024723826 = weight(_text_:der in 5604) [ClassicSimilarity], result of:
          0.024723826 = score(doc=5604,freq=14.0), product of:
            0.054091092 = queryWeight, product of:
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.024215192 = queryNorm
            0.4570776 = fieldWeight in 5604, product of:
              3.7416575 = tf(freq=14.0), with freq of:
                14.0 = termFreq=14.0
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.0546875 = fieldNorm(doc=5604)
        0.024340155 = weight(_text_:und in 5604) [ClassicSimilarity], result of:
          0.024340155 = score(doc=5604,freq=14.0), product of:
            0.05366975 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.024215192 = queryNorm
            0.4535172 = fieldWeight in 5604, product of:
              3.7416575 = tf(freq=14.0), with freq of:
                14.0 = termFreq=14.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.0546875 = fieldNorm(doc=5604)
        0.024340155 = weight(_text_:und in 5604) [ClassicSimilarity], result of:
          0.024340155 = score(doc=5604,freq=14.0), product of:
            0.05366975 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.024215192 = queryNorm
            0.4535172 = fieldWeight in 5604, product of:
              3.7416575 = tf(freq=14.0), with freq of:
                14.0 = termFreq=14.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.0546875 = fieldNorm(doc=5604)
      0.25 = coord(5/20)
    
    Abstract
    In diesem Aufsatz wird der Entwicklungsstand der Terminologiewissenschaft auf 3 Ebenen analysiert: (1) in Bezug auf die Forschungsfragen, die in Forschungsprojekten, universitären Qualifizierungsarbeiten und anderen Forschungskontexten gestellt und auf der Basis empirischer Analysen beantwortet werden, darauf aufbauend (2) in Bezug auf die Methoden, die dabei verwendet werden, die Theorien, die solchen Arbeiten zugrunde gelegt werden, und die Paradigmen, in denen sich die Theorien und Methoden verorten lassen, sowie (3) in Bezug auf die darüber liegende Ebene der Terminologiewissenschaft als Disziplin. Auf allen 3 Ebenen lässt sich feststellen, dass der interdisziplinäre und multiperspektivische Charakter der Terminologiewissenschaft in den letzten Jahrzehnten zugenommen hat und sich weiter verstärkt.
    Series
    Kommunikation und Medienmanagement - Springer eBooks. Computer Science and Engineering

Years

Languages

  • d 159
  • e 40
  • ru 1
  • More… Less…

Types

  • el 23
  • p 1
  • More… Less…