Search (59 results, page 1 of 3)

  • × theme_ss:"Computerlinguistik"
  1. Hotho, A.; Bloehdorn, S.: Data Mining 2004 : Text classification by boosting weak learners based on terms and concepts (2004) 0.27
    0.2748816 = product of:
      0.5497632 = sum of:
        0.053604506 = product of:
          0.16081351 = sum of:
            0.16081351 = weight(_text_:3a in 562) [ClassicSimilarity], result of:
              0.16081351 = score(doc=562,freq=2.0), product of:
                0.28613585 = queryWeight, product of:
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.03375035 = queryNorm
                0.56201804 = fieldWeight in 562, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.046875 = fieldNorm(doc=562)
          0.33333334 = coord(1/3)
        0.16081351 = weight(_text_:2f in 562) [ClassicSimilarity], result of:
          0.16081351 = score(doc=562,freq=2.0), product of:
            0.28613585 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03375035 = queryNorm
            0.56201804 = fieldWeight in 562, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=562)
        0.16081351 = weight(_text_:2f in 562) [ClassicSimilarity], result of:
          0.16081351 = score(doc=562,freq=2.0), product of:
            0.28613585 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03375035 = queryNorm
            0.56201804 = fieldWeight in 562, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=562)
        0.16081351 = weight(_text_:2f in 562) [ClassicSimilarity], result of:
          0.16081351 = score(doc=562,freq=2.0), product of:
            0.28613585 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03375035 = queryNorm
            0.56201804 = fieldWeight in 562, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=562)
        0.01371812 = product of:
          0.02743624 = sum of:
            0.02743624 = weight(_text_:22 in 562) [ClassicSimilarity], result of:
              0.02743624 = score(doc=562,freq=2.0), product of:
                0.11818798 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.03375035 = queryNorm
                0.23214069 = fieldWeight in 562, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.046875 = fieldNorm(doc=562)
          0.5 = coord(1/2)
      0.5 = coord(5/10)
    
    Content
    Vgl.: http://www.google.de/url?sa=t&rct=j&q=&esrc=s&source=web&cd=1&cad=rja&ved=0CEAQFjAA&url=http%3A%2F%2Fciteseerx.ist.psu.edu%2Fviewdoc%2Fdownload%3Fdoi%3D10.1.1.91.4940%26rep%3Drep1%26type%3Dpdf&ei=dOXrUMeIDYHDtQahsIGACg&usg=AFQjCNHFWVh6gNPvnOrOS9R3rkrXCNVD-A&sig2=5I2F5evRfMnsttSgFF9g7Q&bvm=bv.1357316858,d.Yms.
    Date
    8. 1.2013 10:22:32
  2. Noever, D.; Ciolino, M.: ¬The Turing deception (2022) 0.21
    0.21441804 = product of:
      0.5360451 = sum of:
        0.053604506 = product of:
          0.16081351 = sum of:
            0.16081351 = weight(_text_:3a in 862) [ClassicSimilarity], result of:
              0.16081351 = score(doc=862,freq=2.0), product of:
                0.28613585 = queryWeight, product of:
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.03375035 = queryNorm
                0.56201804 = fieldWeight in 862, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.046875 = fieldNorm(doc=862)
          0.33333334 = coord(1/3)
        0.16081351 = weight(_text_:2f in 862) [ClassicSimilarity], result of:
          0.16081351 = score(doc=862,freq=2.0), product of:
            0.28613585 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03375035 = queryNorm
            0.56201804 = fieldWeight in 862, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=862)
        0.16081351 = weight(_text_:2f in 862) [ClassicSimilarity], result of:
          0.16081351 = score(doc=862,freq=2.0), product of:
            0.28613585 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03375035 = queryNorm
            0.56201804 = fieldWeight in 862, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=862)
        0.16081351 = weight(_text_:2f in 862) [ClassicSimilarity], result of:
          0.16081351 = score(doc=862,freq=2.0), product of:
            0.28613585 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03375035 = queryNorm
            0.56201804 = fieldWeight in 862, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=862)
      0.4 = coord(4/10)
    
    Source
    https%3A%2F%2Farxiv.org%2Fabs%2F2212.06721&usg=AOvVaw3i_9pZm9y_dQWoHi6uv0EN
  3. Huo, W.: Automatic multi-word term extraction and its application to Web-page summarization (2012) 0.20
    0.19846347 = product of:
      0.49615866 = sum of:
        0.16081351 = weight(_text_:2f in 563) [ClassicSimilarity], result of:
          0.16081351 = score(doc=563,freq=2.0), product of:
            0.28613585 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03375035 = queryNorm
            0.56201804 = fieldWeight in 563, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=563)
        0.16081351 = weight(_text_:2f in 563) [ClassicSimilarity], result of:
          0.16081351 = score(doc=563,freq=2.0), product of:
            0.28613585 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03375035 = queryNorm
            0.56201804 = fieldWeight in 563, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=563)
        0.16081351 = weight(_text_:2f in 563) [ClassicSimilarity], result of:
          0.16081351 = score(doc=563,freq=2.0), product of:
            0.28613585 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03375035 = queryNorm
            0.56201804 = fieldWeight in 563, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=563)
        0.01371812 = product of:
          0.02743624 = sum of:
            0.02743624 = weight(_text_:22 in 563) [ClassicSimilarity], result of:
              0.02743624 = score(doc=563,freq=2.0), product of:
                0.11818798 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.03375035 = queryNorm
                0.23214069 = fieldWeight in 563, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.046875 = fieldNorm(doc=563)
          0.5 = coord(1/2)
      0.4 = coord(4/10)
    
    Content
    A Thesis presented to The University of Guelph In partial fulfilment of requirements for the degree of Master of Science in Computer Science. Vgl. Unter: http://www.inf.ufrgs.br%2F~ceramisch%2Fdownload_files%2Fpublications%2F2009%2Fp01.pdf.
    Date
    10. 1.2013 19:22:47
  4. Gombocz, W.L.: Stichwort oder Schlagwort versus Textwort : Grazer und Düsseldorfer Philosophie-Dokumentation und -Information nach bzw. gemäß Norbert Henrichs (2000) 0.06
    0.059214782 = product of:
      0.2960739 = sum of:
        0.14803696 = weight(_text_:philosophie in 3413) [ClassicSimilarity], result of:
          0.14803696 = score(doc=3413,freq=4.0), product of:
            0.17881922 = queryWeight, product of:
              5.298292 = idf(docFreq=600, maxDocs=44218)
              0.03375035 = queryNorm
            0.82785815 = fieldWeight in 3413, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              5.298292 = idf(docFreq=600, maxDocs=44218)
              0.078125 = fieldNorm(doc=3413)
        0.14803696 = weight(_text_:philosophie in 3413) [ClassicSimilarity], result of:
          0.14803696 = score(doc=3413,freq=4.0), product of:
            0.17881922 = queryWeight, product of:
              5.298292 = idf(docFreq=600, maxDocs=44218)
              0.03375035 = queryNorm
            0.82785815 = fieldWeight in 3413, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              5.298292 = idf(docFreq=600, maxDocs=44218)
              0.078125 = fieldNorm(doc=3413)
      0.2 = coord(2/10)
    
    Field
    Philosophie
  5. Stock, W.G.: Textwortmethode : Norbert Henrichs zum 65. (3) (2000) 0.05
    0.047371827 = product of:
      0.23685913 = sum of:
        0.118429564 = weight(_text_:philosophie in 4891) [ClassicSimilarity], result of:
          0.118429564 = score(doc=4891,freq=4.0), product of:
            0.17881922 = queryWeight, product of:
              5.298292 = idf(docFreq=600, maxDocs=44218)
              0.03375035 = queryNorm
            0.6622865 = fieldWeight in 4891, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              5.298292 = idf(docFreq=600, maxDocs=44218)
              0.0625 = fieldNorm(doc=4891)
        0.118429564 = weight(_text_:philosophie in 4891) [ClassicSimilarity], result of:
          0.118429564 = score(doc=4891,freq=4.0), product of:
            0.17881922 = queryWeight, product of:
              5.298292 = idf(docFreq=600, maxDocs=44218)
              0.03375035 = queryNorm
            0.6622865 = fieldWeight in 4891, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              5.298292 = idf(docFreq=600, maxDocs=44218)
              0.0625 = fieldNorm(doc=4891)
      0.2 = coord(2/10)
    
    Abstract
    Nur wenige Dokumentationsmethoden werden mit dem Namen ihrer Entwickler assoziiert. Ausnahmen sind Melvil Dewey (DDC), S.R. Ranganathan (Colon Classification) - und Norbert Henrichs. Seine Textwortmethode ermöglicht die Indexierung und das Retrieval von Literatur aus Fachgebieten, die keine allseits akzeptierte Fachterminologie vorweisen, also viele Sozial- und Geisteswissenschaften, vorneweg die Philosophie. Für den Einsatz in der elektronischen Philosophie-Dokumentation hat Henrichs in den späten sechziger Jahren die Textwortmethode entworfen. Er ist damit nicht nur einer der Pioniere der Anwendung der elektronischen Datenverarbeitung in der Informationspraxis, sondern auch der Pionier bei der Dokumentation terminologisch nicht starrer Fachsprachen
  6. Stock, W.G.: Textwortmethode (2000) 0.04
    0.041871175 = product of:
      0.20935588 = sum of:
        0.10467794 = weight(_text_:philosophie in 3408) [ClassicSimilarity], result of:
          0.10467794 = score(doc=3408,freq=2.0), product of:
            0.17881922 = queryWeight, product of:
              5.298292 = idf(docFreq=600, maxDocs=44218)
              0.03375035 = queryNorm
            0.58538413 = fieldWeight in 3408, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              5.298292 = idf(docFreq=600, maxDocs=44218)
              0.078125 = fieldNorm(doc=3408)
        0.10467794 = weight(_text_:philosophie in 3408) [ClassicSimilarity], result of:
          0.10467794 = score(doc=3408,freq=2.0), product of:
            0.17881922 = queryWeight, product of:
              5.298292 = idf(docFreq=600, maxDocs=44218)
              0.03375035 = queryNorm
            0.58538413 = fieldWeight in 3408, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              5.298292 = idf(docFreq=600, maxDocs=44218)
              0.078125 = fieldNorm(doc=3408)
      0.2 = coord(2/10)
    
    Field
    Philosophie
  7. Karlova-Bourbonus, N.: Automatic detection of contradictions in texts (2018) 0.01
    0.012561353 = product of:
      0.06280676 = sum of:
        0.03140338 = weight(_text_:philosophie in 5976) [ClassicSimilarity], result of:
          0.03140338 = score(doc=5976,freq=2.0), product of:
            0.17881922 = queryWeight, product of:
              5.298292 = idf(docFreq=600, maxDocs=44218)
              0.03375035 = queryNorm
            0.17561524 = fieldWeight in 5976, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              5.298292 = idf(docFreq=600, maxDocs=44218)
              0.0234375 = fieldNorm(doc=5976)
        0.03140338 = weight(_text_:philosophie in 5976) [ClassicSimilarity], result of:
          0.03140338 = score(doc=5976,freq=2.0), product of:
            0.17881922 = queryWeight, product of:
              5.298292 = idf(docFreq=600, maxDocs=44218)
              0.03375035 = queryNorm
            0.17561524 = fieldWeight in 5976, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              5.298292 = idf(docFreq=600, maxDocs=44218)
              0.0234375 = fieldNorm(doc=5976)
      0.2 = coord(2/10)
    
    Content
    Inaugural-Dissertation zur Erlangung des Doktorgrades der Philosophie des Fachbereiches 05 - Sprache, Literatur, Kultur der Justus-Liebig-Universität Gießen. Vgl. unter: https://core.ac.uk/download/pdf/196294796.pdf.
  8. Rahmstorf, G.: Wortmodell und Begriffssprache als Basis des semantischen Retrievals (2000) 0.01
    0.009755414 = product of:
      0.09755414 = sum of:
        0.09755414 = weight(_text_:systematische in 5484) [ClassicSimilarity], result of:
          0.09755414 = score(doc=5484,freq=2.0), product of:
            0.20632909 = queryWeight, product of:
              6.113391 = idf(docFreq=265, maxDocs=44218)
              0.03375035 = queryNorm
            0.47280845 = fieldWeight in 5484, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              6.113391 = idf(docFreq=265, maxDocs=44218)
              0.0546875 = fieldNorm(doc=5484)
      0.1 = coord(1/10)
    
    Abstract
    Der heutigen Retrievaltechnik wird das Projekt eines semantisch basierten Suchsystems gegenübergestellt. Es soll genauer und vollständiger arbeiten sowie systematische Zusammenhänge zwischen Themen unterstützen. Bei diesem Ansatz wird ein umfassendes Wörterbuch mit einer einfachen begrifflichen Darstellung der Wortbedeutungen benötigt. Das Wortmodell bildet Wort, Wortmerkmale, Lemma, Wortbedeutungen (Lesarten), Lesartenmerkmale und Begriffe ab. Begriffe sind formale Ausdrücke einer Begriffssprache. Entsprechend dieser Differenzierung wird Lenunaindexierung, Lesartenindexierung und Begriffsindexierung unterschieden. Begriffe werden mit dem Programm Concepto grafisch konstruiert und erfasst
  9. Schmitz, K.-D.: Projektforschung und Infrastrukturen im Bereich der Terminologie : Wie kann die Wirtschaft davon profitieren? (2000) 0.01
    0.008361783 = product of:
      0.08361783 = sum of:
        0.08361783 = weight(_text_:systematische in 5568) [ClassicSimilarity], result of:
          0.08361783 = score(doc=5568,freq=2.0), product of:
            0.20632909 = queryWeight, product of:
              6.113391 = idf(docFreq=265, maxDocs=44218)
              0.03375035 = queryNorm
            0.40526438 = fieldWeight in 5568, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              6.113391 = idf(docFreq=265, maxDocs=44218)
              0.046875 = fieldNorm(doc=5568)
      0.1 = coord(1/10)
    
    Abstract
    In der heutigen Informationsgesellschaft bieten sich der Industrie neue Perspektiven für Kommunikation und Handel auf dem europäischen und internationalen Markt; beide Märkte sind von einer großen sprachlichen, kulturellen und gesellschaftlichen Vielfalt geprägt. Uni Nutzen aus diesen neuen Möglichkeiten zu ziehen und um weiterhin konkurrenzfähig zu bleiben, muß die Industrie spezifische und adäquate Lösungen zur Überwindung der Sprachbarrieren finden. Voraussetzung hierfür ist die genaue Definition, systematische Ordnung und exakte Benennung der Begriffe innerhalb der jeweiligen Fachgebiete, in der eigenen Sprache ebenso wie in den Fremdsprachen. Genau dies sind die Themenbereiche, mit dem sich die Terminologiewissenschaft und die praktische Temninologiearbeit beschäftigen. Die Ergebnisse der Terminologiearbeit im Unternehmen beeinflussen Konstruktion, Produktion, Einkauf, Marketing und Verkauf, Vertragswesen, technische Dokumentation und Übersetzung
  10. Helbig, H.: Wissensverarbeitung und die Semantik der natürlichen Sprache : Wissensrepräsentation mit MultiNet (2008) 0.01
    0.006968153 = product of:
      0.06968153 = sum of:
        0.06968153 = weight(_text_:systematische in 2731) [ClassicSimilarity], result of:
          0.06968153 = score(doc=2731,freq=2.0), product of:
            0.20632909 = queryWeight, product of:
              6.113391 = idf(docFreq=265, maxDocs=44218)
              0.03375035 = queryNorm
            0.33772033 = fieldWeight in 2731, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              6.113391 = idf(docFreq=265, maxDocs=44218)
              0.0390625 = fieldNorm(doc=2731)
      0.1 = coord(1/10)
    
    Abstract
    Das Buch gibt eine umfassende Darstellung einer Methodik zur Interpretation und Bedeutungsrepräsentation natürlichsprachlicher Ausdrücke. Diese Methodik der "Mehrschichtigen Erweiterten Semantischen Netze", das sogenannte MultiNet-Paradigma, ist sowohl für theoretische Untersuchungen als auch für die automatische Verarbeitung natürlicher Sprache auf dem Rechner geeignet. Im ersten Teil des zweiteiligen Buches werden grundlegende Probleme der semantischen Repräsentation von Wissen bzw. der semantischen Interpretation natürlichsprachlicher Phänomene behandelt. Der zweite Teil enthält eine systematische Zusammenstellung des gesamten Repertoires von Darstellungsmitteln, die jeweils nach einem einheitlichen Schema beschrieben werden. Er dient als Kompendium der im Buch verwendeten formalen Beschreibungsmittel von MultiNet. Die vorgestellten Ergebnisse sind eingebettet in ein System von Software-Werkzeugen, die eine praktische Nutzung der MultiNet-Darstellungsmittel als Formalismus zur Bedeutungsrepräsentation im Rahmen der automatischen Sprachverarbeitung sichern. Hierzu gehören: eine Werkbank für den Wissensingenieur, ein Übersetzungssystem zur automatischen Gewinnung von Bedeutungsdarstellungen natürlichsprachlicher Sätze und eine Werkbank für den Computerlexikographen. Der Inhalt des Buches beruht auf jahrzehntelanger Forschung auf dem Gebiet der automatischen Sprachverarbeitung und wurde mit Vorlesungen zur Künstlichen Intelligenz und Wissensverarbeitung an der TU Dresden und der FernUniversität Hagen wiederholt in der Hochschullehre eingesetzt. Als Vorkenntnisse werden beim Leser lediglich Grundlagen der traditionellen Grammatik und elementare Kenntnisse der Prädikatenlogik vorausgesetzt.
  11. Warner, A.J.: Natural language processing (1987) 0.00
    0.0036581655 = product of:
      0.036581654 = sum of:
        0.036581654 = product of:
          0.07316331 = sum of:
            0.07316331 = weight(_text_:22 in 337) [ClassicSimilarity], result of:
              0.07316331 = score(doc=337,freq=2.0), product of:
                0.11818798 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.03375035 = queryNorm
                0.61904186 = fieldWeight in 337, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.125 = fieldNorm(doc=337)
          0.5 = coord(1/2)
      0.1 = coord(1/10)
    
    Source
    Annual review of information science and technology. 22(1987), S.79-108
  12. McMahon, J.G.; Smith, F.J.: Improved statistical language model performance with automatic generated word hierarchies (1996) 0.00
    0.0032008947 = product of:
      0.032008946 = sum of:
        0.032008946 = product of:
          0.06401789 = sum of:
            0.06401789 = weight(_text_:22 in 3164) [ClassicSimilarity], result of:
              0.06401789 = score(doc=3164,freq=2.0), product of:
                0.11818798 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.03375035 = queryNorm
                0.5416616 = fieldWeight in 3164, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.109375 = fieldNorm(doc=3164)
          0.5 = coord(1/2)
      0.1 = coord(1/10)
    
    Source
    Computational linguistics. 22(1996) no.2, S.217-248
  13. Ruge, G.: ¬A spreading activation network for automatic generation of thesaurus relationships (1991) 0.00
    0.0032008947 = product of:
      0.032008946 = sum of:
        0.032008946 = product of:
          0.06401789 = sum of:
            0.06401789 = weight(_text_:22 in 4506) [ClassicSimilarity], result of:
              0.06401789 = score(doc=4506,freq=2.0), product of:
                0.11818798 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.03375035 = queryNorm
                0.5416616 = fieldWeight in 4506, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.109375 = fieldNorm(doc=4506)
          0.5 = coord(1/2)
      0.1 = coord(1/10)
    
    Date
    8.10.2000 11:52:22
  14. Somers, H.: Example-based machine translation : Review article (1999) 0.00
    0.0032008947 = product of:
      0.032008946 = sum of:
        0.032008946 = product of:
          0.06401789 = sum of:
            0.06401789 = weight(_text_:22 in 6672) [ClassicSimilarity], result of:
              0.06401789 = score(doc=6672,freq=2.0), product of:
                0.11818798 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.03375035 = queryNorm
                0.5416616 = fieldWeight in 6672, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.109375 = fieldNorm(doc=6672)
          0.5 = coord(1/2)
      0.1 = coord(1/10)
    
    Date
    31. 7.1996 9:22:19
  15. New tools for human translators (1997) 0.00
    0.0032008947 = product of:
      0.032008946 = sum of:
        0.032008946 = product of:
          0.06401789 = sum of:
            0.06401789 = weight(_text_:22 in 1179) [ClassicSimilarity], result of:
              0.06401789 = score(doc=1179,freq=2.0), product of:
                0.11818798 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.03375035 = queryNorm
                0.5416616 = fieldWeight in 1179, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.109375 = fieldNorm(doc=1179)
          0.5 = coord(1/2)
      0.1 = coord(1/10)
    
    Date
    31. 7.1996 9:22:19
  16. Baayen, R.H.; Lieber, H.: Word frequency distributions and lexical semantics (1997) 0.00
    0.0032008947 = product of:
      0.032008946 = sum of:
        0.032008946 = product of:
          0.06401789 = sum of:
            0.06401789 = weight(_text_:22 in 3117) [ClassicSimilarity], result of:
              0.06401789 = score(doc=3117,freq=2.0), product of:
                0.11818798 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.03375035 = queryNorm
                0.5416616 = fieldWeight in 3117, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.109375 = fieldNorm(doc=3117)
          0.5 = coord(1/2)
      0.1 = coord(1/10)
    
    Date
    28. 2.1999 10:48:22
  17. ¬Der Student aus dem Computer (2023) 0.00
    0.0032008947 = product of:
      0.032008946 = sum of:
        0.032008946 = product of:
          0.06401789 = sum of:
            0.06401789 = weight(_text_:22 in 1079) [ClassicSimilarity], result of:
              0.06401789 = score(doc=1079,freq=2.0), product of:
                0.11818798 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.03375035 = queryNorm
                0.5416616 = fieldWeight in 1079, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.109375 = fieldNorm(doc=1079)
          0.5 = coord(1/2)
      0.1 = coord(1/10)
    
    Date
    27. 1.2023 16:22:55
  18. Byrne, C.C.; McCracken, S.A.: ¬An adaptive thesaurus employing semantic distance, relational inheritance and nominal compound interpretation for linguistic support of information retrieval (1999) 0.00
    0.002743624 = product of:
      0.02743624 = sum of:
        0.02743624 = product of:
          0.05487248 = sum of:
            0.05487248 = weight(_text_:22 in 4483) [ClassicSimilarity], result of:
              0.05487248 = score(doc=4483,freq=2.0), product of:
                0.11818798 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.03375035 = queryNorm
                0.46428138 = fieldWeight in 4483, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.09375 = fieldNorm(doc=4483)
          0.5 = coord(1/2)
      0.1 = coord(1/10)
    
    Date
    15. 3.2000 10:22:37
  19. Boleda, G.; Evert, S.: Multiword expressions : a pain in the neck of lexical semantics (2009) 0.00
    0.002743624 = product of:
      0.02743624 = sum of:
        0.02743624 = product of:
          0.05487248 = sum of:
            0.05487248 = weight(_text_:22 in 4888) [ClassicSimilarity], result of:
              0.05487248 = score(doc=4888,freq=2.0), product of:
                0.11818798 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.03375035 = queryNorm
                0.46428138 = fieldWeight in 4888, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.09375 = fieldNorm(doc=4888)
          0.5 = coord(1/2)
      0.1 = coord(1/10)
    
    Date
    1. 3.2013 14:56:22
  20. Monnerjahn, P.: Vorsprung ohne Technik : Übersetzen: Computer und Qualität (2000) 0.00
    0.002743624 = product of:
      0.02743624 = sum of:
        0.02743624 = product of:
          0.05487248 = sum of:
            0.05487248 = weight(_text_:22 in 5429) [ClassicSimilarity], result of:
              0.05487248 = score(doc=5429,freq=2.0), product of:
                0.11818798 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.03375035 = queryNorm
                0.46428138 = fieldWeight in 5429, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.09375 = fieldNorm(doc=5429)
          0.5 = coord(1/2)
      0.1 = coord(1/10)
    
    Source
    c't. 2000, H.22, S.230-231

Years

Languages

  • e 36
  • d 23

Types

  • a 45
  • m 6
  • el 5
  • s 3
  • x 3
  • p 2
  • d 1
  • More… Less…