Search (20 results, page 1 of 1)

  • × language_ss:"e"
  • × theme_ss:"Computerlinguistik"
  • × type_ss:"m"
  1. ¬The language engineering directory (1993) 0.07
    0.071794406 = product of:
      0.14358881 = sum of:
        0.12867 = weight(_text_:interfaces in 8408) [ClassicSimilarity], result of:
          0.12867 = score(doc=8408,freq=2.0), product of:
            0.22349821 = queryWeight, product of:
              5.2107263 = idf(docFreq=655, maxDocs=44218)
              0.04289195 = queryNorm
            0.57570934 = fieldWeight in 8408, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              5.2107263 = idf(docFreq=655, maxDocs=44218)
              0.078125 = fieldNorm(doc=8408)
        0.014918802 = product of:
          0.044756405 = sum of:
            0.044756405 = weight(_text_:systems in 8408) [ClassicSimilarity], result of:
              0.044756405 = score(doc=8408,freq=2.0), product of:
                0.13181444 = queryWeight, product of:
                  3.0731742 = idf(docFreq=5561, maxDocs=44218)
                  0.04289195 = queryNorm
                0.339541 = fieldWeight in 8408, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.0731742 = idf(docFreq=5561, maxDocs=44218)
                  0.078125 = fieldNorm(doc=8408)
          0.33333334 = coord(1/3)
      0.5 = coord(2/4)
    
    Abstract
    This is a reference guide to language technology organizations and products around the world. Areas covered in the directory include: Artificial intelligence, Document storage and retrieval, Electronic dictionaries (mono- and multilingual), Expert language systems, Multilingual word processors, Natural language database interfaces, Term databanks, Terminology management, Text content analysis, Thesauri
  2. Lehman, J.F.: Adaptive parsing : self-extending natural language interfaces (19??) 0.04
    0.038601004 = product of:
      0.15440401 = sum of:
        0.15440401 = weight(_text_:interfaces in 5296) [ClassicSimilarity], result of:
          0.15440401 = score(doc=5296,freq=2.0), product of:
            0.22349821 = queryWeight, product of:
              5.2107263 = idf(docFreq=655, maxDocs=44218)
              0.04289195 = queryNorm
            0.6908512 = fieldWeight in 5296, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              5.2107263 = idf(docFreq=655, maxDocs=44218)
              0.09375 = fieldNorm(doc=5296)
      0.25 = coord(1/4)
    
  3. Computational linguistics for the new millennium : divergence or synergy? Proceedings of the International Symposium held at the Ruprecht-Karls Universität Heidelberg, 21-22 July 2000. Festschrift in honour of Peter Hellwig on the occasion of his 60th birthday (2002) 0.01
    0.008572424 = product of:
      0.034289695 = sum of:
        0.034289695 = product of:
          0.05143454 = sum of:
            0.022378203 = weight(_text_:systems in 4900) [ClassicSimilarity], result of:
              0.022378203 = score(doc=4900,freq=2.0), product of:
                0.13181444 = queryWeight, product of:
                  3.0731742 = idf(docFreq=5561, maxDocs=44218)
                  0.04289195 = queryNorm
                0.1697705 = fieldWeight in 4900, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.0731742 = idf(docFreq=5561, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=4900)
            0.029056335 = weight(_text_:22 in 4900) [ClassicSimilarity], result of:
              0.029056335 = score(doc=4900,freq=2.0), product of:
                0.15020029 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.04289195 = queryNorm
                0.19345059 = fieldWeight in 4900, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=4900)
          0.6666667 = coord(2/3)
      0.25 = coord(1/4)
    
    Content
    Contents: Manfred Klenner / Henriette Visser: Introduction - Khurshid Ahmad: Writing Linguistics: When I use a word it means what I choose it to mean - Jürgen Handke: 2000 and Beyond: The Potential of New Technologies in Linguistics - Jurij Apresjan / Igor Boguslavsky / Leonid Iomdin / Leonid Tsinman: Lexical Functions in NU: Possible Uses - Hubert Lehmann: Practical Machine Translation and Linguistic Theory - Karin Haenelt: A Contextbased Approach towards Content Processing of Electronic Documents - Petr Sgall / Eva Hajicová: Are Linguistic Frameworks Comparable? - Wolfgang Menzel: Theory and Applications in Computational Linguistics - Is there Common Ground? - Robert Porzel / Michael Strube: Towards Context-adaptive Natural Language Processing Systems - Nicoletta Calzolari: Language Resources in a Multilingual Setting: The European Perspective - Piek Vossen: Computational Linguistics for Theory and Practice.
  4. Goshawke, W.; Kelly, D.K.; Wigg, J.D.: Computer translation of natural language (1987) 0.01
    0.0063295113 = product of:
      0.025318045 = sum of:
        0.025318045 = product of:
          0.07595413 = sum of:
            0.07595413 = weight(_text_:systems in 4819) [ClassicSimilarity], result of:
              0.07595413 = score(doc=4819,freq=4.0), product of:
                0.13181444 = queryWeight, product of:
                  3.0731742 = idf(docFreq=5561, maxDocs=44218)
                  0.04289195 = queryNorm
                0.57622015 = fieldWeight in 4819, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  3.0731742 = idf(docFreq=5561, maxDocs=44218)
                  0.09375 = fieldNorm(doc=4819)
          0.33333334 = coord(1/3)
      0.25 = coord(1/4)
    
    PRECIS
    Languages / Translation / Applications of computer systems
    Subject
    Languages / Translation / Applications of computer systems
  5. Sparck Jones, K.: Synonymy and semantic classification (1986) 0.01
    0.005274593 = product of:
      0.021098372 = sum of:
        0.021098372 = product of:
          0.06329511 = sum of:
            0.06329511 = weight(_text_:systems in 1304) [ClassicSimilarity], result of:
              0.06329511 = score(doc=1304,freq=4.0), product of:
                0.13181444 = queryWeight, product of:
                  3.0731742 = idf(docFreq=5561, maxDocs=44218)
                  0.04289195 = queryNorm
                0.48018348 = fieldWeight in 1304, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  3.0731742 = idf(docFreq=5561, maxDocs=44218)
                  0.078125 = fieldNorm(doc=1304)
          0.33333334 = coord(1/3)
      0.25 = coord(1/4)
    
    PRECIS
    Computer systems / Programming languages / Grammar
    Subject
    Computer systems / Programming languages / Grammar
  6. Whitelock, P.; Kilby, K.: Linguistic and computational techniques in machine translation system design : 2nd ed (1995) 0.00
    0.004886682 = product of:
      0.019546729 = sum of:
        0.019546729 = product of:
          0.058640182 = sum of:
            0.058640182 = weight(_text_:29 in 1750) [ClassicSimilarity], result of:
              0.058640182 = score(doc=1750,freq=2.0), product of:
                0.15088047 = queryWeight, product of:
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.04289195 = queryNorm
                0.38865322 = fieldWeight in 1750, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.078125 = fieldNorm(doc=1750)
          0.33333334 = coord(1/3)
      0.25 = coord(1/4)
    
    Date
    29. 3.1996 18:28:09
  7. Tomita, M.: Efficient parsing for natural language : a fast algorithm for practical systems (19??) 0.00
    0.0044756406 = product of:
      0.017902562 = sum of:
        0.017902562 = product of:
          0.053707685 = sum of:
            0.053707685 = weight(_text_:systems in 5305) [ClassicSimilarity], result of:
              0.053707685 = score(doc=5305,freq=2.0), product of:
                0.13181444 = queryWeight, product of:
                  3.0731742 = idf(docFreq=5561, maxDocs=44218)
                  0.04289195 = queryNorm
                0.4074492 = fieldWeight in 5305, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.0731742 = idf(docFreq=5561, maxDocs=44218)
                  0.09375 = fieldNorm(doc=5305)
          0.33333334 = coord(1/3)
      0.25 = coord(1/4)
    
  8. Sparck Jones, K.; Galliers, J.R.: Evaluating natural language processing systems : an analysis and review (1996) 0.00
    0.0044756406 = product of:
      0.017902562 = sum of:
        0.017902562 = product of:
          0.053707685 = sum of:
            0.053707685 = weight(_text_:systems in 2934) [ClassicSimilarity], result of:
              0.053707685 = score(doc=2934,freq=8.0), product of:
                0.13181444 = queryWeight, product of:
                  3.0731742 = idf(docFreq=5561, maxDocs=44218)
                  0.04289195 = queryNorm
                0.4074492 = fieldWeight in 2934, product of:
                  2.828427 = tf(freq=8.0), with freq of:
                    8.0 = termFreq=8.0
                  3.0731742 = idf(docFreq=5561, maxDocs=44218)
                  0.046875 = fieldNorm(doc=2934)
          0.33333334 = coord(1/3)
      0.25 = coord(1/4)
    
    Abstract
    This comprehensive state-of-the-art book is the first devoted to the important and timely issue of evaluating NLP systems. It addresses the whole area of NLP system evaluation, including aims and scope, problems and methodology. The authors provide a wide-ranging and careful analysis of evaluation concepts, reinforced with extensive illustrations; they relate systems to their environments and develop a framework for proper evaluation. The discussion of principles is completed by a detailed review of practice and strategies in the field, covering both systems for specific tasks, like translation, and core language processors. The methodology lessons drawn from the analysis and review are applied in a series of example cases. A comprehensive bibliography, a subject index, and term glossary are included
  9. Way, E.C.: Knowledge representation and metaphor (oder: meaning) (1994) 0.00
    0.0038741778 = product of:
      0.015496711 = sum of:
        0.015496711 = product of:
          0.046490133 = sum of:
            0.046490133 = weight(_text_:22 in 771) [ClassicSimilarity], result of:
              0.046490133 = score(doc=771,freq=2.0), product of:
                0.15020029 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.04289195 = queryNorm
                0.30952093 = fieldWeight in 771, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0625 = fieldNorm(doc=771)
          0.33333334 = coord(1/3)
      0.25 = coord(1/4)
    
    Footnote
    Bereits 1991 bei Kluwer publiziert // Rez. in: Knowledge organization 22(1995) no.1, S.48-49 (O. Sechser)
  10. Brenner, E.H.: Beyond Boolean : new approaches in information retrieval; the quest for intuitive online search systems past, present & future (1995) 0.00
    0.003692215 = product of:
      0.01476886 = sum of:
        0.01476886 = product of:
          0.04430658 = sum of:
            0.04430658 = weight(_text_:systems in 2547) [ClassicSimilarity], result of:
              0.04430658 = score(doc=2547,freq=4.0), product of:
                0.13181444 = queryWeight, product of:
                  3.0731742 = idf(docFreq=5561, maxDocs=44218)
                  0.04289195 = queryNorm
                0.33612844 = fieldWeight in 2547, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  3.0731742 = idf(docFreq=5561, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=2547)
          0.33333334 = coord(1/3)
      0.25 = coord(1/4)
    
    Abstract
    The challenge of effectively bringing specific, relevant information from the global sea of data to our fingertips, has become an increasingly difficult one. Discusses how the online information industry, founded on Boolean search systems, may be evolving to take advantage of other methods, such as 'term weighting', 'relevance ranking' and 'query by example'
  11. WordNet : an electronic lexical database (language, speech and communication) (1998) 0.00
    0.003420677 = product of:
      0.013682708 = sum of:
        0.013682708 = product of:
          0.041048124 = sum of:
            0.041048124 = weight(_text_:29 in 2434) [ClassicSimilarity], result of:
              0.041048124 = score(doc=2434,freq=2.0), product of:
                0.15088047 = queryWeight, product of:
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.04289195 = queryNorm
                0.27205724 = fieldWeight in 2434, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=2434)
          0.33333334 = coord(1/3)
      0.25 = coord(1/4)
    
    Date
    29. 3.1996 18:16:49
  12. Kracht, M.: Mathematical linguistics (2002) 0.00
    0.0031647556 = product of:
      0.012659023 = sum of:
        0.012659023 = product of:
          0.037977066 = sum of:
            0.037977066 = weight(_text_:systems in 3572) [ClassicSimilarity], result of:
              0.037977066 = score(doc=3572,freq=4.0), product of:
                0.13181444 = queryWeight, product of:
                  3.0731742 = idf(docFreq=5561, maxDocs=44218)
                  0.04289195 = queryNorm
                0.28811008 = fieldWeight in 3572, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  3.0731742 = idf(docFreq=5561, maxDocs=44218)
                  0.046875 = fieldNorm(doc=3572)
          0.33333334 = coord(1/3)
      0.25 = coord(1/4)
    
    Abstract
    This book studies language(s) and linguistic theories from a mathematical point of view. Starting with ideas already contained in Montague's work, it develops the mathematical foundations of present day linguistics. It equips the reader with all the background necessary to understand and evaluate theories as diverse as Montague Grammar, Categorial Grammar, HPSG and GB. The mathematical tools are mainly from universal algebra and logic, but no particular knowledge is presupposed beyond a certain mathematical sophistication that is in any case needed in order to fruitfully work within these theories. The presentation focuses an abstract mathematical structures and their computational properties, but plenty of examples from different natural languages are provided to illustrate the main concepts and results. In contrast to books devoted to so-called formal language theory, languages are seen here as semiotic systems, that is, as systems of signs. A language sign correlates form with meaning. Using the principle of compositionality it is possible to gain substantial insight into the interaction between form and meaning in natural languages.
  13. Hodgson, J.P.E.: Knowledge representation and language in AI (1991) 0.00
    0.0026372964 = product of:
      0.010549186 = sum of:
        0.010549186 = product of:
          0.031647556 = sum of:
            0.031647556 = weight(_text_:systems in 1529) [ClassicSimilarity], result of:
              0.031647556 = score(doc=1529,freq=4.0), product of:
                0.13181444 = queryWeight, product of:
                  3.0731742 = idf(docFreq=5561, maxDocs=44218)
                  0.04289195 = queryNorm
                0.24009174 = fieldWeight in 1529, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  3.0731742 = idf(docFreq=5561, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=1529)
          0.33333334 = coord(1/3)
      0.25 = coord(1/4)
    
    Abstract
    The aim of this book is to highlight the relationship between knowledge representation and language in artificial intelligence, and in particular on the way in which the choice of representation influences the language used to discuss a problem - and vice versa. Opening with a discussion of knowledge representation methods, and following this with a look at reasoning methods, the author begins to make his case for the intimate relationship between language and representation. He shows how each representation method fits particularly well with some reasoning methods and less so with others, using specific languages as examples. The question of representation change, an important and complex issue about which very little is known, is addressed. Dr Hodgson gathers together recent work on problem solving, showing how, in some cases, it has been possible to use representation changes to recast problems into a language that makes them easier to solve. The author maintains throughout that the relationships that this book explores lie at the heart of the construction of large systems, examining a number of the current large AI systems from the viewpoint of representation and language to prove his point.
  14. Conceptual structures : logical, linguistic, and computational issues. 8th International Conference on Conceptual Structures, ICCS 2000, Darmstadt, Germany, August 14-18, 2000 (2000) 0.00
    0.0019380092 = product of:
      0.0077520367 = sum of:
        0.0077520367 = product of:
          0.02325611 = sum of:
            0.02325611 = weight(_text_:systems in 691) [ClassicSimilarity], result of:
              0.02325611 = score(doc=691,freq=6.0), product of:
                0.13181444 = queryWeight, product of:
                  3.0731742 = idf(docFreq=5561, maxDocs=44218)
                  0.04289195 = queryNorm
                0.17643067 = fieldWeight in 691, product of:
                  2.4494898 = tf(freq=6.0), with freq of:
                    6.0 = termFreq=6.0
                  3.0731742 = idf(docFreq=5561, maxDocs=44218)
                  0.0234375 = fieldNorm(doc=691)
          0.33333334 = coord(1/3)
      0.25 = coord(1/4)
    
    Abstract
    Computer scientists create models of a perceived reality. Through AI techniques, these models aim at providing the basic support for emulating cognitive behavior such as reasoning and learning, which is one of the main goals of the Al research effort. Such computer models are formed through the interaction of various acquisition and inference mechanisms: perception, concept learning, conceptual clustering, hypothesis testing, probabilistic inference, etc., and are represented using different paradigms tightly linked to the processes that use them. Among these paradigms let us cite: biological models (neural nets, genetic programming), logic-based models (first-order logic, modal logic, rule-based systems), virtual reality models (object systems, agent systems), probabilistic models (Bayesian nets, fuzzy logic), linguistic models (conceptual dependency graphs, language-based rep resentations), etc. One of the strengths of the Conceptual Graph (CG) theory is its versatility in terms of the representation paradigms under which it falls. It can be viewed and therefore used, under different representation paradigms, which makes it a popular choice for a wealth of applications. Its full coupling with different cognitive processes lead to the opening of the field toward related research communities such as the Description Logic, Formal Concept Analysis, and Computational Linguistic communities. We now see more and more research results from one community enrich the other, laying the foundations of common philosophical grounds from which a successful synergy can emerge. ICCS 2000 embodies this spirit of research collaboration. It presents a set of papers that we believe, by their exposure, will benefit the whole community. For instance, the technical program proposes tracks on Conceptual Ontologies, Language, Formal Concept Analysis, Computational Aspects of Conceptual Structures, and Formal Semantics, with some papers on pragmatism and human related aspects of computing. Never before was the program of ICCS formed by so heterogeneously rooted theories of knowledge representation and use. We hope that this swirl of ideas will benefit you as much as it already has benefited us while putting together this program
  15. Sprachtechnologie, mobile Kommunikation und linguistische Ressourcen : Beiträge zur GLDV Tagung 2005 in Bonn (2005) 0.00
    0.0019380092 = product of:
      0.0077520367 = sum of:
        0.0077520367 = product of:
          0.02325611 = sum of:
            0.02325611 = weight(_text_:systems in 3578) [ClassicSimilarity], result of:
              0.02325611 = score(doc=3578,freq=6.0), product of:
                0.13181444 = queryWeight, product of:
                  3.0731742 = idf(docFreq=5561, maxDocs=44218)
                  0.04289195 = queryNorm
                0.17643067 = fieldWeight in 3578, product of:
                  2.4494898 = tf(freq=6.0), with freq of:
                    6.0 = termFreq=6.0
                  3.0731742 = idf(docFreq=5561, maxDocs=44218)
                  0.0234375 = fieldNorm(doc=3578)
          0.33333334 = coord(1/3)
      0.25 = coord(1/4)
    
    Content
    INHALT: Chris Biemann/Rainer Osswald: Automatische Erweiterung eines semantikbasierten Lexikons durch Bootstrapping auf großen Korpora - Ernesto William De Luca/Andreas Nürnberger: Supporting Mobile Web Search by Ontology-based Categorization - Rüdiger Gleim: HyGraph - Ein Framework zur Extraktion, Repräsentation und Analyse webbasierter Hypertextstrukturen - Felicitas Haas/Bernhard Schröder: Freges Grundgesetze der Arithmetik: Dokumentbaum und Formelwald - Ulrich Held/ Andre Blessing/Bettina Säuberlich/Jürgen Sienel/Horst Rößler/Dieter Kopp: A personalized multimodal news service -Jürgen Hermes/Christoph Benden: Fusion von Annotation und Präprozessierung als Vorschlag zur Behebung des Rohtextproblems - Sonja Hüwel/Britta Wrede/Gerhard Sagerer: Semantisches Parsing mit Frames für robuste multimodale Mensch-Maschine-Kommunikation - Brigitte Krenn/Stefan Evert: Separating the wheat from the chaff- Corpus-driven evaluation of statistical association measures for collocation extraction - Jörn Kreutel: An application-centered Perspective an Multimodal Dialogue Systems - Jonas Kuhn: An Architecture for Prallel Corpusbased Grammar Learning - Thomas Mandl/Rene Schneider/Pia Schnetzler/Christa Womser-Hacker: Evaluierung von Systemen für die Eigennamenerkennung im crosslingualen Information Retrieval - Alexander Mehler/Matthias Dehmer/Rüdiger Gleim: Zur Automatischen Klassifikation von Webgenres - Charlotte Merz/Martin Volk: Requirements for a Parallel Treebank Search Tool - Sally YK. Mok: Multilingual Text Retrieval an the Web: The Case of a Cantonese-Dagaare-English Trilingual e-Lexicon -
    Karel Pala: The Balkanet Experience - Peter M. Kruse/Andre Nauloks/Dietmar Rösner/Manuela Kunze: Clever Search: A WordNet Based Wrapper for Internet Search Engines - Rosmary Stegmann/Wolfgang Woerndl: Using GermaNet to Generate Individual Customer Profiles - Ingo Glöckner/Sven Hartrumpf/Rainer Osswald: From GermaNet Glosses to Formal Meaning Postulates -Aljoscha Burchardt/ Katrin Erk/Anette Frank: A WordNet Detour to FrameNet - Daniel Naber: OpenThesaurus: ein offenes deutsches Wortnetz - Anke Holler/Wolfgang Grund/Heinrich Petith: Maschinelle Generierung assoziativer Termnetze für die Dokumentensuche - Stefan Bordag/Hans Friedrich Witschel/Thomas Wittig: Evaluation of Lexical Acquisition Algorithms - Iryna Gurevych/Hendrik Niederlich: Computing Semantic Relatedness of GermaNet Concepts - Roland Hausser: Turn-taking als kognitive Grundmechanik der Datenbanksemantik - Rodolfo Delmonte: Parsing Overlaps - Melanie Twiggs: Behandlung des Passivs im Rahmen der Datenbanksemantik- Sandra Hohmann: Intention und Interaktion - Anmerkungen zur Relevanz der Benutzerabsicht - Doris Helfenbein: Verwendung von Pronomina im Sprecher- und Hörmodus - Bayan Abu Shawar/Eric Atwell: Modelling turn-taking in a corpus-trained chatbot - Barbara März: Die Koordination in der Datenbanksemantik - Jens Edlund/Mattias Heldner/Joakim Gustafsson: Utterance segmentation and turn-taking in spoken dialogue systems - Ekaterina Buyko: Numerische Repräsentation von Textkorpora für Wissensextraktion - Bernhard Fisseni: ProofML - eine Annotationssprache für natürlichsprachliche mathematische Beweise - Iryna Schenk: Auflösung der Pronomen mit Nicht-NP-Antezedenten in spontansprachlichen Dialogen - Stephan Schwiebert: Entwurf eines agentengestützten Systems zur Paradigmenbildung - Ingmar Steiner: On the analysis of speech rhythm through acoustic parameters - Hans Friedrich Witschel: Text, Wörter, Morpheme - Möglichkeiten einer automatischen Terminologie-Extraktion.
  16. Hutchins, W.J.; Somers, H.L.: ¬An introduction to machine translation (1992) 0.00
    0.0018648503 = product of:
      0.007459401 = sum of:
        0.007459401 = product of:
          0.022378203 = sum of:
            0.022378203 = weight(_text_:systems in 4512) [ClassicSimilarity], result of:
              0.022378203 = score(doc=4512,freq=2.0), product of:
                0.13181444 = queryWeight, product of:
                  3.0731742 = idf(docFreq=5561, maxDocs=44218)
                  0.04289195 = queryNorm
                0.1697705 = fieldWeight in 4512, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.0731742 = idf(docFreq=5561, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=4512)
          0.33333334 = coord(1/3)
      0.25 = coord(1/4)
    
    Abstract
    The translation of foreign language texts by computers was one of the first tasks that the pioneers of Computing and Artificial Intelligence set themselves. Machine translation is again becoming an importantfield of research and development as the need for translations of technical and commercial documentation is growing well beyond the capacity of the translation profession.This is the first textbook of machine translation, providing a full course on both general machine translation systems characteristics and the computational linguistic foundations of the field. The book assumes no previous knowledge of machine translation and provides the basic background information to the linguistic and computational linguistics, artificial intelligence, natural language processing and information science.
  17. Jurafsky, D.; Martin, J.H.: Speech and language processing : ani ntroduction to natural language processing, computational linguistics and speech recognition (2009) 0.00
    0.0018648503 = product of:
      0.007459401 = sum of:
        0.007459401 = product of:
          0.022378203 = sum of:
            0.022378203 = weight(_text_:systems in 1081) [ClassicSimilarity], result of:
              0.022378203 = score(doc=1081,freq=2.0), product of:
                0.13181444 = queryWeight, product of:
                  3.0731742 = idf(docFreq=5561, maxDocs=44218)
                  0.04289195 = queryNorm
                0.1697705 = fieldWeight in 1081, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.0731742 = idf(docFreq=5561, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=1081)
          0.33333334 = coord(1/3)
      0.25 = coord(1/4)
    
    Abstract
    For undergraduate or advanced undergraduate courses in Classical Natural Language Processing, Statistical Natural Language Processing, Speech Recognition, Computational Linguistics, and Human Language Processing. An explosion of Web-based language techniques, merging of distinct fields, availability of phone-based dialogue systems, and much more make this an exciting time in speech and language processing. The first of its kind to thoroughly cover language technology at all levels and with all modern technologies this text takes an empirical approach to the subject, based on applying statistical and other machine-learning algorithms to large corporations. The authors cover areas that traditionally are taught in different courses, to describe a unified vision of speech and language processing. Emphasis is on practical applications and scientific evaluation. An accompanying Website contains teaching materials for instructors, with pointers to language processing resources on the Web. The Second Edition offers a significant amount of new and extended material.
  18. Helbig, H.: Knowledge representation and the semantics of natural language (2014) 0.00
    0.0018648503 = product of:
      0.007459401 = sum of:
        0.007459401 = product of:
          0.022378203 = sum of:
            0.022378203 = weight(_text_:systems in 2396) [ClassicSimilarity], result of:
              0.022378203 = score(doc=2396,freq=2.0), product of:
                0.13181444 = queryWeight, product of:
                  3.0731742 = idf(docFreq=5561, maxDocs=44218)
                  0.04289195 = queryNorm
                0.1697705 = fieldWeight in 2396, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.0731742 = idf(docFreq=5561, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=2396)
          0.33333334 = coord(1/3)
      0.25 = coord(1/4)
    
    Abstract
    Natural Language is not only the most important means of communication between human beings, it is also used over historical periods for the preservation of cultural achievements and their transmission from one generation to the other. During the last few decades, the flod of digitalized information has been growing tremendously. This tendency will continue with the globalisation of information societies and with the growing importance of national and international computer networks. This is one reason why the theoretical understanding and the automated treatment of communication processes based on natural language have such a decisive social and economic impact. In this context, the semantic representation of knowledge originally formulated in natural language plays a central part, because it connects all components of natural language processing systems, be they the automatic understanding of natural language (analysis), the rational reasoning over knowledge bases, or the generation of natural language expressions from formal representations. This book presents a method for the semantic representation of natural language expressions (texts, sentences, phrases, etc.) which can be used as a universal knowledge representation paradigm in the human sciences, like linguistics, cognitive psychology, or philosophy of language, as well as in computational linguistics and in artificial intelligence. It is also an attempt to close the gap between these disciplines, which to a large extent are still working separately.
  19. Bowker, L.; Ciro, J.B.: Machine translation and global research : towards improved machine translation literacy in the scholarly community (2019) 0.00
    0.00149188 = product of:
      0.00596752 = sum of:
        0.00596752 = product of:
          0.01790256 = sum of:
            0.01790256 = weight(_text_:systems in 5970) [ClassicSimilarity], result of:
              0.01790256 = score(doc=5970,freq=2.0), product of:
                0.13181444 = queryWeight, product of:
                  3.0731742 = idf(docFreq=5561, maxDocs=44218)
                  0.04289195 = queryNorm
                0.1358164 = fieldWeight in 5970, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.0731742 = idf(docFreq=5561, maxDocs=44218)
                  0.03125 = fieldNorm(doc=5970)
          0.33333334 = coord(1/3)
      0.25 = coord(1/4)
    
    Abstract
    In the global research community, English has become the main language of scholarly publishing in many disciplines. At the same time, online machine translation systems have become increasingly easy to access and use. Is this a researcher's match made in heaven, or the road to publication perdition? Here Lynne Bowker and Jairo Buitrago Ciro introduce the concept of machine translation literacy, a new kind of literacy for scholars and librarians in the digital age. For scholars, they explain how machine translation works, how it is (or could be) used for scholarly communication, and how both native and non-native English-speakers can write in a translation-friendly way in order to harness its potential. Native English speakers can continue to write in English, but expand the global reach of their research by making it easier for their peers around the world to access and understand their works, while non-native English speakers can write in their mother tongues, but leverage machine translation technology to help them produce draft publications in English. For academic librarians, the authors provide a framework for supporting researchers in all disciplines as they grapple with producing translation-friendly texts and using machine translation for scholarly communication - a form of support that will only become more important as campuses become increasingly international and as universities continue to strive to excel on the global stage. Machine Translation and Global Research is a must-read for scientists, researchers, students, and librarians eager to maximize the global reach and impact of any form of scholarly work.
  20. Working with conceptual structures : contributions to ICCS 2000. 8th International Conference on Conceptual Structures: Logical, Linguistic, and Computational Issues. Darmstadt, August 14-18, 2000 (2000) 0.00
    0.0013053952 = product of:
      0.0052215806 = sum of:
        0.0052215806 = product of:
          0.015664741 = sum of:
            0.015664741 = weight(_text_:systems in 5089) [ClassicSimilarity], result of:
              0.015664741 = score(doc=5089,freq=2.0), product of:
                0.13181444 = queryWeight, product of:
                  3.0731742 = idf(docFreq=5561, maxDocs=44218)
                  0.04289195 = queryNorm
                0.118839346 = fieldWeight in 5089, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.0731742 = idf(docFreq=5561, maxDocs=44218)
                  0.02734375 = fieldNorm(doc=5089)
          0.33333334 = coord(1/3)
      0.25 = coord(1/4)
    
    Content
    Concepts & Language: Knowledge organization by procedures of natural language processing. A case study using the method GABEK (J. Zelger, J. Gadner) - Computer aided narrative analysis using conceptual graphs (H. Schärfe, P. 0hrstrom) - Pragmatic representation of argumentative text: a challenge for the conceptual graph approach (H. Irandoust, B. Moulin) - Conceptual graphs as a knowledge representation core in a complex language learning environment (G. Angelova, A. Nenkova, S. Boycheva, T. Nikolov) - Conceptual Modeling and Ontologies: Relationships and actions in conceptual categories (Ch. Landauer, K.L. Bellman) - Concept approximations for formal concept analysis (J. Saquer, J.S. Deogun) - Faceted information representation (U. Priß) - Simple concept graphs with universal quantifiers (J. Tappe) - A framework for comparing methods for using or reusing multiple ontologies in an application (J. van ZyI, D. Corbett) - Designing task/method knowledge-based systems with conceptual graphs (M. Leclère, F.Trichet, Ch. Choquet) - A logical ontology (J. Farkas, J. Sarbo) - Algorithms and Tools: Fast concept analysis (Ch. Lindig) - A framework for conceptual graph unification (D. Corbett) - Visual CP representation of knowledge (H.D. Pfeiffer, R.T. Hartley) - Maximal isojoin for representing software textual specifications and detecting semantic anomalies (Th. Charnois) - Troika: using grids, lattices and graphs in knowledge acquisition (H.S. Delugach, B.E. Lampkin) - Open world theorem prover for conceptual graphs (J.E. Heaton, P. Kocura) - NetCare: a practical conceptual graphs software tool (S. Polovina, D. Strang) - CGWorld - a web based workbench for conceptual graphs management and applications (P. Dobrev, K. Toutanova) - Position papers: The edition project: Peirce's existential graphs (R. Mülller) - Mining association rules using formal concept analysis (N. Pasquier) - Contextual logic summary (R Wille) - Information channels and conceptual scaling (K.E. Wolff) - Spatial concepts - a rule exploration (S. Rudolph) - The TEXT-TO-ONTO learning environment (A. Mädche, St. Staab) - Controlling the semantics of metadata on audio-visual documents using ontologies (Th. Dechilly, B. Bachimont) - Building the ontological foundations of a terminology from natural language to conceptual graphs with Ribosome, a knowledge extraction system (Ch. Jacquelinet, A. Burgun) - CharGer: some lessons learned and new directions (H.S. Delugach) - Knowledge management using conceptual graphs (W.K. Pun)

Years

Languages