Search (42 results, page 1 of 3)

  • × theme_ss:"Linguistik"
  1. Warner, A.J.: Quantitative and qualitative assessments of the impact of linguistic theory on information science (1991) 0.06
    0.05944937 = product of:
      0.17834811 = sum of:
        0.17834811 = sum of:
          0.07927656 = weight(_text_:science in 29) [ClassicSimilarity], result of:
            0.07927656 = score(doc=29,freq=4.0), product of:
              0.1375819 = queryWeight, product of:
                2.6341193 = idf(docFreq=8627, maxDocs=44218)
                0.052230705 = queryNorm
              0.5762136 = fieldWeight in 29, product of:
                2.0 = tf(freq=4.0), with freq of:
                  4.0 = termFreq=4.0
                2.6341193 = idf(docFreq=8627, maxDocs=44218)
                0.109375 = fieldNorm(doc=29)
          0.09907155 = weight(_text_:22 in 29) [ClassicSimilarity], result of:
            0.09907155 = score(doc=29,freq=2.0), product of:
              0.18290302 = queryWeight, product of:
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.052230705 = queryNorm
              0.5416616 = fieldWeight in 29, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.109375 = fieldNorm(doc=29)
      0.33333334 = coord(1/3)
    
    Date
    6. 1.1999 10:22:45
    Source
    Journal of the American Society for Information Science. 42(1991) no.1, S.64-71
  2. O'Donnell, R.; Smeaton, A.F.: ¬A linguistic approach to information retrieval (1996) 0.02
    0.020193208 = product of:
      0.030289812 = sum of:
        0.009060195 = weight(_text_:in in 2575) [ClassicSimilarity], result of:
          0.009060195 = score(doc=2575,freq=4.0), product of:
            0.07104705 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.052230705 = queryNorm
            0.12752387 = fieldWeight in 2575, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.046875 = fieldNorm(doc=2575)
        0.021229617 = product of:
          0.042459235 = sum of:
            0.042459235 = weight(_text_:22 in 2575) [ClassicSimilarity], result of:
              0.042459235 = score(doc=2575,freq=2.0), product of:
                0.18290302 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.052230705 = queryNorm
                0.23214069 = fieldWeight in 2575, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.046875 = fieldNorm(doc=2575)
          0.5 = coord(1/2)
      0.6666667 = coord(2/3)
    
    Abstract
    An important aspect of information retrieval systems is domain independence, where the subject of the information is not restricted to certain domains of knowledge. This should be able to represent any topic and although the text representation does not involve any semantic knowledge, lexical and syntactic analysis of the text allows the representation to remain domain independent. Reports research at Dublin City University, Ireland, which concentrates on the lexical and syntactic levels of natural language analysis and describes a domain independent automatic information retrieval system which accesses a very large database of newspaper text from the Wall Street Journal. The system represents the text in the form of syntax trees, and these trees are used in the matching process. Reports early results from the stuyd
    Source
    Information retrieval: new systems and current research. Proceedings of the 16th Research Colloquium of the British Computer Society Information Retrieval Specialist Group, Drymen, Scotland, 22-23 Mar 94. Ed.: R. Leon
  3. Harras, G.: Concepts in linguistics : concepts in natural language (2000) 0.02
    0.019308537 = product of:
      0.028962806 = sum of:
        0.014948557 = weight(_text_:in in 5068) [ClassicSimilarity], result of:
          0.014948557 = score(doc=5068,freq=8.0), product of:
            0.07104705 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.052230705 = queryNorm
            0.21040362 = fieldWeight in 5068, product of:
              2.828427 = tf(freq=8.0), with freq of:
                8.0 = termFreq=8.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0546875 = fieldNorm(doc=5068)
        0.014014249 = product of:
          0.028028497 = sum of:
            0.028028497 = weight(_text_:science in 5068) [ClassicSimilarity], result of:
              0.028028497 = score(doc=5068,freq=2.0), product of:
                0.1375819 = queryWeight, product of:
                  2.6341193 = idf(docFreq=8627, maxDocs=44218)
                  0.052230705 = queryNorm
                0.20372227 = fieldWeight in 5068, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  2.6341193 = idf(docFreq=8627, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=5068)
          0.5 = coord(1/2)
      0.6666667 = coord(2/3)
    
    Abstract
    This paper deals with different views of lexical semantics. The focus is on the relationship between lexical expressions and conceptual components. First the assumptions about lexicalization and decompositionality of concepts shared by the most semanticists are presented, followed by a discussion of the differences between two-level-semants and one-level-semantics. The final part is concentrated on the interpretation of conceptual components in situations of communication
    Series
    Lecture notes in computer science; vol.1867: Lecture notes on artificial intelligence
  4. Chomsky, N.: Aspects of the theory of syntax (1965) 0.02
    0.01887077 = product of:
      0.056612313 = sum of:
        0.056612313 = product of:
          0.113224626 = sum of:
            0.113224626 = weight(_text_:22 in 3829) [ClassicSimilarity], result of:
              0.113224626 = score(doc=3829,freq=2.0), product of:
                0.18290302 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.052230705 = queryNorm
                0.61904186 = fieldWeight in 3829, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.125 = fieldNorm(doc=3829)
          0.5 = coord(1/2)
      0.33333334 = coord(1/3)
    
    Date
    6. 1.1999 10:29:22
  5. Pilch, H.: Empirical linguistics (1976) 0.02
    0.01873103 = product of:
      0.028096544 = sum of:
        0.01208026 = weight(_text_:in in 7860) [ClassicSimilarity], result of:
          0.01208026 = score(doc=7860,freq=4.0), product of:
            0.07104705 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.052230705 = queryNorm
            0.17003182 = fieldWeight in 7860, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0625 = fieldNorm(doc=7860)
        0.016016284 = product of:
          0.032032568 = sum of:
            0.032032568 = weight(_text_:science in 7860) [ClassicSimilarity], result of:
              0.032032568 = score(doc=7860,freq=2.0), product of:
                0.1375819 = queryWeight, product of:
                  2.6341193 = idf(docFreq=8627, maxDocs=44218)
                  0.052230705 = queryNorm
                0.23282544 = fieldWeight in 7860, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  2.6341193 = idf(docFreq=8627, maxDocs=44218)
                  0.0625 = fieldNorm(doc=7860)
          0.5 = coord(1/2)
      0.6666667 = coord(2/3)
    
    Abstract
    The book is about real languages and the way to study them, as they are spoken and written by real people in different social situations. The linguist listens to them, using scientific methodology to analyse what he hears. This book explains a wide range of linguistic phenomena - including polyglott societies, lingzistic pathology, the reconstruction of unrecorded history - their relevance to other branches of science and their practical applicabiblity. It is couched in straightforward terms intelligible to a general audience, including students of modern language departments
  6. Miller, G.A.: Wörter : Streifzüge durch die Psycholinguistik (1993) 0.01
    0.012313303 = product of:
      0.018469954 = sum of:
        0.010461812 = weight(_text_:in in 1458) [ClassicSimilarity], result of:
          0.010461812 = score(doc=1458,freq=12.0), product of:
            0.07104705 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.052230705 = queryNorm
            0.14725187 = fieldWeight in 1458, product of:
              3.4641016 = tf(freq=12.0), with freq of:
                12.0 = termFreq=12.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.03125 = fieldNorm(doc=1458)
        0.008008142 = product of:
          0.016016284 = sum of:
            0.016016284 = weight(_text_:science in 1458) [ClassicSimilarity], result of:
              0.016016284 = score(doc=1458,freq=2.0), product of:
                0.1375819 = queryWeight, product of:
                  2.6341193 = idf(docFreq=8627, maxDocs=44218)
                  0.052230705 = queryNorm
                0.11641272 = fieldWeight in 1458, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  2.6341193 = idf(docFreq=8627, maxDocs=44218)
                  0.03125 = fieldNorm(doc=1458)
          0.5 = coord(1/2)
      0.6666667 = coord(2/3)
    
    Abstract
    Wörter sind der sprachliche Ausdruck unseres Denkens, von uns selbst geschaffen, und doch etwas, das wir selten einer näheren Betrachtung unterziehen. Dabei kann uns gerade diese Betrachtung einiges darüber sagen, was in unseren Gehirnen vor sich geht. Die Sprachforschung hat in den letzten Jahrzehnten durch die Ansätze der Kognitionspsychologie neuen Schwung bekommen - und Georg A. Miller hat als einer der Begründer der modernen Psycholinguistik einen nicht unwesentlichen Anteil daran gehabt. In diesem Buch erzühlt er, oft geürzt mit seinem ganz besonderen Humor, was die Linguistik im Reich der Wörter so alles entdeckt hat. Miller führt dem Leser die verschiedenen Seiten von Wörtern vor Augen; jedes einzelne davon ist das Zusammenspiel einer Äußerung - in der phonetischen Aussprache - , einer Bedeutung - in der Semantik - und einer Rolle im Satz - in der Syntax. Diese drei Seiten sieht Miller als Einheit, wobei er dem Leser die Theorien und Methoden, mit denen die Forschung den Wörtern zu Leibe rückt, anschaulich vorstellt
    Footnote
    Originaltitel: The science of words
  7. Conceptual structures : logical, linguistic, and computational issues. 8th International Conference on Conceptual Structures, ICCS 2000, Darmstadt, Germany, August 14-18, 2000 (2000) 0.01
    0.010757141 = product of:
      0.016135711 = sum of:
        0.010129605 = weight(_text_:in in 691) [ClassicSimilarity], result of:
          0.010129605 = score(doc=691,freq=20.0), product of:
            0.07104705 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.052230705 = queryNorm
            0.14257601 = fieldWeight in 691, product of:
              4.472136 = tf(freq=20.0), with freq of:
                20.0 = termFreq=20.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0234375 = fieldNorm(doc=691)
        0.0060061063 = product of:
          0.012012213 = sum of:
            0.012012213 = weight(_text_:science in 691) [ClassicSimilarity], result of:
              0.012012213 = score(doc=691,freq=2.0), product of:
                0.1375819 = queryWeight, product of:
                  2.6341193 = idf(docFreq=8627, maxDocs=44218)
                  0.052230705 = queryNorm
                0.08730954 = fieldWeight in 691, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  2.6341193 = idf(docFreq=8627, maxDocs=44218)
                  0.0234375 = fieldNorm(doc=691)
          0.5 = coord(1/2)
      0.6666667 = coord(2/3)
    
    Abstract
    Computer scientists create models of a perceived reality. Through AI techniques, these models aim at providing the basic support for emulating cognitive behavior such as reasoning and learning, which is one of the main goals of the Al research effort. Such computer models are formed through the interaction of various acquisition and inference mechanisms: perception, concept learning, conceptual clustering, hypothesis testing, probabilistic inference, etc., and are represented using different paradigms tightly linked to the processes that use them. Among these paradigms let us cite: biological models (neural nets, genetic programming), logic-based models (first-order logic, modal logic, rule-based systems), virtual reality models (object systems, agent systems), probabilistic models (Bayesian nets, fuzzy logic), linguistic models (conceptual dependency graphs, language-based rep resentations), etc. One of the strengths of the Conceptual Graph (CG) theory is its versatility in terms of the representation paradigms under which it falls. It can be viewed and therefore used, under different representation paradigms, which makes it a popular choice for a wealth of applications. Its full coupling with different cognitive processes lead to the opening of the field toward related research communities such as the Description Logic, Formal Concept Analysis, and Computational Linguistic communities. We now see more and more research results from one community enrich the other, laying the foundations of common philosophical grounds from which a successful synergy can emerge. ICCS 2000 embodies this spirit of research collaboration. It presents a set of papers that we believe, by their exposure, will benefit the whole community. For instance, the technical program proposes tracks on Conceptual Ontologies, Language, Formal Concept Analysis, Computational Aspects of Conceptual Structures, and Formal Semantics, with some papers on pragmatism and human related aspects of computing. Never before was the program of ICCS formed by so heterogeneously rooted theories of knowledge representation and use. We hope that this swirl of ideas will benefit you as much as it already has benefited us while putting together this program
    Content
    Concepts and Language: The Role of Conceptual Structure in Human Evolution (Keith Devlin) - Concepts in Linguistics - Concepts in Natural Language (Gisela Harras) - Patterns, Schemata, and Types: Author Support through Formalized Experience (Felix H. Gatzemeier) - Conventions and Notations for Knowledge Representation and Retrieval (Philippe Martin) - Conceptual Ontology: Ontology, Metadata, and Semiotics (John F. Sowa) - Pragmatically Yours (Mary Keeler) - Conceptual Modeling for Distributed Ontology Environments (Deborah L. McGuinness) - Discovery of Class Relations in Exception Structured Knowledge Bases (Hendra Suryanto, Paul Compton) - Conceptual Graphs: Perspectives: CGs Applications: Where Are We 7 Years after the First ICCS ? (Michel Chein, David Genest) - The Engineering of a CC-Based System: Fundamental Issues (Guy W. Mineau) - Conceptual Graphs, Metamodeling, and Notation of Concepts (Olivier Gerbé, Guy W. Mineau, Rudolf K. Keller) - Knowledge Representation and Reasonings: Based on Graph Homomorphism (Marie-Laure Mugnier) - User Modeling Using Conceptual Graphs for Intelligent Agents (James F. Baldwin, Trevor P. Martin, Aimilia Tzanavari) - Towards a Unified Querying System of Both Structured and Semi-structured Imprecise Data Using Fuzzy View (Patrice Buche, Ollivier Haemmerlé) - Formal Semantics of Conceptual Structures: The Extensional Semantics of the Conceptual Graph Formalism (Guy W. Mineau) - Semantics of Attribute Relations in Conceptual Graphs (Pavel Kocura) - Nested Concept Graphs and Triadic Power Context Families (Susanne Prediger) - Negations in Simple Concept Graphs (Frithjof Dau) - Extending the CG Model by Simulations (Jean-François Baget) - Contextual Logic and Formal Concept Analysis: Building and Structuring Description Logic Knowledge Bases: Using Least Common Subsumers and Concept Analysis (Franz Baader, Ralf Molitor) - On the Contextual Logic of Ordinal Data (Silke Pollandt, Rudolf Wille) - Boolean Concept Logic (Rudolf Wille) - Lattices of Triadic Concept Graphs (Bernd Groh, Rudolf Wille) - Formalizing Hypotheses with Concepts (Bernhard Ganter, Sergei 0. Kuznetsov) - Generalized Formal Concept Analysis (Laurent Chaudron, Nicolas Maille) - A Logical Generalization of Formal Concept Analysis (Sébastien Ferré, Olivier Ridoux) - On the Treatment of Incomplete Knowledge in Formal Concept Analysis (Peter Burmeister, Richard Holzer) - Conceptual Structures in Practice: Logic-Based Networks: Concept Graphs and Conceptual Structures (Peter W. Eklund) - Conceptual Knowledge Discovery and Data Analysis (Joachim Hereth, Gerd Stumme, Rudolf Wille, Uta Wille) - CEM - A Conceptual Email Manager (Richard Cole, Gerd Stumme) - A Contextual-Logic Extension of TOSCANA (Peter Eklund, Bernd Groh, Gerd Stumme, Rudolf Wille) - A Conceptual Graph Model for W3C Resource Description Framework (Olivier Corby, Rose Dieng, Cédric Hébert) - Computational Aspects of Conceptual Structures: Computing with Conceptual Structures (Bernhard Ganter) - Symmetry and the Computation of Conceptual Structures (Robert Levinson) An Introduction to SNePS 3 (Stuart C. Shapiro) - Composition Norm Dynamics Calculation with Conceptual Graphs (Aldo de Moor) - From PROLOG++ to PROLOG+CG: A CG Object-Oriented Logic Programming Language (Adil Kabbaj, Martin Janta-Polczynski) - A Cost-Bounded Algorithm to Control Events Generalization (Gaël de Chalendar, Brigitte Grau, Olivier Ferret)
    Series
    Lecture notes in computer science; vol.1867: Lecture notes on artificial intelligence
  8. Storms, G.; VanMechelen, I.; DeBoeck, P.: Structural-analysis of the intension and extension of semantic concepts (1994) 0.01
    0.008255962 = product of:
      0.024767887 = sum of:
        0.024767887 = product of:
          0.049535774 = sum of:
            0.049535774 = weight(_text_:22 in 2574) [ClassicSimilarity], result of:
              0.049535774 = score(doc=2574,freq=2.0), product of:
                0.18290302 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.052230705 = queryNorm
                0.2708308 = fieldWeight in 2574, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=2574)
          0.5 = coord(1/2)
      0.33333334 = coord(1/3)
    
    Date
    22. 7.2000 19:17:40
  9. Chafe, W.L.: Meaning and the structure of language (1980) 0.01
    0.008255962 = product of:
      0.024767887 = sum of:
        0.024767887 = product of:
          0.049535774 = sum of:
            0.049535774 = weight(_text_:22 in 220) [ClassicSimilarity], result of:
              0.049535774 = score(doc=220,freq=2.0), product of:
                0.18290302 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.052230705 = queryNorm
                0.2708308 = fieldWeight in 220, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=220)
          0.5 = coord(1/2)
      0.33333334 = coord(1/3)
    
    Date
    22. 4.2007 12:21:29
  10. Sharada, B.A.: Infolinguistics : an interdisciplinary study (1995) 0.01
    0.007550149 = product of:
      0.022650447 = sum of:
        0.022650447 = product of:
          0.045300893 = sum of:
            0.045300893 = weight(_text_:science in 4225) [ClassicSimilarity], result of:
              0.045300893 = score(doc=4225,freq=4.0), product of:
                0.1375819 = queryWeight, product of:
                  2.6341193 = idf(docFreq=8627, maxDocs=44218)
                  0.052230705 = queryNorm
                0.3292649 = fieldWeight in 4225, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  2.6341193 = idf(docFreq=8627, maxDocs=44218)
                  0.0625 = fieldNorm(doc=4225)
          0.5 = coord(1/2)
      0.33333334 = coord(1/3)
    
    Abstract
    Discusses the importance of the theories and principles of linguistics to the organization of information. Infolinguistics studies information or knowledge using linguistics as a representation mechanism. Theoretical studies of search languages and index languages require a theoretical framework provided by the interdisciplinary approach of infolinguistics which is a blend of cognitive psychology, linguistics and information science. Discusses how infolinguistics can contribute to information retrieval using both computer and manual application of grammatical theories
    Source
    Library science with a slant to documentation and information studies. 32(1995) no.3, S.113-121
  11. Rasmussen, L.: Selected linguistic problems in indexing within the Canadian context (1992) 0.01
    0.006164682 = product of:
      0.018494045 = sum of:
        0.018494045 = weight(_text_:in in 1609) [ClassicSimilarity], result of:
          0.018494045 = score(doc=1609,freq=6.0), product of:
            0.07104705 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.052230705 = queryNorm
            0.260307 = fieldWeight in 1609, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.078125 = fieldNorm(doc=1609)
      0.33333334 = coord(1/3)
    
    Abstract
    Takes into account the linguistic characteristics of Canadian English and of Canadian French as well as the problem involved in bilingual indexing because of the trend in the English language towards nominalization
  12. Hutchins, W.J.: Linguistic processes in the indexing and retrieval of documents (1970) 0.01
    0.0056946888 = product of:
      0.017084066 = sum of:
        0.017084066 = weight(_text_:in in 1306) [ClassicSimilarity], result of:
          0.017084066 = score(doc=1306,freq=2.0), product of:
            0.07104705 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.052230705 = queryNorm
            0.24046129 = fieldWeight in 1306, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.125 = fieldNorm(doc=1306)
      0.33333334 = coord(1/3)
    
  13. Dahl, V.: What the study of language can contribute to AI (1993) 0.01
    0.0056946888 = product of:
      0.017084066 = sum of:
        0.017084066 = weight(_text_:in in 5098) [ClassicSimilarity], result of:
          0.017084066 = score(doc=5098,freq=8.0), product of:
            0.07104705 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.052230705 = queryNorm
            0.24046129 = fieldWeight in 5098, product of:
              2.828427 = tf(freq=8.0), with freq of:
                8.0 = termFreq=8.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0625 = fieldNorm(doc=5098)
      0.33333334 = coord(1/3)
    
    Abstract
    Examines areas in which linguistics in particular and the study of language in general have provided useful insight into specific artificial intelligence areas besides the computational linguistics one. Discusses studies in these areas and argues that these links should be further and more deliberately explored since they have changes of leading the way towards a fruitful integration between the formal and humanistic sciences
  14. Chomsky, N.: Logical structure in language (1957) 0.01
    0.0056946888 = product of:
      0.017084066 = sum of:
        0.017084066 = weight(_text_:in in 99) [ClassicSimilarity], result of:
          0.017084066 = score(doc=99,freq=2.0), product of:
            0.07104705 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.052230705 = queryNorm
            0.24046129 = fieldWeight in 99, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.125 = fieldNorm(doc=99)
      0.33333334 = coord(1/3)
    
  15. Fischer, W.L.: Äquivalenz- und Toleranzstrukturen in der Linguistik : zur Theorie der Synonyma (1973) 0.01
    0.0055709993 = product of:
      0.016712997 = sum of:
        0.016712997 = weight(_text_:in in 66) [ClassicSimilarity], result of:
          0.016712997 = score(doc=66,freq=40.0), product of:
            0.07104705 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.052230705 = queryNorm
            0.23523843 = fieldWeight in 66, product of:
              6.3245554 = tf(freq=40.0), with freq of:
                40.0 = termFreq=40.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.02734375 = fieldNorm(doc=66)
      0.33333334 = coord(1/3)
    
    Abstract
    In verschiedenen Bereichen der Linguistik, im Strukturalismus, in der mathematischen Text- und Literaturtheorie spielen Äquivalenzrelationen als mathematische Modelle für empirische Relationen eine zentrale Rolle. Ohne im Einzelfall um einen detaillierten Nachweis bemüht zu sein, geht man im allgemeinen davon aus, daß die Gesetze, die eine Äquivalenzrelation in abstracto charakterisieren, in dem von der Äquivalenzstruktur modellierten realen Phänomen erfüllt sind. Bei genauer Analyse der jeweiligen Situation zeigt es sich jedoch häufig, daß die betreffenden empirischen Relationen nicht allen charakteristischen Eigenschaften der Äquivalenzrelationen genügen bzw. daß es große Schwierigkeiten bereitet, die Gültigkeit z.B. der Eigenschaft der Transitivität nachzuweisen. In all diesen Fällen schlagen wir daher vor, der abstrakten Modellbildung einen schwächeren Begriff, den der "Toleranzrelation" (bzw. des "Toleranzraumes") zugrundezulegen. In den Teilen 1. und 2. stellen wir die formalen Theorien der Äquivalenzrelationen und der Toleranzrelationen einander gegenüber und zeigen, wie sich beide in die Methodologie der Wissenschaften einordnen. In Teil 3. zeigen wir am Problembereich des Synonymenbegriffs inwiefern die schwächere Struktur des Toleranzraumes in vielen Fällen die konkrete Situation angemessener darstellt als ein Äquivalenzgebilde. Wir werden in diesem Zusammenhang die Elemente der von einer Toleranzrelation induzierten Topologie entwickeln und die Grundbeziehungen einer entsprechenden Topologisierung des Wortraumes einer gegebenen Sprache vorstellen. Die letztgenannte Modellstruktur, die zu einer Topologisierung von Worträumen Anlaß gibt, läßt sich auch ausgehend von Modellvorstellungen der inhaltbezogenen Grammatik motivieren. Von zentraler Bedeutung ist dort der Begriff des Wortfeldes. Wir werden mit diesem Begriff bewußt keine physikalischen Vorstellungen verbinden. Zu "Kraft" und "Kraftlinien" im physikalischen Modell lassen sich im (indogermanischen) Sprachbereich im Sinne der inhaltbezogenen Grammatik nur sehr dürftige Entsprechungen finden, die wenigstens in einigen ihrer Eigenschaften den Bezug auf die physikalischen Begriffe "Kraft" und "Kraftlinie" rechtfertigten. Physikalische Analogien scheinen im Zusammenhang mit dem Wortfeld` nur den metaphorischen Charakter eines hermeneutischen Gefasels zu haben.
    Wir beschränken uns daher auf die Darstellung des sprachlichen Feldbegriffes durch ein mathematisches Modell ... bzw.: wir definieren das sprachliche Feld über ein mathematisches Modell. Die Feldeigenschaften in einem System sprachlicher Entitäten sind zunächst solche des gegenseitigen Benachbartseins. Wir können von syntaktischen und/oder semantischen Umgebungen sprechen, in denen jeweils alle zu einem Wort (einer sprachlichen Einheit) synonymen oder sinnverwandten oder strukturverwandten Worte (Einheiten) liegen. Die Eigenschaften von Wortfeldern sind also in dieser Sicht vornehmlich topologischer Natur. Dabei bleibt es gleichgültig, welchem Autor man folgt, wenn man den Feldbegriff in der Sprache zu erfassen trachtet. Immer sind es Umgebungs-, Nachbarschaftsrelationen und fundierende Toleranzrelationen, die als formaler, sprich: mathematischer Kern inhärent sind. Bemerkungen über die Möglichkeit einer algebraisch-topologischen Kennzeichnung von Wortfeldsystemen beschließen Teil 3. In Teil 4. stellen wir den Begriff der k-Toleranzrelation vor. Relationen dieser Art stehen in gewissem Sinne zwischen den Äquivalenzrelationen und den Toleranzrelationen. Mit Modellbildungen auf der Basis von k-Toleranzrelationen kommt man der Realität der Sprache vielleicht am nächsten. Für den Problemkreis der Synonymität bedeutet ihre Einführung, daß man jetzt weder die Gültigkeit strenger Äquivalenz noch die schwache Bedingung der Ununterscheidbarkeit fordert, sondern nur eine k-Ununterscheidbarkeit. Relationentheoretisch entspricht dem die Forderung einer abgeschwächten Transitivität. In Teil 5. deuten wir an, wie man die in Teil 3. angegebenen Definitionsschemata für Synonyma auf der Grundlage der k-Toleranzräume verschärfen kann.
  16. Fillmore, C.J.: ¬The case for case (1968) 0.01
    0.0050334414 = product of:
      0.015100324 = sum of:
        0.015100324 = weight(_text_:in in 35) [ClassicSimilarity], result of:
          0.015100324 = score(doc=35,freq=4.0), product of:
            0.07104705 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.052230705 = queryNorm
            0.21253976 = fieldWeight in 35, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.078125 = fieldNorm(doc=35)
      0.33333334 = coord(1/3)
    
    Footnote
    Deutsche Übersetzung ('Plädoyer für Kasus') in: Kasustheorie. Hrsg. von W. Abraham. Frankfurt: Athenäum 1971 (Schwerpunkte Linguistik und Kommunikationswissenschaft; Bd.2) S.1-118.
    Source
    Universals in language. Ed.: E. Bach u. R.T. Harms
  17. Glasersfeld, E. von: ¬Die semantische Analyse von Verben auf der Grundlage begrifflicher Situationen (1987) 0.01
    0.0050334414 = product of:
      0.015100324 = sum of:
        0.015100324 = weight(_text_:in in 37) [ClassicSimilarity], result of:
          0.015100324 = score(doc=37,freq=4.0), product of:
            0.07104705 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.052230705 = queryNorm
            0.21253976 = fieldWeight in 37, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.078125 = fieldNorm(doc=37)
      0.33333334 = coord(1/3)
    
    Footnote
    Original: 'Semantic analysis of verbs in terms of conceptual situations' in: Linguistics 94(1972) S.90-107
  18. Johnson, F.C.; Paice, C.D.; Black, W.J.; Neal, A.P.: ¬The application of linguistic processing to automatic abstract generation (1993) 0.01
    0.0050334414 = product of:
      0.015100324 = sum of:
        0.015100324 = weight(_text_:in in 2290) [ClassicSimilarity], result of:
          0.015100324 = score(doc=2290,freq=4.0), product of:
            0.07104705 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.052230705 = queryNorm
            0.21253976 = fieldWeight in 2290, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.078125 = fieldNorm(doc=2290)
      0.33333334 = coord(1/3)
    
    Footnote
    Wiederabgedruckt in: Readings in information retrieval. Ed.: K. Sparck Jones u. P. Willett. San Francisco: Morgan Kaufmann 1997. S.538-552.
  19. Krömmelbein, U.: linguistische und fachwissenschaftliche Gesichtspunkte. Eine vergleichende Untersuchung der Regeln für die Schlagwortvergabe der Deutschen Bibliothek, RSWK, Voll-PRECIS und Kurz-PRECIS : Schlagwort-Syntax (1983) 0.01
    0.0050334414 = product of:
      0.015100324 = sum of:
        0.015100324 = weight(_text_:in in 2566) [ClassicSimilarity], result of:
          0.015100324 = score(doc=2566,freq=4.0), product of:
            0.07104705 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.052230705 = queryNorm
            0.21253976 = fieldWeight in 2566, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.078125 = fieldNorm(doc=2566)
      0.33333334 = coord(1/3)
    
    Footnote
    Examensarbeit Höherer Dienst an der FHBD in Köln. - Auch veröffentlicht in: Bibliothek: Forschung und Praxis 8(1984) S.159-203
  20. Beaugrande, R.A. de; Dressler, W.U.: Einführung in die Textlinguistik (1981) 0.00
    0.0049828524 = product of:
      0.014948557 = sum of:
        0.014948557 = weight(_text_:in in 6226) [ClassicSimilarity], result of:
          0.014948557 = score(doc=6226,freq=2.0), product of:
            0.07104705 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.052230705 = queryNorm
            0.21040362 = fieldWeight in 6226, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.109375 = fieldNorm(doc=6226)
      0.33333334 = coord(1/3)
    

Languages

  • e 22
  • d 19
  • f 1
  • More… Less…

Types

  • m 22
  • a 19
  • s 3
  • x 1
  • More… Less…

Classifications