Search (72 results, page 1 of 4)

  • × theme_ss:"Formale Begriffsanalyse"
  1. Eklund. P.W.: Logic-based networks : concept graphs and conceptual structures (2000) 0.04
    0.039221488 = product of:
      0.17257454 = sum of:
        0.08073681 = weight(_text_:lecture in 5084) [ClassicSimilarity], result of:
          0.08073681 = score(doc=5084,freq=4.0), product of:
            0.13308205 = queryWeight, product of:
              6.4711404 = idf(docFreq=185, maxDocs=44218)
              0.02056547 = queryNorm
            0.6066694 = fieldWeight in 5084, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              6.4711404 = idf(docFreq=185, maxDocs=44218)
              0.046875 = fieldNorm(doc=5084)
        0.05503029 = weight(_text_:notes in 5084) [ClassicSimilarity], result of:
          0.05503029 = score(doc=5084,freq=4.0), product of:
            0.10987139 = queryWeight, product of:
              5.3425174 = idf(docFreq=574, maxDocs=44218)
              0.02056547 = queryNorm
            0.500861 = fieldWeight in 5084, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              5.3425174 = idf(docFreq=574, maxDocs=44218)
              0.046875 = fieldNorm(doc=5084)
        0.0043691397 = weight(_text_:in in 5084) [ClassicSimilarity], result of:
          0.0043691397 = score(doc=5084,freq=6.0), product of:
            0.027974274 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.02056547 = queryNorm
            0.1561842 = fieldWeight in 5084, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.046875 = fieldNorm(doc=5084)
        0.025749456 = weight(_text_:computer in 5084) [ClassicSimilarity], result of:
          0.025749456 = score(doc=5084,freq=4.0), product of:
            0.0751567 = queryWeight, product of:
              3.6545093 = idf(docFreq=3109, maxDocs=44218)
              0.02056547 = queryNorm
            0.34261024 = fieldWeight in 5084, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              3.6545093 = idf(docFreq=3109, maxDocs=44218)
              0.046875 = fieldNorm(doc=5084)
        0.00668884 = product of:
          0.01337768 = sum of:
            0.01337768 = weight(_text_:science in 5084) [ClassicSimilarity], result of:
              0.01337768 = score(doc=5084,freq=4.0), product of:
                0.0541719 = queryWeight, product of:
                  2.6341193 = idf(docFreq=8627, maxDocs=44218)
                  0.02056547 = queryNorm
                0.24694869 = fieldWeight in 5084, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  2.6341193 = idf(docFreq=8627, maxDocs=44218)
                  0.046875 = fieldNorm(doc=5084)
          0.5 = coord(1/2)
      0.22727273 = coord(5/22)
    
    Abstract
    Logic-based networks are semantic networks that support reasoning capabilities. In this paper, knowledge processing within logicbased networks is viewed as three stages. The first stage involves the formation of concepts and relations: the basic primitives with which we wish to formulate knowledge. The second stage involves the formation of wellformed formulas that express knowledge about the primitive concepts and relations once isolated. The final stage involves efficiently processing the wffs to the desired end. Our research involves each of these steps as they relate to Sowa's conceptual structures and Wille's concept lattices. Formal Concept Analysis gives us a capability to perform concept formation via symbolic machine learning. Concept(ual) Graphs provide a means to describe relational properties between primitive concept and relation types. Finally, techniques from other areas of computer science are required to compute logic-based networks efficiently. This paper illustrates the three stages of knowledge processing in practical terms using examples from our research
    Series
    Lecture notes in computer science; vol.1867: Lecture notes on artificial intelligence
  2. Burmeister, P.; Holzer, R.: On the treatment of incomplete knowledge in formal concept analysis (2000) 0.04
    0.03769073 = product of:
      0.16583921 = sum of:
        0.08073681 = weight(_text_:lecture in 5085) [ClassicSimilarity], result of:
          0.08073681 = score(doc=5085,freq=4.0), product of:
            0.13308205 = queryWeight, product of:
              6.4711404 = idf(docFreq=185, maxDocs=44218)
              0.02056547 = queryNorm
            0.6066694 = fieldWeight in 5085, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              6.4711404 = idf(docFreq=185, maxDocs=44218)
              0.046875 = fieldNorm(doc=5085)
        0.05503029 = weight(_text_:notes in 5085) [ClassicSimilarity], result of:
          0.05503029 = score(doc=5085,freq=4.0), product of:
            0.10987139 = queryWeight, product of:
              5.3425174 = idf(docFreq=574, maxDocs=44218)
              0.02056547 = queryNorm
            0.500861 = fieldWeight in 5085, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              5.3425174 = idf(docFreq=574, maxDocs=44218)
              0.046875 = fieldNorm(doc=5085)
        0.007134775 = weight(_text_:in in 5085) [ClassicSimilarity], result of:
          0.007134775 = score(doc=5085,freq=16.0), product of:
            0.027974274 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.02056547 = queryNorm
            0.25504774 = fieldWeight in 5085, product of:
              4.0 = tf(freq=16.0), with freq of:
                16.0 = termFreq=16.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.046875 = fieldNorm(doc=5085)
        0.018207615 = weight(_text_:computer in 5085) [ClassicSimilarity], result of:
          0.018207615 = score(doc=5085,freq=2.0), product of:
            0.0751567 = queryWeight, product of:
              3.6545093 = idf(docFreq=3109, maxDocs=44218)
              0.02056547 = queryNorm
            0.24226204 = fieldWeight in 5085, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.6545093 = idf(docFreq=3109, maxDocs=44218)
              0.046875 = fieldNorm(doc=5085)
        0.0047297236 = product of:
          0.009459447 = sum of:
            0.009459447 = weight(_text_:science in 5085) [ClassicSimilarity], result of:
              0.009459447 = score(doc=5085,freq=2.0), product of:
                0.0541719 = queryWeight, product of:
                  2.6341193 = idf(docFreq=8627, maxDocs=44218)
                  0.02056547 = queryNorm
                0.17461908 = fieldWeight in 5085, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  2.6341193 = idf(docFreq=8627, maxDocs=44218)
                  0.046875 = fieldNorm(doc=5085)
          0.5 = coord(1/2)
      0.22727273 = coord(5/22)
    
    Abstract
    Some possible treatments of incomplete knowledge in conceptual data representation, data analysis and knowledge acquisition are presented. In particular, some ways of conceptual scalings as well as the role of the three-valued KLEENE-logic are briefly investigated. This logic is also one background in attribute exploration, a conceptual tool for knowledge acquisition. For this method a strategy is given to obtain as much of (attribute) implicational knowledge about a given "universe" as possible; and we show how to represent incomplete knowledge in order to be able to pin down the questions still to be answered in order to obtain complete knowledge in this situation
    Series
    Lecture notes in computer science; vol.1867: Lecture notes on artificial intelligence
  3. Ganter, B.: Computing with conceptual structures (2000) 0.04
    0.03735113 = product of:
      0.16434497 = sum of:
        0.08073681 = weight(_text_:lecture in 5088) [ClassicSimilarity], result of:
          0.08073681 = score(doc=5088,freq=4.0), product of:
            0.13308205 = queryWeight, product of:
              6.4711404 = idf(docFreq=185, maxDocs=44218)
              0.02056547 = queryNorm
            0.6066694 = fieldWeight in 5088, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              6.4711404 = idf(docFreq=185, maxDocs=44218)
              0.046875 = fieldNorm(doc=5088)
        0.05503029 = weight(_text_:notes in 5088) [ClassicSimilarity], result of:
          0.05503029 = score(doc=5088,freq=4.0), product of:
            0.10987139 = queryWeight, product of:
              5.3425174 = idf(docFreq=574, maxDocs=44218)
              0.02056547 = queryNorm
            0.500861 = fieldWeight in 5088, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              5.3425174 = idf(docFreq=574, maxDocs=44218)
              0.046875 = fieldNorm(doc=5088)
        0.005640535 = weight(_text_:in in 5088) [ClassicSimilarity], result of:
          0.005640535 = score(doc=5088,freq=10.0), product of:
            0.027974274 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.02056547 = queryNorm
            0.20163295 = fieldWeight in 5088, product of:
              3.1622777 = tf(freq=10.0), with freq of:
                10.0 = termFreq=10.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.046875 = fieldNorm(doc=5088)
        0.018207615 = weight(_text_:computer in 5088) [ClassicSimilarity], result of:
          0.018207615 = score(doc=5088,freq=2.0), product of:
            0.0751567 = queryWeight, product of:
              3.6545093 = idf(docFreq=3109, maxDocs=44218)
              0.02056547 = queryNorm
            0.24226204 = fieldWeight in 5088, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.6545093 = idf(docFreq=3109, maxDocs=44218)
              0.046875 = fieldNorm(doc=5088)
        0.0047297236 = product of:
          0.009459447 = sum of:
            0.009459447 = weight(_text_:science in 5088) [ClassicSimilarity], result of:
              0.009459447 = score(doc=5088,freq=2.0), product of:
                0.0541719 = queryWeight, product of:
                  2.6341193 = idf(docFreq=8627, maxDocs=44218)
                  0.02056547 = queryNorm
                0.17461908 = fieldWeight in 5088, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  2.6341193 = idf(docFreq=8627, maxDocs=44218)
                  0.046875 = fieldNorm(doc=5088)
          0.5 = coord(1/2)
      0.22727273 = coord(5/22)
    
    Abstract
    We give an overview over the computational tools for conceptional structures that have emerged from the theory of Formal Concept Analysis, with emphasis on basic ideas rather than technical details. We describe what we mean by conceptual computations, and try to convince the reader that an elaborate formalization is a necessary precondition. Claiming that Formal Concept Analysis provides such a formal background, we present as examples two well known algorithms in very simple pseudo code. These earl be used for navigating in a lattice, thereby supporting some prototypical tasks of conceptual computation. We refer to some of the many more advanced methods, discuss how to compute with limited precision and explain why in the case of incomplete knowledge the conceptual approach is more efficient than a combinatorial one. Utilizing this efficiency requires skillful use of the formalism. We present two results that lead in this direction
    Series
    Lecture notes in computer science; vol.1867: Lecture notes on artificial intelligence
  4. Eklund, P.; Groh, B.; Stumme, G.; Wille, R.: ¬A conceptual-logic extension of TOSCANA (2000) 0.04
    0.037062176 = product of:
      0.16307357 = sum of:
        0.08073681 = weight(_text_:lecture in 5082) [ClassicSimilarity], result of:
          0.08073681 = score(doc=5082,freq=4.0), product of:
            0.13308205 = queryWeight, product of:
              6.4711404 = idf(docFreq=185, maxDocs=44218)
              0.02056547 = queryNorm
            0.6066694 = fieldWeight in 5082, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              6.4711404 = idf(docFreq=185, maxDocs=44218)
              0.046875 = fieldNorm(doc=5082)
        0.05503029 = weight(_text_:notes in 5082) [ClassicSimilarity], result of:
          0.05503029 = score(doc=5082,freq=4.0), product of:
            0.10987139 = queryWeight, product of:
              5.3425174 = idf(docFreq=574, maxDocs=44218)
              0.02056547 = queryNorm
            0.500861 = fieldWeight in 5082, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              5.3425174 = idf(docFreq=574, maxDocs=44218)
              0.046875 = fieldNorm(doc=5082)
        0.0043691397 = weight(_text_:in in 5082) [ClassicSimilarity], result of:
          0.0043691397 = score(doc=5082,freq=6.0), product of:
            0.027974274 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.02056547 = queryNorm
            0.1561842 = fieldWeight in 5082, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.046875 = fieldNorm(doc=5082)
        0.018207615 = weight(_text_:computer in 5082) [ClassicSimilarity], result of:
          0.018207615 = score(doc=5082,freq=2.0), product of:
            0.0751567 = queryWeight, product of:
              3.6545093 = idf(docFreq=3109, maxDocs=44218)
              0.02056547 = queryNorm
            0.24226204 = fieldWeight in 5082, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.6545093 = idf(docFreq=3109, maxDocs=44218)
              0.046875 = fieldNorm(doc=5082)
        0.0047297236 = product of:
          0.009459447 = sum of:
            0.009459447 = weight(_text_:science in 5082) [ClassicSimilarity], result of:
              0.009459447 = score(doc=5082,freq=2.0), product of:
                0.0541719 = queryWeight, product of:
                  2.6341193 = idf(docFreq=8627, maxDocs=44218)
                  0.02056547 = queryNorm
                0.17461908 = fieldWeight in 5082, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  2.6341193 = idf(docFreq=8627, maxDocs=44218)
                  0.046875 = fieldNorm(doc=5082)
          0.5 = coord(1/2)
      0.22727273 = coord(5/22)
    
    Abstract
    The aim of this paper is to indicate how TOSCANA may be extended to allow graphical representations not only of concept lattices but also of concept graphs in the sense of Contextual Logic. The contextual- logic extension of TOSCANA requires the logical scaling of conceptual and relational scales for which we propose the Peircean Algebraic Logic as reconstructed by R. W. Burch. As graphical representations we recommend, besides labelled line diagrams of concept lattices and Sowa's diagrams of conceptual graphs, particular information maps for utilizing background knowledge as much as possible. Our considerations are illustrated by a small information system about the domestic flights in Austria
    Series
    Lecture notes in computer science; vol.1867: Lecture notes on artificial intelligence
  5. Hereth, J.; Stumme, G.; Wille, R.; Wille, U.: Conceptual knowledge discovery and data analysis (2000) 0.03
    0.031490915 = product of:
      0.13856001 = sum of:
        0.06728067 = weight(_text_:lecture in 5083) [ClassicSimilarity], result of:
          0.06728067 = score(doc=5083,freq=4.0), product of:
            0.13308205 = queryWeight, product of:
              6.4711404 = idf(docFreq=185, maxDocs=44218)
              0.02056547 = queryNorm
            0.50555784 = fieldWeight in 5083, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              6.4711404 = idf(docFreq=185, maxDocs=44218)
              0.0390625 = fieldNorm(doc=5083)
        0.045858577 = weight(_text_:notes in 5083) [ClassicSimilarity], result of:
          0.045858577 = score(doc=5083,freq=4.0), product of:
            0.10987139 = queryWeight, product of:
              5.3425174 = idf(docFreq=574, maxDocs=44218)
              0.02056547 = queryNorm
            0.41738418 = fieldWeight in 5083, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              5.3425174 = idf(docFreq=574, maxDocs=44218)
              0.0390625 = fieldNorm(doc=5083)
        0.0063063093 = weight(_text_:in in 5083) [ClassicSimilarity], result of:
          0.0063063093 = score(doc=5083,freq=18.0), product of:
            0.027974274 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.02056547 = queryNorm
            0.22543246 = fieldWeight in 5083, product of:
              4.2426405 = tf(freq=18.0), with freq of:
                18.0 = termFreq=18.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0390625 = fieldNorm(doc=5083)
        0.015173013 = weight(_text_:computer in 5083) [ClassicSimilarity], result of:
          0.015173013 = score(doc=5083,freq=2.0), product of:
            0.0751567 = queryWeight, product of:
              3.6545093 = idf(docFreq=3109, maxDocs=44218)
              0.02056547 = queryNorm
            0.20188503 = fieldWeight in 5083, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.6545093 = idf(docFreq=3109, maxDocs=44218)
              0.0390625 = fieldNorm(doc=5083)
        0.0039414368 = product of:
          0.0078828735 = sum of:
            0.0078828735 = weight(_text_:science in 5083) [ClassicSimilarity], result of:
              0.0078828735 = score(doc=5083,freq=2.0), product of:
                0.0541719 = queryWeight, product of:
                  2.6341193 = idf(docFreq=8627, maxDocs=44218)
                  0.02056547 = queryNorm
                0.1455159 = fieldWeight in 5083, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  2.6341193 = idf(docFreq=8627, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=5083)
          0.5 = coord(1/2)
      0.22727273 = coord(5/22)
    
    Abstract
    In this paper, we discuss Conceptual Knowledge Discovery in Databases (CKDD) in its connection with Data Analysis. Our approach is based on Formal Concept Analysis, a mathematical theory which has been developed and proven useful during the last 20 years. Formal Concept Analysis has led to a theory of conceptual information systems which has been applied by using the management system TOSCANA in a wide range of domains. In this paper, we use such an application in database marketing to demonstrate how methods and procedures of CKDD can be applied in Data Analysis. In particular, we show the interplay and integration of data mining and data analysis techniques based on Formal Concept Analysis. The main concern of this paper is to explain how the transition from data to knowledge can be supported by a TOSCANA system. To clarify the transition steps we discuss their correspondence to the five levels of knowledge representation established by R. Brachman and to the steps of empirically grounded theory building proposed by A. Strauss and J. Corbin
    Series
    Lecture notes in computer science; vol.1867: Lecture notes on artificial intelligence
  6. Groh, B.; Strahringer, S.; Wille, R.: TOSCANA-systems based on thesauri (1998) 0.02
    0.022391902 = product of:
      0.16420728 = sum of:
        0.095149234 = weight(_text_:lecture in 3084) [ClassicSimilarity], result of:
          0.095149234 = score(doc=3084,freq=2.0), product of:
            0.13308205 = queryWeight, product of:
              6.4711404 = idf(docFreq=185, maxDocs=44218)
              0.02056547 = queryNorm
            0.7149667 = fieldWeight in 3084, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              6.4711404 = idf(docFreq=185, maxDocs=44218)
              0.078125 = fieldNorm(doc=3084)
        0.064853825 = weight(_text_:notes in 3084) [ClassicSimilarity], result of:
          0.064853825 = score(doc=3084,freq=2.0), product of:
            0.10987139 = queryWeight, product of:
              5.3425174 = idf(docFreq=574, maxDocs=44218)
              0.02056547 = queryNorm
            0.59027034 = fieldWeight in 3084, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              5.3425174 = idf(docFreq=574, maxDocs=44218)
              0.078125 = fieldNorm(doc=3084)
        0.0042042066 = weight(_text_:in in 3084) [ClassicSimilarity], result of:
          0.0042042066 = score(doc=3084,freq=2.0), product of:
            0.027974274 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.02056547 = queryNorm
            0.15028831 = fieldWeight in 3084, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.078125 = fieldNorm(doc=3084)
      0.13636364 = coord(3/22)
    
    Series
    Lecture notes in artificial intelligence; vol.1453
  7. Conceptual structures : logical, linguistic, and computational issues. 8th International Conference on Conceptual Structures, ICCS 2000, Darmstadt, Germany, August 14-18, 2000 (2000) 0.02
    0.020455712 = product of:
      0.09000513 = sum of:
        0.040368404 = weight(_text_:lecture in 691) [ClassicSimilarity], result of:
          0.040368404 = score(doc=691,freq=4.0), product of:
            0.13308205 = queryWeight, product of:
              6.4711404 = idf(docFreq=185, maxDocs=44218)
              0.02056547 = queryNorm
            0.3033347 = fieldWeight in 691, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              6.4711404 = idf(docFreq=185, maxDocs=44218)
              0.0234375 = fieldNorm(doc=691)
        0.027515145 = weight(_text_:notes in 691) [ClassicSimilarity], result of:
          0.027515145 = score(doc=691,freq=4.0), product of:
            0.10987139 = queryWeight, product of:
              5.3425174 = idf(docFreq=574, maxDocs=44218)
              0.02056547 = queryNorm
            0.2504305 = fieldWeight in 691, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              5.3425174 = idf(docFreq=574, maxDocs=44218)
              0.0234375 = fieldNorm(doc=691)
        0.00398846 = weight(_text_:in in 691) [ClassicSimilarity], result of:
          0.00398846 = score(doc=691,freq=20.0), product of:
            0.027974274 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.02056547 = queryNorm
            0.14257601 = fieldWeight in 691, product of:
              4.472136 = tf(freq=20.0), with freq of:
                20.0 = termFreq=20.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0234375 = fieldNorm(doc=691)
        0.015768258 = weight(_text_:computer in 691) [ClassicSimilarity], result of:
          0.015768258 = score(doc=691,freq=6.0), product of:
            0.0751567 = queryWeight, product of:
              3.6545093 = idf(docFreq=3109, maxDocs=44218)
              0.02056547 = queryNorm
            0.20980507 = fieldWeight in 691, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              3.6545093 = idf(docFreq=3109, maxDocs=44218)
              0.0234375 = fieldNorm(doc=691)
        0.0023648618 = product of:
          0.0047297236 = sum of:
            0.0047297236 = weight(_text_:science in 691) [ClassicSimilarity], result of:
              0.0047297236 = score(doc=691,freq=2.0), product of:
                0.0541719 = queryWeight, product of:
                  2.6341193 = idf(docFreq=8627, maxDocs=44218)
                  0.02056547 = queryNorm
                0.08730954 = fieldWeight in 691, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  2.6341193 = idf(docFreq=8627, maxDocs=44218)
                  0.0234375 = fieldNorm(doc=691)
          0.5 = coord(1/2)
      0.22727273 = coord(5/22)
    
    Abstract
    Computer scientists create models of a perceived reality. Through AI techniques, these models aim at providing the basic support for emulating cognitive behavior such as reasoning and learning, which is one of the main goals of the Al research effort. Such computer models are formed through the interaction of various acquisition and inference mechanisms: perception, concept learning, conceptual clustering, hypothesis testing, probabilistic inference, etc., and are represented using different paradigms tightly linked to the processes that use them. Among these paradigms let us cite: biological models (neural nets, genetic programming), logic-based models (first-order logic, modal logic, rule-based systems), virtual reality models (object systems, agent systems), probabilistic models (Bayesian nets, fuzzy logic), linguistic models (conceptual dependency graphs, language-based rep resentations), etc. One of the strengths of the Conceptual Graph (CG) theory is its versatility in terms of the representation paradigms under which it falls. It can be viewed and therefore used, under different representation paradigms, which makes it a popular choice for a wealth of applications. Its full coupling with different cognitive processes lead to the opening of the field toward related research communities such as the Description Logic, Formal Concept Analysis, and Computational Linguistic communities. We now see more and more research results from one community enrich the other, laying the foundations of common philosophical grounds from which a successful synergy can emerge. ICCS 2000 embodies this spirit of research collaboration. It presents a set of papers that we believe, by their exposure, will benefit the whole community. For instance, the technical program proposes tracks on Conceptual Ontologies, Language, Formal Concept Analysis, Computational Aspects of Conceptual Structures, and Formal Semantics, with some papers on pragmatism and human related aspects of computing. Never before was the program of ICCS formed by so heterogeneously rooted theories of knowledge representation and use. We hope that this swirl of ideas will benefit you as much as it already has benefited us while putting together this program
    Content
    Concepts and Language: The Role of Conceptual Structure in Human Evolution (Keith Devlin) - Concepts in Linguistics - Concepts in Natural Language (Gisela Harras) - Patterns, Schemata, and Types: Author Support through Formalized Experience (Felix H. Gatzemeier) - Conventions and Notations for Knowledge Representation and Retrieval (Philippe Martin) - Conceptual Ontology: Ontology, Metadata, and Semiotics (John F. Sowa) - Pragmatically Yours (Mary Keeler) - Conceptual Modeling for Distributed Ontology Environments (Deborah L. McGuinness) - Discovery of Class Relations in Exception Structured Knowledge Bases (Hendra Suryanto, Paul Compton) - Conceptual Graphs: Perspectives: CGs Applications: Where Are We 7 Years after the First ICCS ? (Michel Chein, David Genest) - The Engineering of a CC-Based System: Fundamental Issues (Guy W. Mineau) - Conceptual Graphs, Metamodeling, and Notation of Concepts (Olivier Gerbé, Guy W. Mineau, Rudolf K. Keller) - Knowledge Representation and Reasonings: Based on Graph Homomorphism (Marie-Laure Mugnier) - User Modeling Using Conceptual Graphs for Intelligent Agents (James F. Baldwin, Trevor P. Martin, Aimilia Tzanavari) - Towards a Unified Querying System of Both Structured and Semi-structured Imprecise Data Using Fuzzy View (Patrice Buche, Ollivier Haemmerlé) - Formal Semantics of Conceptual Structures: The Extensional Semantics of the Conceptual Graph Formalism (Guy W. Mineau) - Semantics of Attribute Relations in Conceptual Graphs (Pavel Kocura) - Nested Concept Graphs and Triadic Power Context Families (Susanne Prediger) - Negations in Simple Concept Graphs (Frithjof Dau) - Extending the CG Model by Simulations (Jean-François Baget) - Contextual Logic and Formal Concept Analysis: Building and Structuring Description Logic Knowledge Bases: Using Least Common Subsumers and Concept Analysis (Franz Baader, Ralf Molitor) - On the Contextual Logic of Ordinal Data (Silke Pollandt, Rudolf Wille) - Boolean Concept Logic (Rudolf Wille) - Lattices of Triadic Concept Graphs (Bernd Groh, Rudolf Wille) - Formalizing Hypotheses with Concepts (Bernhard Ganter, Sergei 0. Kuznetsov) - Generalized Formal Concept Analysis (Laurent Chaudron, Nicolas Maille) - A Logical Generalization of Formal Concept Analysis (Sébastien Ferré, Olivier Ridoux) - On the Treatment of Incomplete Knowledge in Formal Concept Analysis (Peter Burmeister, Richard Holzer) - Conceptual Structures in Practice: Logic-Based Networks: Concept Graphs and Conceptual Structures (Peter W. Eklund) - Conceptual Knowledge Discovery and Data Analysis (Joachim Hereth, Gerd Stumme, Rudolf Wille, Uta Wille) - CEM - A Conceptual Email Manager (Richard Cole, Gerd Stumme) - A Contextual-Logic Extension of TOSCANA (Peter Eklund, Bernd Groh, Gerd Stumme, Rudolf Wille) - A Conceptual Graph Model for W3C Resource Description Framework (Olivier Corby, Rose Dieng, Cédric Hébert) - Computational Aspects of Conceptual Structures: Computing with Conceptual Structures (Bernhard Ganter) - Symmetry and the Computation of Conceptual Structures (Robert Levinson) An Introduction to SNePS 3 (Stuart C. Shapiro) - Composition Norm Dynamics Calculation with Conceptual Graphs (Aldo de Moor) - From PROLOG++ to PROLOG+CG: A CG Object-Oriented Logic Programming Language (Adil Kabbaj, Martin Janta-Polczynski) - A Cost-Bounded Algorithm to Control Events Generalization (Gaël de Chalendar, Brigitte Grau, Olivier Ferret)
    Series
    Lecture notes in computer science; vol.1867: Lecture notes on artificial intelligence
  8. Priss, U.: Faceted information representation (2000) 0.01
    0.0073779463 = product of:
      0.05410494 = sum of:
        0.041409813 = weight(_text_:informatik in 5095) [ClassicSimilarity], result of:
          0.041409813 = score(doc=5095,freq=2.0), product of:
            0.104934774 = queryWeight, product of:
              5.1024737 = idf(docFreq=730, maxDocs=44218)
              0.02056547 = queryNorm
            0.3946243 = fieldWeight in 5095, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              5.1024737 = idf(docFreq=730, maxDocs=44218)
              0.0546875 = fieldNorm(doc=5095)
        0.0029429442 = weight(_text_:in in 5095) [ClassicSimilarity], result of:
          0.0029429442 = score(doc=5095,freq=2.0), product of:
            0.027974274 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.02056547 = queryNorm
            0.10520181 = fieldWeight in 5095, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0546875 = fieldNorm(doc=5095)
        0.00975218 = product of:
          0.01950436 = sum of:
            0.01950436 = weight(_text_:22 in 5095) [ClassicSimilarity], result of:
              0.01950436 = score(doc=5095,freq=2.0), product of:
                0.072016776 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.02056547 = queryNorm
                0.2708308 = fieldWeight in 5095, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=5095)
          0.5 = coord(1/2)
      0.13636364 = coord(3/22)
    
    Abstract
    This paper presents an abstract formalization of the notion of "facets". Facets are relational structures of units, relations and other facets selected for a certain purpose. Facets can be used to structure large knowledge representation systems into a hierarchical arrangement of consistent and independent subsystems (facets) that facilitate flexibility and combinations of different viewpoints or aspects. This paper describes the basic notions, facet characteristics and construction mechanisms. It then explicates the theory in an example of a faceted information retrieval system (FaIR)
    Date
    22. 1.2016 17:47:06
    Series
    Berichte aus der Informatik
  9. Vogt, C.; Wille, R.: Formale Begriffsanalyse : Darstellung und Analyse von bibliographischen Daten (1994) 0.01
    0.0068295854 = product of:
      0.050083626 = sum of:
        0.02209887 = weight(_text_:und in 7603) [ClassicSimilarity], result of:
          0.02209887 = score(doc=7603,freq=4.0), product of:
            0.04558063 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.02056547 = queryNorm
            0.4848303 = fieldWeight in 7603, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.109375 = fieldNorm(doc=7603)
        0.02209887 = weight(_text_:und in 7603) [ClassicSimilarity], result of:
          0.02209887 = score(doc=7603,freq=4.0), product of:
            0.04558063 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.02056547 = queryNorm
            0.4848303 = fieldWeight in 7603, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.109375 = fieldNorm(doc=7603)
        0.0058858884 = weight(_text_:in in 7603) [ClassicSimilarity], result of:
          0.0058858884 = score(doc=7603,freq=2.0), product of:
            0.027974274 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.02056547 = queryNorm
            0.21040362 = fieldWeight in 7603, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.109375 = fieldNorm(doc=7603)
      0.13636364 = coord(3/22)
    
    Source
    Informations- und Wissensverarbeitung in den Sozialwissenschaften: Beiträge zur Umsetzung neuer Informationstechnologien. Hrsg.: H. Best u.a
  10. Priss, U.: Formal concept analysis in information science (2006) 0.01
    0.0064168195 = product of:
      0.047056675 = sum of:
        0.022493038 = product of:
          0.044986077 = sum of:
            0.044986077 = weight(_text_:29 in 4305) [ClassicSimilarity], result of:
              0.044986077 = score(doc=4305,freq=2.0), product of:
                0.072342895 = queryWeight, product of:
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.02056547 = queryNorm
                0.6218451 = fieldWeight in 4305, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.125 = fieldNorm(doc=4305)
          0.5 = coord(1/2)
        0.00672673 = weight(_text_:in in 4305) [ClassicSimilarity], result of:
          0.00672673 = score(doc=4305,freq=2.0), product of:
            0.027974274 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.02056547 = queryNorm
            0.24046129 = fieldWeight in 4305, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.125 = fieldNorm(doc=4305)
        0.017836906 = product of:
          0.035673812 = sum of:
            0.035673812 = weight(_text_:science in 4305) [ClassicSimilarity], result of:
              0.035673812 = score(doc=4305,freq=4.0), product of:
                0.0541719 = queryWeight, product of:
                  2.6341193 = idf(docFreq=8627, maxDocs=44218)
                  0.02056547 = queryNorm
                0.6585298 = fieldWeight in 4305, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  2.6341193 = idf(docFreq=8627, maxDocs=44218)
                  0.125 = fieldNorm(doc=4305)
          0.5 = coord(1/2)
      0.13636364 = coord(3/22)
    
    Date
    13. 7.2008 19:29:59
    Source
    Annual review of information science and technology. 40(2006), S.xxx-xxx
  11. Wille, R.: Begriffliche Wissensverarbeitung in der Wirtschaft (2002) 0.01
    0.005914595 = product of:
      0.043373697 = sum of:
        0.019138183 = weight(_text_:und in 547) [ClassicSimilarity], result of:
          0.019138183 = score(doc=547,freq=12.0), product of:
            0.04558063 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.02056547 = queryNorm
            0.41987535 = fieldWeight in 547, product of:
              3.4641016 = tf(freq=12.0), with freq of:
                12.0 = termFreq=12.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.0546875 = fieldNorm(doc=547)
        0.019138183 = weight(_text_:und in 547) [ClassicSimilarity], result of:
          0.019138183 = score(doc=547,freq=12.0), product of:
            0.04558063 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.02056547 = queryNorm
            0.41987535 = fieldWeight in 547, product of:
              3.4641016 = tf(freq=12.0), with freq of:
                12.0 = termFreq=12.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.0546875 = fieldNorm(doc=547)
        0.0050973296 = weight(_text_:in in 547) [ClassicSimilarity], result of:
          0.0050973296 = score(doc=547,freq=6.0), product of:
            0.027974274 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.02056547 = queryNorm
            0.1822149 = fieldWeight in 547, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0546875 = fieldNorm(doc=547)
      0.13636364 = coord(3/22)
    
    Abstract
    Begriffliche Wissensverarbeitung ist einem pragmatischen Wissensverständnis verpflichtet, nach dem menschliches Wissen in einem offenen Prozess menschlichen Denkens, Argumentierens und Kommunizierens entsteht und weiterlebt. Sie gründet sich auf eine mathematische Begriffstheorie, die auf das wechselseitige Zusammenwirken von Formalem und Inhaltlichem ausgerichtet ist. Wie diese theoretische Konzeption in der wirtschaftlichen Praxis zur Wirkung kommt wird erläutert anhand der Kernprozesse des organisationalen Wissensmanagements, d.h. nach G. Probst et al. anhand von Wissensidentifikation, Wissenserwerb, Wissensentwicklung, Wissens(ver)teilung, Wissensnutzung und Wissensbewahrung; jeweils an einem Beispiel wird der Einsatz spezifischer Methoden der Begrifflichen Wissensverarbeitung demonstriert. Abschließend wird auf den prozesshaften Wirkungszusammenhang von Wissenszielen und Wissensbewertungen mit den Kernprozessen aus Sicht der Begrifflichen Wissensverarbeitung eingegangen.
    Source
    Information - Wissenschaft und Praxis. 53(2002) H.3, S.149-160
  12. Kohler-Koch, B.; Vogt, F.: Normen- und regelgeleitete internationale Kooperationen : Formale Begriffsanalyse in der Politikwissenschaft (2000) 0.01
    0.005853931 = product of:
      0.042928826 = sum of:
        0.018941889 = weight(_text_:und in 4206) [ClassicSimilarity], result of:
          0.018941889 = score(doc=4206,freq=4.0), product of:
            0.04558063 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.02056547 = queryNorm
            0.41556883 = fieldWeight in 4206, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.09375 = fieldNorm(doc=4206)
        0.018941889 = weight(_text_:und in 4206) [ClassicSimilarity], result of:
          0.018941889 = score(doc=4206,freq=4.0), product of:
            0.04558063 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.02056547 = queryNorm
            0.41556883 = fieldWeight in 4206, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.09375 = fieldNorm(doc=4206)
        0.0050450475 = weight(_text_:in in 4206) [ClassicSimilarity], result of:
          0.0050450475 = score(doc=4206,freq=2.0), product of:
            0.027974274 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.02056547 = queryNorm
            0.18034597 = fieldWeight in 4206, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.09375 = fieldNorm(doc=4206)
      0.13636364 = coord(3/22)
    
    Source
    Begriffliche Wissensverarbeitung: Methoden und Anwendungen, mit Beiträgen zahlreicher Fachwissenschaftler. Hrsg.: G. Stumme u. R. Wille
  13. Sander, C.; Schmiede, R.; Wille, R.: ¬Ein begriffliches Datensystem zur Literatur der interdisziplinären Technikforschung (1993) 0.01
    0.005662092 = product of:
      0.041522007 = sum of:
        0.017470691 = weight(_text_:und in 5255) [ClassicSimilarity], result of:
          0.017470691 = score(doc=5255,freq=10.0), product of:
            0.04558063 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.02056547 = queryNorm
            0.38329202 = fieldWeight in 5255, product of:
              3.1622777 = tf(freq=10.0), with freq of:
                10.0 = termFreq=10.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.0546875 = fieldNorm(doc=5255)
        0.017470691 = weight(_text_:und in 5255) [ClassicSimilarity], result of:
          0.017470691 = score(doc=5255,freq=10.0), product of:
            0.04558063 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.02056547 = queryNorm
            0.38329202 = fieldWeight in 5255, product of:
              3.1622777 = tf(freq=10.0), with freq of:
                10.0 = termFreq=10.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.0546875 = fieldNorm(doc=5255)
        0.0065806243 = weight(_text_:in in 5255) [ClassicSimilarity], result of:
          0.0065806243 = score(doc=5255,freq=10.0), product of:
            0.027974274 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.02056547 = queryNorm
            0.23523843 = fieldWeight in 5255, product of:
              3.1622777 = tf(freq=10.0), with freq of:
                10.0 = termFreq=10.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0546875 = fieldNorm(doc=5255)
      0.13636364 = coord(3/22)
    
    Abstract
    Begriffliche Datensysteme sind im Rahmen der Formalen Begriffsanalyse entstanden und gründen sich auf mathematische Formalisierungen von Begriff, Begriffssystem und Begriffliche Datei. Sie machen Wissen, das in einer Datenbasis vorliegt, begrifflich zugänglich und interpretierbar. Hierfür werden begriffliche Zusammenhänge entsprechend gewählter Frageaspekte in gestuften Liniendiagrammen dargestellt. Durch Verfeinern, Vergröbern und Wechseln von Begriffstrukturen kann man unbegrenzt durch das in der Datenbasis gespeicherte Wissen "navigieren". In einem Forschungsprojekt, gefördert vom Zentrum für interdisziplinäre Technikforschung an der TH Darmstadt, ist ein Prototyp eines begrifflichen Datensystems erstellt worden, dem als Datenkontext eine ausgewählte, begrifflich aufgearbeitete Menge von Büchern zur interdisziplinären Technikforschung zugrunde liegt. Mit diesem Prototyp soll die flexible und variable Verwendung begrifflicher datensysteme im Literaturbereich demonstriert werden
    Source
    Vortrag, 17. Jahrestagung der Gesellschaft für Klassifikation, 3.-5.3.1993 in Kaiserslautern
  14. Begriffliche Wissensverarbeitung : Methoden und Anwendungen. Mit Beiträgen zahlreicher Fachwissenschaftler (2000) 0.01
    0.0055715144 = product of:
      0.04085777 = sum of:
        0.017648064 = weight(_text_:und in 4193) [ClassicSimilarity], result of:
          0.017648064 = score(doc=4193,freq=20.0), product of:
            0.04558063 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.02056547 = queryNorm
            0.3871834 = fieldWeight in 4193, product of:
              4.472136 = tf(freq=20.0), with freq of:
                20.0 = termFreq=20.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.0390625 = fieldNorm(doc=4193)
        0.017648064 = weight(_text_:und in 4193) [ClassicSimilarity], result of:
          0.017648064 = score(doc=4193,freq=20.0), product of:
            0.04558063 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.02056547 = queryNorm
            0.3871834 = fieldWeight in 4193, product of:
              4.472136 = tf(freq=20.0), with freq of:
                20.0 = termFreq=20.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.0390625 = fieldNorm(doc=4193)
        0.005561643 = weight(_text_:in in 4193) [ClassicSimilarity], result of:
          0.005561643 = score(doc=4193,freq=14.0), product of:
            0.027974274 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.02056547 = queryNorm
            0.19881277 = fieldWeight in 4193, product of:
              3.7416575 = tf(freq=14.0), with freq of:
                14.0 = termFreq=14.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0390625 = fieldNorm(doc=4193)
      0.13636364 = coord(3/22)
    
    Abstract
    Dieses Buch stellt Methoden der Begrifflichen Wissensverarbeitung vor und präsentiert Anwendungen aus unterschiedlichen Praxisfeldern. Im Methodenteil wird in moderne Techniken der Begrifflichen Datenanalyse und Wissensverarbeitung eingeführt. Der zweite Teil richtet sich verstärkt an potentielle Anwender. An ausgewählten Anwendungen wird die Vorgehensweise bei der Datenanalyse und dem Information Retrieval mit den Methoden der Begrifflichen Wissensverarbeitung vorgestellt und ihr Potential aufgezeigt
    Content
    Enthält die Beiträge: GANTER, B.: Begriffe und Implikationen; BURMEISTER, P.: ConImp: Ein Programm zur Fromalen Begriffsanalyse; Lengnink, K.: Ähnlichkeit als Distanz in Begriffsverbänden; POLLANDT, S.: Datenanalyse mit Fuzzy-Begriffen; PREDIGER, S.: Terminologische Merkmalslogik in der Formalen Begriffsanalyse; WILLE, R. u. M. ZICKWOLFF: Grundlagen einer Triadischen Begriffsanalyse; LINDIG, C. u. G. SNELTING: Formale Begriffsanalyse im Software Engineering; STRACK, H. u. M. SKORSKY: Zugriffskontrolle bei Programmsystemen und im Datenschutz mittels Formaler Begriffsanalyse; ANDELFINGER, U.: Inhaltliche Erschließung des Bereichs 'Sozialorientierte Gestaltung von Informationstechnik': Ein begriffsanalytischer Ansatz; GÖDERT, W.: Wissensdarstellung in Informationssystemen, Fragetypen und Anforderungen an Retrievalkomponenten; ROCK, T. u. R. WILLE: Ein TOSCANA-Erkundungssystem zur Literatursuche; ESCHENFELDER, D. u.a.: Ein Erkundungssystem zum Baurecht: Methoden der Entwicklung eines TOSCANA-Systems; GROßKOPF, A. u. G. HARRAS: Begriffliche Erkundung semantischer Strukturen von Sprechaktverben; ZELGER, J.: Grundwerte, Ziele und Maßnahmen in einem regionalen Krankenhaus: Eine Anwendung des Verfahrens GABEK; KOHLER-KOCH, B. u. F. VOGT: Normen- und regelgeleitete internationale Kooperationen: Formale Begriffsanalyse in der Politikwissenschaft; HENNING, H.J. u. W. KEMMNITZ: Entwicklung eines kontextuellen Methodenkonzeptes mit Hilfer der Formalen Begriffsanalyse an Beispielen zum Risikoverständnis; BARTEL, H.-G.: Über Möglichkeiten der Formalen Begriffsanalyse in der Mathematischen Archäochemie
  15. Vogt, F.; Wille, R.: TOSCANA - a graphical tool for analyzing and exploring data (1995) 0.01
    0.005478912 = product of:
      0.040178686 = sum of:
        0.004756517 = weight(_text_:in in 1901) [ClassicSimilarity], result of:
          0.004756517 = score(doc=1901,freq=4.0), product of:
            0.027974274 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.02056547 = queryNorm
            0.17003182 = fieldWeight in 1901, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0625 = fieldNorm(doc=1901)
        0.024276821 = weight(_text_:computer in 1901) [ClassicSimilarity], result of:
          0.024276821 = score(doc=1901,freq=2.0), product of:
            0.0751567 = queryWeight, product of:
              3.6545093 = idf(docFreq=3109, maxDocs=44218)
              0.02056547 = queryNorm
            0.32301605 = fieldWeight in 1901, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.6545093 = idf(docFreq=3109, maxDocs=44218)
              0.0625 = fieldNorm(doc=1901)
        0.01114535 = product of:
          0.0222907 = sum of:
            0.0222907 = weight(_text_:22 in 1901) [ClassicSimilarity], result of:
              0.0222907 = score(doc=1901,freq=2.0), product of:
                0.072016776 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.02056547 = queryNorm
                0.30952093 = fieldWeight in 1901, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0625 = fieldNorm(doc=1901)
          0.5 = coord(1/2)
      0.13636364 = coord(3/22)
    
    Abstract
    TOSCANA is a computer program which allows an online interaction with larger data bases to analyse and explore data conceptually. It uses labelled line diagrams of concept lattices to communicate knowledge coded in given data. The basic problem to create online presentations of concept lattices is solved by composing prepared diagrams to nested line diagrams. A larger number of applications in different areas have already shown that TOSCANA is a useful tool for many purposes
    Source
    Knowledge organization. 22(1995) no.2, S.78-81
  16. Helmerich, M.: Liniendiagramme in der Wissenskommunikation : eine mathematisch-didaktische Untersuchung (2011) 0.01
    0.0052070846 = product of:
      0.038185287 = sum of:
        0.016742421 = weight(_text_:und in 4390) [ClassicSimilarity], result of:
          0.016742421 = score(doc=4390,freq=18.0), product of:
            0.04558063 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.02056547 = queryNorm
            0.3673144 = fieldWeight in 4390, product of:
              4.2426405 = tf(freq=18.0), with freq of:
                18.0 = termFreq=18.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.0390625 = fieldNorm(doc=4390)
        0.016742421 = weight(_text_:und in 4390) [ClassicSimilarity], result of:
          0.016742421 = score(doc=4390,freq=18.0), product of:
            0.04558063 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.02056547 = queryNorm
            0.3673144 = fieldWeight in 4390, product of:
              4.2426405 = tf(freq=18.0), with freq of:
                18.0 = termFreq=18.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.0390625 = fieldNorm(doc=4390)
        0.004700446 = weight(_text_:in in 4390) [ClassicSimilarity], result of:
          0.004700446 = score(doc=4390,freq=10.0), product of:
            0.027974274 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.02056547 = queryNorm
            0.16802745 = fieldWeight in 4390, product of:
              3.1622777 = tf(freq=10.0), with freq of:
                10.0 = termFreq=10.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0390625 = fieldNorm(doc=4390)
      0.13636364 = coord(3/22)
    
    Abstract
    Die Kommunikation von Wissen nimmt in der modernen Wissensgesellschaft einen entscheidenden Stellenwert ein. Kommunikation im Habermas'schen Sinne eines intersubjektiven Verständigungsprozesses ist dann aber auch mehr als nur der Austausch von Zeichen: es geht um Sinn und Bedeutung und die Aushandlungsprozesse darüber, wie wir als Kommunikationsgemeinschaft Zeichen interpretieren und darin Informationen codieren. Als Medium für solche Kommunikations - prozesse eignen sich besonders gut Liniendiagramme aus der Theorie der Formalen Begriffsanalyse. Diese Liniendiagramme sind nicht nur geeignet, die Wissenskommunikation zu unterstützen, sondern auch Kommunikationsprozesse überhaupt erst zu initiieren. Solche Liniendiagramme können die Wissenskommunikation gut unterstützen, da sie durch ihre Einfachheit, Ordnung, Prägnanz und ergänzende Stimulanz für Verständigung über die wissensgenerierende Information sorgen. Außerdem wird mit den Liniendiagrammen ein Kommunikationsmittel bereitgestellt, dass inter- und transdisziplinär wirksam werden kann und so Wissensgebiete für verschiedene Disziplinen erschließt, da es mit den Diagrammen gelingt, die allgemeine, zugrundeliegende logische Struktur mit Hilfe eines mathematisch fundierten Verfahrens herauszuarbeiten. Liniendiagramme stellen nicht nur Wissensgebiete in einer geordneten, strukturierten Form dar, sondern verwenden dafür auch formale Begriffe und knüpfen damit an Begriffe als Objekte des menschlichen Denkens an. In den Begriffe verschmilzt ein Ausschnitt der betrachteten Objekte (im Beispiel die verschiedenen Gewässerarten) mit den ihnen gemeinsamen Merkmalen zu neuen Denkeinheiten und geben somit dem Wissen eine Form, in der Kommunikation über diese Denkeinheiten und die darin konzentrierte Information ermöglicht wird.
  17. Zickwolff, M.: Zur Rolle der Formalen Begriffsanalyse in der Wissensakquisition (1994) 0.01
    0.0050643287 = product of:
      0.03713841 = sum of:
        0.015626261 = weight(_text_:und in 8938) [ClassicSimilarity], result of:
          0.015626261 = score(doc=8938,freq=2.0), product of:
            0.04558063 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.02056547 = queryNorm
            0.34282678 = fieldWeight in 8938, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.109375 = fieldNorm(doc=8938)
        0.015626261 = weight(_text_:und in 8938) [ClassicSimilarity], result of:
          0.015626261 = score(doc=8938,freq=2.0), product of:
            0.04558063 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.02056547 = queryNorm
            0.34282678 = fieldWeight in 8938, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.109375 = fieldNorm(doc=8938)
        0.0058858884 = weight(_text_:in in 8938) [ClassicSimilarity], result of:
          0.0058858884 = score(doc=8938,freq=2.0), product of:
            0.027974274 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.02056547 = queryNorm
            0.21040362 = fieldWeight in 8938, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.109375 = fieldNorm(doc=8938)
      0.13636364 = coord(3/22)
    
    Source
    Begriffliche Wissensverarbeitung: Grundfragen und Aufgaben. Hrsg.: R. Wille u. M. Zickwolff
  18. Lengnink, K.: Ähnlichkeit als Distanz in Begriffsverbänden (2000) 0.01
    0.0050643287 = product of:
      0.03713841 = sum of:
        0.015626261 = weight(_text_:und in 4033) [ClassicSimilarity], result of:
          0.015626261 = score(doc=4033,freq=2.0), product of:
            0.04558063 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.02056547 = queryNorm
            0.34282678 = fieldWeight in 4033, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.109375 = fieldNorm(doc=4033)
        0.015626261 = weight(_text_:und in 4033) [ClassicSimilarity], result of:
          0.015626261 = score(doc=4033,freq=2.0), product of:
            0.04558063 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.02056547 = queryNorm
            0.34282678 = fieldWeight in 4033, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.109375 = fieldNorm(doc=4033)
        0.0058858884 = weight(_text_:in in 4033) [ClassicSimilarity], result of:
          0.0058858884 = score(doc=4033,freq=2.0), product of:
            0.027974274 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.02056547 = queryNorm
            0.21040362 = fieldWeight in 4033, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.109375 = fieldNorm(doc=4033)
      0.13636364 = coord(3/22)
    
    Source
    Begriffliche Wissensverarbeitung: Methoden und Anwendungen, mit Beiträgen zahlreicher Fachwissenschaftler. Hrsg.: G. Stumme u. R. Wille
  19. Prediger, S.: Terminologische Merkmalslogik in der Formalen Begriffsanalyse (2000) 0.01
    0.0050643287 = product of:
      0.03713841 = sum of:
        0.015626261 = weight(_text_:und in 4197) [ClassicSimilarity], result of:
          0.015626261 = score(doc=4197,freq=2.0), product of:
            0.04558063 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.02056547 = queryNorm
            0.34282678 = fieldWeight in 4197, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.109375 = fieldNorm(doc=4197)
        0.015626261 = weight(_text_:und in 4197) [ClassicSimilarity], result of:
          0.015626261 = score(doc=4197,freq=2.0), product of:
            0.04558063 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.02056547 = queryNorm
            0.34282678 = fieldWeight in 4197, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.109375 = fieldNorm(doc=4197)
        0.0058858884 = weight(_text_:in in 4197) [ClassicSimilarity], result of:
          0.0058858884 = score(doc=4197,freq=2.0), product of:
            0.027974274 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.02056547 = queryNorm
            0.21040362 = fieldWeight in 4197, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.109375 = fieldNorm(doc=4197)
      0.13636364 = coord(3/22)
    
    Source
    Begriffliche Wissensverarbeitung: Methoden und Anwendungen, mit Beiträgen zahlreicher Fachwissenschaftler. Hrsg.: G. Stumme u. R. Wille
  20. Wille, R.: Lattices in data analysis : how to draw them with a computer (1989) 0.01
    0.005025489 = product of:
      0.055280373 = sum of:
        0.00672673 = weight(_text_:in in 3043) [ClassicSimilarity], result of:
          0.00672673 = score(doc=3043,freq=2.0), product of:
            0.027974274 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.02056547 = queryNorm
            0.24046129 = fieldWeight in 3043, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.125 = fieldNorm(doc=3043)
        0.048553642 = weight(_text_:computer in 3043) [ClassicSimilarity], result of:
          0.048553642 = score(doc=3043,freq=2.0), product of:
            0.0751567 = queryWeight, product of:
              3.6545093 = idf(docFreq=3109, maxDocs=44218)
              0.02056547 = queryNorm
            0.6460321 = fieldWeight in 3043, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.6545093 = idf(docFreq=3109, maxDocs=44218)
              0.125 = fieldNorm(doc=3043)
      0.09090909 = coord(2/22)
    

Years

Languages

  • d 36
  • e 35

Types

  • a 58
  • m 8
  • p 3
  • s 3
  • el 1
  • r 1
  • x 1
  • More… Less…