Search (34 results, page 1 of 2)

  • × theme_ss:"Formale Begriffsanalyse"
  • × language_ss:"e"
  1. Prediger, S.: Kontextuelle Urteilslogik mit Begriffsgraphen : Ein Beitrag zur Restrukturierung der mathematischen Logik (1998) 0.01
    0.012832299 = product of:
      0.038496897 = sum of:
        0.008924231 = weight(_text_:in in 3142) [ClassicSimilarity], result of:
          0.008924231 = score(doc=3142,freq=2.0), product of:
            0.059380736 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.043654136 = queryNorm
            0.15028831 = fieldWeight in 3142, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.078125 = fieldNorm(doc=3142)
        0.029572664 = product of:
          0.059145328 = sum of:
            0.059145328 = weight(_text_:22 in 3142) [ClassicSimilarity], result of:
              0.059145328 = score(doc=3142,freq=2.0), product of:
                0.15286934 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.043654136 = queryNorm
                0.38690117 = fieldWeight in 3142, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.078125 = fieldNorm(doc=3142)
          0.5 = coord(1/2)
      0.33333334 = coord(2/6)
    
    Date
    26. 2.2008 15:58:22
    Footnote
    Rez. in: KO 26(1999) no.3, S.175-176 (R. Wille)
  2. Vogt, F.; Wille, R.: TOSCANA - a graphical tool for analyzing and exploring data (1995) 0.01
    0.011251582 = product of:
      0.033754744 = sum of:
        0.010096614 = weight(_text_:in in 1901) [ClassicSimilarity], result of:
          0.010096614 = score(doc=1901,freq=4.0), product of:
            0.059380736 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.043654136 = queryNorm
            0.17003182 = fieldWeight in 1901, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0625 = fieldNorm(doc=1901)
        0.02365813 = product of:
          0.04731626 = sum of:
            0.04731626 = weight(_text_:22 in 1901) [ClassicSimilarity], result of:
              0.04731626 = score(doc=1901,freq=2.0), product of:
                0.15286934 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.043654136 = queryNorm
                0.30952093 = fieldWeight in 1901, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0625 = fieldNorm(doc=1901)
          0.5 = coord(1/2)
      0.33333334 = coord(2/6)
    
    Abstract
    TOSCANA is a computer program which allows an online interaction with larger data bases to analyse and explore data conceptually. It uses labelled line diagrams of concept lattices to communicate knowledge coded in given data. The basic problem to create online presentations of concept lattices is solved by composing prepared diagrams to nested line diagrams. A larger number of applications in different areas have already shown that TOSCANA is a useful tool for many purposes
    Source
    Knowledge organization. 22(1995) no.2, S.78-81
  3. Kent, R.E.: Implications and rules in thesauri (1994) 0.01
    0.011077631 = product of:
      0.033232894 = sum of:
        0.014278769 = weight(_text_:in in 3457) [ClassicSimilarity], result of:
          0.014278769 = score(doc=3457,freq=8.0), product of:
            0.059380736 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.043654136 = queryNorm
            0.24046129 = fieldWeight in 3457, product of:
              2.828427 = tf(freq=8.0), with freq of:
                8.0 = termFreq=8.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0625 = fieldNorm(doc=3457)
        0.018954126 = weight(_text_:und in 3457) [ClassicSimilarity], result of:
          0.018954126 = score(doc=3457,freq=2.0), product of:
            0.09675359 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.043654136 = queryNorm
            0.19590102 = fieldWeight in 3457, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.0625 = fieldNorm(doc=3457)
      0.33333334 = coord(2/6)
    
    Abstract
    A central consideration in the study of whole language semantic space as encoded in thesauri is word sense comparability. Shows how word sense comparability can be adequately expressed by the logical implications and rules from Formal Concept Analysis. Formal concept analysis, a new approach to formal logic initiated by Rudolf Wille, has been used for data modelling, analysis and interpretation, and also for knowledge representation and knowledge discovery
    Series
    Advances in knowledge organization; vol.4
    Theme
    Konzeption und Anwendung des Prinzips Thesaurus
  4. Priss, U.: Faceted knowledge representation (1999) 0.01
    0.009845134 = product of:
      0.029535402 = sum of:
        0.008834538 = weight(_text_:in in 2654) [ClassicSimilarity], result of:
          0.008834538 = score(doc=2654,freq=4.0), product of:
            0.059380736 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.043654136 = queryNorm
            0.14877784 = fieldWeight in 2654, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0546875 = fieldNorm(doc=2654)
        0.020700864 = product of:
          0.04140173 = sum of:
            0.04140173 = weight(_text_:22 in 2654) [ClassicSimilarity], result of:
              0.04140173 = score(doc=2654,freq=2.0), product of:
                0.15286934 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.043654136 = queryNorm
                0.2708308 = fieldWeight in 2654, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=2654)
          0.5 = coord(1/2)
      0.33333334 = coord(2/6)
    
    Abstract
    Faceted Knowledge Representation provides a formalism for implementing knowledge systems. The basic notions of faceted knowledge representation are "unit", "relation", "facet" and "interpretation". Units are atomic elements and can be abstract elements or refer to external objects in an application. Relations are sequences or matrices of 0 and 1's (binary matrices). Facets are relational structures that combine units and relations. Each facet represents an aspect or viewpoint of a knowledge system. Interpretations are mappings that can be used to translate between different representations. This paper introduces the basic notions of faceted knowledge representation. The formalism is applied here to an abstract modeling of a faceted thesaurus as used in information retrieval.
    Date
    22. 1.2016 17:30:31
  5. Priss, U.: Faceted information representation (2000) 0.01
    0.008982609 = product of:
      0.026947826 = sum of:
        0.006246961 = weight(_text_:in in 5095) [ClassicSimilarity], result of:
          0.006246961 = score(doc=5095,freq=2.0), product of:
            0.059380736 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.043654136 = queryNorm
            0.10520181 = fieldWeight in 5095, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0546875 = fieldNorm(doc=5095)
        0.020700864 = product of:
          0.04140173 = sum of:
            0.04140173 = weight(_text_:22 in 5095) [ClassicSimilarity], result of:
              0.04140173 = score(doc=5095,freq=2.0), product of:
                0.15286934 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.043654136 = queryNorm
                0.2708308 = fieldWeight in 5095, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=5095)
          0.5 = coord(1/2)
      0.33333334 = coord(2/6)
    
    Abstract
    This paper presents an abstract formalization of the notion of "facets". Facets are relational structures of units, relations and other facets selected for a certain purpose. Facets can be used to structure large knowledge representation systems into a hierarchical arrangement of consistent and independent subsystems (facets) that facilitate flexibility and combinations of different viewpoints or aspects. This paper describes the basic notions, facet characteristics and construction mechanisms. It then explicates the theory in an example of a faceted information retrieval system (FaIR)
    Date
    22. 1.2016 17:47:06
  6. Burmeister, P.; Holzer, R.: On the treatment of incomplete knowledge in formal concept analysis (2000) 0.00
    0.0025241538 = product of:
      0.015144923 = sum of:
        0.015144923 = weight(_text_:in in 5085) [ClassicSimilarity], result of:
          0.015144923 = score(doc=5085,freq=16.0), product of:
            0.059380736 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.043654136 = queryNorm
            0.25504774 = fieldWeight in 5085, product of:
              4.0 = tf(freq=16.0), with freq of:
                16.0 = termFreq=16.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.046875 = fieldNorm(doc=5085)
      0.16666667 = coord(1/6)
    
    Abstract
    Some possible treatments of incomplete knowledge in conceptual data representation, data analysis and knowledge acquisition are presented. In particular, some ways of conceptual scalings as well as the role of the three-valued KLEENE-logic are briefly investigated. This logic is also one background in attribute exploration, a conceptual tool for knowledge acquisition. For this method a strategy is given to obtain as much of (attribute) implicational knowledge about a given "universe" as possible; and we show how to represent incomplete knowledge in order to be able to pin down the questions still to be answered in order to obtain complete knowledge in this situation
    Series
    Lecture notes in computer science; vol.1867: Lecture notes on artificial intelligence
  7. Wille, R.: Lattices in data analysis : how to draw them with a computer (1989) 0.00
    0.0023797948 = product of:
      0.014278769 = sum of:
        0.014278769 = weight(_text_:in in 3043) [ClassicSimilarity], result of:
          0.014278769 = score(doc=3043,freq=2.0), product of:
            0.059380736 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.043654136 = queryNorm
            0.24046129 = fieldWeight in 3043, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.125 = fieldNorm(doc=3043)
      0.16666667 = coord(1/6)
    
  8. Priss, U.: Formal concept analysis in information science (2006) 0.00
    0.0023797948 = product of:
      0.014278769 = sum of:
        0.014278769 = weight(_text_:in in 4305) [ClassicSimilarity], result of:
          0.014278769 = score(doc=4305,freq=2.0), product of:
            0.059380736 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.043654136 = queryNorm
            0.24046129 = fieldWeight in 4305, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.125 = fieldNorm(doc=4305)
      0.16666667 = coord(1/6)
    
  9. Priss, U.; Old, L.J.: Concept neighbourhoods in knowledge organisation systems (2010) 0.00
    0.0023797948 = product of:
      0.014278769 = sum of:
        0.014278769 = weight(_text_:in in 3527) [ClassicSimilarity], result of:
          0.014278769 = score(doc=3527,freq=8.0), product of:
            0.059380736 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.043654136 = queryNorm
            0.24046129 = fieldWeight in 3527, product of:
              2.828427 = tf(freq=8.0), with freq of:
                8.0 = termFreq=8.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0625 = fieldNorm(doc=3527)
      0.16666667 = coord(1/6)
    
    Abstract
    This paper discusses the application of concept neighbourhoods (in the sense of formal concept analysis) to knowledge organisation systems. Examples are provided using Roget's Thesaurus, WordNet and Wikipedia categories.
    Series
    Advances in knowledge organization; vol.12
    Source
    Paradigms and conceptual systems in knowledge organization: Proceedings of the Eleventh International ISKO Conference, 23-26 February 2010 Rome, Italy. Edited by Claudio Gnoli and Fulvio Mazzocchi
  10. Hereth, J.; Stumme, G.; Wille, R.; Wille, U.: Conceptual knowledge discovery and data analysis (2000) 0.00
    0.0022310577 = product of:
      0.0133863455 = sum of:
        0.0133863455 = weight(_text_:in in 5083) [ClassicSimilarity], result of:
          0.0133863455 = score(doc=5083,freq=18.0), product of:
            0.059380736 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.043654136 = queryNorm
            0.22543246 = fieldWeight in 5083, product of:
              4.2426405 = tf(freq=18.0), with freq of:
                18.0 = termFreq=18.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0390625 = fieldNorm(doc=5083)
      0.16666667 = coord(1/6)
    
    Abstract
    In this paper, we discuss Conceptual Knowledge Discovery in Databases (CKDD) in its connection with Data Analysis. Our approach is based on Formal Concept Analysis, a mathematical theory which has been developed and proven useful during the last 20 years. Formal Concept Analysis has led to a theory of conceptual information systems which has been applied by using the management system TOSCANA in a wide range of domains. In this paper, we use such an application in database marketing to demonstrate how methods and procedures of CKDD can be applied in Data Analysis. In particular, we show the interplay and integration of data mining and data analysis techniques based on Formal Concept Analysis. The main concern of this paper is to explain how the transition from data to knowledge can be supported by a TOSCANA system. To clarify the transition steps we discuss their correspondence to the five levels of knowledge representation established by R. Brachman and to the steps of empirically grounded theory building proposed by A. Strauss and J. Corbin
    Series
    Lecture notes in computer science; vol.1867: Lecture notes on artificial intelligence
  11. De Maio, C.; Fenza, G.; Loia, V.; Senatore, S.: Hierarchical web resources retrieval by exploiting Fuzzy Formal Concept Analysis (2012) 0.00
    0.0021859813 = product of:
      0.013115887 = sum of:
        0.013115887 = weight(_text_:in in 2737) [ClassicSimilarity], result of:
          0.013115887 = score(doc=2737,freq=12.0), product of:
            0.059380736 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.043654136 = queryNorm
            0.22087781 = fieldWeight in 2737, product of:
              3.4641016 = tf(freq=12.0), with freq of:
                12.0 = termFreq=12.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.046875 = fieldNorm(doc=2737)
      0.16666667 = coord(1/6)
    
    Abstract
    In recent years, knowledge structuring is assuming important roles in several real world applications such as decision support, cooperative problem solving, e-commerce, Semantic Web and, even in planning systems. Ontologies play an important role in supporting automated processes to access information and are at the core of new strategies for the development of knowledge-based systems. Yet, developing an ontology is a time-consuming task which often needs an accurate domain expertise to tackle structural and logical difficulties in the definition of concepts as well as conceivable relationships. This work presents an ontology-based retrieval approach, that supports data organization and visualization and provides a friendly navigation model. It exploits the fuzzy extension of the Formal Concept Analysis theory to elicit conceptualizations from datasets and generate a hierarchy-based representation of extracted knowledge. An intuitive graphical interface provides a multi-facets view of the built ontology. Through a transparent query-based retrieval, final users navigate across concepts, relations and population.
    Content
    Beitrag in einem Themenheft "Soft Approaches to IA on the Web". Vgl.: doi:10.1016/j.ipm.2011.04.003.
  12. Scheich, P.; Skorsky, M.; Vogt, F.; Wachter, C.; Wille, R.: Conceptual data systems (1992) 0.00
    0.0020823204 = product of:
      0.012493922 = sum of:
        0.012493922 = weight(_text_:in in 3147) [ClassicSimilarity], result of:
          0.012493922 = score(doc=3147,freq=2.0), product of:
            0.059380736 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.043654136 = queryNorm
            0.21040362 = fieldWeight in 3147, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.109375 = fieldNorm(doc=3147)
      0.16666667 = coord(1/6)
    
    Footnote
    Erscheint im Tagungsband der 16. Jahrestagung der Gesellschaft für Klassifikation 1992 in Dortmund
  13. Sedelow, W.A.: ¬The formal analysis of concepts (1993) 0.00
    0.0020609628 = product of:
      0.012365777 = sum of:
        0.012365777 = weight(_text_:in in 620) [ClassicSimilarity], result of:
          0.012365777 = score(doc=620,freq=6.0), product of:
            0.059380736 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.043654136 = queryNorm
            0.2082456 = fieldWeight in 620, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0625 = fieldNorm(doc=620)
      0.16666667 = coord(1/6)
    
    Abstract
    The present paper focuses on the extraction, by means of a formal logical/mathematical methodology (i.e. automatically, exclusively by rule), of concept content, as in, for example, continuous discourse. The approach to a fully formal defintion of concept content ultimately is owing to a German government initiative to establish 'standards' regarding concepts, in conjunction with efforts to stipulate precisely (and then, derivatively, through computer prgrams) data and information needs according to work role in certain government offices
  14. Priss, U.: Comparing classification systems using facets (2000) 0.00
    0.0020609628 = product of:
      0.012365777 = sum of:
        0.012365777 = weight(_text_:in in 6485) [ClassicSimilarity], result of:
          0.012365777 = score(doc=6485,freq=6.0), product of:
            0.059380736 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.043654136 = queryNorm
            0.2082456 = fieldWeight in 6485, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0625 = fieldNorm(doc=6485)
      0.16666667 = coord(1/6)
    
    Abstract
    This paper describes a qualitative methodology for comparing and analyzing classification schemes. Theoretical facets are modeled as concept lattices in the sense of formal concept analysis and are used as 'ground' on which the underlying conceptual facets of a classification scheme are visually represented as 'figures'.
    Series
    Advances in knowledge organization; vol.7
    Source
    Dynamism and stability in knowledge organization: Proceedings of the 6th International ISKO-Conference, 10-13 July 2000, Toronto, Canada. Ed.: C. Beghtol et al
  15. Ganter, B.; Wille, R.: Formal concept analysis : mathematical foundations (1998) 0.00
    0.0019955188 = product of:
      0.011973113 = sum of:
        0.011973113 = weight(_text_:in in 5061) [ClassicSimilarity], result of:
          0.011973113 = score(doc=5061,freq=10.0), product of:
            0.059380736 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.043654136 = queryNorm
            0.20163295 = fieldWeight in 5061, product of:
              3.1622777 = tf(freq=10.0), with freq of:
                10.0 = termFreq=10.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.046875 = fieldNorm(doc=5061)
      0.16666667 = coord(1/6)
    
    Abstract
    This is the first textbook on formal concept analysis. It gives a systematic presentation of the mathematical foundations and their relation to applications in computer science, especially data analysis and knowledge processing. Above all, it presents graphical methods for representing conceptual systems that have proved themselves in communicating knowledge. Theory and graphical representation are thus closely coupled together. The mathematical foundations are treated thouroughly and illuminated by means of numerous examples. Since computers are being used ever more widely for knowledge processing, formal methods for conceptual analysis are gaining in importance. This book makes the basic theory for such methods accessible in a compact form
    Footnote
    Rez. in: KO 26(1999) no.3, S.172-173 (U. Priss)
  16. Ganter, B.: Computing with conceptual structures (2000) 0.00
    0.0019955188 = product of:
      0.011973113 = sum of:
        0.011973113 = weight(_text_:in in 5088) [ClassicSimilarity], result of:
          0.011973113 = score(doc=5088,freq=10.0), product of:
            0.059380736 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.043654136 = queryNorm
            0.20163295 = fieldWeight in 5088, product of:
              3.1622777 = tf(freq=10.0), with freq of:
                10.0 = termFreq=10.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.046875 = fieldNorm(doc=5088)
      0.16666667 = coord(1/6)
    
    Abstract
    We give an overview over the computational tools for conceptional structures that have emerged from the theory of Formal Concept Analysis, with emphasis on basic ideas rather than technical details. We describe what we mean by conceptual computations, and try to convince the reader that an elaborate formalization is a necessary precondition. Claiming that Formal Concept Analysis provides such a formal background, we present as examples two well known algorithms in very simple pseudo code. These earl be used for navigating in a lattice, thereby supporting some prototypical tasks of conceptual computation. We refer to some of the many more advanced methods, discuss how to compute with limited precision and explain why in the case of incomplete knowledge the conceptual approach is more efficient than a combinatorial one. Utilizing this efficiency requires skillful use of the formalism. We present two results that lead in this direction
    Series
    Lecture notes in computer science; vol.1867: Lecture notes on artificial intelligence
  17. Kaytoue, M.; Kuznetsov, S.O.; Assaghir, Z.; Napoli, A.: Embedding tolerance relations in concept lattices : an application in information fusion (2010) 0.00
    0.001821651 = product of:
      0.010929906 = sum of:
        0.010929906 = weight(_text_:in in 4843) [ClassicSimilarity], result of:
          0.010929906 = score(doc=4843,freq=12.0), product of:
            0.059380736 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.043654136 = queryNorm
            0.18406484 = fieldWeight in 4843, product of:
              3.4641016 = tf(freq=12.0), with freq of:
                12.0 = termFreq=12.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0390625 = fieldNorm(doc=4843)
      0.16666667 = coord(1/6)
    
    Abstract
    Formal Concept Analysis (FCA) is a well founded mathematical framework used for conceptual classication and knowledge management. Given a binary table describing a relation between objects and attributes, FCA consists in building a set of concepts organized by a subsumption relation within a concept lattice. Accordingly, FCA requires to transform complex data, e.g. numbers, intervals, graphs, into binary data leading to loss of information and poor interpretability of object classes. In this paper, we propose a pre-processing method producing binary data from complex data taking advantage of similarity between objects. As a result, the concept lattice is composed of classes being maximal sets of pairwise similar objects. This method is based on FCA and on a formalization of similarity as a tolerance relation (reexive and symmetric). It applies to complex object descriptions and especially here to interval data. Moreover, it can be applied to any kind of structured data for which a similarity can be dened (sequences, graphs, etc.). Finally, an application highlights that the resulting concept lattice plays an important role in information fusion problem, as illustrated with a real-world example in agronomy.
  18. Luksch, P.; Wille, R.: ¬A mathematical model for conceptual knowledge systems (1991) 0.00
    0.0018033426 = product of:
      0.010820055 = sum of:
        0.010820055 = weight(_text_:in in 3033) [ClassicSimilarity], result of:
          0.010820055 = score(doc=3033,freq=6.0), product of:
            0.059380736 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.043654136 = queryNorm
            0.1822149 = fieldWeight in 3033, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0546875 = fieldNorm(doc=3033)
      0.16666667 = coord(1/6)
    
    Abstract
    Objects, attributes, and concepts are basic notations of conceptual knowledge; they are linked by the following four basic relations: an object has an attribute, an object belongs to a concept, an attribute abstracts from a concept, and a concept is a subconcept of another concept. These structural elements are well mathematized in formal concept analysis. Therefore, conceptual knowledge systems can be mathematically modelled in the frame of formal concept analysis. How such modelling may be performed is indicated by an example of a conceptual knowledge system. The formal definition of the model finally clarifies in which ways representation, inference, acquisition, and communication of conceptual knowledge can be mathematically treated
  19. Kollewe, W.: Data representation by nested line diagrams illustrated by a survey of pensioners (1991) 0.00
    0.0018033426 = product of:
      0.010820055 = sum of:
        0.010820055 = weight(_text_:in in 5230) [ClassicSimilarity], result of:
          0.010820055 = score(doc=5230,freq=6.0), product of:
            0.059380736 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.043654136 = queryNorm
            0.1822149 = fieldWeight in 5230, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0546875 = fieldNorm(doc=5230)
      0.16666667 = coord(1/6)
    
    Abstract
    With formal concept analysis surveys are analyzable in the way that a meaningful picture of the answers of the interviewed persons is available. Line diagrams of large concept lattices might become less readable up to the point that it is impossible to pursue the line segments with the eyes. Nested line diagrams give the opportunity to overcome these difficulties. The main idea of nested line diagrams is to partition the line diagram into boxes so that line segments between two boxes are all parallel and may be replaced by one line segment. The possibility to draw line diagrams with more than two factors does allow it to describe concept lattices with many hundred or thousand concepts in a clear structure. In practice it has often been proven useful to take standardized scales for the single levels
  20. Priss, U.: ¬A graphical interface for conceptually navigating faceted thesauri (1998) 0.00
    0.0018033426 = product of:
      0.010820055 = sum of:
        0.010820055 = weight(_text_:in in 6658) [ClassicSimilarity], result of:
          0.010820055 = score(doc=6658,freq=6.0), product of:
            0.059380736 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.043654136 = queryNorm
            0.1822149 = fieldWeight in 6658, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0546875 = fieldNorm(doc=6658)
      0.16666667 = coord(1/6)
    
    Abstract
    This paper describes a graphical interface for the navigation and construction of faceted thesauri that is based on formal concept analysis. Each facet of a thesaurus is represented as a mathematical lattice that is further subdivided into components. Users can graphically navigate through the Java implementation of the interface by clicking on terms that connect facets and components. Since there are many applications for thesauri in the knowledge representation field, such a graphical interface has the potential of being very useful
    Series
    Advances in knowledge organization; vol.6
    Source
    Structures and relations in knowledge organization: Proceedings of the 5th International ISKO-Conference, Lille, 25.-29.8.1998. Ed.: W. Mustafa el Hadi et al