Search (34 results, page 1 of 2)

  • × language_ss:"e"
  • × theme_ss:"Formale Begriffsanalyse"
  1. Prediger, S.: Kontextuelle Urteilslogik mit Begriffsgraphen : Ein Beitrag zur Restrukturierung der mathematischen Logik (1998) 0.01
    0.0077220523 = product of:
      0.043758295 = sum of:
        0.0062418524 = weight(_text_:in in 3142) [ClassicSimilarity], result of:
          0.0062418524 = score(doc=3142,freq=2.0), product of:
            0.04153252 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.030532904 = queryNorm
            0.15028831 = fieldWeight in 3142, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.078125 = fieldNorm(doc=3142)
        0.016832508 = weight(_text_:der in 3142) [ClassicSimilarity], result of:
          0.016832508 = score(doc=3142,freq=2.0), product of:
            0.06820339 = queryWeight, product of:
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.030532904 = queryNorm
            0.2467987 = fieldWeight in 3142, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.078125 = fieldNorm(doc=3142)
        0.020683935 = product of:
          0.04136787 = sum of:
            0.04136787 = weight(_text_:22 in 3142) [ClassicSimilarity], result of:
              0.04136787 = score(doc=3142,freq=2.0), product of:
                0.106921025 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.030532904 = queryNorm
                0.38690117 = fieldWeight in 3142, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.078125 = fieldNorm(doc=3142)
          0.5 = coord(1/2)
      0.1764706 = coord(3/17)
    
    Date
    26. 2.2008 15:58:22
    Footnote
    Rez. in: KO 26(1999) no.3, S.175-176 (R. Wille)
  2. Priss, U.: Faceted information representation (2000) 0.01
    0.0054054363 = product of:
      0.030630805 = sum of:
        0.004369296 = weight(_text_:in in 5095) [ClassicSimilarity], result of:
          0.004369296 = score(doc=5095,freq=2.0), product of:
            0.04153252 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.030532904 = queryNorm
            0.10520181 = fieldWeight in 5095, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0546875 = fieldNorm(doc=5095)
        0.011782755 = weight(_text_:der in 5095) [ClassicSimilarity], result of:
          0.011782755 = score(doc=5095,freq=2.0), product of:
            0.06820339 = queryWeight, product of:
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.030532904 = queryNorm
            0.17275909 = fieldWeight in 5095, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.0546875 = fieldNorm(doc=5095)
        0.014478754 = product of:
          0.028957509 = sum of:
            0.028957509 = weight(_text_:22 in 5095) [ClassicSimilarity], result of:
              0.028957509 = score(doc=5095,freq=2.0), product of:
                0.106921025 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.030532904 = queryNorm
                0.2708308 = fieldWeight in 5095, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=5095)
          0.5 = coord(1/2)
      0.1764706 = coord(3/17)
    
    Abstract
    This paper presents an abstract formalization of the notion of "facets". Facets are relational structures of units, relations and other facets selected for a certain purpose. Facets can be used to structure large knowledge representation systems into a hierarchical arrangement of consistent and independent subsystems (facets) that facilitate flexibility and combinations of different viewpoints or aspects. This paper describes the basic notions, facet characteristics and construction mechanisms. It then explicates the theory in an example of a faceted information retrieval system (FaIR)
    Date
    22. 1.2016 17:47:06
    Series
    Berichte aus der Informatik
  3. Scheich, P.; Skorsky, M.; Vogt, F.; Wachter, C.; Wille, R.: Conceptual data systems (1992) 0.00
    0.004948854 = product of:
      0.04206526 = sum of:
        0.008738592 = weight(_text_:in in 3147) [ClassicSimilarity], result of:
          0.008738592 = score(doc=3147,freq=2.0), product of:
            0.04153252 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.030532904 = queryNorm
            0.21040362 = fieldWeight in 3147, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.109375 = fieldNorm(doc=3147)
        0.033326667 = weight(_text_:der in 3147) [ClassicSimilarity], result of:
          0.033326667 = score(doc=3147,freq=4.0), product of:
            0.06820339 = queryWeight, product of:
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.030532904 = queryNorm
            0.4886365 = fieldWeight in 3147, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.109375 = fieldNorm(doc=3147)
      0.11764706 = coord(2/17)
    
    Footnote
    Erscheint im Tagungsband der 16. Jahrestagung der Gesellschaft für Klassifikation 1992 in Dortmund
  4. Vogt, F.; Wille, R.: TOSCANA - a graphical tool for analyzing and exploring data (1995) 0.00
    0.0027775292 = product of:
      0.023608997 = sum of:
        0.00706185 = weight(_text_:in in 1901) [ClassicSimilarity], result of:
          0.00706185 = score(doc=1901,freq=4.0), product of:
            0.04153252 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.030532904 = queryNorm
            0.17003182 = fieldWeight in 1901, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0625 = fieldNorm(doc=1901)
        0.016547147 = product of:
          0.033094294 = sum of:
            0.033094294 = weight(_text_:22 in 1901) [ClassicSimilarity], result of:
              0.033094294 = score(doc=1901,freq=2.0), product of:
                0.106921025 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.030532904 = queryNorm
                0.30952093 = fieldWeight in 1901, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0625 = fieldNorm(doc=1901)
          0.5 = coord(1/2)
      0.11764706 = coord(2/17)
    
    Abstract
    TOSCANA is a computer program which allows an online interaction with larger data bases to analyse and explore data conceptually. It uses labelled line diagrams of concept lattices to communicate knowledge coded in given data. The basic problem to create online presentations of concept lattices is solved by composing prepared diagrams to nested line diagrams. A larger number of applications in different areas have already shown that TOSCANA is a useful tool for many purposes
    Source
    Knowledge organization. 22(1995) no.2, S.78-81
  5. Kent, R.E.: Implications and rules in thesauri (1994) 0.00
    0.0027345885 = product of:
      0.023244001 = sum of:
        0.013257037 = weight(_text_:und in 3457) [ClassicSimilarity], result of:
          0.013257037 = score(doc=3457,freq=2.0), product of:
            0.06767212 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.030532904 = queryNorm
            0.19590102 = fieldWeight in 3457, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.0625 = fieldNorm(doc=3457)
        0.009986963 = weight(_text_:in in 3457) [ClassicSimilarity], result of:
          0.009986963 = score(doc=3457,freq=8.0), product of:
            0.04153252 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.030532904 = queryNorm
            0.24046129 = fieldWeight in 3457, product of:
              2.828427 = tf(freq=8.0), with freq of:
                8.0 = termFreq=8.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0625 = fieldNorm(doc=3457)
      0.11764706 = coord(2/17)
    
    Abstract
    A central consideration in the study of whole language semantic space as encoded in thesauri is word sense comparability. Shows how word sense comparability can be adequately expressed by the logical implications and rules from Formal Concept Analysis. Formal concept analysis, a new approach to formal logic initiated by Rudolf Wille, has been used for data modelling, analysis and interpretation, and also for knowledge representation and knowledge discovery
    Series
    Advances in knowledge organization; vol.4
    Theme
    Konzeption und Anwendung des Prinzips Thesaurus
  6. Priss, U.: Faceted knowledge representation (1999) 0.00
    0.002430338 = product of:
      0.020657873 = sum of:
        0.0061791185 = weight(_text_:in in 2654) [ClassicSimilarity], result of:
          0.0061791185 = score(doc=2654,freq=4.0), product of:
            0.04153252 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.030532904 = queryNorm
            0.14877784 = fieldWeight in 2654, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0546875 = fieldNorm(doc=2654)
        0.014478754 = product of:
          0.028957509 = sum of:
            0.028957509 = weight(_text_:22 in 2654) [ClassicSimilarity], result of:
              0.028957509 = score(doc=2654,freq=2.0), product of:
                0.106921025 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.030532904 = queryNorm
                0.2708308 = fieldWeight in 2654, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=2654)
          0.5 = coord(1/2)
      0.11764706 = coord(2/17)
    
    Abstract
    Faceted Knowledge Representation provides a formalism for implementing knowledge systems. The basic notions of faceted knowledge representation are "unit", "relation", "facet" and "interpretation". Units are atomic elements and can be abstract elements or refer to external objects in an application. Relations are sequences or matrices of 0 and 1's (binary matrices). Facets are relational structures that combine units and relations. Each facet represents an aspect or viewpoint of a knowledge system. Interpretations are mappings that can be used to translate between different representations. This paper introduces the basic notions of faceted knowledge representation. The formalism is applied here to an abstract modeling of a faceted thesaurus as used in information retrieval.
    Date
    22. 1.2016 17:30:31
  7. Working with conceptual structures : contributions to ICCS 2000. 8th International Conference on Conceptual Structures: Logical, Linguistic, and Computational Issues. Darmstadt, August 14-18, 2000 (2000) 0.00
    0.0013731077 = product of:
      0.011671415 = sum of:
        0.0057800366 = weight(_text_:in in 5089) [ClassicSimilarity], result of:
          0.0057800366 = score(doc=5089,freq=14.0), product of:
            0.04153252 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.030532904 = queryNorm
            0.13916893 = fieldWeight in 5089, product of:
              3.7416575 = tf(freq=14.0), with freq of:
                14.0 = termFreq=14.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.02734375 = fieldNorm(doc=5089)
        0.0058913776 = weight(_text_:der in 5089) [ClassicSimilarity], result of:
          0.0058913776 = score(doc=5089,freq=2.0), product of:
            0.06820339 = queryWeight, product of:
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.030532904 = queryNorm
            0.08637954 = fieldWeight in 5089, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.02734375 = fieldNorm(doc=5089)
      0.11764706 = coord(2/17)
    
    Abstract
    The 8th International Conference on Conceptual Structures - Logical, Linguistic, and Computational Issues (ICCS 2000) brings together a wide range of researchers and practitioners working with conceptual structures. During the last few years, the ICCS conference series has considerably widened its scope on different kinds of conceptual structures, stimulating research across domain boundaries. We hope that this stimulation is further enhanced by ICCS 2000 joining the long tradition of conferences in Darmstadt with extensive, lively discussions. This volume consists of contributions presented at ICCS 2000, complementing the volume "Conceptual Structures: Logical, Linguistic, and Computational Issues" (B. Ganter, G.W. Mineau (Eds.), LNAI 1867, Springer, Berlin-Heidelberg 2000). It contains submissions reviewed by the program committee, and position papers. We wish to express our appreciation to all the authors of submitted papers, to the general chair, the program chair, the editorial board, the program committee, and to the additional reviewers for making ICCS 2000 a valuable contribution in the knowledge processing research field. Special thanks go to the local organizers for making the conference an enjoyable and inspiring event. We are grateful to Darmstadt University of Technology, the Ernst Schröder Center for Conceptual Knowledge Processing, the Center for Interdisciplinary Studies in Technology, the Deutsche Forschungsgemeinschaft, Land Hessen, and NaviCon GmbH for their generous support
    Content
    Concepts & Language: Knowledge organization by procedures of natural language processing. A case study using the method GABEK (J. Zelger, J. Gadner) - Computer aided narrative analysis using conceptual graphs (H. Schärfe, P. 0hrstrom) - Pragmatic representation of argumentative text: a challenge for the conceptual graph approach (H. Irandoust, B. Moulin) - Conceptual graphs as a knowledge representation core in a complex language learning environment (G. Angelova, A. Nenkova, S. Boycheva, T. Nikolov) - Conceptual Modeling and Ontologies: Relationships and actions in conceptual categories (Ch. Landauer, K.L. Bellman) - Concept approximations for formal concept analysis (J. Saquer, J.S. Deogun) - Faceted information representation (U. Priß) - Simple concept graphs with universal quantifiers (J. Tappe) - A framework for comparing methods for using or reusing multiple ontologies in an application (J. van ZyI, D. Corbett) - Designing task/method knowledge-based systems with conceptual graphs (M. Leclère, F.Trichet, Ch. Choquet) - A logical ontology (J. Farkas, J. Sarbo) - Algorithms and Tools: Fast concept analysis (Ch. Lindig) - A framework for conceptual graph unification (D. Corbett) - Visual CP representation of knowledge (H.D. Pfeiffer, R.T. Hartley) - Maximal isojoin for representing software textual specifications and detecting semantic anomalies (Th. Charnois) - Troika: using grids, lattices and graphs in knowledge acquisition (H.S. Delugach, B.E. Lampkin) - Open world theorem prover for conceptual graphs (J.E. Heaton, P. Kocura) - NetCare: a practical conceptual graphs software tool (S. Polovina, D. Strang) - CGWorld - a web based workbench for conceptual graphs management and applications (P. Dobrev, K. Toutanova) - Position papers: The edition project: Peirce's existential graphs (R. Mülller) - Mining association rules using formal concept analysis (N. Pasquier) - Contextual logic summary (R Wille) - Information channels and conceptual scaling (K.E. Wolff) - Spatial concepts - a rule exploration (S. Rudolph) - The TEXT-TO-ONTO learning environment (A. Mädche, St. Staab) - Controlling the semantics of metadata on audio-visual documents using ontologies (Th. Dechilly, B. Bachimont) - Building the ontological foundations of a terminology from natural language to conceptual graphs with Ribosome, a knowledge extraction system (Ch. Jacquelinet, A. Burgun) - CharGer: some lessons learned and new directions (H.S. Delugach) - Knowledge management using conceptual graphs (W.K. Pun)
    Series
    Berichte aus der Informatik
  8. Burmeister, P.; Holzer, R.: On the treatment of incomplete knowledge in formal concept analysis (2000) 0.00
    6.2310445E-4 = product of:
      0.010592775 = sum of:
        0.010592775 = weight(_text_:in in 5085) [ClassicSimilarity], result of:
          0.010592775 = score(doc=5085,freq=16.0), product of:
            0.04153252 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.030532904 = queryNorm
            0.25504774 = fieldWeight in 5085, product of:
              4.0 = tf(freq=16.0), with freq of:
                16.0 = termFreq=16.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.046875 = fieldNorm(doc=5085)
      0.05882353 = coord(1/17)
    
    Abstract
    Some possible treatments of incomplete knowledge in conceptual data representation, data analysis and knowledge acquisition are presented. In particular, some ways of conceptual scalings as well as the role of the three-valued KLEENE-logic are briefly investigated. This logic is also one background in attribute exploration, a conceptual tool for knowledge acquisition. For this method a strategy is given to obtain as much of (attribute) implicational knowledge about a given "universe" as possible; and we show how to represent incomplete knowledge in order to be able to pin down the questions still to be answered in order to obtain complete knowledge in this situation
    Series
    Lecture notes in computer science; vol.1867: Lecture notes on artificial intelligence
  9. Wille, R.: Lattices in data analysis : how to draw them with a computer (1989) 0.00
    5.8746844E-4 = product of:
      0.009986963 = sum of:
        0.009986963 = weight(_text_:in in 3043) [ClassicSimilarity], result of:
          0.009986963 = score(doc=3043,freq=2.0), product of:
            0.04153252 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.030532904 = queryNorm
            0.24046129 = fieldWeight in 3043, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.125 = fieldNorm(doc=3043)
      0.05882353 = coord(1/17)
    
  10. Priss, U.: Formal concept analysis in information science (2006) 0.00
    5.8746844E-4 = product of:
      0.009986963 = sum of:
        0.009986963 = weight(_text_:in in 4305) [ClassicSimilarity], result of:
          0.009986963 = score(doc=4305,freq=2.0), product of:
            0.04153252 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.030532904 = queryNorm
            0.24046129 = fieldWeight in 4305, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.125 = fieldNorm(doc=4305)
      0.05882353 = coord(1/17)
    
  11. Priss, U.; Old, L.J.: Concept neighbourhoods in knowledge organisation systems (2010) 0.00
    5.8746844E-4 = product of:
      0.009986963 = sum of:
        0.009986963 = weight(_text_:in in 3527) [ClassicSimilarity], result of:
          0.009986963 = score(doc=3527,freq=8.0), product of:
            0.04153252 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.030532904 = queryNorm
            0.24046129 = fieldWeight in 3527, product of:
              2.828427 = tf(freq=8.0), with freq of:
                8.0 = termFreq=8.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0625 = fieldNorm(doc=3527)
      0.05882353 = coord(1/17)
    
    Abstract
    This paper discusses the application of concept neighbourhoods (in the sense of formal concept analysis) to knowledge organisation systems. Examples are provided using Roget's Thesaurus, WordNet and Wikipedia categories.
    Series
    Advances in knowledge organization; vol.12
    Source
    Paradigms and conceptual systems in knowledge organization: Proceedings of the Eleventh International ISKO Conference, 23-26 February 2010 Rome, Italy. Edited by Claudio Gnoli and Fulvio Mazzocchi
  12. Hereth, J.; Stumme, G.; Wille, R.; Wille, U.: Conceptual knowledge discovery and data analysis (2000) 0.00
    5.5075163E-4 = product of:
      0.009362778 = sum of:
        0.009362778 = weight(_text_:in in 5083) [ClassicSimilarity], result of:
          0.009362778 = score(doc=5083,freq=18.0), product of:
            0.04153252 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.030532904 = queryNorm
            0.22543246 = fieldWeight in 5083, product of:
              4.2426405 = tf(freq=18.0), with freq of:
                18.0 = termFreq=18.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0390625 = fieldNorm(doc=5083)
      0.05882353 = coord(1/17)
    
    Abstract
    In this paper, we discuss Conceptual Knowledge Discovery in Databases (CKDD) in its connection with Data Analysis. Our approach is based on Formal Concept Analysis, a mathematical theory which has been developed and proven useful during the last 20 years. Formal Concept Analysis has led to a theory of conceptual information systems which has been applied by using the management system TOSCANA in a wide range of domains. In this paper, we use such an application in database marketing to demonstrate how methods and procedures of CKDD can be applied in Data Analysis. In particular, we show the interplay and integration of data mining and data analysis techniques based on Formal Concept Analysis. The main concern of this paper is to explain how the transition from data to knowledge can be supported by a TOSCANA system. To clarify the transition steps we discuss their correspondence to the five levels of knowledge representation established by R. Brachman and to the steps of empirically grounded theory building proposed by A. Strauss and J. Corbin
    Series
    Lecture notes in computer science; vol.1867: Lecture notes on artificial intelligence
  13. De Maio, C.; Fenza, G.; Loia, V.; Senatore, S.: Hierarchical web resources retrieval by exploiting Fuzzy Formal Concept Analysis (2012) 0.00
    5.3962425E-4 = product of:
      0.009173612 = sum of:
        0.009173612 = weight(_text_:in in 2737) [ClassicSimilarity], result of:
          0.009173612 = score(doc=2737,freq=12.0), product of:
            0.04153252 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.030532904 = queryNorm
            0.22087781 = fieldWeight in 2737, product of:
              3.4641016 = tf(freq=12.0), with freq of:
                12.0 = termFreq=12.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.046875 = fieldNorm(doc=2737)
      0.05882353 = coord(1/17)
    
    Abstract
    In recent years, knowledge structuring is assuming important roles in several real world applications such as decision support, cooperative problem solving, e-commerce, Semantic Web and, even in planning systems. Ontologies play an important role in supporting automated processes to access information and are at the core of new strategies for the development of knowledge-based systems. Yet, developing an ontology is a time-consuming task which often needs an accurate domain expertise to tackle structural and logical difficulties in the definition of concepts as well as conceivable relationships. This work presents an ontology-based retrieval approach, that supports data organization and visualization and provides a friendly navigation model. It exploits the fuzzy extension of the Formal Concept Analysis theory to elicit conceptualizations from datasets and generate a hierarchy-based representation of extracted knowledge. An intuitive graphical interface provides a multi-facets view of the built ontology. Through a transparent query-based retrieval, final users navigate across concepts, relations and population.
    Content
    Beitrag in einem Themenheft "Soft Approaches to IA on the Web". Vgl.: doi:10.1016/j.ipm.2011.04.003.
  14. Sedelow, W.A.: ¬The formal analysis of concepts (1993) 0.00
    5.087626E-4 = product of:
      0.008648965 = sum of:
        0.008648965 = weight(_text_:in in 620) [ClassicSimilarity], result of:
          0.008648965 = score(doc=620,freq=6.0), product of:
            0.04153252 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.030532904 = queryNorm
            0.2082456 = fieldWeight in 620, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0625 = fieldNorm(doc=620)
      0.05882353 = coord(1/17)
    
    Abstract
    The present paper focuses on the extraction, by means of a formal logical/mathematical methodology (i.e. automatically, exclusively by rule), of concept content, as in, for example, continuous discourse. The approach to a fully formal defintion of concept content ultimately is owing to a German government initiative to establish 'standards' regarding concepts, in conjunction with efforts to stipulate precisely (and then, derivatively, through computer prgrams) data and information needs according to work role in certain government offices
  15. Priss, U.: Comparing classification systems using facets (2000) 0.00
    5.087626E-4 = product of:
      0.008648965 = sum of:
        0.008648965 = weight(_text_:in in 6485) [ClassicSimilarity], result of:
          0.008648965 = score(doc=6485,freq=6.0), product of:
            0.04153252 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.030532904 = queryNorm
            0.2082456 = fieldWeight in 6485, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0625 = fieldNorm(doc=6485)
      0.05882353 = coord(1/17)
    
    Abstract
    This paper describes a qualitative methodology for comparing and analyzing classification schemes. Theoretical facets are modeled as concept lattices in the sense of formal concept analysis and are used as 'ground' on which the underlying conceptual facets of a classification scheme are visually represented as 'figures'.
    Series
    Advances in knowledge organization; vol.7
    Source
    Dynamism and stability in knowledge organization: Proceedings of the 6th International ISKO-Conference, 10-13 July 2000, Toronto, Canada. Ed.: C. Beghtol et al
  16. Ganter, B.; Wille, R.: Formal concept analysis : mathematical foundations (1998) 0.00
    4.926073E-4 = product of:
      0.008374324 = sum of:
        0.008374324 = weight(_text_:in in 5061) [ClassicSimilarity], result of:
          0.008374324 = score(doc=5061,freq=10.0), product of:
            0.04153252 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.030532904 = queryNorm
            0.20163295 = fieldWeight in 5061, product of:
              3.1622777 = tf(freq=10.0), with freq of:
                10.0 = termFreq=10.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.046875 = fieldNorm(doc=5061)
      0.05882353 = coord(1/17)
    
    Abstract
    This is the first textbook on formal concept analysis. It gives a systematic presentation of the mathematical foundations and their relation to applications in computer science, especially data analysis and knowledge processing. Above all, it presents graphical methods for representing conceptual systems that have proved themselves in communicating knowledge. Theory and graphical representation are thus closely coupled together. The mathematical foundations are treated thouroughly and illuminated by means of numerous examples. Since computers are being used ever more widely for knowledge processing, formal methods for conceptual analysis are gaining in importance. This book makes the basic theory for such methods accessible in a compact form
    Footnote
    Rez. in: KO 26(1999) no.3, S.172-173 (U. Priss)
  17. Ganter, B.: Computing with conceptual structures (2000) 0.00
    4.926073E-4 = product of:
      0.008374324 = sum of:
        0.008374324 = weight(_text_:in in 5088) [ClassicSimilarity], result of:
          0.008374324 = score(doc=5088,freq=10.0), product of:
            0.04153252 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.030532904 = queryNorm
            0.20163295 = fieldWeight in 5088, product of:
              3.1622777 = tf(freq=10.0), with freq of:
                10.0 = termFreq=10.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.046875 = fieldNorm(doc=5088)
      0.05882353 = coord(1/17)
    
    Abstract
    We give an overview over the computational tools for conceptional structures that have emerged from the theory of Formal Concept Analysis, with emphasis on basic ideas rather than technical details. We describe what we mean by conceptual computations, and try to convince the reader that an elaborate formalization is a necessary precondition. Claiming that Formal Concept Analysis provides such a formal background, we present as examples two well known algorithms in very simple pseudo code. These earl be used for navigating in a lattice, thereby supporting some prototypical tasks of conceptual computation. We refer to some of the many more advanced methods, discuss how to compute with limited precision and explain why in the case of incomplete knowledge the conceptual approach is more efficient than a combinatorial one. Utilizing this efficiency requires skillful use of the formalism. We present two results that lead in this direction
    Series
    Lecture notes in computer science; vol.1867: Lecture notes on artificial intelligence
  18. Kaytoue, M.; Kuznetsov, S.O.; Assaghir, Z.; Napoli, A.: Embedding tolerance relations in concept lattices : an application in information fusion (2010) 0.00
    4.4968686E-4 = product of:
      0.0076446766 = sum of:
        0.0076446766 = weight(_text_:in in 4843) [ClassicSimilarity], result of:
          0.0076446766 = score(doc=4843,freq=12.0), product of:
            0.04153252 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.030532904 = queryNorm
            0.18406484 = fieldWeight in 4843, product of:
              3.4641016 = tf(freq=12.0), with freq of:
                12.0 = termFreq=12.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0390625 = fieldNorm(doc=4843)
      0.05882353 = coord(1/17)
    
    Abstract
    Formal Concept Analysis (FCA) is a well founded mathematical framework used for conceptual classication and knowledge management. Given a binary table describing a relation between objects and attributes, FCA consists in building a set of concepts organized by a subsumption relation within a concept lattice. Accordingly, FCA requires to transform complex data, e.g. numbers, intervals, graphs, into binary data leading to loss of information and poor interpretability of object classes. In this paper, we propose a pre-processing method producing binary data from complex data taking advantage of similarity between objects. As a result, the concept lattice is composed of classes being maximal sets of pairwise similar objects. This method is based on FCA and on a formalization of similarity as a tolerance relation (reexive and symmetric). It applies to complex object descriptions and especially here to interval data. Moreover, it can be applied to any kind of structured data for which a similarity can be dened (sequences, graphs, etc.). Finally, an application highlights that the resulting concept lattice plays an important role in information fusion problem, as illustrated with a real-world example in agronomy.
  19. Luksch, P.; Wille, R.: ¬A mathematical model for conceptual knowledge systems (1991) 0.00
    4.451673E-4 = product of:
      0.007567844 = sum of:
        0.007567844 = weight(_text_:in in 3033) [ClassicSimilarity], result of:
          0.007567844 = score(doc=3033,freq=6.0), product of:
            0.04153252 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.030532904 = queryNorm
            0.1822149 = fieldWeight in 3033, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0546875 = fieldNorm(doc=3033)
      0.05882353 = coord(1/17)
    
    Abstract
    Objects, attributes, and concepts are basic notations of conceptual knowledge; they are linked by the following four basic relations: an object has an attribute, an object belongs to a concept, an attribute abstracts from a concept, and a concept is a subconcept of another concept. These structural elements are well mathematized in formal concept analysis. Therefore, conceptual knowledge systems can be mathematically modelled in the frame of formal concept analysis. How such modelling may be performed is indicated by an example of a conceptual knowledge system. The formal definition of the model finally clarifies in which ways representation, inference, acquisition, and communication of conceptual knowledge can be mathematically treated
  20. Kollewe, W.: Data representation by nested line diagrams illustrated by a survey of pensioners (1991) 0.00
    4.451673E-4 = product of:
      0.007567844 = sum of:
        0.007567844 = weight(_text_:in in 5230) [ClassicSimilarity], result of:
          0.007567844 = score(doc=5230,freq=6.0), product of:
            0.04153252 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.030532904 = queryNorm
            0.1822149 = fieldWeight in 5230, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0546875 = fieldNorm(doc=5230)
      0.05882353 = coord(1/17)
    
    Abstract
    With formal concept analysis surveys are analyzable in the way that a meaningful picture of the answers of the interviewed persons is available. Line diagrams of large concept lattices might become less readable up to the point that it is impossible to pursue the line segments with the eyes. Nested line diagrams give the opportunity to overcome these difficulties. The main idea of nested line diagrams is to partition the line diagram into boxes so that line segments between two boxes are all parallel and may be replaced by one line segment. The possibility to draw line diagrams with more than two factors does allow it to describe concept lattices with many hundred or thousand concepts in a clear structure. In practice it has often been proven useful to take standardized scales for the single levels