Search (33 results, page 2 of 2)

  • × theme_ss:"Formale Begriffsanalyse"
  • × type_ss:"a"
  1. De Maio, C.; Fenza, G.; Loia, V.; Senatore, S.: Hierarchical web resources retrieval by exploiting Fuzzy Formal Concept Analysis (2012) 0.00
    0.0014551922 = product of:
      0.01309673 = sum of:
        0.01309673 = weight(_text_:of in 2737) [ClassicSimilarity], result of:
          0.01309673 = score(doc=2737,freq=12.0), product of:
            0.05157766 = queryWeight, product of:
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.03298316 = queryNorm
            0.25392252 = fieldWeight in 2737, product of:
              3.4641016 = tf(freq=12.0), with freq of:
                12.0 = termFreq=12.0
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.046875 = fieldNorm(doc=2737)
      0.11111111 = coord(1/9)
    
    Abstract
    In recent years, knowledge structuring is assuming important roles in several real world applications such as decision support, cooperative problem solving, e-commerce, Semantic Web and, even in planning systems. Ontologies play an important role in supporting automated processes to access information and are at the core of new strategies for the development of knowledge-based systems. Yet, developing an ontology is a time-consuming task which often needs an accurate domain expertise to tackle structural and logical difficulties in the definition of concepts as well as conceivable relationships. This work presents an ontology-based retrieval approach, that supports data organization and visualization and provides a friendly navigation model. It exploits the fuzzy extension of the Formal Concept Analysis theory to elicit conceptualizations from datasets and generate a hierarchy-based representation of extracted knowledge. An intuitive graphical interface provides a multi-facets view of the built ontology. Through a transparent query-based retrieval, final users navigate across concepts, relations and population.
  2. Ganter, B.; Stahl, J.; Wille, R.: Conceptual measurement and many-valued contexts (1986) 0.00
    0.0013861861 = product of:
      0.012475675 = sum of:
        0.012475675 = weight(_text_:of in 3137) [ClassicSimilarity], result of:
          0.012475675 = score(doc=3137,freq=2.0), product of:
            0.05157766 = queryWeight, product of:
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.03298316 = queryNorm
            0.24188137 = fieldWeight in 3137, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.109375 = fieldNorm(doc=3137)
      0.11111111 = coord(1/9)
    
    Source
    Classification as a tool of research. Ed.: W. Gaul u. M. Schader
  3. Ganter, B.; Wille, R.: Conceptual scaling (1989) 0.00
    0.0013861861 = product of:
      0.012475675 = sum of:
        0.012475675 = weight(_text_:of in 3138) [ClassicSimilarity], result of:
          0.012475675 = score(doc=3138,freq=2.0), product of:
            0.05157766 = queryWeight, product of:
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.03298316 = queryNorm
            0.24188137 = fieldWeight in 3138, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.109375 = fieldNorm(doc=3138)
      0.11111111 = coord(1/9)
    
    Source
    Applications of combinatorics and graph theory to the biological and social sciences. Ed.: F. Roberts
  4. Wille, R.: Knowledge acquisition by methods of formal concept analysis (1989) 0.00
    0.0013861861 = product of:
      0.012475675 = sum of:
        0.012475675 = weight(_text_:of in 594) [ClassicSimilarity], result of:
          0.012475675 = score(doc=594,freq=2.0), product of:
            0.05157766 = queryWeight, product of:
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.03298316 = queryNorm
            0.24188137 = fieldWeight in 594, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.109375 = fieldNorm(doc=594)
      0.11111111 = coord(1/9)
    
  5. Priss, U.: Comparing classification systems using facets (2000) 0.00
    0.0013719685 = product of:
      0.012347717 = sum of:
        0.012347717 = weight(_text_:of in 6485) [ClassicSimilarity], result of:
          0.012347717 = score(doc=6485,freq=6.0), product of:
            0.05157766 = queryWeight, product of:
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.03298316 = queryNorm
            0.23940048 = fieldWeight in 6485, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.0625 = fieldNorm(doc=6485)
      0.11111111 = coord(1/9)
    
    Abstract
    This paper describes a qualitative methodology for comparing and analyzing classification schemes. Theoretical facets are modeled as concept lattices in the sense of formal concept analysis and are used as 'ground' on which the underlying conceptual facets of a classification scheme are visually represented as 'figures'.
    Source
    Dynamism and stability in knowledge organization: Proceedings of the 6th International ISKO-Conference, 10-13 July 2000, Toronto, Canada. Ed.: C. Beghtol et al
  6. Priss, U.; Old, L.J.: Concept neighbourhoods in knowledge organisation systems (2010) 0.00
    0.0013719685 = product of:
      0.012347717 = sum of:
        0.012347717 = weight(_text_:of in 3527) [ClassicSimilarity], result of:
          0.012347717 = score(doc=3527,freq=6.0), product of:
            0.05157766 = queryWeight, product of:
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.03298316 = queryNorm
            0.23940048 = fieldWeight in 3527, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.0625 = fieldNorm(doc=3527)
      0.11111111 = coord(1/9)
    
    Abstract
    This paper discusses the application of concept neighbourhoods (in the sense of formal concept analysis) to knowledge organisation systems. Examples are provided using Roget's Thesaurus, WordNet and Wikipedia categories.
    Source
    Paradigms and conceptual systems in knowledge organization: Proceedings of the Eleventh International ISKO Conference, 23-26 February 2010 Rome, Italy. Edited by Claudio Gnoli and Fulvio Mazzocchi
  7. Eklund. P.W.: Logic-based networks : concept graphs and conceptual structures (2000) 0.00
    0.0013284028 = product of:
      0.011955625 = sum of:
        0.011955625 = weight(_text_:of in 5084) [ClassicSimilarity], result of:
          0.011955625 = score(doc=5084,freq=10.0), product of:
            0.05157766 = queryWeight, product of:
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.03298316 = queryNorm
            0.23179851 = fieldWeight in 5084, product of:
              3.1622777 = tf(freq=10.0), with freq of:
                10.0 = termFreq=10.0
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.046875 = fieldNorm(doc=5084)
      0.11111111 = coord(1/9)
    
    Abstract
    Logic-based networks are semantic networks that support reasoning capabilities. In this paper, knowledge processing within logicbased networks is viewed as three stages. The first stage involves the formation of concepts and relations: the basic primitives with which we wish to formulate knowledge. The second stage involves the formation of wellformed formulas that express knowledge about the primitive concepts and relations once isolated. The final stage involves efficiently processing the wffs to the desired end. Our research involves each of these steps as they relate to Sowa's conceptual structures and Wille's concept lattices. Formal Concept Analysis gives us a capability to perform concept formation via symbolic machine learning. Concept(ual) Graphs provide a means to describe relational properties between primitive concept and relation types. Finally, techniques from other areas of computer science are required to compute logic-based networks efficiently. This paper illustrates the three stages of knowledge processing in practical terms using examples from our research
  8. Burmeister, P.; Holzer, R.: On the treatment of incomplete knowledge in formal concept analysis (2000) 0.00
    0.0013284028 = product of:
      0.011955625 = sum of:
        0.011955625 = weight(_text_:of in 5085) [ClassicSimilarity], result of:
          0.011955625 = score(doc=5085,freq=10.0), product of:
            0.05157766 = queryWeight, product of:
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.03298316 = queryNorm
            0.23179851 = fieldWeight in 5085, product of:
              3.1622777 = tf(freq=10.0), with freq of:
                10.0 = termFreq=10.0
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.046875 = fieldNorm(doc=5085)
      0.11111111 = coord(1/9)
    
    Abstract
    Some possible treatments of incomplete knowledge in conceptual data representation, data analysis and knowledge acquisition are presented. In particular, some ways of conceptual scalings as well as the role of the three-valued KLEENE-logic are briefly investigated. This logic is also one background in attribute exploration, a conceptual tool for knowledge acquisition. For this method a strategy is given to obtain as much of (attribute) implicational knowledge about a given "universe" as possible; and we show how to represent incomplete knowledge in order to be able to pin down the questions still to be answered in order to obtain complete knowledge in this situation
  9. Ganter, B.: Computing with conceptual structures (2000) 0.00
    0.0013284028 = product of:
      0.011955625 = sum of:
        0.011955625 = weight(_text_:of in 5088) [ClassicSimilarity], result of:
          0.011955625 = score(doc=5088,freq=10.0), product of:
            0.05157766 = queryWeight, product of:
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.03298316 = queryNorm
            0.23179851 = fieldWeight in 5088, product of:
              3.1622777 = tf(freq=10.0), with freq of:
                10.0 = termFreq=10.0
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.046875 = fieldNorm(doc=5088)
      0.11111111 = coord(1/9)
    
    Abstract
    We give an overview over the computational tools for conceptional structures that have emerged from the theory of Formal Concept Analysis, with emphasis on basic ideas rather than technical details. We describe what we mean by conceptual computations, and try to convince the reader that an elaborate formalization is a necessary precondition. Claiming that Formal Concept Analysis provides such a formal background, we present as examples two well known algorithms in very simple pseudo code. These earl be used for navigating in a lattice, thereby supporting some prototypical tasks of conceptual computation. We refer to some of the many more advanced methods, discuss how to compute with limited precision and explain why in the case of incomplete knowledge the conceptual approach is more efficient than a combinatorial one. Utilizing this efficiency requires skillful use of the formalism. We present two results that lead in this direction
  10. Hereth, J.; Stumme, G.; Wille, R.; Wille, U.: Conceptual knowledge discovery and data analysis (2000) 0.00
    0.0013098229 = product of:
      0.0117884055 = sum of:
        0.0117884055 = weight(_text_:of in 5083) [ClassicSimilarity], result of:
          0.0117884055 = score(doc=5083,freq=14.0), product of:
            0.05157766 = queryWeight, product of:
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.03298316 = queryNorm
            0.22855641 = fieldWeight in 5083, product of:
              3.7416575 = tf(freq=14.0), with freq of:
                14.0 = termFreq=14.0
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.0390625 = fieldNorm(doc=5083)
      0.11111111 = coord(1/9)
    
    Abstract
    In this paper, we discuss Conceptual Knowledge Discovery in Databases (CKDD) in its connection with Data Analysis. Our approach is based on Formal Concept Analysis, a mathematical theory which has been developed and proven useful during the last 20 years. Formal Concept Analysis has led to a theory of conceptual information systems which has been applied by using the management system TOSCANA in a wide range of domains. In this paper, we use such an application in database marketing to demonstrate how methods and procedures of CKDD can be applied in Data Analysis. In particular, we show the interplay and integration of data mining and data analysis techniques based on Formal Concept Analysis. The main concern of this paper is to explain how the transition from data to knowledge can be supported by a TOSCANA system. To clarify the transition steps we discuss their correspondence to the five levels of knowledge representation established by R. Brachman and to the steps of empirically grounded theory building proposed by A. Strauss and J. Corbin
  11. Kent, R.E.: Implications and rules in thesauri (1994) 0.00
    0.0011202075 = product of:
      0.010081868 = sum of:
        0.010081868 = weight(_text_:of in 3457) [ClassicSimilarity], result of:
          0.010081868 = score(doc=3457,freq=4.0), product of:
            0.05157766 = queryWeight, product of:
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.03298316 = queryNorm
            0.19546966 = fieldWeight in 3457, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.0625 = fieldNorm(doc=3457)
      0.11111111 = coord(1/9)
    
    Abstract
    A central consideration in the study of whole language semantic space as encoded in thesauri is word sense comparability. Shows how word sense comparability can be adequately expressed by the logical implications and rules from Formal Concept Analysis. Formal concept analysis, a new approach to formal logic initiated by Rudolf Wille, has been used for data modelling, analysis and interpretation, and also for knowledge representation and knowledge discovery
    Source
    Knowledge organization and quality management: Proc. of the 3rd International ISKO Conference, 20-24 June 1994, Copenhagen, Denmark. Ed.: H. Albrechtsen et al
  12. Carpineto, C.; Romano, G.: Order-theoretical ranking (2000) 0.00
    0.0011070023 = product of:
      0.009963021 = sum of:
        0.009963021 = weight(_text_:of in 4766) [ClassicSimilarity], result of:
          0.009963021 = score(doc=4766,freq=10.0), product of:
            0.05157766 = queryWeight, product of:
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.03298316 = queryNorm
            0.19316542 = fieldWeight in 4766, product of:
              3.1622777 = tf(freq=10.0), with freq of:
                10.0 = termFreq=10.0
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.0390625 = fieldNorm(doc=4766)
      0.11111111 = coord(1/9)
    
    Abstract
    Current best-match ranking (BMR) systems perform well but cannot handle word mismatch between a query and a document. The best known alternative ranking method, hierarchical clustering-based ranking (HCR), seems to be more robust than BMR with respect to this problem, but it is hampered by theoretical and practical limitations. We present an approach to document ranking that explicitly addresses the word mismatch problem by exploiting interdocument similarity information in a novel way. Document ranking is seen as a query-document transformation driven by a conceptual representation of the whole document collection, into which the query is merged. Our approach is nased on the theory of concept (or Galois) lattices, which, er argue, provides a powerful, well-founded, and conputationally-tractable framework to model the space in which documents and query are represented and to compute such a transformation. We compared information retrieval using concept lattice-based ranking (CLR) to BMR and HCR. The results showed that HCR was outperformed by CLR as well as BMR, and suggested that, of the two best methods, BMR achieved better performance than CLR on the whole document set, whereas CLR compared more favorably when only the first retrieved documents were used for evaluation. We also evaluated the three methods' specific ability to rank documents that did not match the query, in which case the speriority of CLR over BMR and HCR was apparent
    Source
    Journal of the American Society for Information Science. 51(2000) no.7, S.587-601
  13. Negm, E.; AbdelRahman, S.; Bahgat, R.: PREFCA: a portal retrieval engine based on formal concept analysis (2017) 0.00
    8.856019E-4 = product of:
      0.007970417 = sum of:
        0.007970417 = weight(_text_:of in 3291) [ClassicSimilarity], result of:
          0.007970417 = score(doc=3291,freq=10.0), product of:
            0.05157766 = queryWeight, product of:
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.03298316 = queryNorm
            0.15453234 = fieldWeight in 3291, product of:
              3.1622777 = tf(freq=10.0), with freq of:
                10.0 = termFreq=10.0
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.03125 = fieldNorm(doc=3291)
      0.11111111 = coord(1/9)
    
    Abstract
    The web is a network of linked sites whereby each site either forms a physical portal or a standalone page. In the former case, the portal presents an access point to its embedded web pages that coherently present a specific topic. In the latter case, there are millions of standalone web pages, that are scattered throughout the web, having the same topic and could be conceptually linked together to form virtual portals. Search engines have been developed to help users in reaching the adequate pages in an efficient and effective manner. All the known current search engine techniques rely on the web page as the basic atomic search unit. They ignore the conceptual links, that reveal the implicit web related meanings, among the retrieved pages. However, building a semantic model for the whole portal may contain more semantic information than a model of scattered individual pages. In addition, user queries can be poor and contain imprecise terms that do not reflect the real user intention. Consequently, retrieving the standalone individual pages that are directly related to the query may not satisfy the user's need. In this paper, we propose PREFCA, a Portal Retrieval Engine based on Formal Concept Analysis that relies on the portal as the main search unit. PREFCA consists of three phases: First, the information extraction phase that is concerned with extracting portal's semantic data. Second, the formal concept analysis phase that utilizes formal concept analysis to discover the conceptual links among portal and attributes. Finally, the information retrieval phase where we propose a portal ranking method to retrieve ranked pairs of portals and embedded pages. Additionally, we apply the network analysis rules to output some portal characteristics. We evaluated PREFCA using two data sets, namely the Forum for Information Retrieval Evaluation 2010 and ClueWeb09 (category B) test data, for physical and virtual portals respectively. PREFCA proves higher F-measure accuracy, better Mean Average Precision ranking and comparable network analysis and efficiency results than other search engine approaches, namely Term Frequency Inverse Document Frequency (TF-IDF), Latent Semantic Analysis (LSA), and BM25 techniques. As well, it gains high Mean Average Precision in comparison with learning to rank techniques. Moreover, PREFCA also gains better reach time than Carrot as a well-known topic-based search engine.