Search (52 results, page 2 of 3)

  • × theme_ss:"Formale Begriffsanalyse"
  1. Vogt, F.: Formale Begriffsanalyse mit C++ : Datenstrukturen und Algorithmen (1996) 0.00
    0.0048322063 = product of:
      0.014496619 = sum of:
        0.014496619 = weight(_text_:in in 2037) [ClassicSimilarity], result of:
          0.014496619 = score(doc=2037,freq=6.0), product of:
            0.069613084 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.051176514 = queryNorm
            0.2082456 = fieldWeight in 2037, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0625 = fieldNorm(doc=2037)
      0.33333334 = coord(1/3)
    
    Abstract
    Das Buch soll den Leser, der an Formaler Begriffsnalyse als Methode der Datenanalyse und Wissensstrukturierung interssiert ist, in die Lage versetzen, eigene C++-Programme zur Formalen Begriffsanalyse zu schreiben. Die Vorgehensweise der Formalen Begriffsanalyse werden an einem Anwendungsbeispiel erläutert, so daß das Buch sowohl als Leitfaden für den interessierten Neueinsteiger als auch als Handbuch für den versierten Anwendungsprogrammierer und Projektleiter dienen kann
    Content
    The book comprises 16 chapters and an appendix in which the software CONSCRIPT is explained
    Footnote
    Rez. in: Knowledge organization 25(1998) nos.1/2, S.47 (S. Düwel u. W. Hesse)
  2. Priss, U.: Comparing classification systems using facets (2000) 0.00
    0.0048322063 = product of:
      0.014496619 = sum of:
        0.014496619 = weight(_text_:in in 6485) [ClassicSimilarity], result of:
          0.014496619 = score(doc=6485,freq=6.0), product of:
            0.069613084 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.051176514 = queryNorm
            0.2082456 = fieldWeight in 6485, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0625 = fieldNorm(doc=6485)
      0.33333334 = coord(1/3)
    
    Abstract
    This paper describes a qualitative methodology for comparing and analyzing classification schemes. Theoretical facets are modeled as concept lattices in the sense of formal concept analysis and are used as 'ground' on which the underlying conceptual facets of a classification scheme are visually represented as 'figures'.
    Series
    Advances in knowledge organization; vol.7
    Source
    Dynamism and stability in knowledge organization: Proceedings of the 6th International ISKO-Conference, 10-13 July 2000, Toronto, Canada. Ed.: C. Beghtol et al
  3. Ganter, B.; Wille, R.: Formal concept analysis : mathematical foundations (1998) 0.00
    0.004678764 = product of:
      0.014036291 = sum of:
        0.014036291 = weight(_text_:in in 5061) [ClassicSimilarity], result of:
          0.014036291 = score(doc=5061,freq=10.0), product of:
            0.069613084 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.051176514 = queryNorm
            0.20163295 = fieldWeight in 5061, product of:
              3.1622777 = tf(freq=10.0), with freq of:
                10.0 = termFreq=10.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.046875 = fieldNorm(doc=5061)
      0.33333334 = coord(1/3)
    
    Abstract
    This is the first textbook on formal concept analysis. It gives a systematic presentation of the mathematical foundations and their relation to applications in computer science, especially data analysis and knowledge processing. Above all, it presents graphical methods for representing conceptual systems that have proved themselves in communicating knowledge. Theory and graphical representation are thus closely coupled together. The mathematical foundations are treated thouroughly and illuminated by means of numerous examples. Since computers are being used ever more widely for knowledge processing, formal methods for conceptual analysis are gaining in importance. This book makes the basic theory for such methods accessible in a compact form
    Footnote
    Rez. in: KO 26(1999) no.3, S.172-173 (U. Priss)
  4. Ganter, B.: Computing with conceptual structures (2000) 0.00
    0.004678764 = product of:
      0.014036291 = sum of:
        0.014036291 = weight(_text_:in in 5088) [ClassicSimilarity], result of:
          0.014036291 = score(doc=5088,freq=10.0), product of:
            0.069613084 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.051176514 = queryNorm
            0.20163295 = fieldWeight in 5088, product of:
              3.1622777 = tf(freq=10.0), with freq of:
                10.0 = termFreq=10.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.046875 = fieldNorm(doc=5088)
      0.33333334 = coord(1/3)
    
    Abstract
    We give an overview over the computational tools for conceptional structures that have emerged from the theory of Formal Concept Analysis, with emphasis on basic ideas rather than technical details. We describe what we mean by conceptual computations, and try to convince the reader that an elaborate formalization is a necessary precondition. Claiming that Formal Concept Analysis provides such a formal background, we present as examples two well known algorithms in very simple pseudo code. These earl be used for navigating in a lattice, thereby supporting some prototypical tasks of conceptual computation. We refer to some of the many more advanced methods, discuss how to compute with limited precision and explain why in the case of incomplete knowledge the conceptual approach is more efficient than a combinatorial one. Utilizing this efficiency requires skillful use of the formalism. We present two results that lead in this direction
    Series
    Lecture notes in computer science; vol.1867: Lecture notes on artificial intelligence
  5. Begriffliche Wissensverarbeitung : Methoden und Anwendungen. Mit Beiträgen zahlreicher Fachwissenschaftler (2000) 0.00
    0.0046133236 = product of:
      0.01383997 = sum of:
        0.01383997 = weight(_text_:in in 4193) [ClassicSimilarity], result of:
          0.01383997 = score(doc=4193,freq=14.0), product of:
            0.069613084 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.051176514 = queryNorm
            0.19881277 = fieldWeight in 4193, product of:
              3.7416575 = tf(freq=14.0), with freq of:
                14.0 = termFreq=14.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0390625 = fieldNorm(doc=4193)
      0.33333334 = coord(1/3)
    
    Abstract
    Dieses Buch stellt Methoden der Begrifflichen Wissensverarbeitung vor und präsentiert Anwendungen aus unterschiedlichen Praxisfeldern. Im Methodenteil wird in moderne Techniken der Begrifflichen Datenanalyse und Wissensverarbeitung eingeführt. Der zweite Teil richtet sich verstärkt an potentielle Anwender. An ausgewählten Anwendungen wird die Vorgehensweise bei der Datenanalyse und dem Information Retrieval mit den Methoden der Begrifflichen Wissensverarbeitung vorgestellt und ihr Potential aufgezeigt
    Content
    Enthält die Beiträge: GANTER, B.: Begriffe und Implikationen; BURMEISTER, P.: ConImp: Ein Programm zur Fromalen Begriffsanalyse; Lengnink, K.: Ähnlichkeit als Distanz in Begriffsverbänden; POLLANDT, S.: Datenanalyse mit Fuzzy-Begriffen; PREDIGER, S.: Terminologische Merkmalslogik in der Formalen Begriffsanalyse; WILLE, R. u. M. ZICKWOLFF: Grundlagen einer Triadischen Begriffsanalyse; LINDIG, C. u. G. SNELTING: Formale Begriffsanalyse im Software Engineering; STRACK, H. u. M. SKORSKY: Zugriffskontrolle bei Programmsystemen und im Datenschutz mittels Formaler Begriffsanalyse; ANDELFINGER, U.: Inhaltliche Erschließung des Bereichs 'Sozialorientierte Gestaltung von Informationstechnik': Ein begriffsanalytischer Ansatz; GÖDERT, W.: Wissensdarstellung in Informationssystemen, Fragetypen und Anforderungen an Retrievalkomponenten; ROCK, T. u. R. WILLE: Ein TOSCANA-Erkundungssystem zur Literatursuche; ESCHENFELDER, D. u.a.: Ein Erkundungssystem zum Baurecht: Methoden der Entwicklung eines TOSCANA-Systems; GROßKOPF, A. u. G. HARRAS: Begriffliche Erkundung semantischer Strukturen von Sprechaktverben; ZELGER, J.: Grundwerte, Ziele und Maßnahmen in einem regionalen Krankenhaus: Eine Anwendung des Verfahrens GABEK; KOHLER-KOCH, B. u. F. VOGT: Normen- und regelgeleitete internationale Kooperationen: Formale Begriffsanalyse in der Politikwissenschaft; HENNING, H.J. u. W. KEMMNITZ: Entwicklung eines kontextuellen Methodenkonzeptes mit Hilfer der Formalen Begriffsanalyse an Beispielen zum Risikoverständnis; BARTEL, H.-G.: Über Möglichkeiten der Formalen Begriffsanalyse in der Mathematischen Archäochemie
  6. Kaytoue, M.; Kuznetsov, S.O.; Assaghir, Z.; Napoli, A.: Embedding tolerance relations in concept lattices : an application in information fusion (2010) 0.00
    0.0042711073 = product of:
      0.012813321 = sum of:
        0.012813321 = weight(_text_:in in 4843) [ClassicSimilarity], result of:
          0.012813321 = score(doc=4843,freq=12.0), product of:
            0.069613084 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.051176514 = queryNorm
            0.18406484 = fieldWeight in 4843, product of:
              3.4641016 = tf(freq=12.0), with freq of:
                12.0 = termFreq=12.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0390625 = fieldNorm(doc=4843)
      0.33333334 = coord(1/3)
    
    Abstract
    Formal Concept Analysis (FCA) is a well founded mathematical framework used for conceptual classication and knowledge management. Given a binary table describing a relation between objects and attributes, FCA consists in building a set of concepts organized by a subsumption relation within a concept lattice. Accordingly, FCA requires to transform complex data, e.g. numbers, intervals, graphs, into binary data leading to loss of information and poor interpretability of object classes. In this paper, we propose a pre-processing method producing binary data from complex data taking advantage of similarity between objects. As a result, the concept lattice is composed of classes being maximal sets of pairwise similar objects. This method is based on FCA and on a formalization of similarity as a tolerance relation (reexive and symmetric). It applies to complex object descriptions and especially here to interval data. Moreover, it can be applied to any kind of structured data for which a similarity can be dened (sequences, graphs, etc.). Finally, an application highlights that the resulting concept lattice plays an important role in information fusion problem, as illustrated with a real-world example in agronomy.
  7. Luksch, P.; Wille, R.: ¬A mathematical model for conceptual knowledge systems (1991) 0.00
    0.0042281803 = product of:
      0.012684541 = sum of:
        0.012684541 = weight(_text_:in in 3033) [ClassicSimilarity], result of:
          0.012684541 = score(doc=3033,freq=6.0), product of:
            0.069613084 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.051176514 = queryNorm
            0.1822149 = fieldWeight in 3033, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0546875 = fieldNorm(doc=3033)
      0.33333334 = coord(1/3)
    
    Abstract
    Objects, attributes, and concepts are basic notations of conceptual knowledge; they are linked by the following four basic relations: an object has an attribute, an object belongs to a concept, an attribute abstracts from a concept, and a concept is a subconcept of another concept. These structural elements are well mathematized in formal concept analysis. Therefore, conceptual knowledge systems can be mathematically modelled in the frame of formal concept analysis. How such modelling may be performed is indicated by an example of a conceptual knowledge system. The formal definition of the model finally clarifies in which ways representation, inference, acquisition, and communication of conceptual knowledge can be mathematically treated
  8. Kollewe, W.: Data representation by nested line diagrams illustrated by a survey of pensioners (1991) 0.00
    0.0042281803 = product of:
      0.012684541 = sum of:
        0.012684541 = weight(_text_:in in 5230) [ClassicSimilarity], result of:
          0.012684541 = score(doc=5230,freq=6.0), product of:
            0.069613084 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.051176514 = queryNorm
            0.1822149 = fieldWeight in 5230, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0546875 = fieldNorm(doc=5230)
      0.33333334 = coord(1/3)
    
    Abstract
    With formal concept analysis surveys are analyzable in the way that a meaningful picture of the answers of the interviewed persons is available. Line diagrams of large concept lattices might become less readable up to the point that it is impossible to pursue the line segments with the eyes. Nested line diagrams give the opportunity to overcome these difficulties. The main idea of nested line diagrams is to partition the line diagram into boxes so that line segments between two boxes are all parallel and may be replaced by one line segment. The possibility to draw line diagrams with more than two factors does allow it to describe concept lattices with many hundred or thousand concepts in a clear structure. In practice it has often been proven useful to take standardized scales for the single levels
  9. Priss, U.: ¬A graphical interface for conceptually navigating faceted thesauri (1998) 0.00
    0.0042281803 = product of:
      0.012684541 = sum of:
        0.012684541 = weight(_text_:in in 6658) [ClassicSimilarity], result of:
          0.012684541 = score(doc=6658,freq=6.0), product of:
            0.069613084 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.051176514 = queryNorm
            0.1822149 = fieldWeight in 6658, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0546875 = fieldNorm(doc=6658)
      0.33333334 = coord(1/3)
    
    Abstract
    This paper describes a graphical interface for the navigation and construction of faceted thesauri that is based on formal concept analysis. Each facet of a thesaurus is represented as a mathematical lattice that is further subdivided into components. Users can graphically navigate through the Java implementation of the interface by clicking on terms that connect facets and components. Since there are many applications for thesauri in the knowledge representation field, such a graphical interface has the potential of being very useful
    Series
    Advances in knowledge organization; vol.6
    Source
    Structures and relations in knowledge organization: Proceedings of the 5th International ISKO-Conference, Lille, 25.-29.8.1998. Ed.: W. Mustafa el Hadi et al
  10. Wille, R.: Begriffliche Wissensverarbeitung in der Wirtschaft (2002) 0.00
    0.0042281803 = product of:
      0.012684541 = sum of:
        0.012684541 = weight(_text_:in in 547) [ClassicSimilarity], result of:
          0.012684541 = score(doc=547,freq=6.0), product of:
            0.069613084 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.051176514 = queryNorm
            0.1822149 = fieldWeight in 547, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0546875 = fieldNorm(doc=547)
      0.33333334 = coord(1/3)
    
    Abstract
    Begriffliche Wissensverarbeitung ist einem pragmatischen Wissensverständnis verpflichtet, nach dem menschliches Wissen in einem offenen Prozess menschlichen Denkens, Argumentierens und Kommunizierens entsteht und weiterlebt. Sie gründet sich auf eine mathematische Begriffstheorie, die auf das wechselseitige Zusammenwirken von Formalem und Inhaltlichem ausgerichtet ist. Wie diese theoretische Konzeption in der wirtschaftlichen Praxis zur Wirkung kommt wird erläutert anhand der Kernprozesse des organisationalen Wissensmanagements, d.h. nach G. Probst et al. anhand von Wissensidentifikation, Wissenserwerb, Wissensentwicklung, Wissens(ver)teilung, Wissensnutzung und Wissensbewahrung; jeweils an einem Beispiel wird der Einsatz spezifischer Methoden der Begrifflichen Wissensverarbeitung demonstriert. Abschließend wird auf den prozesshaften Wirkungszusammenhang von Wissenszielen und Wissensbewertungen mit den Kernprozessen aus Sicht der Begrifflichen Wissensverarbeitung eingegangen.
  11. Wille, R.: Begriffliche Datensysteme als Werkzeuge der Wissenskommunikation (1992) 0.00
    0.0041848132 = product of:
      0.012554439 = sum of:
        0.012554439 = weight(_text_:in in 8826) [ClassicSimilarity], result of:
          0.012554439 = score(doc=8826,freq=2.0), product of:
            0.069613084 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.051176514 = queryNorm
            0.18034597 = fieldWeight in 8826, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.09375 = fieldNorm(doc=8826)
      0.33333334 = coord(1/3)
    
    Source
    Mensch und Maschine: Informationelle Schnittstellen der Kommunikation. Proc. des 3. Int. Symposiums für Informationswissenschaft (ISI'92), 5.-7.11.1992 in Saarbrücken. Hrsg.: H.H. Zimmermann, H.-D. Luckhardt u. A. Schulz
  12. Wille, R.; Wachter, C.: Begriffsanalyse von Dokumenten (1992) 0.00
    0.0041848132 = product of:
      0.012554439 = sum of:
        0.012554439 = weight(_text_:in in 341) [ClassicSimilarity], result of:
          0.012554439 = score(doc=341,freq=2.0), product of:
            0.069613084 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.051176514 = queryNorm
            0.18034597 = fieldWeight in 341, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.09375 = fieldNorm(doc=341)
      0.33333334 = coord(1/3)
    
    Source
    Information und Dokumentation in den 90er Jahren: neue Herausforderung, neue Technologien. Deutscher Dokumentartag 1991, Universität Ulm, 30.9.-2.10.1991. Hrsg.: W. Neubauer u. K.-H. Meier
  13. Kohler-Koch, B.; Vogt, F.: Normen- und regelgeleitete internationale Kooperationen : Formale Begriffsanalyse in der Politikwissenschaft (2000) 0.00
    0.0041848132 = product of:
      0.012554439 = sum of:
        0.012554439 = weight(_text_:in in 4206) [ClassicSimilarity], result of:
          0.012554439 = score(doc=4206,freq=2.0), product of:
            0.069613084 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.051176514 = queryNorm
            0.18034597 = fieldWeight in 4206, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.09375 = fieldNorm(doc=4206)
      0.33333334 = coord(1/3)
    
  14. Bartel, H.-G.: Über Möglichkeiten der Formalen Begriffsanalyse in der Mathematischen Archäochemie (2000) 0.00
    0.0041848132 = product of:
      0.012554439 = sum of:
        0.012554439 = weight(_text_:in in 4208) [ClassicSimilarity], result of:
          0.012554439 = score(doc=4208,freq=2.0), product of:
            0.069613084 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.051176514 = queryNorm
            0.18034597 = fieldWeight in 4208, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.09375 = fieldNorm(doc=4208)
      0.33333334 = coord(1/3)
    
  15. Sedelow, S.Y.; Sedelow, W.A.: Thesauri and concept-lattice semantic nets (1994) 0.00
    0.00394548 = product of:
      0.011836439 = sum of:
        0.011836439 = weight(_text_:in in 7733) [ClassicSimilarity], result of:
          0.011836439 = score(doc=7733,freq=4.0), product of:
            0.069613084 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.051176514 = queryNorm
            0.17003182 = fieldWeight in 7733, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0625 = fieldNorm(doc=7733)
      0.33333334 = coord(1/3)
    
    Abstract
    Formal concept lattices are a promising vehicle for the construction of rigorous and empirically accurate semantic nets. Presented here are results of initial experiments with concept lattices as representations of semantic relationships in the implicit structure of a large database (e.g. Roget's thesaurus)
    Series
    Advances in knowledge organization; vol.4
  16. Neuss, C.; Kent, R.E.: Conceptual analysis of resource meta-information (1995) 0.00
    0.00394548 = product of:
      0.011836439 = sum of:
        0.011836439 = weight(_text_:in in 2204) [ClassicSimilarity], result of:
          0.011836439 = score(doc=2204,freq=4.0), product of:
            0.069613084 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.051176514 = queryNorm
            0.17003182 = fieldWeight in 2204, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0625 = fieldNorm(doc=2204)
      0.33333334 = coord(1/3)
    
    Abstract
    With the continuously growing amount of Internet accessible information resources, locating relevant information in the WWW becomes increasingly difficult. Recent developments provide scalable mechanisms for maintaing indexes of network accessible information. In order to implement sophisticated retrieval engines, a means of automatic analysis and classification of document meta information has to be found. Proposes the use of methods from the mathematical theory of concept analysis to analyze and interactively explore the information space defined by wide area resource discovery services
  17. Ganter, B.; Wille, R.: Formale Begriffsanalyse : Mathematische Grundlagen (1996) 0.00
    0.00394548 = product of:
      0.011836439 = sum of:
        0.011836439 = weight(_text_:in in 4605) [ClassicSimilarity], result of:
          0.011836439 = score(doc=4605,freq=4.0), product of:
            0.069613084 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.051176514 = queryNorm
            0.17003182 = fieldWeight in 4605, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0625 = fieldNorm(doc=4605)
      0.33333334 = coord(1/3)
    
    Abstract
    This first textbook in the field of formal concept analysis provides a systematic presentation of the mathematical foundations and their relation to applications in informatics, especially data analysis and knowledge processing
  18. Pollandt, S.: Fuzzy-Begriffe : Formale Begriffsanalyse unscharfer Daten (1997) 0.00
    0.00394548 = product of:
      0.011836439 = sum of:
        0.011836439 = weight(_text_:in in 2086) [ClassicSimilarity], result of:
          0.011836439 = score(doc=2086,freq=4.0), product of:
            0.069613084 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.051176514 = queryNorm
            0.17003182 = fieldWeight in 2086, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0625 = fieldNorm(doc=2086)
      0.33333334 = coord(1/3)
    
    Abstract
    Ausgehend von der Theorie der Fuzzy-Mengen und Fuzzy-Logik werden neue Methoden zur Analyse unscharfer Daten entwickelt. Dazu wird die Theorie der Formalen Begriffsanalyse in einer Reihe von Methoden und Verfahren erweitert und somit der Forderung von Anwendern nach Möglichkeiten zur begriffsanalytischen Erfassung unscharfer Daten Rechnung getragen. Die benötigten theoretischen Grundlagen werden einführend bereitgestellt, die mathematische Darstellung wird an leicht nachvollziehbaren praktischen Beispielen veranschaulicht
    Footnote
    Rez. in: Knowledge organization 25(1998) no.3, S.123-124 (K.E. Wolff)
  19. Helmerich, M.: Liniendiagramme in der Wissenskommunikation : eine mathematisch-didaktische Untersuchung (2011) 0.00
    0.0038989696 = product of:
      0.011696909 = sum of:
        0.011696909 = weight(_text_:in in 4390) [ClassicSimilarity], result of:
          0.011696909 = score(doc=4390,freq=10.0), product of:
            0.069613084 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.051176514 = queryNorm
            0.16802745 = fieldWeight in 4390, product of:
              3.1622777 = tf(freq=10.0), with freq of:
                10.0 = termFreq=10.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0390625 = fieldNorm(doc=4390)
      0.33333334 = coord(1/3)
    
    Abstract
    Die Kommunikation von Wissen nimmt in der modernen Wissensgesellschaft einen entscheidenden Stellenwert ein. Kommunikation im Habermas'schen Sinne eines intersubjektiven Verständigungsprozesses ist dann aber auch mehr als nur der Austausch von Zeichen: es geht um Sinn und Bedeutung und die Aushandlungsprozesse darüber, wie wir als Kommunikationsgemeinschaft Zeichen interpretieren und darin Informationen codieren. Als Medium für solche Kommunikations - prozesse eignen sich besonders gut Liniendiagramme aus der Theorie der Formalen Begriffsanalyse. Diese Liniendiagramme sind nicht nur geeignet, die Wissenskommunikation zu unterstützen, sondern auch Kommunikationsprozesse überhaupt erst zu initiieren. Solche Liniendiagramme können die Wissenskommunikation gut unterstützen, da sie durch ihre Einfachheit, Ordnung, Prägnanz und ergänzende Stimulanz für Verständigung über die wissensgenerierende Information sorgen. Außerdem wird mit den Liniendiagrammen ein Kommunikationsmittel bereitgestellt, dass inter- und transdisziplinär wirksam werden kann und so Wissensgebiete für verschiedene Disziplinen erschließt, da es mit den Diagrammen gelingt, die allgemeine, zugrundeliegende logische Struktur mit Hilfe eines mathematisch fundierten Verfahrens herauszuarbeiten. Liniendiagramme stellen nicht nur Wissensgebiete in einer geordneten, strukturierten Form dar, sondern verwenden dafür auch formale Begriffe und knüpfen damit an Begriffe als Objekte des menschlichen Denkens an. In den Begriffe verschmilzt ein Ausschnitt der betrachteten Objekte (im Beispiel die verschiedenen Gewässerarten) mit den ihnen gemeinsamen Merkmalen zu neuen Denkeinheiten und geben somit dem Wissen eine Form, in der Kommunikation über diese Denkeinheiten und die darin konzentrierte Information ermöglicht wird.
  20. Negm, E.; AbdelRahman, S.; Bahgat, R.: PREFCA: a portal retrieval engine based on formal concept analysis (2017) 0.00
    0.0036906586 = product of:
      0.011071975 = sum of:
        0.011071975 = weight(_text_:in in 3291) [ClassicSimilarity], result of:
          0.011071975 = score(doc=3291,freq=14.0), product of:
            0.069613084 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.051176514 = queryNorm
            0.15905021 = fieldWeight in 3291, product of:
              3.7416575 = tf(freq=14.0), with freq of:
                14.0 = termFreq=14.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.03125 = fieldNorm(doc=3291)
      0.33333334 = coord(1/3)
    
    Abstract
    The web is a network of linked sites whereby each site either forms a physical portal or a standalone page. In the former case, the portal presents an access point to its embedded web pages that coherently present a specific topic. In the latter case, there are millions of standalone web pages, that are scattered throughout the web, having the same topic and could be conceptually linked together to form virtual portals. Search engines have been developed to help users in reaching the adequate pages in an efficient and effective manner. All the known current search engine techniques rely on the web page as the basic atomic search unit. They ignore the conceptual links, that reveal the implicit web related meanings, among the retrieved pages. However, building a semantic model for the whole portal may contain more semantic information than a model of scattered individual pages. In addition, user queries can be poor and contain imprecise terms that do not reflect the real user intention. Consequently, retrieving the standalone individual pages that are directly related to the query may not satisfy the user's need. In this paper, we propose PREFCA, a Portal Retrieval Engine based on Formal Concept Analysis that relies on the portal as the main search unit. PREFCA consists of three phases: First, the information extraction phase that is concerned with extracting portal's semantic data. Second, the formal concept analysis phase that utilizes formal concept analysis to discover the conceptual links among portal and attributes. Finally, the information retrieval phase where we propose a portal ranking method to retrieve ranked pairs of portals and embedded pages. Additionally, we apply the network analysis rules to output some portal characteristics. We evaluated PREFCA using two data sets, namely the Forum for Information Retrieval Evaluation 2010 and ClueWeb09 (category B) test data, for physical and virtual portals respectively. PREFCA proves higher F-measure accuracy, better Mean Average Precision ranking and comparable network analysis and efficiency results than other search engine approaches, namely Term Frequency Inverse Document Frequency (TF-IDF), Latent Semantic Analysis (LSA), and BM25 techniques. As well, it gains high Mean Average Precision in comparison with learning to rank techniques. Moreover, PREFCA also gains better reach time than Carrot as a well-known topic-based search engine.