Search (78 results, page 1 of 4)

  • × theme_ss:"Formale Begriffsanalyse"
  1. Conceptual structures : logical, linguistic, and computational issues. 8th International Conference on Conceptual Structures, ICCS 2000, Darmstadt, Germany, August 14-18, 2000 (2000) 0.04
    0.043913566 = product of:
      0.08782713 = sum of:
        0.008718421 = product of:
          0.026155261 = sum of:
            0.026155261 = weight(_text_:f in 691) [ClassicSimilarity], result of:
              0.026155261 = score(doc=691,freq=4.0), product of:
                0.13999219 = queryWeight, product of:
                  3.985786 = idf(docFreq=2232, maxDocs=44218)
                  0.035122856 = queryNorm
                0.18683371 = fieldWeight in 691, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  3.985786 = idf(docFreq=2232, maxDocs=44218)
                  0.0234375 = fieldNorm(doc=691)
          0.33333334 = coord(1/3)
        0.010857872 = weight(_text_:for in 691) [ClassicSimilarity], result of:
          0.010857872 = score(doc=691,freq=14.0), product of:
            0.06594466 = queryWeight, product of:
              1.8775425 = idf(docFreq=18385, maxDocs=44218)
              0.035122856 = queryNorm
            0.16465127 = fieldWeight in 691, product of:
              3.7416575 = tf(freq=14.0), with freq of:
                14.0 = termFreq=14.0
              1.8775425 = idf(docFreq=18385, maxDocs=44218)
              0.0234375 = fieldNorm(doc=691)
        0.014490065 = weight(_text_:the in 691) [ClassicSimilarity], result of:
          0.014490065 = score(doc=691,freq=50.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.26147994 = fieldWeight in 691, product of:
              7.071068 = tf(freq=50.0), with freq of:
                50.0 = termFreq=50.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.0234375 = fieldNorm(doc=691)
        0.016103853 = weight(_text_:of in 691) [ClassicSimilarity], result of:
          0.016103853 = score(doc=691,freq=64.0), product of:
            0.054923624 = queryWeight, product of:
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.035122856 = queryNorm
            0.2932045 = fieldWeight in 691, product of:
              8.0 = tf(freq=64.0), with freq of:
                64.0 = termFreq=64.0
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.0234375 = fieldNorm(doc=691)
        0.014490065 = weight(_text_:the in 691) [ClassicSimilarity], result of:
          0.014490065 = score(doc=691,freq=50.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.26147994 = fieldWeight in 691, product of:
              7.071068 = tf(freq=50.0), with freq of:
                50.0 = termFreq=50.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.0234375 = fieldNorm(doc=691)
        0.023166854 = product of:
          0.046333708 = sum of:
            0.046333708 = weight(_text_:communities in 691) [ClassicSimilarity], result of:
              0.046333708 = score(doc=691,freq=4.0), product of:
                0.18632571 = queryWeight, product of:
                  5.3049703 = idf(docFreq=596, maxDocs=44218)
                  0.035122856 = queryNorm
                0.24867049 = fieldWeight in 691, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  5.3049703 = idf(docFreq=596, maxDocs=44218)
                  0.0234375 = fieldNorm(doc=691)
          0.5 = coord(1/2)
      0.5 = coord(6/12)
    
    Abstract
    Computer scientists create models of a perceived reality. Through AI techniques, these models aim at providing the basic support for emulating cognitive behavior such as reasoning and learning, which is one of the main goals of the Al research effort. Such computer models are formed through the interaction of various acquisition and inference mechanisms: perception, concept learning, conceptual clustering, hypothesis testing, probabilistic inference, etc., and are represented using different paradigms tightly linked to the processes that use them. Among these paradigms let us cite: biological models (neural nets, genetic programming), logic-based models (first-order logic, modal logic, rule-based systems), virtual reality models (object systems, agent systems), probabilistic models (Bayesian nets, fuzzy logic), linguistic models (conceptual dependency graphs, language-based rep resentations), etc. One of the strengths of the Conceptual Graph (CG) theory is its versatility in terms of the representation paradigms under which it falls. It can be viewed and therefore used, under different representation paradigms, which makes it a popular choice for a wealth of applications. Its full coupling with different cognitive processes lead to the opening of the field toward related research communities such as the Description Logic, Formal Concept Analysis, and Computational Linguistic communities. We now see more and more research results from one community enrich the other, laying the foundations of common philosophical grounds from which a successful synergy can emerge. ICCS 2000 embodies this spirit of research collaboration. It presents a set of papers that we believe, by their exposure, will benefit the whole community. For instance, the technical program proposes tracks on Conceptual Ontologies, Language, Formal Concept Analysis, Computational Aspects of Conceptual Structures, and Formal Semantics, with some papers on pragmatism and human related aspects of computing. Never before was the program of ICCS formed by so heterogeneously rooted theories of knowledge representation and use. We hope that this swirl of ideas will benefit you as much as it already has benefited us while putting together this program
    Content
    Concepts and Language: The Role of Conceptual Structure in Human Evolution (Keith Devlin) - Concepts in Linguistics - Concepts in Natural Language (Gisela Harras) - Patterns, Schemata, and Types: Author Support through Formalized Experience (Felix H. Gatzemeier) - Conventions and Notations for Knowledge Representation and Retrieval (Philippe Martin) - Conceptual Ontology: Ontology, Metadata, and Semiotics (John F. Sowa) - Pragmatically Yours (Mary Keeler) - Conceptual Modeling for Distributed Ontology Environments (Deborah L. McGuinness) - Discovery of Class Relations in Exception Structured Knowledge Bases (Hendra Suryanto, Paul Compton) - Conceptual Graphs: Perspectives: CGs Applications: Where Are We 7 Years after the First ICCS ? (Michel Chein, David Genest) - The Engineering of a CC-Based System: Fundamental Issues (Guy W. Mineau) - Conceptual Graphs, Metamodeling, and Notation of Concepts (Olivier Gerbé, Guy W. Mineau, Rudolf K. Keller) - Knowledge Representation and Reasonings: Based on Graph Homomorphism (Marie-Laure Mugnier) - User Modeling Using Conceptual Graphs for Intelligent Agents (James F. Baldwin, Trevor P. Martin, Aimilia Tzanavari) - Towards a Unified Querying System of Both Structured and Semi-structured Imprecise Data Using Fuzzy View (Patrice Buche, Ollivier Haemmerlé) - Formal Semantics of Conceptual Structures: The Extensional Semantics of the Conceptual Graph Formalism (Guy W. Mineau) - Semantics of Attribute Relations in Conceptual Graphs (Pavel Kocura) - Nested Concept Graphs and Triadic Power Context Families (Susanne Prediger) - Negations in Simple Concept Graphs (Frithjof Dau) - Extending the CG Model by Simulations (Jean-François Baget) - Contextual Logic and Formal Concept Analysis: Building and Structuring Description Logic Knowledge Bases: Using Least Common Subsumers and Concept Analysis (Franz Baader, Ralf Molitor) - On the Contextual Logic of Ordinal Data (Silke Pollandt, Rudolf Wille) - Boolean Concept Logic (Rudolf Wille) - Lattices of Triadic Concept Graphs (Bernd Groh, Rudolf Wille) - Formalizing Hypotheses with Concepts (Bernhard Ganter, Sergei 0. Kuznetsov) - Generalized Formal Concept Analysis (Laurent Chaudron, Nicolas Maille) - A Logical Generalization of Formal Concept Analysis (Sébastien Ferré, Olivier Ridoux) - On the Treatment of Incomplete Knowledge in Formal Concept Analysis (Peter Burmeister, Richard Holzer) - Conceptual Structures in Practice: Logic-Based Networks: Concept Graphs and Conceptual Structures (Peter W. Eklund) - Conceptual Knowledge Discovery and Data Analysis (Joachim Hereth, Gerd Stumme, Rudolf Wille, Uta Wille) - CEM - A Conceptual Email Manager (Richard Cole, Gerd Stumme) - A Contextual-Logic Extension of TOSCANA (Peter Eklund, Bernd Groh, Gerd Stumme, Rudolf Wille) - A Conceptual Graph Model for W3C Resource Description Framework (Olivier Corby, Rose Dieng, Cédric Hébert) - Computational Aspects of Conceptual Structures: Computing with Conceptual Structures (Bernhard Ganter) - Symmetry and the Computation of Conceptual Structures (Robert Levinson) An Introduction to SNePS 3 (Stuart C. Shapiro) - Composition Norm Dynamics Calculation with Conceptual Graphs (Aldo de Moor) - From PROLOG++ to PROLOG+CG: A CG Object-Oriented Logic Programming Language (Adil Kabbaj, Martin Janta-Polczynski) - A Cost-Bounded Algorithm to Control Events Generalization (Gaël de Chalendar, Brigitte Grau, Olivier Ferret)
  2. Vogt, F.; Wille, R.: TOSCANA - a graphical tool for analyzing and exploring data (1995) 0.04
    0.03977791 = product of:
      0.07955582 = sum of:
        0.016439613 = product of:
          0.049318835 = sum of:
            0.049318835 = weight(_text_:f in 1901) [ClassicSimilarity], result of:
              0.049318835 = score(doc=1901,freq=2.0), product of:
                0.13999219 = queryWeight, product of:
                  3.985786 = idf(docFreq=2232, maxDocs=44218)
                  0.035122856 = queryNorm
                0.35229704 = fieldWeight in 1901, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.985786 = idf(docFreq=2232, maxDocs=44218)
                  0.0625 = fieldNorm(doc=1901)
          0.33333334 = coord(1/3)
        0.015476737 = weight(_text_:for in 1901) [ClassicSimilarity], result of:
          0.015476737 = score(doc=1901,freq=4.0), product of:
            0.06594466 = queryWeight, product of:
              1.8775425 = idf(docFreq=18385, maxDocs=44218)
              0.035122856 = queryNorm
            0.23469281 = fieldWeight in 1901, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              1.8775425 = idf(docFreq=18385, maxDocs=44218)
              0.0625 = fieldNorm(doc=1901)
        0.007728035 = weight(_text_:the in 1901) [ClassicSimilarity], result of:
          0.007728035 = score(doc=1901,freq=2.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.13945597 = fieldWeight in 1901, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.0625 = fieldNorm(doc=1901)
        0.013148742 = weight(_text_:of in 1901) [ClassicSimilarity], result of:
          0.013148742 = score(doc=1901,freq=6.0), product of:
            0.054923624 = queryWeight, product of:
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.035122856 = queryNorm
            0.23940048 = fieldWeight in 1901, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.0625 = fieldNorm(doc=1901)
        0.007728035 = weight(_text_:the in 1901) [ClassicSimilarity], result of:
          0.007728035 = score(doc=1901,freq=2.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.13945597 = fieldWeight in 1901, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.0625 = fieldNorm(doc=1901)
        0.019034648 = product of:
          0.038069297 = sum of:
            0.038069297 = weight(_text_:22 in 1901) [ClassicSimilarity], result of:
              0.038069297 = score(doc=1901,freq=2.0), product of:
                0.12299426 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.035122856 = queryNorm
                0.30952093 = fieldWeight in 1901, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0625 = fieldNorm(doc=1901)
          0.5 = coord(1/2)
      0.5 = coord(6/12)
    
    Abstract
    TOSCANA is a computer program which allows an online interaction with larger data bases to analyse and explore data conceptually. It uses labelled line diagrams of concept lattices to communicate knowledge coded in given data. The basic problem to create online presentations of concept lattices is solved by composing prepared diagrams to nested line diagrams. A larger number of applications in different areas have already shown that TOSCANA is a useful tool for many purposes
    Source
    Knowledge organization. 22(1995) no.2, S.78-81
  3. Skorsky, M.: Graphische Darstellung eines Thesaurus (1997) 0.04
    0.038896672 = product of:
      0.15558669 = sum of:
        0.014350494 = weight(_text_:information in 1051) [ClassicSimilarity], result of:
          0.014350494 = score(doc=1051,freq=2.0), product of:
            0.0616574 = queryWeight, product of:
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.035122856 = queryNorm
            0.23274569 = fieldWeight in 1051, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.09375 = fieldNorm(doc=1051)
        0.039620515 = weight(_text_:und in 1051) [ClassicSimilarity], result of:
          0.039620515 = score(doc=1051,freq=6.0), product of:
            0.07784514 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.035122856 = queryNorm
            0.5089658 = fieldWeight in 1051, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.09375 = fieldNorm(doc=1051)
        0.10161568 = weight(_text_:dokumentation in 1051) [ClassicSimilarity], result of:
          0.10161568 = score(doc=1051,freq=2.0), product of:
            0.16407113 = queryWeight, product of:
              4.671349 = idf(docFreq=1124, maxDocs=44218)
              0.035122856 = queryNorm
            0.6193392 = fieldWeight in 1051, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              4.671349 = idf(docFreq=1124, maxDocs=44218)
              0.09375 = fieldNorm(doc=1051)
      0.25 = coord(3/12)
    
    Source
    Information und Dokumentation: Qualität und Qualifikation. Deutscher Dokumentartag 1997, Universität Regensburg, 24.-26.9.1997. Hrsg.: M. Ockenfeld u. G.J. Mantwill
    Theme
    Konzeption und Anwendung des Prinzips Thesaurus
  4. Priss, U.: Faceted information representation (2000) 0.04
    0.038882356 = product of:
      0.07776471 = sum of:
        0.0118385535 = weight(_text_:information in 5095) [ClassicSimilarity], result of:
          0.0118385535 = score(doc=5095,freq=4.0), product of:
            0.0616574 = queryWeight, product of:
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.035122856 = queryNorm
            0.1920054 = fieldWeight in 5095, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.0546875 = fieldNorm(doc=5095)
        0.009575742 = weight(_text_:for in 5095) [ClassicSimilarity], result of:
          0.009575742 = score(doc=5095,freq=2.0), product of:
            0.06594466 = queryWeight, product of:
              1.8775425 = idf(docFreq=18385, maxDocs=44218)
              0.035122856 = queryNorm
            0.14520876 = fieldWeight in 5095, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.8775425 = idf(docFreq=18385, maxDocs=44218)
              0.0546875 = fieldNorm(doc=5095)
        0.011712181 = weight(_text_:the in 5095) [ClassicSimilarity], result of:
          0.011712181 = score(doc=5095,freq=6.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.21135174 = fieldWeight in 5095, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.0546875 = fieldNorm(doc=5095)
        0.016270736 = weight(_text_:of in 5095) [ClassicSimilarity], result of:
          0.016270736 = score(doc=5095,freq=12.0), product of:
            0.054923624 = queryWeight, product of:
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.035122856 = queryNorm
            0.29624295 = fieldWeight in 5095, product of:
              3.4641016 = tf(freq=12.0), with freq of:
                12.0 = termFreq=12.0
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.0546875 = fieldNorm(doc=5095)
        0.011712181 = weight(_text_:the in 5095) [ClassicSimilarity], result of:
          0.011712181 = score(doc=5095,freq=6.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.21135174 = fieldWeight in 5095, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.0546875 = fieldNorm(doc=5095)
        0.016655317 = product of:
          0.033310633 = sum of:
            0.033310633 = weight(_text_:22 in 5095) [ClassicSimilarity], result of:
              0.033310633 = score(doc=5095,freq=2.0), product of:
                0.12299426 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.035122856 = queryNorm
                0.2708308 = fieldWeight in 5095, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=5095)
          0.5 = coord(1/2)
      0.5 = coord(6/12)
    
    Abstract
    This paper presents an abstract formalization of the notion of "facets". Facets are relational structures of units, relations and other facets selected for a certain purpose. Facets can be used to structure large knowledge representation systems into a hierarchical arrangement of consistent and independent subsystems (facets) that facilitate flexibility and combinations of different viewpoints or aspects. This paper describes the basic notions, facet characteristics and construction mechanisms. It then explicates the theory in an example of a faceted information retrieval system (FaIR)
    Date
    22. 1.2016 17:47:06
  5. Negm, E.; AbdelRahman, S.; Bahgat, R.: PREFCA: a portal retrieval engine based on formal concept analysis (2017) 0.04
    0.03814322 = product of:
      0.07628644 = sum of:
        0.0082198065 = product of:
          0.024659418 = sum of:
            0.024659418 = weight(_text_:f in 3291) [ClassicSimilarity], result of:
              0.024659418 = score(doc=3291,freq=2.0), product of:
                0.13999219 = queryWeight, product of:
                  3.985786 = idf(docFreq=2232, maxDocs=44218)
                  0.035122856 = queryNorm
                0.17614852 = fieldWeight in 3291, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.985786 = idf(docFreq=2232, maxDocs=44218)
                  0.03125 = fieldNorm(doc=3291)
          0.33333334 = coord(1/3)
        0.010696227 = weight(_text_:information in 3291) [ClassicSimilarity], result of:
          0.010696227 = score(doc=3291,freq=10.0), product of:
            0.0616574 = queryWeight, product of:
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.035122856 = queryNorm
            0.1734784 = fieldWeight in 3291, product of:
              3.1622777 = tf(freq=10.0), with freq of:
                10.0 = termFreq=10.0
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.03125 = fieldNorm(doc=3291)
        0.009477528 = weight(_text_:for in 3291) [ClassicSimilarity], result of:
          0.009477528 = score(doc=3291,freq=6.0), product of:
            0.06594466 = queryWeight, product of:
              1.8775425 = idf(docFreq=18385, maxDocs=44218)
              0.035122856 = queryNorm
            0.14371942 = fieldWeight in 3291, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              1.8775425 = idf(docFreq=18385, maxDocs=44218)
              0.03125 = fieldNorm(doc=3291)
        0.0197027 = weight(_text_:the in 3291) [ClassicSimilarity], result of:
          0.0197027 = score(doc=3291,freq=52.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.35554436 = fieldWeight in 3291, product of:
              7.2111025 = tf(freq=52.0), with freq of:
                52.0 = termFreq=52.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.03125 = fieldNorm(doc=3291)
        0.008487476 = weight(_text_:of in 3291) [ClassicSimilarity], result of:
          0.008487476 = score(doc=3291,freq=10.0), product of:
            0.054923624 = queryWeight, product of:
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.035122856 = queryNorm
            0.15453234 = fieldWeight in 3291, product of:
              3.1622777 = tf(freq=10.0), with freq of:
                10.0 = termFreq=10.0
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.03125 = fieldNorm(doc=3291)
        0.0197027 = weight(_text_:the in 3291) [ClassicSimilarity], result of:
          0.0197027 = score(doc=3291,freq=52.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.35554436 = fieldWeight in 3291, product of:
              7.2111025 = tf(freq=52.0), with freq of:
                52.0 = termFreq=52.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.03125 = fieldNorm(doc=3291)
      0.5 = coord(6/12)
    
    Abstract
    The web is a network of linked sites whereby each site either forms a physical portal or a standalone page. In the former case, the portal presents an access point to its embedded web pages that coherently present a specific topic. In the latter case, there are millions of standalone web pages, that are scattered throughout the web, having the same topic and could be conceptually linked together to form virtual portals. Search engines have been developed to help users in reaching the adequate pages in an efficient and effective manner. All the known current search engine techniques rely on the web page as the basic atomic search unit. They ignore the conceptual links, that reveal the implicit web related meanings, among the retrieved pages. However, building a semantic model for the whole portal may contain more semantic information than a model of scattered individual pages. In addition, user queries can be poor and contain imprecise terms that do not reflect the real user intention. Consequently, retrieving the standalone individual pages that are directly related to the query may not satisfy the user's need. In this paper, we propose PREFCA, a Portal Retrieval Engine based on Formal Concept Analysis that relies on the portal as the main search unit. PREFCA consists of three phases: First, the information extraction phase that is concerned with extracting portal's semantic data. Second, the formal concept analysis phase that utilizes formal concept analysis to discover the conceptual links among portal and attributes. Finally, the information retrieval phase where we propose a portal ranking method to retrieve ranked pairs of portals and embedded pages. Additionally, we apply the network analysis rules to output some portal characteristics. We evaluated PREFCA using two data sets, namely the Forum for Information Retrieval Evaluation 2010 and ClueWeb09 (category B) test data, for physical and virtual portals respectively. PREFCA proves higher F-measure accuracy, better Mean Average Precision ranking and comparable network analysis and efficiency results than other search engine approaches, namely Term Frequency Inverse Document Frequency (TF-IDF), Latent Semantic Analysis (LSA), and BM25 techniques. As well, it gains high Mean Average Precision in comparison with learning to rank techniques. Moreover, PREFCA also gains better reach time than Carrot as a well-known topic-based search engine.
    Source
    Information processing and management. 53(2017) no.1, S.203-222
  6. Scheich, P.; Skorsky, M.; Vogt, F.; Wachter, C.; Wille, R.: Conceptual data systems (1993) 0.04
    0.038133457 = product of:
      0.0915203 = sum of:
        0.024659418 = product of:
          0.07397825 = sum of:
            0.07397825 = weight(_text_:f in 5262) [ClassicSimilarity], result of:
              0.07397825 = score(doc=5262,freq=2.0), product of:
                0.13999219 = queryWeight, product of:
                  3.985786 = idf(docFreq=2232, maxDocs=44218)
                  0.035122856 = queryNorm
                0.52844554 = fieldWeight in 5262, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.985786 = idf(docFreq=2232, maxDocs=44218)
                  0.09375 = fieldNorm(doc=5262)
          0.33333334 = coord(1/3)
        0.014350494 = weight(_text_:information in 5262) [ClassicSimilarity], result of:
          0.014350494 = score(doc=5262,freq=2.0), product of:
            0.0616574 = queryWeight, product of:
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.035122856 = queryNorm
            0.23274569 = fieldWeight in 5262, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.09375 = fieldNorm(doc=5262)
        0.016393637 = weight(_text_:the in 5262) [ClassicSimilarity], result of:
          0.016393637 = score(doc=5262,freq=4.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.2958308 = fieldWeight in 5262, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.09375 = fieldNorm(doc=5262)
        0.019723112 = weight(_text_:of in 5262) [ClassicSimilarity], result of:
          0.019723112 = score(doc=5262,freq=6.0), product of:
            0.054923624 = queryWeight, product of:
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.035122856 = queryNorm
            0.3591007 = fieldWeight in 5262, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.09375 = fieldNorm(doc=5262)
        0.016393637 = weight(_text_:the in 5262) [ClassicSimilarity], result of:
          0.016393637 = score(doc=5262,freq=4.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.2958308 = fieldWeight in 5262, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.09375 = fieldNorm(doc=5262)
      0.41666666 = coord(5/12)
    
    Source
    Information and classification: concepts, methods and applications. Proceedings of the 16th Annual Conference of the Gesellschaft für Klassifikation, University of Dortmund, April 1-3, 1992. Ed.: O. Opitz u.a
  7. Neuss, C.; Kent, R.E.: Conceptual analysis of resource meta-information (1995) 0.04
    0.03709326 = product of:
      0.08902383 = sum of:
        0.023434259 = weight(_text_:information in 2204) [ClassicSimilarity], result of:
          0.023434259 = score(doc=2204,freq=12.0), product of:
            0.0616574 = queryWeight, product of:
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.035122856 = queryNorm
            0.38007212 = fieldWeight in 2204, product of:
              3.4641016 = tf(freq=12.0), with freq of:
                12.0 = termFreq=12.0
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.0625 = fieldNorm(doc=2204)
        0.010943705 = weight(_text_:for in 2204) [ClassicSimilarity], result of:
          0.010943705 = score(doc=2204,freq=2.0), product of:
            0.06594466 = queryWeight, product of:
              1.8775425 = idf(docFreq=18385, maxDocs=44218)
              0.035122856 = queryNorm
            0.16595288 = fieldWeight in 2204, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.8775425 = idf(docFreq=18385, maxDocs=44218)
              0.0625 = fieldNorm(doc=2204)
        0.017280413 = weight(_text_:the in 2204) [ClassicSimilarity], result of:
          0.017280413 = score(doc=2204,freq=10.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.31183305 = fieldWeight in 2204, product of:
              3.1622777 = tf(freq=10.0), with freq of:
                10.0 = termFreq=10.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.0625 = fieldNorm(doc=2204)
        0.020085035 = weight(_text_:of in 2204) [ClassicSimilarity], result of:
          0.020085035 = score(doc=2204,freq=14.0), product of:
            0.054923624 = queryWeight, product of:
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.035122856 = queryNorm
            0.36569026 = fieldWeight in 2204, product of:
              3.7416575 = tf(freq=14.0), with freq of:
                14.0 = termFreq=14.0
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.0625 = fieldNorm(doc=2204)
        0.017280413 = weight(_text_:the in 2204) [ClassicSimilarity], result of:
          0.017280413 = score(doc=2204,freq=10.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.31183305 = fieldWeight in 2204, product of:
              3.1622777 = tf(freq=10.0), with freq of:
                10.0 = termFreq=10.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.0625 = fieldNorm(doc=2204)
      0.41666666 = coord(5/12)
    
    Abstract
    With the continuously growing amount of Internet accessible information resources, locating relevant information in the WWW becomes increasingly difficult. Recent developments provide scalable mechanisms for maintaing indexes of network accessible information. In order to implement sophisticated retrieval engines, a means of automatic analysis and classification of document meta information has to be found. Proposes the use of methods from the mathematical theory of concept analysis to analyze and interactively explore the information space defined by wide area resource discovery services
  8. Priss, U.: Faceted knowledge representation (1999) 0.04
    0.036439814 = product of:
      0.07287963 = sum of:
        0.008371122 = weight(_text_:information in 2654) [ClassicSimilarity], result of:
          0.008371122 = score(doc=2654,freq=2.0), product of:
            0.0616574 = queryWeight, product of:
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.035122856 = queryNorm
            0.13576832 = fieldWeight in 2654, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.0546875 = fieldNorm(doc=2654)
        0.009575742 = weight(_text_:for in 2654) [ClassicSimilarity], result of:
          0.009575742 = score(doc=2654,freq=2.0), product of:
            0.06594466 = queryWeight, product of:
              1.8775425 = idf(docFreq=18385, maxDocs=44218)
              0.035122856 = queryNorm
            0.14520876 = fieldWeight in 2654, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.8775425 = idf(docFreq=18385, maxDocs=44218)
              0.0546875 = fieldNorm(doc=2654)
        0.011712181 = weight(_text_:the in 2654) [ClassicSimilarity], result of:
          0.011712181 = score(doc=2654,freq=6.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.21135174 = fieldWeight in 2654, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.0546875 = fieldNorm(doc=2654)
        0.0148530835 = weight(_text_:of in 2654) [ClassicSimilarity], result of:
          0.0148530835 = score(doc=2654,freq=10.0), product of:
            0.054923624 = queryWeight, product of:
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.035122856 = queryNorm
            0.2704316 = fieldWeight in 2654, product of:
              3.1622777 = tf(freq=10.0), with freq of:
                10.0 = termFreq=10.0
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.0546875 = fieldNorm(doc=2654)
        0.011712181 = weight(_text_:the in 2654) [ClassicSimilarity], result of:
          0.011712181 = score(doc=2654,freq=6.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.21135174 = fieldWeight in 2654, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.0546875 = fieldNorm(doc=2654)
        0.016655317 = product of:
          0.033310633 = sum of:
            0.033310633 = weight(_text_:22 in 2654) [ClassicSimilarity], result of:
              0.033310633 = score(doc=2654,freq=2.0), product of:
                0.12299426 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.035122856 = queryNorm
                0.2708308 = fieldWeight in 2654, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=2654)
          0.5 = coord(1/2)
      0.5 = coord(6/12)
    
    Abstract
    Faceted Knowledge Representation provides a formalism for implementing knowledge systems. The basic notions of faceted knowledge representation are "unit", "relation", "facet" and "interpretation". Units are atomic elements and can be abstract elements or refer to external objects in an application. Relations are sequences or matrices of 0 and 1's (binary matrices). Facets are relational structures that combine units and relations. Each facet represents an aspect or viewpoint of a knowledge system. Interpretations are mappings that can be used to translate between different representations. This paper introduces the basic notions of faceted knowledge representation. The formalism is applied here to an abstract modeling of a faceted thesaurus as used in information retrieval.
    Date
    22. 1.2016 17:30:31
  9. Kumar, C.A.; Radvansky, M.; Annapurna, J.: Analysis of Vector Space Model, Latent Semantic Indexing and Formal Concept Analysis for information retrieval (2012) 0.04
    0.035336226 = product of:
      0.08480694 = sum of:
        0.016742244 = weight(_text_:information in 2710) [ClassicSimilarity], result of:
          0.016742244 = score(doc=2710,freq=8.0), product of:
            0.0616574 = queryWeight, product of:
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.035122856 = queryNorm
            0.27153665 = fieldWeight in 2710, product of:
              2.828427 = tf(freq=8.0), with freq of:
                8.0 = termFreq=8.0
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.0546875 = fieldNorm(doc=2710)
        0.013542145 = weight(_text_:for in 2710) [ClassicSimilarity], result of:
          0.013542145 = score(doc=2710,freq=4.0), product of:
            0.06594466 = queryWeight, product of:
              1.8775425 = idf(docFreq=18385, maxDocs=44218)
              0.035122856 = queryNorm
            0.20535621 = fieldWeight in 2710, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              1.8775425 = idf(docFreq=18385, maxDocs=44218)
              0.0546875 = fieldNorm(doc=2710)
        0.01912591 = weight(_text_:the in 2710) [ClassicSimilarity], result of:
          0.01912591 = score(doc=2710,freq=16.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.34513593 = fieldWeight in 2710, product of:
              4.0 = tf(freq=16.0), with freq of:
                16.0 = termFreq=16.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.0546875 = fieldNorm(doc=2710)
        0.016270736 = weight(_text_:of in 2710) [ClassicSimilarity], result of:
          0.016270736 = score(doc=2710,freq=12.0), product of:
            0.054923624 = queryWeight, product of:
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.035122856 = queryNorm
            0.29624295 = fieldWeight in 2710, product of:
              3.4641016 = tf(freq=12.0), with freq of:
                12.0 = termFreq=12.0
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.0546875 = fieldNorm(doc=2710)
        0.01912591 = weight(_text_:the in 2710) [ClassicSimilarity], result of:
          0.01912591 = score(doc=2710,freq=16.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.34513593 = fieldWeight in 2710, product of:
              4.0 = tf(freq=16.0), with freq of:
                16.0 = termFreq=16.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.0546875 = fieldNorm(doc=2710)
      0.41666666 = coord(5/12)
    
    Abstract
    Latent Semantic Indexing (LSI), a variant of classical Vector Space Model (VSM), is an Information Retrieval (IR) model that attempts to capture the latent semantic relationship between the data items. Mathematical lattices, under the framework of Formal Concept Analysis (FCA), represent conceptual hierarchies in data and retrieve the information. However both LSI and FCA uses the data represented in form of matrices. The objective of this paper is to systematically analyze VSM, LSI and FCA for the task of IR using the standard and real life datasets.
    Source
    Cybernetics and information technologies. 12(2012) no.1, S.34-48
  10. Priss, U.; Jacob, E.: Utilizing faceted structures for information systems design (1999) 0.03
    0.034782723 = product of:
      0.08347854 = sum of:
        0.014350493 = weight(_text_:information in 2470) [ClassicSimilarity], result of:
          0.014350493 = score(doc=2470,freq=18.0), product of:
            0.0616574 = queryWeight, product of:
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.035122856 = queryNorm
            0.23274568 = fieldWeight in 2470, product of:
              4.2426405 = tf(freq=18.0), with freq of:
                18.0 = termFreq=18.0
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.03125 = fieldNorm(doc=2470)
        0.016415557 = weight(_text_:for in 2470) [ClassicSimilarity], result of:
          0.016415557 = score(doc=2470,freq=18.0), product of:
            0.06594466 = queryWeight, product of:
              1.8775425 = idf(docFreq=18385, maxDocs=44218)
              0.035122856 = queryNorm
            0.2489293 = fieldWeight in 2470, product of:
              4.2426405 = tf(freq=18.0), with freq of:
                18.0 = termFreq=18.0
              1.8775425 = idf(docFreq=18385, maxDocs=44218)
              0.03125 = fieldNorm(doc=2470)
        0.018531177 = weight(_text_:the in 2470) [ClassicSimilarity], result of:
          0.018531177 = score(doc=2470,freq=46.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.3344037 = fieldWeight in 2470, product of:
              6.78233 = tf(freq=46.0), with freq of:
                46.0 = termFreq=46.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.03125 = fieldNorm(doc=2470)
        0.015650133 = weight(_text_:of in 2470) [ClassicSimilarity], result of:
          0.015650133 = score(doc=2470,freq=34.0), product of:
            0.054923624 = queryWeight, product of:
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.035122856 = queryNorm
            0.28494355 = fieldWeight in 2470, product of:
              5.8309517 = tf(freq=34.0), with freq of:
                34.0 = termFreq=34.0
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.03125 = fieldNorm(doc=2470)
        0.018531177 = weight(_text_:the in 2470) [ClassicSimilarity], result of:
          0.018531177 = score(doc=2470,freq=46.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.3344037 = fieldWeight in 2470, product of:
              6.78233 = tf(freq=46.0), with freq of:
                46.0 = termFreq=46.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.03125 = fieldNorm(doc=2470)
      0.41666666 = coord(5/12)
    
    Abstract
    Even for the experienced information professional, designing an efficient multi-purpose information access structure can be a very difficult task. This paper argues for the use of a faceted thesaurus as the basis for organizing a small-scale institutional website. We contend that a faceted approach to knowledge organization can make the process of organization less random and more manageable. We begin by reporting on an informal survey of three institutional websites. This study underscores the problems of organization that can impact access to information. We then formalize the terminology of faceted thesauri and demonstrate its application with several examples.
    The writers show that a faceted navigation structure makes web sites easier to use. They begin by analyzing the web sites of three library and information science faculties, and seeing if the sites easily provide the answers to five specific questions, e.g., how the school ranks in national evaluations. (It is worth noting that the web site of the Faculty of Information Studies and the University of Toronto, where this bibliography is being written, would fail on four of the five questions.) Using examples from LIS web site content, they show how facets can be related and constructed, and use concept diagrams for illustration. They briefly discuss constraints necessary when joining facets: for example, enrolled students can be full- or part-time, but prospective and alumni students cannot. It should not be possible to construct terms such as "part-time alumni" (see Yannis Tzitzikas et al, below in Background). They conclude that a faceted approach is best for web site navigation, because it can clearly show where the user is in the site, what the related pages are, and how to get to them. There is a short discussion of user interfaces, and the diagrams in the paper will be of interest to anyone making a facet-based web site. This paper is clearly written, informative, and thought-provoking. Uta Priss's web site lists her other publications, many of which are related and some of which are online: http://www.upriss.org.uk/top/research.html.
    Imprint
    Medford, NJ : Information Today
    Series
    Proceedings of the American Society for Information Science; vol.36
    Source
    Knowledge: creation, organization and use. Proceedings of the 62nd Annual Meeting of the American Society for Information Science, 31.10.-4.11.1999. Ed.: L. Woods
  11. Wille, R.; Wachter, C.: Begriffsanalyse von Dokumenten (1992) 0.03
    0.034710273 = product of:
      0.1388411 = sum of:
        0.014350494 = weight(_text_:information in 341) [ClassicSimilarity], result of:
          0.014350494 = score(doc=341,freq=2.0), product of:
            0.0616574 = queryWeight, product of:
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.035122856 = queryNorm
            0.23274569 = fieldWeight in 341, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.09375 = fieldNorm(doc=341)
        0.022874914 = weight(_text_:und in 341) [ClassicSimilarity], result of:
          0.022874914 = score(doc=341,freq=2.0), product of:
            0.07784514 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.035122856 = queryNorm
            0.29385152 = fieldWeight in 341, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.09375 = fieldNorm(doc=341)
        0.10161568 = weight(_text_:dokumentation in 341) [ClassicSimilarity], result of:
          0.10161568 = score(doc=341,freq=2.0), product of:
            0.16407113 = queryWeight, product of:
              4.671349 = idf(docFreq=1124, maxDocs=44218)
              0.035122856 = queryNorm
            0.6193392 = fieldWeight in 341, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              4.671349 = idf(docFreq=1124, maxDocs=44218)
              0.09375 = fieldNorm(doc=341)
      0.25 = coord(3/12)
    
    Source
    Information und Dokumentation in den 90er Jahren: neue Herausforderung, neue Technologien. Deutscher Dokumentartag 1991, Universität Ulm, 30.9.-2.10.1991. Hrsg.: W. Neubauer u. K.-H. Meier
  12. Priss, U.: Lattice-based information retrieval (2000) 0.03
    0.032781553 = product of:
      0.07867573 = sum of:
        0.014499208 = weight(_text_:information in 6055) [ClassicSimilarity], result of:
          0.014499208 = score(doc=6055,freq=6.0), product of:
            0.0616574 = queryWeight, product of:
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.035122856 = queryNorm
            0.23515764 = fieldWeight in 6055, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.0546875 = fieldNorm(doc=6055)
        0.013542145 = weight(_text_:for in 6055) [ClassicSimilarity], result of:
          0.013542145 = score(doc=6055,freq=4.0), product of:
            0.06594466 = queryWeight, product of:
              1.8775425 = idf(docFreq=18385, maxDocs=44218)
              0.035122856 = queryNorm
            0.20535621 = fieldWeight in 6055, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              1.8775425 = idf(docFreq=18385, maxDocs=44218)
              0.0546875 = fieldNorm(doc=6055)
        0.01789065 = weight(_text_:the in 6055) [ClassicSimilarity], result of:
          0.01789065 = score(doc=6055,freq=14.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.3228451 = fieldWeight in 6055, product of:
              3.7416575 = tf(freq=14.0), with freq of:
                14.0 = termFreq=14.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.0546875 = fieldNorm(doc=6055)
        0.0148530835 = weight(_text_:of in 6055) [ClassicSimilarity], result of:
          0.0148530835 = score(doc=6055,freq=10.0), product of:
            0.054923624 = queryWeight, product of:
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.035122856 = queryNorm
            0.2704316 = fieldWeight in 6055, product of:
              3.1622777 = tf(freq=10.0), with freq of:
                10.0 = termFreq=10.0
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.0546875 = fieldNorm(doc=6055)
        0.01789065 = weight(_text_:the in 6055) [ClassicSimilarity], result of:
          0.01789065 = score(doc=6055,freq=14.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.3228451 = fieldWeight in 6055, product of:
              3.7416575 = tf(freq=14.0), with freq of:
                14.0 = termFreq=14.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.0546875 = fieldNorm(doc=6055)
      0.41666666 = coord(5/12)
    
    Abstract
    A lattice-based model for information retrieval was suggested in the 1960's but has been seen as a theoretical possibility hard to practically apply ever since. This paper attempts to revive the lattice model and demonstrate its applicability in an information retrieval system, FalR, that incorporates a graphical representation of a faceted thesaurus. It shows how Boolean queries can be lattice-theoretically related to the concepts of the thesaurus and visualized within the thesaurus display. An advantage of FaIR is that it allows for a high level of transparency of the system, which can be controlled by the user
  13. Kent, R.E.: Implications and rules in thesauri (1994) 0.03
    0.028430533 = product of:
      0.06823328 = sum of:
        0.015249942 = weight(_text_:und in 3457) [ClassicSimilarity], result of:
          0.015249942 = score(doc=3457,freq=2.0), product of:
            0.07784514 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.035122856 = queryNorm
            0.19590102 = fieldWeight in 3457, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.0625 = fieldNorm(doc=3457)
        0.015476737 = weight(_text_:for in 3457) [ClassicSimilarity], result of:
          0.015476737 = score(doc=3457,freq=4.0), product of:
            0.06594466 = queryWeight, product of:
              1.8775425 = idf(docFreq=18385, maxDocs=44218)
              0.035122856 = queryNorm
            0.23469281 = fieldWeight in 3457, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              1.8775425 = idf(docFreq=18385, maxDocs=44218)
              0.0625 = fieldNorm(doc=3457)
        0.01338535 = weight(_text_:the in 3457) [ClassicSimilarity], result of:
          0.01338535 = score(doc=3457,freq=6.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.24154484 = fieldWeight in 3457, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.0625 = fieldNorm(doc=3457)
        0.010735902 = weight(_text_:of in 3457) [ClassicSimilarity], result of:
          0.010735902 = score(doc=3457,freq=4.0), product of:
            0.054923624 = queryWeight, product of:
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.035122856 = queryNorm
            0.19546966 = fieldWeight in 3457, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.0625 = fieldNorm(doc=3457)
        0.01338535 = weight(_text_:the in 3457) [ClassicSimilarity], result of:
          0.01338535 = score(doc=3457,freq=6.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.24154484 = fieldWeight in 3457, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.0625 = fieldNorm(doc=3457)
      0.41666666 = coord(5/12)
    
    Abstract
    A central consideration in the study of whole language semantic space as encoded in thesauri is word sense comparability. Shows how word sense comparability can be adequately expressed by the logical implications and rules from Formal Concept Analysis. Formal concept analysis, a new approach to formal logic initiated by Rudolf Wille, has been used for data modelling, analysis and interpretation, and also for knowledge representation and knowledge discovery
    Source
    Knowledge organization and quality management: Proc. of the 3rd International ISKO Conference, 20-24 June 1994, Copenhagen, Denmark. Ed.: H. Albrechtsen et al
    Theme
    Konzeption und Anwendung des Prinzips Thesaurus
  14. Working with conceptual structures : contributions to ICCS 2000. 8th International Conference on Conceptual Structures: Logical, Linguistic, and Computational Issues. Darmstadt, August 14-18, 2000 (2000) 0.03
    0.02805191 = product of:
      0.067324586 = sum of:
        0.0059192767 = weight(_text_:information in 5089) [ClassicSimilarity], result of:
          0.0059192767 = score(doc=5089,freq=4.0), product of:
            0.0616574 = queryWeight, product of:
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.035122856 = queryNorm
            0.0960027 = fieldWeight in 5089, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.02734375 = fieldNorm(doc=5089)
        0.017262915 = weight(_text_:for in 5089) [ClassicSimilarity], result of:
          0.017262915 = score(doc=5089,freq=26.0), product of:
            0.06594466 = queryWeight, product of:
              1.8775425 = idf(docFreq=18385, maxDocs=44218)
              0.035122856 = queryNorm
            0.26177883 = fieldWeight in 5089, product of:
              5.0990195 = tf(freq=26.0), with freq of:
                26.0 = termFreq=26.0
              1.8775425 = idf(docFreq=18385, maxDocs=44218)
              0.02734375 = fieldNorm(doc=5089)
        0.016563525 = weight(_text_:the in 5089) [ClassicSimilarity], result of:
          0.016563525 = score(doc=5089,freq=48.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.2988965 = fieldWeight in 5089, product of:
              6.928203 = tf(freq=48.0), with freq of:
                48.0 = termFreq=48.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.02734375 = fieldNorm(doc=5089)
        0.011015342 = weight(_text_:of in 5089) [ClassicSimilarity], result of:
          0.011015342 = score(doc=5089,freq=22.0), product of:
            0.054923624 = queryWeight, product of:
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.035122856 = queryNorm
            0.20055744 = fieldWeight in 5089, product of:
              4.690416 = tf(freq=22.0), with freq of:
                22.0 = termFreq=22.0
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.02734375 = fieldNorm(doc=5089)
        0.016563525 = weight(_text_:the in 5089) [ClassicSimilarity], result of:
          0.016563525 = score(doc=5089,freq=48.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.2988965 = fieldWeight in 5089, product of:
              6.928203 = tf(freq=48.0), with freq of:
                48.0 = termFreq=48.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.02734375 = fieldNorm(doc=5089)
      0.41666666 = coord(5/12)
    
    Abstract
    The 8th International Conference on Conceptual Structures - Logical, Linguistic, and Computational Issues (ICCS 2000) brings together a wide range of researchers and practitioners working with conceptual structures. During the last few years, the ICCS conference series has considerably widened its scope on different kinds of conceptual structures, stimulating research across domain boundaries. We hope that this stimulation is further enhanced by ICCS 2000 joining the long tradition of conferences in Darmstadt with extensive, lively discussions. This volume consists of contributions presented at ICCS 2000, complementing the volume "Conceptual Structures: Logical, Linguistic, and Computational Issues" (B. Ganter, G.W. Mineau (Eds.), LNAI 1867, Springer, Berlin-Heidelberg 2000). It contains submissions reviewed by the program committee, and position papers. We wish to express our appreciation to all the authors of submitted papers, to the general chair, the program chair, the editorial board, the program committee, and to the additional reviewers for making ICCS 2000 a valuable contribution in the knowledge processing research field. Special thanks go to the local organizers for making the conference an enjoyable and inspiring event. We are grateful to Darmstadt University of Technology, the Ernst Schröder Center for Conceptual Knowledge Processing, the Center for Interdisciplinary Studies in Technology, the Deutsche Forschungsgemeinschaft, Land Hessen, and NaviCon GmbH for their generous support
    Content
    Concepts & Language: Knowledge organization by procedures of natural language processing. A case study using the method GABEK (J. Zelger, J. Gadner) - Computer aided narrative analysis using conceptual graphs (H. Schärfe, P. 0hrstrom) - Pragmatic representation of argumentative text: a challenge for the conceptual graph approach (H. Irandoust, B. Moulin) - Conceptual graphs as a knowledge representation core in a complex language learning environment (G. Angelova, A. Nenkova, S. Boycheva, T. Nikolov) - Conceptual Modeling and Ontologies: Relationships and actions in conceptual categories (Ch. Landauer, K.L. Bellman) - Concept approximations for formal concept analysis (J. Saquer, J.S. Deogun) - Faceted information representation (U. Priß) - Simple concept graphs with universal quantifiers (J. Tappe) - A framework for comparing methods for using or reusing multiple ontologies in an application (J. van ZyI, D. Corbett) - Designing task/method knowledge-based systems with conceptual graphs (M. Leclère, F.Trichet, Ch. Choquet) - A logical ontology (J. Farkas, J. Sarbo) - Algorithms and Tools: Fast concept analysis (Ch. Lindig) - A framework for conceptual graph unification (D. Corbett) - Visual CP representation of knowledge (H.D. Pfeiffer, R.T. Hartley) - Maximal isojoin for representing software textual specifications and detecting semantic anomalies (Th. Charnois) - Troika: using grids, lattices and graphs in knowledge acquisition (H.S. Delugach, B.E. Lampkin) - Open world theorem prover for conceptual graphs (J.E. Heaton, P. Kocura) - NetCare: a practical conceptual graphs software tool (S. Polovina, D. Strang) - CGWorld - a web based workbench for conceptual graphs management and applications (P. Dobrev, K. Toutanova) - Position papers: The edition project: Peirce's existential graphs (R. Mülller) - Mining association rules using formal concept analysis (N. Pasquier) - Contextual logic summary (R Wille) - Information channels and conceptual scaling (K.E. Wolff) - Spatial concepts - a rule exploration (S. Rudolph) - The TEXT-TO-ONTO learning environment (A. Mädche, St. Staab) - Controlling the semantics of metadata on audio-visual documents using ontologies (Th. Dechilly, B. Bachimont) - Building the ontological foundations of a terminology from natural language to conceptual graphs with Ribosome, a knowledge extraction system (Ch. Jacquelinet, A. Burgun) - CharGer: some lessons learned and new directions (H.S. Delugach) - Knowledge management using conceptual graphs (W.K. Pun)
  15. Eklund, P.; Groh, B.; Stumme, G.; Wille, R.: ¬A conceptual-logic extension of TOSCANA (2000) 0.03
    0.028012587 = product of:
      0.06723021 = sum of:
        0.010147331 = weight(_text_:information in 5082) [ClassicSimilarity], result of:
          0.010147331 = score(doc=5082,freq=4.0), product of:
            0.0616574 = queryWeight, product of:
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.035122856 = queryNorm
            0.16457605 = fieldWeight in 5082, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.046875 = fieldNorm(doc=5082)
        0.011607553 = weight(_text_:for in 5082) [ClassicSimilarity], result of:
          0.011607553 = score(doc=5082,freq=4.0), product of:
            0.06594466 = queryWeight, product of:
              1.8775425 = idf(docFreq=18385, maxDocs=44218)
              0.035122856 = queryNorm
            0.17601961 = fieldWeight in 5082, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              1.8775425 = idf(docFreq=18385, maxDocs=44218)
              0.046875 = fieldNorm(doc=5082)
        0.014197307 = weight(_text_:the in 5082) [ClassicSimilarity], result of:
          0.014197307 = score(doc=5082,freq=12.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.25619698 = fieldWeight in 5082, product of:
              3.4641016 = tf(freq=12.0), with freq of:
                12.0 = termFreq=12.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.046875 = fieldNorm(doc=5082)
        0.017080715 = weight(_text_:of in 5082) [ClassicSimilarity], result of:
          0.017080715 = score(doc=5082,freq=18.0), product of:
            0.054923624 = queryWeight, product of:
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.035122856 = queryNorm
            0.3109903 = fieldWeight in 5082, product of:
              4.2426405 = tf(freq=18.0), with freq of:
                18.0 = termFreq=18.0
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.046875 = fieldNorm(doc=5082)
        0.014197307 = weight(_text_:the in 5082) [ClassicSimilarity], result of:
          0.014197307 = score(doc=5082,freq=12.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.25619698 = fieldWeight in 5082, product of:
              3.4641016 = tf(freq=12.0), with freq of:
                12.0 = termFreq=12.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.046875 = fieldNorm(doc=5082)
      0.41666666 = coord(5/12)
    
    Abstract
    The aim of this paper is to indicate how TOSCANA may be extended to allow graphical representations not only of concept lattices but also of concept graphs in the sense of Contextual Logic. The contextual- logic extension of TOSCANA requires the logical scaling of conceptual and relational scales for which we propose the Peircean Algebraic Logic as reconstructed by R. W. Burch. As graphical representations we recommend, besides labelled line diagrams of concept lattices and Sowa's diagrams of conceptual graphs, particular information maps for utilizing background knowledge as much as possible. Our considerations are illustrated by a small information system about the domestic flights in Austria
  16. Carpineto, C.; Romano, G.: Order-theoretical ranking (2000) 0.03
    0.027826438 = product of:
      0.06678345 = sum of:
        0.010356578 = weight(_text_:information in 4766) [ClassicSimilarity], result of:
          0.010356578 = score(doc=4766,freq=6.0), product of:
            0.0616574 = queryWeight, product of:
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.035122856 = queryNorm
            0.16796975 = fieldWeight in 4766, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.0390625 = fieldNorm(doc=4766)
        0.00967296 = weight(_text_:for in 4766) [ClassicSimilarity], result of:
          0.00967296 = score(doc=4766,freq=4.0), product of:
            0.06594466 = queryWeight, product of:
              1.8775425 = idf(docFreq=18385, maxDocs=44218)
              0.035122856 = queryNorm
            0.14668301 = fieldWeight in 4766, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              1.8775425 = idf(docFreq=18385, maxDocs=44218)
              0.0390625 = fieldNorm(doc=4766)
        0.018072287 = weight(_text_:the in 4766) [ClassicSimilarity], result of:
          0.018072287 = score(doc=4766,freq=28.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.3261228 = fieldWeight in 4766, product of:
              5.2915025 = tf(freq=28.0), with freq of:
                28.0 = termFreq=28.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.0390625 = fieldNorm(doc=4766)
        0.010609345 = weight(_text_:of in 4766) [ClassicSimilarity], result of:
          0.010609345 = score(doc=4766,freq=10.0), product of:
            0.054923624 = queryWeight, product of:
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.035122856 = queryNorm
            0.19316542 = fieldWeight in 4766, product of:
              3.1622777 = tf(freq=10.0), with freq of:
                10.0 = termFreq=10.0
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.0390625 = fieldNorm(doc=4766)
        0.018072287 = weight(_text_:the in 4766) [ClassicSimilarity], result of:
          0.018072287 = score(doc=4766,freq=28.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.3261228 = fieldWeight in 4766, product of:
              5.2915025 = tf(freq=28.0), with freq of:
                28.0 = termFreq=28.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.0390625 = fieldNorm(doc=4766)
      0.41666666 = coord(5/12)
    
    Abstract
    Current best-match ranking (BMR) systems perform well but cannot handle word mismatch between a query and a document. The best known alternative ranking method, hierarchical clustering-based ranking (HCR), seems to be more robust than BMR with respect to this problem, but it is hampered by theoretical and practical limitations. We present an approach to document ranking that explicitly addresses the word mismatch problem by exploiting interdocument similarity information in a novel way. Document ranking is seen as a query-document transformation driven by a conceptual representation of the whole document collection, into which the query is merged. Our approach is nased on the theory of concept (or Galois) lattices, which, er argue, provides a powerful, well-founded, and conputationally-tractable framework to model the space in which documents and query are represented and to compute such a transformation. We compared information retrieval using concept lattice-based ranking (CLR) to BMR and HCR. The results showed that HCR was outperformed by CLR as well as BMR, and suggested that, of the two best methods, BMR achieved better performance than CLR on the whole document set, whereas CLR compared more favorably when only the first retrieved documents were used for evaluation. We also evaluated the three methods' specific ability to rank documents that did not match the query, in which case the speriority of CLR over BMR and HCR was apparent
    Source
    Journal of the American Society for Information Science. 51(2000) no.7, S.587-601
  17. Sedelow, W.A.: ¬The formal analysis of concepts (1993) 0.03
    0.027752375 = product of:
      0.0666057 = sum of:
        0.009566996 = weight(_text_:information in 620) [ClassicSimilarity], result of:
          0.009566996 = score(doc=620,freq=2.0), product of:
            0.0616574 = queryWeight, product of:
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.035122856 = queryNorm
            0.1551638 = fieldWeight in 620, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.0625 = fieldNorm(doc=620)
        0.010943705 = weight(_text_:for in 620) [ClassicSimilarity], result of:
          0.010943705 = score(doc=620,freq=2.0), product of:
            0.06594466 = queryWeight, product of:
              1.8775425 = idf(docFreq=18385, maxDocs=44218)
              0.035122856 = queryNorm
            0.16595288 = fieldWeight in 620, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.8775425 = idf(docFreq=18385, maxDocs=44218)
              0.0625 = fieldNorm(doc=620)
        0.01545607 = weight(_text_:the in 620) [ClassicSimilarity], result of:
          0.01545607 = score(doc=620,freq=8.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.27891195 = fieldWeight in 620, product of:
              2.828427 = tf(freq=8.0), with freq of:
                8.0 = termFreq=8.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.0625 = fieldNorm(doc=620)
        0.015182858 = weight(_text_:of in 620) [ClassicSimilarity], result of:
          0.015182858 = score(doc=620,freq=8.0), product of:
            0.054923624 = queryWeight, product of:
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.035122856 = queryNorm
            0.27643585 = fieldWeight in 620, product of:
              2.828427 = tf(freq=8.0), with freq of:
                8.0 = termFreq=8.0
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.0625 = fieldNorm(doc=620)
        0.01545607 = weight(_text_:the in 620) [ClassicSimilarity], result of:
          0.01545607 = score(doc=620,freq=8.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.27891195 = fieldWeight in 620, product of:
              2.828427 = tf(freq=8.0), with freq of:
                8.0 = termFreq=8.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.0625 = fieldNorm(doc=620)
      0.41666666 = coord(5/12)
    
    Abstract
    The present paper focuses on the extraction, by means of a formal logical/mathematical methodology (i.e. automatically, exclusively by rule), of concept content, as in, for example, continuous discourse. The approach to a fully formal defintion of concept content ultimately is owing to a German government initiative to establish 'standards' regarding concepts, in conjunction with efforts to stipulate precisely (and then, derivatively, through computer prgrams) data and information needs according to work role in certain government offices
  18. De Maio, C.; Fenza, G.; Loia, V.; Senatore, S.: Hierarchical web resources retrieval by exploiting Fuzzy Formal Concept Analysis (2012) 0.03
    0.026237976 = product of:
      0.062971145 = sum of:
        0.010147331 = weight(_text_:information in 2737) [ClassicSimilarity], result of:
          0.010147331 = score(doc=2737,freq=4.0), product of:
            0.0616574 = queryWeight, product of:
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.035122856 = queryNorm
            0.16457605 = fieldWeight in 2737, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.046875 = fieldNorm(doc=2737)
        0.008207779 = weight(_text_:for in 2737) [ClassicSimilarity], result of:
          0.008207779 = score(doc=2737,freq=2.0), product of:
            0.06594466 = queryWeight, product of:
              1.8775425 = idf(docFreq=18385, maxDocs=44218)
              0.035122856 = queryNorm
            0.12446466 = fieldWeight in 2737, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.8775425 = idf(docFreq=18385, maxDocs=44218)
              0.046875 = fieldNorm(doc=2737)
        0.015334844 = weight(_text_:the in 2737) [ClassicSimilarity], result of:
          0.015334844 = score(doc=2737,freq=14.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.27672437 = fieldWeight in 2737, product of:
              3.7416575 = tf(freq=14.0), with freq of:
                14.0 = termFreq=14.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.046875 = fieldNorm(doc=2737)
        0.013946345 = weight(_text_:of in 2737) [ClassicSimilarity], result of:
          0.013946345 = score(doc=2737,freq=12.0), product of:
            0.054923624 = queryWeight, product of:
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.035122856 = queryNorm
            0.25392252 = fieldWeight in 2737, product of:
              3.4641016 = tf(freq=12.0), with freq of:
                12.0 = termFreq=12.0
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.046875 = fieldNorm(doc=2737)
        0.015334844 = weight(_text_:the in 2737) [ClassicSimilarity], result of:
          0.015334844 = score(doc=2737,freq=14.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.27672437 = fieldWeight in 2737, product of:
              3.7416575 = tf(freq=14.0), with freq of:
                14.0 = termFreq=14.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.046875 = fieldNorm(doc=2737)
      0.41666666 = coord(5/12)
    
    Abstract
    In recent years, knowledge structuring is assuming important roles in several real world applications such as decision support, cooperative problem solving, e-commerce, Semantic Web and, even in planning systems. Ontologies play an important role in supporting automated processes to access information and are at the core of new strategies for the development of knowledge-based systems. Yet, developing an ontology is a time-consuming task which often needs an accurate domain expertise to tackle structural and logical difficulties in the definition of concepts as well as conceivable relationships. This work presents an ontology-based retrieval approach, that supports data organization and visualization and provides a friendly navigation model. It exploits the fuzzy extension of the Formal Concept Analysis theory to elicit conceptualizations from datasets and generate a hierarchy-based representation of extracted knowledge. An intuitive graphical interface provides a multi-facets view of the built ontology. Through a transparent query-based retrieval, final users navigate across concepts, relations and population.
    Content
    Beitrag in einem Themenheft "Soft Approaches to IA on the Web". Vgl.: doi:10.1016/j.ipm.2011.04.003.
    Source
    Information processing and management. 48(2012) no.3, S.399-418
  19. Lex, W.: ¬A representation of concepts for their computerization (1987) 0.02
    0.024473753 = product of:
      0.073421255 = sum of:
        0.015476737 = weight(_text_:for in 618) [ClassicSimilarity], result of:
          0.015476737 = score(doc=618,freq=4.0), product of:
            0.06594466 = queryWeight, product of:
              1.8775425 = idf(docFreq=18385, maxDocs=44218)
              0.035122856 = queryNorm
            0.23469281 = fieldWeight in 618, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              1.8775425 = idf(docFreq=18385, maxDocs=44218)
              0.0625 = fieldNorm(doc=618)
        0.018929742 = weight(_text_:the in 618) [ClassicSimilarity], result of:
          0.018929742 = score(doc=618,freq=12.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.34159598 = fieldWeight in 618, product of:
              3.4641016 = tf(freq=12.0), with freq of:
                12.0 = termFreq=12.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.0625 = fieldNorm(doc=618)
        0.020085035 = weight(_text_:of in 618) [ClassicSimilarity], result of:
          0.020085035 = score(doc=618,freq=14.0), product of:
            0.054923624 = queryWeight, product of:
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.035122856 = queryNorm
            0.36569026 = fieldWeight in 618, product of:
              3.7416575 = tf(freq=14.0), with freq of:
                14.0 = termFreq=14.0
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.0625 = fieldNorm(doc=618)
        0.018929742 = weight(_text_:the in 618) [ClassicSimilarity], result of:
          0.018929742 = score(doc=618,freq=12.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.34159598 = fieldWeight in 618, product of:
              3.4641016 = tf(freq=12.0), with freq of:
                12.0 = termFreq=12.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.0625 = fieldNorm(doc=618)
      0.33333334 = coord(4/12)
    
    Abstract
    A lattice theoretical description of concept hierarchies is developed using for attributes the terms "given", "negated", "open" and "impossible" as the truth-values of a four-valued logic. Similar to the theory of B. Ganter and R. Wille so does this framework permit a precise representation of the usual interdependences in a field of related concepts - such as superconcepts, subconcept, contrary concepts etc. -, whenever the concepts under consideration can be sufficiently described by the presence or absence of certain attributes ...
  20. Kollewe, W.: Data representation by nested line diagrams illustrated by a survey of pensioners (1991) 0.02
    0.023094356 = product of:
      0.06928307 = sum of:
        0.009575742 = weight(_text_:for in 5230) [ClassicSimilarity], result of:
          0.009575742 = score(doc=5230,freq=2.0), product of:
            0.06594466 = queryWeight, product of:
              1.8775425 = idf(docFreq=18385, maxDocs=44218)
              0.035122856 = queryNorm
            0.14520876 = fieldWeight in 5230, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.8775425 = idf(docFreq=18385, maxDocs=44218)
              0.0546875 = fieldNorm(doc=5230)
        0.02242712 = weight(_text_:the in 5230) [ClassicSimilarity], result of:
          0.02242712 = score(doc=5230,freq=22.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.40470776 = fieldWeight in 5230, product of:
              4.690416 = tf(freq=22.0), with freq of:
                22.0 = termFreq=22.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.0546875 = fieldNorm(doc=5230)
        0.0148530835 = weight(_text_:of in 5230) [ClassicSimilarity], result of:
          0.0148530835 = score(doc=5230,freq=10.0), product of:
            0.054923624 = queryWeight, product of:
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.035122856 = queryNorm
            0.2704316 = fieldWeight in 5230, product of:
              3.1622777 = tf(freq=10.0), with freq of:
                10.0 = termFreq=10.0
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.0546875 = fieldNorm(doc=5230)
        0.02242712 = weight(_text_:the in 5230) [ClassicSimilarity], result of:
          0.02242712 = score(doc=5230,freq=22.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.40470776 = fieldWeight in 5230, product of:
              4.690416 = tf(freq=22.0), with freq of:
                22.0 = termFreq=22.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.0546875 = fieldNorm(doc=5230)
      0.33333334 = coord(4/12)
    
    Abstract
    With formal concept analysis surveys are analyzable in the way that a meaningful picture of the answers of the interviewed persons is available. Line diagrams of large concept lattices might become less readable up to the point that it is impossible to pursue the line segments with the eyes. Nested line diagrams give the opportunity to overcome these difficulties. The main idea of nested line diagrams is to partition the line diagram into boxes so that line segments between two boxes are all parallel and may be replaced by one line segment. The possibility to draw line diagrams with more than two factors does allow it to describe concept lattices with many hundred or thousand concepts in a clear structure. In practice it has often been proven useful to take standardized scales for the single levels

Years

Languages

  • e 42
  • d 35

Types

  • a 64
  • m 9
  • p 3
  • s 3
  • el 1
  • r 1
  • More… Less…