Search (30 results, page 1 of 2)

  • × theme_ss:"Formale Begriffsanalyse"
  1. Working with conceptual structures : contributions to ICCS 2000. 8th International Conference on Conceptual Structures: Logical, Linguistic, and Computational Issues. Darmstadt, August 14-18, 2000 (2000) 0.08
    0.0769753 = product of:
      0.10263373 = sum of:
        0.04546203 = weight(_text_:j in 5089) [ClassicSimilarity], result of:
          0.04546203 = score(doc=5089,freq=14.0), product of:
            0.1398433 = queryWeight, product of:
              3.1774964 = idf(docFreq=5010, maxDocs=44218)
              0.044010527 = queryNorm
            0.32509267 = fieldWeight in 5089, product of:
              3.7416575 = tf(freq=14.0), with freq of:
                14.0 = termFreq=14.0
              3.1774964 = idf(docFreq=5010, maxDocs=44218)
              0.02734375 = fieldNorm(doc=5089)
        0.052925747 = weight(_text_:van in 5089) [ClassicSimilarity], result of:
          0.052925747 = score(doc=5089,freq=2.0), product of:
            0.24542865 = queryWeight, product of:
              5.5765896 = idf(docFreq=454, maxDocs=44218)
              0.044010527 = queryNorm
            0.21564616 = fieldWeight in 5089, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              5.5765896 = idf(docFreq=454, maxDocs=44218)
              0.02734375 = fieldNorm(doc=5089)
        0.004245954 = product of:
          0.008491908 = sum of:
            0.008491908 = weight(_text_:der in 5089) [ClassicSimilarity], result of:
              0.008491908 = score(doc=5089,freq=2.0), product of:
                0.098309256 = queryWeight, product of:
                  2.2337668 = idf(docFreq=12875, maxDocs=44218)
                  0.044010527 = queryNorm
                0.08637954 = fieldWeight in 5089, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  2.2337668 = idf(docFreq=12875, maxDocs=44218)
                  0.02734375 = fieldNorm(doc=5089)
          0.5 = coord(1/2)
      0.75 = coord(3/4)
    
    Content
    Concepts & Language: Knowledge organization by procedures of natural language processing. A case study using the method GABEK (J. Zelger, J. Gadner) - Computer aided narrative analysis using conceptual graphs (H. Schärfe, P. 0hrstrom) - Pragmatic representation of argumentative text: a challenge for the conceptual graph approach (H. Irandoust, B. Moulin) - Conceptual graphs as a knowledge representation core in a complex language learning environment (G. Angelova, A. Nenkova, S. Boycheva, T. Nikolov) - Conceptual Modeling and Ontologies: Relationships and actions in conceptual categories (Ch. Landauer, K.L. Bellman) - Concept approximations for formal concept analysis (J. Saquer, J.S. Deogun) - Faceted information representation (U. Priß) - Simple concept graphs with universal quantifiers (J. Tappe) - A framework for comparing methods for using or reusing multiple ontologies in an application (J. van ZyI, D. Corbett) - Designing task/method knowledge-based systems with conceptual graphs (M. Leclère, F.Trichet, Ch. Choquet) - A logical ontology (J. Farkas, J. Sarbo) - Algorithms and Tools: Fast concept analysis (Ch. Lindig) - A framework for conceptual graph unification (D. Corbett) - Visual CP representation of knowledge (H.D. Pfeiffer, R.T. Hartley) - Maximal isojoin for representing software textual specifications and detecting semantic anomalies (Th. Charnois) - Troika: using grids, lattices and graphs in knowledge acquisition (H.S. Delugach, B.E. Lampkin) - Open world theorem prover for conceptual graphs (J.E. Heaton, P. Kocura) - NetCare: a practical conceptual graphs software tool (S. Polovina, D. Strang) - CGWorld - a web based workbench for conceptual graphs management and applications (P. Dobrev, K. Toutanova) - Position papers: The edition project: Peirce's existential graphs (R. Mülller) - Mining association rules using formal concept analysis (N. Pasquier) - Contextual logic summary (R Wille) - Information channels and conceptual scaling (K.E. Wolff) - Spatial concepts - a rule exploration (S. Rudolph) - The TEXT-TO-ONTO learning environment (A. Mädche, St. Staab) - Controlling the semantics of metadata on audio-visual documents using ontologies (Th. Dechilly, B. Bachimont) - Building the ontological foundations of a terminology from natural language to conceptual graphs with Ribosome, a knowledge extraction system (Ch. Jacquelinet, A. Burgun) - CharGer: some lessons learned and new directions (H.S. Delugach) - Knowledge management using conceptual graphs (W.K. Pun)
    Series
    Berichte aus der Informatik
  2. Kollewe, W.; Sander, C.; Schmiede, R.; Wille, R.: TOSCANA als Instrument der bibliothekarischen Sacherschließung (1995) 0.04
    0.0350532 = product of:
      0.0701064 = sum of:
        0.049094375 = weight(_text_:j in 927) [ClassicSimilarity], result of:
          0.049094375 = score(doc=927,freq=2.0), product of:
            0.1398433 = queryWeight, product of:
              3.1774964 = idf(docFreq=5010, maxDocs=44218)
              0.044010527 = queryNorm
            0.35106707 = fieldWeight in 927, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.1774964 = idf(docFreq=5010, maxDocs=44218)
              0.078125 = fieldNorm(doc=927)
        0.021012025 = product of:
          0.04202405 = sum of:
            0.04202405 = weight(_text_:der in 927) [ClassicSimilarity], result of:
              0.04202405 = score(doc=927,freq=6.0), product of:
                0.098309256 = queryWeight, product of:
                  2.2337668 = idf(docFreq=12875, maxDocs=44218)
                  0.044010527 = queryNorm
                0.42746788 = fieldWeight in 927, product of:
                  2.4494898 = tf(freq=6.0), with freq of:
                    6.0 = termFreq=6.0
                  2.2337668 = idf(docFreq=12875, maxDocs=44218)
                  0.078125 = fieldNorm(doc=927)
          0.5 = coord(1/2)
      0.5 = coord(2/4)
    
    Source
    Aufbau und Erschließung begrifflicher Datenbanken: Beiträge zur bibliothekarischen Klassifikation. Eine Auswahl von Vorträgen der Jahrestagungen 1993 (Kaiserslautern) und 1994 (Oldenburg) der Gesellschaft für Klassifikation. Hrsg.: H. Havekost u. H.-J. Wätjen
  3. Kipke, U.; Wille, R.: Begriffsverbände als Ablaufschemata zur Gegenstandsbestimmung (1986) 0.03
    0.02650025 = product of:
      0.0530005 = sum of:
        0.0392755 = weight(_text_:j in 3039) [ClassicSimilarity], result of:
          0.0392755 = score(doc=3039,freq=2.0), product of:
            0.1398433 = queryWeight, product of:
              3.1774964 = idf(docFreq=5010, maxDocs=44218)
              0.044010527 = queryNorm
            0.28085366 = fieldWeight in 3039, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.1774964 = idf(docFreq=5010, maxDocs=44218)
              0.0625 = fieldNorm(doc=3039)
        0.013724997 = product of:
          0.027449993 = sum of:
            0.027449993 = weight(_text_:der in 3039) [ClassicSimilarity], result of:
              0.027449993 = score(doc=3039,freq=4.0), product of:
                0.098309256 = queryWeight, product of:
                  2.2337668 = idf(docFreq=12875, maxDocs=44218)
                  0.044010527 = queryNorm
                0.27922085 = fieldWeight in 3039, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  2.2337668 = idf(docFreq=12875, maxDocs=44218)
                  0.0625 = fieldNorm(doc=3039)
          0.5 = coord(1/2)
      0.5 = coord(2/4)
    
    Abstract
    Ausgeführt wird, wie Begriffsverbände als Ablaufschemata zur Gegenstandsbestimmung genutzt werden können. Im Gegensatz zur Baumabfrage gestattet die beschriebene Methode dem Benutzer ein Höchstmaß an Freiheit und Transparenz. Demonstriert wird die Methode an der Bestimmung des Symmetrieprinzips von Flächenmustern
    Source
    Die Klassifikation und ihr Umfeld. Proc. 10. Jahrestagung der Gesellschaft für Klassifikation, Münster, 18.-21.6.1986. Hrsg.: P. Degens, H.-J. Hermes, O. Opitz
  4. Begriffliche Wissensverarbeitung : Methoden und Anwendungen. Mit Beiträgen zahlreicher Fachwissenschaftler (2000) 0.02
    0.022332335 = product of:
      0.04466467 = sum of:
        0.024547188 = weight(_text_:j in 4193) [ClassicSimilarity], result of:
          0.024547188 = score(doc=4193,freq=2.0), product of:
            0.1398433 = queryWeight, product of:
              3.1774964 = idf(docFreq=5010, maxDocs=44218)
              0.044010527 = queryNorm
            0.17553353 = fieldWeight in 4193, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.1774964 = idf(docFreq=5010, maxDocs=44218)
              0.0390625 = fieldNorm(doc=4193)
        0.020117482 = product of:
          0.040234964 = sum of:
            0.040234964 = weight(_text_:der in 4193) [ClassicSimilarity], result of:
              0.040234964 = score(doc=4193,freq=22.0), product of:
                0.098309256 = queryWeight, product of:
                  2.2337668 = idf(docFreq=12875, maxDocs=44218)
                  0.044010527 = queryNorm
                0.40926933 = fieldWeight in 4193, product of:
                  4.690416 = tf(freq=22.0), with freq of:
                    22.0 = termFreq=22.0
                  2.2337668 = idf(docFreq=12875, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=4193)
          0.5 = coord(1/2)
      0.5 = coord(2/4)
    
    Abstract
    Dieses Buch stellt Methoden der Begrifflichen Wissensverarbeitung vor und präsentiert Anwendungen aus unterschiedlichen Praxisfeldern. Im Methodenteil wird in moderne Techniken der Begrifflichen Datenanalyse und Wissensverarbeitung eingeführt. Der zweite Teil richtet sich verstärkt an potentielle Anwender. An ausgewählten Anwendungen wird die Vorgehensweise bei der Datenanalyse und dem Information Retrieval mit den Methoden der Begrifflichen Wissensverarbeitung vorgestellt und ihr Potential aufgezeigt
    Content
    Enthält die Beiträge: GANTER, B.: Begriffe und Implikationen; BURMEISTER, P.: ConImp: Ein Programm zur Fromalen Begriffsanalyse; Lengnink, K.: Ähnlichkeit als Distanz in Begriffsverbänden; POLLANDT, S.: Datenanalyse mit Fuzzy-Begriffen; PREDIGER, S.: Terminologische Merkmalslogik in der Formalen Begriffsanalyse; WILLE, R. u. M. ZICKWOLFF: Grundlagen einer Triadischen Begriffsanalyse; LINDIG, C. u. G. SNELTING: Formale Begriffsanalyse im Software Engineering; STRACK, H. u. M. SKORSKY: Zugriffskontrolle bei Programmsystemen und im Datenschutz mittels Formaler Begriffsanalyse; ANDELFINGER, U.: Inhaltliche Erschließung des Bereichs 'Sozialorientierte Gestaltung von Informationstechnik': Ein begriffsanalytischer Ansatz; GÖDERT, W.: Wissensdarstellung in Informationssystemen, Fragetypen und Anforderungen an Retrievalkomponenten; ROCK, T. u. R. WILLE: Ein TOSCANA-Erkundungssystem zur Literatursuche; ESCHENFELDER, D. u.a.: Ein Erkundungssystem zum Baurecht: Methoden der Entwicklung eines TOSCANA-Systems; GROßKOPF, A. u. G. HARRAS: Begriffliche Erkundung semantischer Strukturen von Sprechaktverben; ZELGER, J.: Grundwerte, Ziele und Maßnahmen in einem regionalen Krankenhaus: Eine Anwendung des Verfahrens GABEK; KOHLER-KOCH, B. u. F. VOGT: Normen- und regelgeleitete internationale Kooperationen: Formale Begriffsanalyse in der Politikwissenschaft; HENNING, H.J. u. W. KEMMNITZ: Entwicklung eines kontextuellen Methodenkonzeptes mit Hilfer der Formalen Begriffsanalyse an Beispielen zum Risikoverständnis; BARTEL, H.-G.: Über Möglichkeiten der Formalen Begriffsanalyse in der Mathematischen Archäochemie
  5. Prediger, S.: Kontextuelle Urteilslogik mit Begriffsgraphen : Ein Beitrag zur Restrukturierung der mathematischen Logik (1998) 0.02
    0.020972697 = product of:
      0.08389079 = sum of:
        0.08389079 = sum of:
          0.024262596 = weight(_text_:der in 3142) [ClassicSimilarity], result of:
            0.024262596 = score(doc=3142,freq=2.0), product of:
              0.098309256 = queryWeight, product of:
                2.2337668 = idf(docFreq=12875, maxDocs=44218)
                0.044010527 = queryNorm
              0.2467987 = fieldWeight in 3142, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                2.2337668 = idf(docFreq=12875, maxDocs=44218)
                0.078125 = fieldNorm(doc=3142)
          0.059628192 = weight(_text_:22 in 3142) [ClassicSimilarity], result of:
            0.059628192 = score(doc=3142,freq=2.0), product of:
              0.15411738 = queryWeight, product of:
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.044010527 = queryNorm
              0.38690117 = fieldWeight in 3142, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.078125 = fieldNorm(doc=3142)
      0.25 = coord(1/4)
    
    Date
    26. 2.2008 15:58:22
  6. Ganter, B.; Stahl, J.; Wille, R.: Conceptual measurement and many-valued contexts (1986) 0.02
    0.017183032 = product of:
      0.06873213 = sum of:
        0.06873213 = weight(_text_:j in 3137) [ClassicSimilarity], result of:
          0.06873213 = score(doc=3137,freq=2.0), product of:
            0.1398433 = queryWeight, product of:
              3.1774964 = idf(docFreq=5010, maxDocs=44218)
              0.044010527 = queryNorm
            0.4914939 = fieldWeight in 3137, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.1774964 = idf(docFreq=5010, maxDocs=44218)
              0.109375 = fieldNorm(doc=3137)
      0.25 = coord(1/4)
    
  7. Priss, U.: Faceted information representation (2000) 0.01
    0.014680887 = product of:
      0.058723547 = sum of:
        0.058723547 = sum of:
          0.016983816 = weight(_text_:der in 5095) [ClassicSimilarity], result of:
            0.016983816 = score(doc=5095,freq=2.0), product of:
              0.098309256 = queryWeight, product of:
                2.2337668 = idf(docFreq=12875, maxDocs=44218)
                0.044010527 = queryNorm
              0.17275909 = fieldWeight in 5095, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                2.2337668 = idf(docFreq=12875, maxDocs=44218)
                0.0546875 = fieldNorm(doc=5095)
          0.041739732 = weight(_text_:22 in 5095) [ClassicSimilarity], result of:
            0.041739732 = score(doc=5095,freq=2.0), product of:
              0.15411738 = queryWeight, product of:
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.044010527 = queryNorm
              0.2708308 = fieldWeight in 5095, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.0546875 = fieldNorm(doc=5095)
      0.25 = coord(1/4)
    
    Date
    22. 1.2016 17:47:06
    Series
    Berichte aus der Informatik
  8. Hereth, J.; Stumme, G.; Wille, R.; Wille, U.: Conceptual knowledge discovery and data analysis (2000) 0.01
    0.008678742 = product of:
      0.034714967 = sum of:
        0.034714967 = weight(_text_:j in 5083) [ClassicSimilarity], result of:
          0.034714967 = score(doc=5083,freq=4.0), product of:
            0.1398433 = queryWeight, product of:
              3.1774964 = idf(docFreq=5010, maxDocs=44218)
              0.044010527 = queryNorm
            0.2482419 = fieldWeight in 5083, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              3.1774964 = idf(docFreq=5010, maxDocs=44218)
              0.0390625 = fieldNorm(doc=5083)
      0.25 = coord(1/4)
    
    Abstract
    In this paper, we discuss Conceptual Knowledge Discovery in Databases (CKDD) in its connection with Data Analysis. Our approach is based on Formal Concept Analysis, a mathematical theory which has been developed and proven useful during the last 20 years. Formal Concept Analysis has led to a theory of conceptual information systems which has been applied by using the management system TOSCANA in a wide range of domains. In this paper, we use such an application in database marketing to demonstrate how methods and procedures of CKDD can be applied in Data Analysis. In particular, we show the interplay and integration of data mining and data analysis techniques based on Formal Concept Analysis. The main concern of this paper is to explain how the transition from data to knowledge can be supported by a TOSCANA system. To clarify the transition steps we discuss their correspondence to the five levels of knowledge representation established by R. Brachman and to the steps of empirically grounded theory building proposed by A. Strauss and J. Corbin
  9. Kumar, C.A.; Radvansky, M.; Annapurna, J.: Analysis of Vector Space Model, Latent Semantic Indexing and Formal Concept Analysis for information retrieval (2012) 0.01
    0.008591516 = product of:
      0.034366064 = sum of:
        0.034366064 = weight(_text_:j in 2710) [ClassicSimilarity], result of:
          0.034366064 = score(doc=2710,freq=2.0), product of:
            0.1398433 = queryWeight, product of:
              3.1774964 = idf(docFreq=5010, maxDocs=44218)
              0.044010527 = queryNorm
            0.24574696 = fieldWeight in 2710, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.1774964 = idf(docFreq=5010, maxDocs=44218)
              0.0546875 = fieldNorm(doc=2710)
      0.25 = coord(1/4)
    
  10. Scheich, P.; Skorsky, M.; Vogt, F.; Wachter, C.; Wille, R.: Conceptual data systems (1992) 0.01
    0.0060046865 = product of:
      0.024018746 = sum of:
        0.024018746 = product of:
          0.04803749 = sum of:
            0.04803749 = weight(_text_:der in 3147) [ClassicSimilarity], result of:
              0.04803749 = score(doc=3147,freq=4.0), product of:
                0.098309256 = queryWeight, product of:
                  2.2337668 = idf(docFreq=12875, maxDocs=44218)
                  0.044010527 = queryNorm
                0.4886365 = fieldWeight in 3147, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  2.2337668 = idf(docFreq=12875, maxDocs=44218)
                  0.109375 = fieldNorm(doc=3147)
          0.5 = coord(1/2)
      0.25 = coord(1/4)
    
    Footnote
    Erscheint im Tagungsband der 16. Jahrestagung der Gesellschaft für Klassifikation 1992 in Dortmund
  11. Zickwolff, M.: Zur Rolle der Formalen Begriffsanalyse in der Wissensakquisition (1994) 0.01
    0.0060046865 = product of:
      0.024018746 = sum of:
        0.024018746 = product of:
          0.04803749 = sum of:
            0.04803749 = weight(_text_:der in 8938) [ClassicSimilarity], result of:
              0.04803749 = score(doc=8938,freq=4.0), product of:
                0.098309256 = queryWeight, product of:
                  2.2337668 = idf(docFreq=12875, maxDocs=44218)
                  0.044010527 = queryNorm
                0.4886365 = fieldWeight in 8938, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  2.2337668 = idf(docFreq=12875, maxDocs=44218)
                  0.109375 = fieldNorm(doc=8938)
          0.5 = coord(1/2)
      0.25 = coord(1/4)
    
  12. Vogt, F.; Wille, R.: TOSCANA - a graphical tool for analyzing and exploring data (1995) 0.01
    0.0059628193 = product of:
      0.023851277 = sum of:
        0.023851277 = product of:
          0.047702555 = sum of:
            0.047702555 = weight(_text_:22 in 1901) [ClassicSimilarity], result of:
              0.047702555 = score(doc=1901,freq=2.0), product of:
                0.15411738 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.044010527 = queryNorm
                0.30952093 = fieldWeight in 1901, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0625 = fieldNorm(doc=1901)
          0.5 = coord(1/2)
      0.25 = coord(1/4)
    
    Source
    Knowledge organization. 22(1995) no.2, S.78-81
  13. Kollewe, W.; Sander, C.; Schmiede, R.; Wille, R.: TOSCANA als Instrument der bibliothekarischen Sacherschließung (1995) 0.01
    0.0054252814 = product of:
      0.021701125 = sum of:
        0.021701125 = product of:
          0.04340225 = sum of:
            0.04340225 = weight(_text_:der in 585) [ClassicSimilarity], result of:
              0.04340225 = score(doc=585,freq=10.0), product of:
                0.098309256 = queryWeight, product of:
                  2.2337668 = idf(docFreq=12875, maxDocs=44218)
                  0.044010527 = queryNorm
                0.44148692 = fieldWeight in 585, product of:
                  3.1622777 = tf(freq=10.0), with freq of:
                    10.0 = termFreq=10.0
                  2.2337668 = idf(docFreq=12875, maxDocs=44218)
                  0.0625 = fieldNorm(doc=585)
          0.5 = coord(1/2)
      0.25 = coord(1/4)
    
    Abstract
    TOSCANA ist ein Computerprogramm, mit dem begriffliche Erkundungssysteme auf der Grundlage der Formalen Begriffsanalyse erstellt werden können.In der vorliegenden Arbeit wird diskutiert, wie TOSCANA zur bibliothekarischen Sacherschließung und thematischen Literatursuche eingesetzt werden kann. Berichtet wird dabei von dem Forschungsprojekt 'Anwendung eines Modells begrifflicher Wissenssysteme im Bereich der Literatur zur interdisziplinären Technikforschung', das vom Darmstädter Zentrum für interdisziplinäre Technikforschung gefördert worden ist
  14. Priss, U.: Faceted knowledge representation (1999) 0.01
    0.0052174665 = product of:
      0.020869866 = sum of:
        0.020869866 = product of:
          0.041739732 = sum of:
            0.041739732 = weight(_text_:22 in 2654) [ClassicSimilarity], result of:
              0.041739732 = score(doc=2654,freq=2.0), product of:
                0.15411738 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.044010527 = queryNorm
                0.2708308 = fieldWeight in 2654, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=2654)
          0.5 = coord(1/2)
      0.25 = coord(1/4)
    
    Date
    22. 1.2016 17:30:31
  15. Wille, R.: Begriffliche Wissensverarbeitung in der Wirtschaft (2002) 0.01
    0.005200211 = product of:
      0.020800844 = sum of:
        0.020800844 = product of:
          0.041601688 = sum of:
            0.041601688 = weight(_text_:der in 547) [ClassicSimilarity], result of:
              0.041601688 = score(doc=547,freq=12.0), product of:
                0.098309256 = queryWeight, product of:
                  2.2337668 = idf(docFreq=12875, maxDocs=44218)
                  0.044010527 = queryNorm
                0.4231716 = fieldWeight in 547, product of:
                  3.4641016 = tf(freq=12.0), with freq of:
                    12.0 = termFreq=12.0
                  2.2337668 = idf(docFreq=12875, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=547)
          0.5 = coord(1/2)
      0.25 = coord(1/4)
    
    Abstract
    Begriffliche Wissensverarbeitung ist einem pragmatischen Wissensverständnis verpflichtet, nach dem menschliches Wissen in einem offenen Prozess menschlichen Denkens, Argumentierens und Kommunizierens entsteht und weiterlebt. Sie gründet sich auf eine mathematische Begriffstheorie, die auf das wechselseitige Zusammenwirken von Formalem und Inhaltlichem ausgerichtet ist. Wie diese theoretische Konzeption in der wirtschaftlichen Praxis zur Wirkung kommt wird erläutert anhand der Kernprozesse des organisationalen Wissensmanagements, d.h. nach G. Probst et al. anhand von Wissensidentifikation, Wissenserwerb, Wissensentwicklung, Wissens(ver)teilung, Wissensnutzung und Wissensbewahrung; jeweils an einem Beispiel wird der Einsatz spezifischer Methoden der Begrifflichen Wissensverarbeitung demonstriert. Abschließend wird auf den prozesshaften Wirkungszusammenhang von Wissenszielen und Wissensbewertungen mit den Kernprozessen aus Sicht der Begrifflichen Wissensverarbeitung eingegangen.
  16. Wille, R.: Begriffliche Datensysteme als Werkzeuge der Wissenskommunikation (1992) 0.01
    0.005146874 = product of:
      0.020587496 = sum of:
        0.020587496 = product of:
          0.041174993 = sum of:
            0.041174993 = weight(_text_:der in 8826) [ClassicSimilarity], result of:
              0.041174993 = score(doc=8826,freq=4.0), product of:
                0.098309256 = queryWeight, product of:
                  2.2337668 = idf(docFreq=12875, maxDocs=44218)
                  0.044010527 = queryNorm
                0.4188313 = fieldWeight in 8826, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  2.2337668 = idf(docFreq=12875, maxDocs=44218)
                  0.09375 = fieldNorm(doc=8826)
          0.5 = coord(1/2)
      0.25 = coord(1/4)
    
    Source
    Mensch und Maschine: Informationelle Schnittstellen der Kommunikation. Proc. des 3. Int. Symposiums für Informationswissenschaft (ISI'92), 5.-7.11.1992 in Saarbrücken. Hrsg.: H.H. Zimmermann, H.-D. Luckhardt u. A. Schulz
  17. Wille, R.: Liniendiagramme hierarchischer Begriffssysteme (1984) 0.01
    0.005146874 = product of:
      0.020587496 = sum of:
        0.020587496 = product of:
          0.041174993 = sum of:
            0.041174993 = weight(_text_:der in 593) [ClassicSimilarity], result of:
              0.041174993 = score(doc=593,freq=4.0), product of:
                0.098309256 = queryWeight, product of:
                  2.2337668 = idf(docFreq=12875, maxDocs=44218)
                  0.044010527 = queryNorm
                0.4188313 = fieldWeight in 593, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  2.2337668 = idf(docFreq=12875, maxDocs=44218)
                  0.09375 = fieldNorm(doc=593)
          0.5 = coord(1/2)
      0.25 = coord(1/4)
    
    Source
    Anwendungen in der Klassifikation. II: Datenanalyse und numerische Klassifikation. Proc. 8. Jahrestagung der Gesellschaft für Klassifikation, Hofgeismar, 10.-13.4.1984. Hrsg.: H.H. Bock
  18. Bartel, H.-G.: Über Möglichkeiten der Formalen Begriffsanalyse in der Mathematischen Archäochemie (2000) 0.01
    0.005146874 = product of:
      0.020587496 = sum of:
        0.020587496 = product of:
          0.041174993 = sum of:
            0.041174993 = weight(_text_:der in 4208) [ClassicSimilarity], result of:
              0.041174993 = score(doc=4208,freq=4.0), product of:
                0.098309256 = queryWeight, product of:
                  2.2337668 = idf(docFreq=12875, maxDocs=44218)
                  0.044010527 = queryNorm
                0.4188313 = fieldWeight in 4208, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  2.2337668 = idf(docFreq=12875, maxDocs=44218)
                  0.09375 = fieldNorm(doc=4208)
          0.5 = coord(1/2)
      0.25 = coord(1/4)
    
  19. Kollewe, W.: Instrumente der Literaturverwaltung : Inhaltliche analyse von Datenbeständen durch 'Begriffliche Wissensverarbeitung' (1996) 0.00
    0.0048525194 = product of:
      0.019410077 = sum of:
        0.019410077 = product of:
          0.038820155 = sum of:
            0.038820155 = weight(_text_:der in 4376) [ClassicSimilarity], result of:
              0.038820155 = score(doc=4376,freq=8.0), product of:
                0.098309256 = queryWeight, product of:
                  2.2337668 = idf(docFreq=12875, maxDocs=44218)
                  0.044010527 = queryNorm
                0.3948779 = fieldWeight in 4376, product of:
                  2.828427 = tf(freq=8.0), with freq of:
                    8.0 = termFreq=8.0
                  2.2337668 = idf(docFreq=12875, maxDocs=44218)
                  0.0625 = fieldNorm(doc=4376)
          0.5 = coord(1/2)
      0.25 = coord(1/4)
    
    Abstract
    Ein grundsätzliches Problem der Literaturverwaltung besteht darin, daß viele Nutzer der Retrievalsysteme gar nicht genau sagen können, was sie suchen. Erst im Prozeß des erkundenden Suchens lernen sie genauer zu präzisieren, was sie finden wollen. Dieser Lernprozeß wird durch einzelne Suchwörter (Suchwortketten) nur unzureichend unterstützt, weshalb der benutzer häufig unzufrieden mit dem Ergebnis eines solchen Suchprozesses ist. Notwendig sind reichhaltigere Begriffsnetze, die thematisch geordnete Zusammenhänge darstellen und sich flexibel verfeinern, vergröbern oder verändern lassen, um in geeignetem Umfang die wünschenswerte Orientierung liefern zu können. Das Computerprogramm TOSCANA könnte hier weiterhelfen
  20. Pollandt, S.: Fuzzy-Begriffe : Formale Begriffsanalyse unscharfer Daten (1997) 0.00
    0.0048525194 = product of:
      0.019410077 = sum of:
        0.019410077 = product of:
          0.038820155 = sum of:
            0.038820155 = weight(_text_:der in 2086) [ClassicSimilarity], result of:
              0.038820155 = score(doc=2086,freq=8.0), product of:
                0.098309256 = queryWeight, product of:
                  2.2337668 = idf(docFreq=12875, maxDocs=44218)
                  0.044010527 = queryNorm
                0.3948779 = fieldWeight in 2086, product of:
                  2.828427 = tf(freq=8.0), with freq of:
                    8.0 = termFreq=8.0
                  2.2337668 = idf(docFreq=12875, maxDocs=44218)
                  0.0625 = fieldNorm(doc=2086)
          0.5 = coord(1/2)
      0.25 = coord(1/4)
    
    Abstract
    Ausgehend von der Theorie der Fuzzy-Mengen und Fuzzy-Logik werden neue Methoden zur Analyse unscharfer Daten entwickelt. Dazu wird die Theorie der Formalen Begriffsanalyse in einer Reihe von Methoden und Verfahren erweitert und somit der Forderung von Anwendern nach Möglichkeiten zur begriffsanalytischen Erfassung unscharfer Daten Rechnung getragen. Die benötigten theoretischen Grundlagen werden einführend bereitgestellt, die mathematische Darstellung wird an leicht nachvollziehbaren praktischen Beispielen veranschaulicht