Search (30 results, page 1 of 2)

  • × theme_ss:"Formale Begriffsanalyse"
  • × year_i:[1990 TO 2000}
  1. Vogt, C.; Wille, R.: Formale Begriffsanalyse : Darstellung und Analyse von bibliographischen Daten (1994) 0.02
    0.019800998 = product of:
      0.059402995 = sum of:
        0.012493922 = weight(_text_:in in 7603) [ClassicSimilarity], result of:
          0.012493922 = score(doc=7603,freq=2.0), product of:
            0.059380736 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.043654136 = queryNorm
            0.21040362 = fieldWeight in 7603, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.109375 = fieldNorm(doc=7603)
        0.04690907 = weight(_text_:und in 7603) [ClassicSimilarity], result of:
          0.04690907 = score(doc=7603,freq=4.0), product of:
            0.09675359 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.043654136 = queryNorm
            0.4848303 = fieldWeight in 7603, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.109375 = fieldNorm(doc=7603)
      0.33333334 = coord(2/6)
    
    Source
    Informations- und Wissensverarbeitung in den Sozialwissenschaften: Beiträge zur Umsetzung neuer Informationstechnologien. Hrsg.: H. Best u.a
  2. Sander, C.; Schmiede, R.; Wille, R.: ¬Ein begriffliches Datensystem zur Literatur der interdisziplinären Technikforschung (1993) 0.02
    0.017017838 = product of:
      0.05105351 = sum of:
        0.013968632 = weight(_text_:in in 5255) [ClassicSimilarity], result of:
          0.013968632 = score(doc=5255,freq=10.0), product of:
            0.059380736 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.043654136 = queryNorm
            0.23523843 = fieldWeight in 5255, product of:
              3.1622777 = tf(freq=10.0), with freq of:
                10.0 = termFreq=10.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0546875 = fieldNorm(doc=5255)
        0.037084877 = weight(_text_:und in 5255) [ClassicSimilarity], result of:
          0.037084877 = score(doc=5255,freq=10.0), product of:
            0.09675359 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.043654136 = queryNorm
            0.38329202 = fieldWeight in 5255, product of:
              3.1622777 = tf(freq=10.0), with freq of:
                10.0 = termFreq=10.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.0546875 = fieldNorm(doc=5255)
      0.33333334 = coord(2/6)
    
    Abstract
    Begriffliche Datensysteme sind im Rahmen der Formalen Begriffsanalyse entstanden und gründen sich auf mathematische Formalisierungen von Begriff, Begriffssystem und Begriffliche Datei. Sie machen Wissen, das in einer Datenbasis vorliegt, begrifflich zugänglich und interpretierbar. Hierfür werden begriffliche Zusammenhänge entsprechend gewählter Frageaspekte in gestuften Liniendiagrammen dargestellt. Durch Verfeinern, Vergröbern und Wechseln von Begriffstrukturen kann man unbegrenzt durch das in der Datenbasis gespeicherte Wissen "navigieren". In einem Forschungsprojekt, gefördert vom Zentrum für interdisziplinäre Technikforschung an der TH Darmstadt, ist ein Prototyp eines begrifflichen Datensystems erstellt worden, dem als Datenkontext eine ausgewählte, begrifflich aufgearbeitete Menge von Büchern zur interdisziplinären Technikforschung zugrunde liegt. Mit diesem Prototyp soll die flexible und variable Verwendung begrifflicher datensysteme im Literaturbereich demonstriert werden
    Source
    Vortrag, 17. Jahrestagung der Gesellschaft für Klassifikation, 3.-5.3.1993 in Kaiserslautern
  3. Zickwolff, M.: Zur Rolle der Formalen Begriffsanalyse in der Wissensakquisition (1994) 0.02
    0.015221215 = product of:
      0.045663644 = sum of:
        0.012493922 = weight(_text_:in in 8938) [ClassicSimilarity], result of:
          0.012493922 = score(doc=8938,freq=2.0), product of:
            0.059380736 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.043654136 = queryNorm
            0.21040362 = fieldWeight in 8938, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.109375 = fieldNorm(doc=8938)
        0.03316972 = weight(_text_:und in 8938) [ClassicSimilarity], result of:
          0.03316972 = score(doc=8938,freq=2.0), product of:
            0.09675359 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.043654136 = queryNorm
            0.34282678 = fieldWeight in 8938, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.109375 = fieldNorm(doc=8938)
      0.33333334 = coord(2/6)
    
    Source
    Begriffliche Wissensverarbeitung: Grundfragen und Aufgaben. Hrsg.: R. Wille u. M. Zickwolff
  4. Vogt, F.: Formale Begriffsanalyse mit C++ : Datenstrukturen und Algorithmen (1996) 0.02
    0.015065096 = product of:
      0.04519529 = sum of:
        0.012365777 = weight(_text_:in in 2037) [ClassicSimilarity], result of:
          0.012365777 = score(doc=2037,freq=6.0), product of:
            0.059380736 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.043654136 = queryNorm
            0.2082456 = fieldWeight in 2037, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0625 = fieldNorm(doc=2037)
        0.032829512 = weight(_text_:und in 2037) [ClassicSimilarity], result of:
          0.032829512 = score(doc=2037,freq=6.0), product of:
            0.09675359 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.043654136 = queryNorm
            0.33931053 = fieldWeight in 2037, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.0625 = fieldNorm(doc=2037)
      0.33333334 = coord(2/6)
    
    Abstract
    Das Buch soll den Leser, der an Formaler Begriffsnalyse als Methode der Datenanalyse und Wissensstrukturierung interssiert ist, in die Lage versetzen, eigene C++-Programme zur Formalen Begriffsanalyse zu schreiben. Die Vorgehensweise der Formalen Begriffsanalyse werden an einem Anwendungsbeispiel erläutert, so daß das Buch sowohl als Leitfaden für den interessierten Neueinsteiger als auch als Handbuch für den versierten Anwendungsprogrammierer und Projektleiter dienen kann
    Content
    The book comprises 16 chapters and an appendix in which the software CONSCRIPT is explained
    Footnote
    Rez. in: Knowledge organization 25(1998) nos.1/2, S.47 (S. Düwel u. W. Hesse)
  5. Pollandt, S.: Fuzzy-Begriffe : Formale Begriffsanalyse unscharfer Daten (1997) 0.01
    0.014308709 = product of:
      0.042926125 = sum of:
        0.010096614 = weight(_text_:in in 2086) [ClassicSimilarity], result of:
          0.010096614 = score(doc=2086,freq=4.0), product of:
            0.059380736 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.043654136 = queryNorm
            0.17003182 = fieldWeight in 2086, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0625 = fieldNorm(doc=2086)
        0.032829512 = weight(_text_:und in 2086) [ClassicSimilarity], result of:
          0.032829512 = score(doc=2086,freq=6.0), product of:
            0.09675359 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.043654136 = queryNorm
            0.33931053 = fieldWeight in 2086, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.0625 = fieldNorm(doc=2086)
      0.33333334 = coord(2/6)
    
    Abstract
    Ausgehend von der Theorie der Fuzzy-Mengen und Fuzzy-Logik werden neue Methoden zur Analyse unscharfer Daten entwickelt. Dazu wird die Theorie der Formalen Begriffsanalyse in einer Reihe von Methoden und Verfahren erweitert und somit der Forderung von Anwendern nach Möglichkeiten zur begriffsanalytischen Erfassung unscharfer Daten Rechnung getragen. Die benötigten theoretischen Grundlagen werden einführend bereitgestellt, die mathematische Darstellung wird an leicht nachvollziehbaren praktischen Beispielen veranschaulicht
    Footnote
    Rez. in: Knowledge organization 25(1998) no.3, S.123-124 (K.E. Wolff)
  6. Wille, R.: Begriffliche Datensysteme als Werkzeuge der Wissenskommunikation (1992) 0.01
    0.013046755 = product of:
      0.039140265 = sum of:
        0.010709076 = weight(_text_:in in 8826) [ClassicSimilarity], result of:
          0.010709076 = score(doc=8826,freq=2.0), product of:
            0.059380736 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.043654136 = queryNorm
            0.18034597 = fieldWeight in 8826, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.09375 = fieldNorm(doc=8826)
        0.02843119 = weight(_text_:und in 8826) [ClassicSimilarity], result of:
          0.02843119 = score(doc=8826,freq=2.0), product of:
            0.09675359 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.043654136 = queryNorm
            0.29385152 = fieldWeight in 8826, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.09375 = fieldNorm(doc=8826)
      0.33333334 = coord(2/6)
    
    Source
    Mensch und Maschine: Informationelle Schnittstellen der Kommunikation. Proc. des 3. Int. Symposiums für Informationswissenschaft (ISI'92), 5.-7.11.1992 in Saarbrücken. Hrsg.: H.H. Zimmermann, H.-D. Luckhardt u. A. Schulz
  7. Wille, R.; Wachter, C.: Begriffsanalyse von Dokumenten (1992) 0.01
    0.013046755 = product of:
      0.039140265 = sum of:
        0.010709076 = weight(_text_:in in 341) [ClassicSimilarity], result of:
          0.010709076 = score(doc=341,freq=2.0), product of:
            0.059380736 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.043654136 = queryNorm
            0.18034597 = fieldWeight in 341, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.09375 = fieldNorm(doc=341)
        0.02843119 = weight(_text_:und in 341) [ClassicSimilarity], result of:
          0.02843119 = score(doc=341,freq=2.0), product of:
            0.09675359 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.043654136 = queryNorm
            0.29385152 = fieldWeight in 341, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.09375 = fieldNorm(doc=341)
      0.33333334 = coord(2/6)
    
    Source
    Information und Dokumentation in den 90er Jahren: neue Herausforderung, neue Technologien. Deutscher Dokumentartag 1991, Universität Ulm, 30.9.-2.10.1991. Hrsg.: W. Neubauer u. K.-H. Meier
  8. Prediger, S.: Kontextuelle Urteilslogik mit Begriffsgraphen : Ein Beitrag zur Restrukturierung der mathematischen Logik (1998) 0.01
    0.012832299 = product of:
      0.038496897 = sum of:
        0.008924231 = weight(_text_:in in 3142) [ClassicSimilarity], result of:
          0.008924231 = score(doc=3142,freq=2.0), product of:
            0.059380736 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.043654136 = queryNorm
            0.15028831 = fieldWeight in 3142, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.078125 = fieldNorm(doc=3142)
        0.029572664 = product of:
          0.059145328 = sum of:
            0.059145328 = weight(_text_:22 in 3142) [ClassicSimilarity], result of:
              0.059145328 = score(doc=3142,freq=2.0), product of:
                0.15286934 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.043654136 = queryNorm
                0.38690117 = fieldWeight in 3142, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.078125 = fieldNorm(doc=3142)
          0.5 = coord(1/2)
      0.33333334 = coord(2/6)
    
    Date
    26. 2.2008 15:58:22
    Footnote
    Rez. in: KO 26(1999) no.3, S.175-176 (R. Wille)
  9. Vogt, F.; Wille, R.: TOSCANA - a graphical tool for analyzing and exploring data (1995) 0.01
    0.011251582 = product of:
      0.033754744 = sum of:
        0.010096614 = weight(_text_:in in 1901) [ClassicSimilarity], result of:
          0.010096614 = score(doc=1901,freq=4.0), product of:
            0.059380736 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.043654136 = queryNorm
            0.17003182 = fieldWeight in 1901, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0625 = fieldNorm(doc=1901)
        0.02365813 = product of:
          0.04731626 = sum of:
            0.04731626 = weight(_text_:22 in 1901) [ClassicSimilarity], result of:
              0.04731626 = score(doc=1901,freq=2.0), product of:
                0.15286934 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.043654136 = queryNorm
                0.30952093 = fieldWeight in 1901, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0625 = fieldNorm(doc=1901)
          0.5 = coord(1/2)
      0.33333334 = coord(2/6)
    
    Abstract
    TOSCANA is a computer program which allows an online interaction with larger data bases to analyse and explore data conceptually. It uses labelled line diagrams of concept lattices to communicate knowledge coded in given data. The basic problem to create online presentations of concept lattices is solved by composing prepared diagrams to nested line diagrams. A larger number of applications in different areas have already shown that TOSCANA is a useful tool for many purposes
    Source
    Knowledge organization. 22(1995) no.2, S.78-81
  10. Kent, R.E.: Implications and rules in thesauri (1994) 0.01
    0.011077631 = product of:
      0.033232894 = sum of:
        0.014278769 = weight(_text_:in in 3457) [ClassicSimilarity], result of:
          0.014278769 = score(doc=3457,freq=8.0), product of:
            0.059380736 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.043654136 = queryNorm
            0.24046129 = fieldWeight in 3457, product of:
              2.828427 = tf(freq=8.0), with freq of:
                8.0 = termFreq=8.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0625 = fieldNorm(doc=3457)
        0.018954126 = weight(_text_:und in 3457) [ClassicSimilarity], result of:
          0.018954126 = score(doc=3457,freq=2.0), product of:
            0.09675359 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.043654136 = queryNorm
            0.19590102 = fieldWeight in 3457, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.0625 = fieldNorm(doc=3457)
      0.33333334 = coord(2/6)
    
    Abstract
    A central consideration in the study of whole language semantic space as encoded in thesauri is word sense comparability. Shows how word sense comparability can be adequately expressed by the logical implications and rules from Formal Concept Analysis. Formal concept analysis, a new approach to formal logic initiated by Rudolf Wille, has been used for data modelling, analysis and interpretation, and also for knowledge representation and knowledge discovery
    Series
    Advances in knowledge organization; vol.4
    Theme
    Konzeption und Anwendung des Prinzips Thesaurus
  11. Priss, U.: Faceted knowledge representation (1999) 0.01
    0.009845134 = product of:
      0.029535402 = sum of:
        0.008834538 = weight(_text_:in in 2654) [ClassicSimilarity], result of:
          0.008834538 = score(doc=2654,freq=4.0), product of:
            0.059380736 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.043654136 = queryNorm
            0.14877784 = fieldWeight in 2654, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0546875 = fieldNorm(doc=2654)
        0.020700864 = product of:
          0.04140173 = sum of:
            0.04140173 = weight(_text_:22 in 2654) [ClassicSimilarity], result of:
              0.04140173 = score(doc=2654,freq=2.0), product of:
                0.15286934 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.043654136 = queryNorm
                0.2708308 = fieldWeight in 2654, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=2654)
          0.5 = coord(1/2)
      0.33333334 = coord(2/6)
    
    Abstract
    Faceted Knowledge Representation provides a formalism for implementing knowledge systems. The basic notions of faceted knowledge representation are "unit", "relation", "facet" and "interpretation". Units are atomic elements and can be abstract elements or refer to external objects in an application. Relations are sequences or matrices of 0 and 1's (binary matrices). Facets are relational structures that combine units and relations. Each facet represents an aspect or viewpoint of a knowledge system. Interpretations are mappings that can be used to translate between different representations. This paper introduces the basic notions of faceted knowledge representation. The formalism is applied here to an abstract modeling of a faceted thesaurus as used in information retrieval.
    Date
    22. 1.2016 17:30:31
  12. Kollewe, W.: Instrumente der Literaturverwaltung : Inhaltliche analyse von Datenbeständen durch 'Begriffliche Wissensverarbeitung' (1996) 0.01
    0.008697838 = product of:
      0.02609351 = sum of:
        0.0071393843 = weight(_text_:in in 4376) [ClassicSimilarity], result of:
          0.0071393843 = score(doc=4376,freq=2.0), product of:
            0.059380736 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.043654136 = queryNorm
            0.120230645 = fieldWeight in 4376, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0625 = fieldNorm(doc=4376)
        0.018954126 = weight(_text_:und in 4376) [ClassicSimilarity], result of:
          0.018954126 = score(doc=4376,freq=2.0), product of:
            0.09675359 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.043654136 = queryNorm
            0.19590102 = fieldWeight in 4376, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.0625 = fieldNorm(doc=4376)
      0.33333334 = coord(2/6)
    
    Abstract
    Ein grundsätzliches Problem der Literaturverwaltung besteht darin, daß viele Nutzer der Retrievalsysteme gar nicht genau sagen können, was sie suchen. Erst im Prozeß des erkundenden Suchens lernen sie genauer zu präzisieren, was sie finden wollen. Dieser Lernprozeß wird durch einzelne Suchwörter (Suchwortketten) nur unzureichend unterstützt, weshalb der benutzer häufig unzufrieden mit dem Ergebnis eines solchen Suchprozesses ist. Notwendig sind reichhaltigere Begriffsnetze, die thematisch geordnete Zusammenhänge darstellen und sich flexibel verfeinern, vergröbern oder verändern lassen, um in geeignetem Umfang die wünschenswerte Orientierung liefern zu können. Das Computerprogramm TOSCANA könnte hier weiterhelfen
  13. Skorsky, M.: Graphische Darstellung eines Thesaurus (1997) 0.01
    0.008207378 = product of:
      0.049244266 = sum of:
        0.049244266 = weight(_text_:und in 1051) [ClassicSimilarity], result of:
          0.049244266 = score(doc=1051,freq=6.0), product of:
            0.09675359 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.043654136 = queryNorm
            0.5089658 = fieldWeight in 1051, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.09375 = fieldNorm(doc=1051)
      0.16666667 = coord(1/6)
    
    Source
    Information und Dokumentation: Qualität und Qualifikation. Deutscher Dokumentartag 1997, Universität Regensburg, 24.-26.9.1997. Hrsg.: M. Ockenfeld u. G.J. Mantwill
    Theme
    Konzeption und Anwendung des Prinzips Thesaurus
  14. Kollewe, W.; Sander, C.; Schmiede, R.; Wille, R.: TOSCANA als Instrument der bibliothekarischen Sacherschließung (1995) 0.01
    0.0068394816 = product of:
      0.04103689 = sum of:
        0.04103689 = weight(_text_:und in 927) [ClassicSimilarity], result of:
          0.04103689 = score(doc=927,freq=6.0), product of:
            0.09675359 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.043654136 = queryNorm
            0.42413816 = fieldWeight in 927, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.078125 = fieldNorm(doc=927)
      0.16666667 = coord(1/6)
    
    Imprint
    Oldenburg : Bibliotheks- und Informationssystem
    Source
    Aufbau und Erschließung begrifflicher Datenbanken: Beiträge zur bibliothekarischen Klassifikation. Eine Auswahl von Vorträgen der Jahrestagungen 1993 (Kaiserslautern) und 1994 (Oldenburg) der Gesellschaft für Klassifikation. Hrsg.: H. Havekost u. H.-J. Wätjen
  15. Kollewe, W.; Skorsky, M.; Vogt, F.; Wille, R.: TOSCANA - ein Werkzeug zur begrifflichen Analyse und Erkundung von Daten (1994) 0.01
    0.006701296 = product of:
      0.040207777 = sum of:
        0.040207777 = weight(_text_:und in 8942) [ClassicSimilarity], result of:
          0.040207777 = score(doc=8942,freq=4.0), product of:
            0.09675359 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.043654136 = queryNorm
            0.41556883 = fieldWeight in 8942, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.09375 = fieldNorm(doc=8942)
      0.16666667 = coord(1/6)
    
    Source
    Begriffliche Wissensverarbeitung: Grundfragen und Aufgaben. Hrsg.: R. Wille u. M. Zickwolff
  16. Skorsky, M.: Dokumentensammlungen : Strukturiert und recherchiert mit TOSKANA (1996) 0.00
    0.004738532 = product of:
      0.02843119 = sum of:
        0.02843119 = weight(_text_:und in 5261) [ClassicSimilarity], result of:
          0.02843119 = score(doc=5261,freq=2.0), product of:
            0.09675359 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.043654136 = queryNorm
            0.29385152 = fieldWeight in 5261, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.09375 = fieldNorm(doc=5261)
      0.16666667 = coord(1/6)
    
  17. Kollewe, W.; Sander, C.; Schmiede, R.; Wille, R.: TOSCANA als Instrument der bibliothekarischen Sacherschließung (1995) 0.00
    0.003159021 = product of:
      0.018954126 = sum of:
        0.018954126 = weight(_text_:und in 585) [ClassicSimilarity], result of:
          0.018954126 = score(doc=585,freq=2.0), product of:
            0.09675359 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.043654136 = queryNorm
            0.19590102 = fieldWeight in 585, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.0625 = fieldNorm(doc=585)
      0.16666667 = coord(1/6)
    
    Abstract
    TOSCANA ist ein Computerprogramm, mit dem begriffliche Erkundungssysteme auf der Grundlage der Formalen Begriffsanalyse erstellt werden können.In der vorliegenden Arbeit wird diskutiert, wie TOSCANA zur bibliothekarischen Sacherschließung und thematischen Literatursuche eingesetzt werden kann. Berichtet wird dabei von dem Forschungsprojekt 'Anwendung eines Modells begrifflicher Wissenssysteme im Bereich der Literatur zur interdisziplinären Technikforschung', das vom Darmstädter Zentrum für interdisziplinäre Technikforschung gefördert worden ist
  18. Viehmann, V.: Formale Begriffsanalyse in der bibliothekarischen Sacherschließung (1996) 0.00
    0.0023797948 = product of:
      0.014278769 = sum of:
        0.014278769 = weight(_text_:in in 4570) [ClassicSimilarity], result of:
          0.014278769 = score(doc=4570,freq=2.0), product of:
            0.059380736 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.043654136 = queryNorm
            0.24046129 = fieldWeight in 4570, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.125 = fieldNorm(doc=4570)
      0.16666667 = coord(1/6)
    
  19. Scheich, P.; Skorsky, M.; Vogt, F.; Wachter, C.; Wille, R.: Conceptual data systems (1992) 0.00
    0.0020823204 = product of:
      0.012493922 = sum of:
        0.012493922 = weight(_text_:in in 3147) [ClassicSimilarity], result of:
          0.012493922 = score(doc=3147,freq=2.0), product of:
            0.059380736 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.043654136 = queryNorm
            0.21040362 = fieldWeight in 3147, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.109375 = fieldNorm(doc=3147)
      0.16666667 = coord(1/6)
    
    Footnote
    Erscheint im Tagungsband der 16. Jahrestagung der Gesellschaft für Klassifikation 1992 in Dortmund
  20. Sedelow, W.A.: ¬The formal analysis of concepts (1993) 0.00
    0.0020609628 = product of:
      0.012365777 = sum of:
        0.012365777 = weight(_text_:in in 620) [ClassicSimilarity], result of:
          0.012365777 = score(doc=620,freq=6.0), product of:
            0.059380736 = queryWeight, product of:
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.043654136 = queryNorm
            0.2082456 = fieldWeight in 620, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              1.3602545 = idf(docFreq=30841, maxDocs=44218)
              0.0625 = fieldNorm(doc=620)
      0.16666667 = coord(1/6)
    
    Abstract
    The present paper focuses on the extraction, by means of a formal logical/mathematical methodology (i.e. automatically, exclusively by rule), of concept content, as in, for example, continuous discourse. The approach to a fully formal defintion of concept content ultimately is owing to a German government initiative to establish 'standards' regarding concepts, in conjunction with efforts to stipulate precisely (and then, derivatively, through computer prgrams) data and information needs according to work role in certain government offices