Search (34 results, page 1 of 2)

  • × type_ss:"p"
  1. Schöneberg, U.; Gödert, W.: Erschließung mathematischer Publikationen mittels linguistischer Verfahren (2012) 0.05
    0.05403028 = product of:
      0.15437223 = sum of:
        0.021010485 = weight(_text_:software in 1055) [ClassicSimilarity], result of:
          0.021010485 = score(doc=1055,freq=2.0), product of:
            0.07989157 = queryWeight, product of:
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.02013827 = queryNorm
            0.2629875 = fieldWeight in 1055, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.046875 = fieldNorm(doc=1055)
        0.016063396 = weight(_text_:und in 1055) [ClassicSimilarity], result of:
          0.016063396 = score(doc=1055,freq=12.0), product of:
            0.044633795 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.02013827 = queryNorm
            0.35989314 = fieldWeight in 1055, product of:
              3.4641016 = tf(freq=12.0), with freq of:
                12.0 = termFreq=12.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.046875 = fieldNorm(doc=1055)
        0.021010485 = weight(_text_:software in 1055) [ClassicSimilarity], result of:
          0.021010485 = score(doc=1055,freq=2.0), product of:
            0.07989157 = queryWeight, product of:
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.02013827 = queryNorm
            0.2629875 = fieldWeight in 1055, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.046875 = fieldNorm(doc=1055)
        0.050701115 = weight(_text_:methoden in 1055) [ClassicSimilarity], result of:
          0.050701115 = score(doc=1055,freq=4.0), product of:
            0.10436003 = queryWeight, product of:
              5.1821747 = idf(docFreq=674, maxDocs=44218)
              0.02013827 = queryNorm
            0.48582888 = fieldWeight in 1055, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              5.1821747 = idf(docFreq=674, maxDocs=44218)
              0.046875 = fieldNorm(doc=1055)
        0.0163166 = weight(_text_:der in 1055) [ClassicSimilarity], result of:
          0.0163166 = score(doc=1055,freq=12.0), product of:
            0.044984195 = queryWeight, product of:
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.02013827 = queryNorm
            0.36271852 = fieldWeight in 1055, product of:
              3.4641016 = tf(freq=12.0), with freq of:
                12.0 = termFreq=12.0
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.046875 = fieldNorm(doc=1055)
        0.008259674 = product of:
          0.016519347 = sum of:
            0.016519347 = weight(_text_:29 in 1055) [ClassicSimilarity], result of:
              0.016519347 = score(doc=1055,freq=2.0), product of:
                0.070840135 = queryWeight, product of:
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.02013827 = queryNorm
                0.23319192 = fieldWeight in 1055, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.046875 = fieldNorm(doc=1055)
          0.5 = coord(1/2)
        0.021010485 = weight(_text_:software in 1055) [ClassicSimilarity], result of:
          0.021010485 = score(doc=1055,freq=2.0), product of:
            0.07989157 = queryWeight, product of:
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.02013827 = queryNorm
            0.2629875 = fieldWeight in 1055, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.046875 = fieldNorm(doc=1055)
      0.35 = coord(7/20)
    
    Abstract
    Die Zahl der mathematik-relevanten Publikationn steigt von Jahr zu Jahr an. Referatedienste wie da Zentralblatt MATH und Mathematical Reviews erfassen die bibliographischen Daten, erschließen die Arbeiten inhaltlich und machen sie - heute über Datenbanken, früher in gedruckter Form - für den Nutzer suchbar. Keywords sind ein wesentlicher Bestandteil der inhaltlichen Erschließung der Publikationen. Keywords sind meist keine einzelnen Wörter, sondern Mehrwortphrasen. Das legt die Anwendung linguistischer Methoden und Verfahren nahe. Die an der FH Köln entwickelte Software 'Lingo' wurde für die speziellen Anforderungen mathematischer Texte angepasst und sowohl zum Aufbau eines kontrollierten Vokabulars als auch zur Extraction von Keywords aus mathematischen Publikationen genutzt. Es ist geplant, über eine Verknüpfung von kontrolliertem Vokabular und der Mathematical Subject Classification Methoden für die automatische Klassifikation für den Referatedienst Zentralblatt MATH zu entwickeln und zu erproben.
    Date
    12. 9.2013 12:29:05
    Footnote
    Vortrag anlässlich der DMV-Tagung in Saarbrücken, 17.-20.09.2012.
  2. Hobohm, H.-C.: Zensur in der Digitalität - eine Überwindung der Moderne? : Die Rolle der Bibliotheken (2020) 0.05
    0.049381297 = product of:
      0.16460432 = sum of:
        0.034297217 = weight(_text_:23 in 5371) [ClassicSimilarity], result of:
          0.034297217 = score(doc=5371,freq=2.0), product of:
            0.07217676 = queryWeight, product of:
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.02013827 = queryNorm
            0.47518367 = fieldWeight in 5371, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.09375 = fieldNorm(doc=5371)
        0.034297217 = weight(_text_:23 in 5371) [ClassicSimilarity], result of:
          0.034297217 = score(doc=5371,freq=2.0), product of:
            0.07217676 = queryWeight, product of:
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.02013827 = queryNorm
            0.47518367 = fieldWeight in 5371, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.09375 = fieldNorm(doc=5371)
        0.018548414 = weight(_text_:und in 5371) [ClassicSimilarity], result of:
          0.018548414 = score(doc=5371,freq=4.0), product of:
            0.044633795 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.02013827 = queryNorm
            0.41556883 = fieldWeight in 5371, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.09375 = fieldNorm(doc=5371)
        0.034297217 = weight(_text_:23 in 5371) [ClassicSimilarity], result of:
          0.034297217 = score(doc=5371,freq=2.0), product of:
            0.07217676 = queryWeight, product of:
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.02013827 = queryNorm
            0.47518367 = fieldWeight in 5371, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.09375 = fieldNorm(doc=5371)
        0.026644897 = weight(_text_:der in 5371) [ClassicSimilarity], result of:
          0.026644897 = score(doc=5371,freq=8.0), product of:
            0.044984195 = queryWeight, product of:
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.02013827 = queryNorm
            0.59231687 = fieldWeight in 5371, product of:
              2.828427 = tf(freq=8.0), with freq of:
                8.0 = termFreq=8.0
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.09375 = fieldNorm(doc=5371)
        0.016519347 = product of:
          0.033038694 = sum of:
            0.033038694 = weight(_text_:29 in 5371) [ClassicSimilarity], result of:
              0.033038694 = score(doc=5371,freq=2.0), product of:
                0.070840135 = queryWeight, product of:
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.02013827 = queryNorm
                0.46638384 = fieldWeight in 5371, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.09375 = fieldNorm(doc=5371)
          0.5 = coord(1/2)
      0.3 = coord(6/20)
    
    Content
    Beitrag zur Tagung: "Nationalsozialismus Digital. Die Verantwortung von Bibliotheken, Archiven und Museen sowie Forschungseinrichtungen und Medien im Umgang mit der NSZeit im Netz." Österreichische Nationalbibliothek, Universität Wien, 27. - 29. November 2019
    Date
    8. 8.2020 9:02:23
  3. Rahmstorf, G.: Methoden und Formate für mehrsprachige Begriffssysteme (1996) 0.02
    0.02207917 = product of:
      0.14719446 = sum of:
        0.022717075 = weight(_text_:und in 7110) [ClassicSimilarity], result of:
          0.022717075 = score(doc=7110,freq=6.0), product of:
            0.044633795 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.02013827 = queryNorm
            0.5089658 = fieldWeight in 7110, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.09375 = fieldNorm(doc=7110)
        0.10140223 = weight(_text_:methoden in 7110) [ClassicSimilarity], result of:
          0.10140223 = score(doc=7110,freq=4.0), product of:
            0.10436003 = queryWeight, product of:
              5.1821747 = idf(docFreq=674, maxDocs=44218)
              0.02013827 = queryNorm
            0.97165775 = fieldWeight in 7110, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              5.1821747 = idf(docFreq=674, maxDocs=44218)
              0.09375 = fieldNorm(doc=7110)
        0.023075162 = weight(_text_:der in 7110) [ClassicSimilarity], result of:
          0.023075162 = score(doc=7110,freq=6.0), product of:
            0.044984195 = queryWeight, product of:
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.02013827 = queryNorm
            0.5129615 = fieldWeight in 7110, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.09375 = fieldNorm(doc=7110)
      0.15 = coord(3/20)
    
    Footnote
    Ausgehändigtes Papier während der Sitzung 'Methoden und Formate für sprachbezogene Begriffssysteme' anläßlich der 20. Jahrestagung der Gesellschaft für Klassifikation am 5.3.1995 in Freiburg
    Theme
    Konzeption und Anwendung des Prinzips Thesaurus
  4. Yitzhaki, M.: ¬A draft version of a consolidated thesaurus for the rapidly growing field of alternative medicine (2000) 0.02
    0.015433748 = product of:
      0.102891654 = sum of:
        0.034297217 = weight(_text_:23 in 5417) [ClassicSimilarity], result of:
          0.034297217 = score(doc=5417,freq=2.0), product of:
            0.07217676 = queryWeight, product of:
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.02013827 = queryNorm
            0.47518367 = fieldWeight in 5417, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.09375 = fieldNorm(doc=5417)
        0.034297217 = weight(_text_:23 in 5417) [ClassicSimilarity], result of:
          0.034297217 = score(doc=5417,freq=2.0), product of:
            0.07217676 = queryWeight, product of:
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.02013827 = queryNorm
            0.47518367 = fieldWeight in 5417, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.09375 = fieldNorm(doc=5417)
        0.034297217 = weight(_text_:23 in 5417) [ClassicSimilarity], result of:
          0.034297217 = score(doc=5417,freq=2.0), product of:
            0.07217676 = queryWeight, product of:
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.02013827 = queryNorm
            0.47518367 = fieldWeight in 5417, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.09375 = fieldNorm(doc=5417)
      0.15 = coord(3/20)
    
    Date
    1.11.2000 17:24:23
  5. Breuer, T.; Tavakolpoursaleh, N.; Schaer, P.; Hienert, D.; Schaible, J.; Castro, L.J.: Online Information Retrieval Evaluation using the STELLA Framework (2022) 0.01
    0.013370992 = product of:
      0.08913994 = sum of:
        0.029713312 = weight(_text_:software in 640) [ClassicSimilarity], result of:
          0.029713312 = score(doc=640,freq=4.0), product of:
            0.07989157 = queryWeight, product of:
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.02013827 = queryNorm
            0.3719205 = fieldWeight in 640, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.046875 = fieldNorm(doc=640)
        0.029713312 = weight(_text_:software in 640) [ClassicSimilarity], result of:
          0.029713312 = score(doc=640,freq=4.0), product of:
            0.07989157 = queryWeight, product of:
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.02013827 = queryNorm
            0.3719205 = fieldWeight in 640, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.046875 = fieldNorm(doc=640)
        0.029713312 = weight(_text_:software in 640) [ClassicSimilarity], result of:
          0.029713312 = score(doc=640,freq=4.0), product of:
            0.07989157 = queryWeight, product of:
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.02013827 = queryNorm
            0.3719205 = fieldWeight in 640, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.046875 = fieldNorm(doc=640)
      0.15 = coord(3/20)
    
    Abstract
    Involving users in early phases of software development has become a common strategy as it enables developers to consider user needs from the beginning. Once a system is in production, new opportunities to observe, evaluate and learn from users emerge as more information becomes available. Gathering information from users to continuously evaluate their behavior is a common practice for commercial software, while the Cranfield paradigm remains the preferred option for Information Retrieval (IR) and recommendation systems in the academic world. Here we introduce the Infrastructures for Living Labs STELLA project which aims to create an evaluation infrastructure allowing experimental systems to run along production web-based academic search systems with real users. STELLA combines user interactions and log files analyses to enable large-scale A/B experiments for academic search.
  6. Wilk, D.: Problems in the use of Library of Congress Subject Headings as the basis for Hebrew subject headings in the Bar-Ilan University Library (2000) 0.01
    0.012861458 = product of:
      0.08574305 = sum of:
        0.028581016 = weight(_text_:23 in 5416) [ClassicSimilarity], result of:
          0.028581016 = score(doc=5416,freq=2.0), product of:
            0.07217676 = queryWeight, product of:
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.02013827 = queryNorm
            0.3959864 = fieldWeight in 5416, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.078125 = fieldNorm(doc=5416)
        0.028581016 = weight(_text_:23 in 5416) [ClassicSimilarity], result of:
          0.028581016 = score(doc=5416,freq=2.0), product of:
            0.07217676 = queryWeight, product of:
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.02013827 = queryNorm
            0.3959864 = fieldWeight in 5416, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.078125 = fieldNorm(doc=5416)
        0.028581016 = weight(_text_:23 in 5416) [ClassicSimilarity], result of:
          0.028581016 = score(doc=5416,freq=2.0), product of:
            0.07217676 = queryWeight, product of:
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.02013827 = queryNorm
            0.3959864 = fieldWeight in 5416, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.078125 = fieldNorm(doc=5416)
      0.15 = coord(3/20)
    
    Date
    1.11.2000 17:23:05
  7. Wätjen, H.-J.: Mensch oder Maschine? : Auswahl und Erschließung vonm Informationsressourcen im Internet (1996) 0.01
    0.0060378797 = product of:
      0.04025253 = sum of:
        0.015457011 = weight(_text_:und in 3161) [ClassicSimilarity], result of:
          0.015457011 = score(doc=3161,freq=4.0), product of:
            0.044633795 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.02013827 = queryNorm
            0.34630734 = fieldWeight in 3161, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.078125 = fieldNorm(doc=3161)
        0.015700657 = weight(_text_:der in 3161) [ClassicSimilarity], result of:
          0.015700657 = score(doc=3161,freq=4.0), product of:
            0.044984195 = queryWeight, product of:
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.02013827 = queryNorm
            0.34902605 = fieldWeight in 3161, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.078125 = fieldNorm(doc=3161)
        0.009094859 = product of:
          0.027284576 = sum of:
            0.027284576 = weight(_text_:22 in 3161) [ClassicSimilarity], result of:
              0.027284576 = score(doc=3161,freq=2.0), product of:
                0.07052079 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.02013827 = queryNorm
                0.38690117 = fieldWeight in 3161, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.078125 = fieldNorm(doc=3161)
          0.33333334 = coord(1/3)
      0.15 = coord(3/20)
    
    Abstract
    Beschreibung der verschiedenen Hilfsmittel zur inhaltlichen Suche von Angeboten im Internet
    Date
    2. 2.1996 15:40:22
    Footnote
    Erscheint in: Zeitschrift für Bibliothekswesen und Bibliographie; abzurufen vom Server des Verfassers: http://waetjen.bis.uni-oldenburg.de
    Source
    Vortrag anläßlich des Workshops'Internet-basierte Informationssysteme der Bibliotheken', 15.-17.1.96 in Bielefeld
  8. Frühwald, W.: ¬Das Forscherwissen und die Öffentlichkeit : Überlegungen zur Laisierung wissenschaftlicher Erkenntnis (1992) 0.00
    0.004985227 = product of:
      0.049852267 = sum of:
        0.024731217 = weight(_text_:und in 3045) [ClassicSimilarity], result of:
          0.024731217 = score(doc=3045,freq=4.0), product of:
            0.044633795 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.02013827 = queryNorm
            0.55409175 = fieldWeight in 3045, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.125 = fieldNorm(doc=3045)
        0.02512105 = weight(_text_:der in 3045) [ClassicSimilarity], result of:
          0.02512105 = score(doc=3045,freq=4.0), product of:
            0.044984195 = queryWeight, product of:
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.02013827 = queryNorm
            0.5584417 = fieldWeight in 3045, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.125 = fieldNorm(doc=3045)
      0.1 = coord(2/20)
    
    Source
    Eröffnungsvortrag der 117. Versammlung der Gesellschaft Deutscher Naturforschung und Ärzte, 19.9.92 in Aachen
  9. Grunst, G.; Thomas, Christoph; Oppermann, R.: Intelligente Benutzerschnittstellen : kontext-sensitive Hilfen und Adaptivität (1991) 0.00
    0.0042494484 = product of:
      0.042494483 = sum of:
        0.024731217 = weight(_text_:und in 570) [ClassicSimilarity], result of:
          0.024731217 = score(doc=570,freq=4.0), product of:
            0.044633795 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.02013827 = queryNorm
            0.55409175 = fieldWeight in 570, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.125 = fieldNorm(doc=570)
        0.017763264 = weight(_text_:der in 570) [ClassicSimilarity], result of:
          0.017763264 = score(doc=570,freq=2.0), product of:
            0.044984195 = queryWeight, product of:
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.02013827 = queryNorm
            0.3948779 = fieldWeight in 570, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.125 = fieldNorm(doc=570)
      0.1 = coord(2/20)
    
    Imprint
    Sankt Augustin : Gesellschaft für Mathematik und Datenverarbeitung
    Series
    Arbeitspapere der GMD; 553
  10. Grötschel, M.; Lügger, J.; Sperber, W.: Wissenschaftliches Publizieren und elektronische Fachinformation im Umbruch : ein Situationsbericht aus der Sicht der Mathematik (1993) 0.00
    0.003728258 = product of:
      0.03728258 = sum of:
        0.015301661 = weight(_text_:und in 1946) [ClassicSimilarity], result of:
          0.015301661 = score(doc=1946,freq=2.0), product of:
            0.044633795 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.02013827 = queryNorm
            0.34282678 = fieldWeight in 1946, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.109375 = fieldNorm(doc=1946)
        0.021980919 = weight(_text_:der in 1946) [ClassicSimilarity], result of:
          0.021980919 = score(doc=1946,freq=4.0), product of:
            0.044984195 = queryWeight, product of:
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.02013827 = queryNorm
            0.4886365 = fieldWeight in 1946, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.109375 = fieldNorm(doc=1946)
      0.1 = coord(2/20)
    
  11. Gödert, W.: Navigation und Retrieval in Datenbanken und Informationsnetzen (1995) 0.00
    0.0037182674 = product of:
      0.037182674 = sum of:
        0.021639816 = weight(_text_:und in 2113) [ClassicSimilarity], result of:
          0.021639816 = score(doc=2113,freq=4.0), product of:
            0.044633795 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.02013827 = queryNorm
            0.4848303 = fieldWeight in 2113, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.109375 = fieldNorm(doc=2113)
        0.015542857 = weight(_text_:der in 2113) [ClassicSimilarity], result of:
          0.015542857 = score(doc=2113,freq=2.0), product of:
            0.044984195 = queryWeight, product of:
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.02013827 = queryNorm
            0.34551817 = fieldWeight in 2113, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.109375 = fieldNorm(doc=2113)
      0.1 = coord(2/20)
    
    Footnote
    Ausarbeitung eines Vortrages anläßlich des SWD-Kolloquiums in der Deutschen Bibliothek in Frankfurt am 23.11.1994; Text beim Verfasser erhältlich
  12. Sander, C.; Schmiede, R.; Wille, R.: ¬Ein begriffliches Datensystem zur Literatur der interdisziplinären Technikforschung (1993) 0.00
    0.0034485222 = product of:
      0.03448522 = sum of:
        0.017107777 = weight(_text_:und in 5255) [ClassicSimilarity], result of:
          0.017107777 = score(doc=5255,freq=10.0), product of:
            0.044633795 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.02013827 = queryNorm
            0.38329202 = fieldWeight in 5255, product of:
              3.1622777 = tf(freq=10.0), with freq of:
                10.0 = termFreq=10.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.0546875 = fieldNorm(doc=5255)
        0.017377444 = weight(_text_:der in 5255) [ClassicSimilarity], result of:
          0.017377444 = score(doc=5255,freq=10.0), product of:
            0.044984195 = queryWeight, product of:
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.02013827 = queryNorm
            0.38630107 = fieldWeight in 5255, product of:
              3.1622777 = tf(freq=10.0), with freq of:
                10.0 = termFreq=10.0
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.0546875 = fieldNorm(doc=5255)
      0.1 = coord(2/20)
    
    Abstract
    Begriffliche Datensysteme sind im Rahmen der Formalen Begriffsanalyse entstanden und gründen sich auf mathematische Formalisierungen von Begriff, Begriffssystem und Begriffliche Datei. Sie machen Wissen, das in einer Datenbasis vorliegt, begrifflich zugänglich und interpretierbar. Hierfür werden begriffliche Zusammenhänge entsprechend gewählter Frageaspekte in gestuften Liniendiagrammen dargestellt. Durch Verfeinern, Vergröbern und Wechseln von Begriffstrukturen kann man unbegrenzt durch das in der Datenbasis gespeicherte Wissen "navigieren". In einem Forschungsprojekt, gefördert vom Zentrum für interdisziplinäre Technikforschung an der TH Darmstadt, ist ein Prototyp eines begrifflichen Datensystems erstellt worden, dem als Datenkontext eine ausgewählte, begrifflich aufgearbeitete Menge von Büchern zur interdisziplinären Technikforschung zugrunde liegt. Mit diesem Prototyp soll die flexible und variable Verwendung begrifflicher datensysteme im Literaturbereich demonstriert werden
    Source
    Vortrag, 17. Jahrestagung der Gesellschaft für Klassifikation, 3.-5.3.1993 in Kaiserslautern
  13. Wille, R.: Denken in Begriffen : von der griechischen Philosophie bis zur Künstlichen Intelligenz heute (1993) 0.00
    0.003062907 = product of:
      0.030629069 = sum of:
        0.013251626 = weight(_text_:und in 3145) [ClassicSimilarity], result of:
          0.013251626 = score(doc=3145,freq=6.0), product of:
            0.044633795 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.02013827 = queryNorm
            0.2968967 = fieldWeight in 3145, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.0546875 = fieldNorm(doc=3145)
        0.017377444 = weight(_text_:der in 3145) [ClassicSimilarity], result of:
          0.017377444 = score(doc=3145,freq=10.0), product of:
            0.044984195 = queryWeight, product of:
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.02013827 = queryNorm
            0.38630107 = fieldWeight in 3145, product of:
              3.1622777 = tf(freq=10.0), with freq of:
                10.0 = termFreq=10.0
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.0546875 = fieldNorm(doc=3145)
      0.1 = coord(2/20)
    
    Abstract
    Mechanistisches Denken und seine maschinelle Umsetzung (insbesondere in komplexe Computersysteme) gefährdet heute zunehmend die kognitive Autonomie des Menschen. Seinen besonderen Ausdruck findet dieses Denken in den Zielen der Künstlichen Intelligenz, denen die Metapher des künstlichen Menschen zugrunde liegt. Um die Beschränktheit mechanistischen Denkens deutlich werden zu lassen, wird die Geschichte des Begriffs von der griechischen Antike bis heute in ihren wichtigsten Stationen dargelegt. Das macht insbesondere den inhaltlichen Verlust sichtbar, den einschränkende Formalisierungen des Begriffsdenkens mit sich bringen. Es wird dafür plädiert, die enge Verbindung von Inhaltlichem und Formalem im Begriffsdenken zu reaktivieren; hierzu wird dem machanistischen Weltbild entgegengestellt das Weltbild der menschlichen Kommunikationsgemeinschaft, für das kommunikatives Denken und Handeln konstitutiv ist
    Source
    Vortrag, 17. Jahrestagung der Gesellschaft für Klassifikation, 3.-5.3.1993 in Kaiserslautern
  14. Bauckhage, C.: Moderne Textanalyse : neues Wissen für intelligente Lösungen (2016) 0.00
    0.0030128874 = product of:
      0.030128874 = sum of:
        0.0123656085 = weight(_text_:und in 2568) [ClassicSimilarity], result of:
          0.0123656085 = score(doc=2568,freq=4.0), product of:
            0.044633795 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.02013827 = queryNorm
            0.27704588 = fieldWeight in 2568, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.0625 = fieldNorm(doc=2568)
        0.017763264 = weight(_text_:der in 2568) [ClassicSimilarity], result of:
          0.017763264 = score(doc=2568,freq=8.0), product of:
            0.044984195 = queryWeight, product of:
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.02013827 = queryNorm
            0.3948779 = fieldWeight in 2568, product of:
              2.828427 = tf(freq=8.0), with freq of:
                8.0 = termFreq=8.0
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.0625 = fieldNorm(doc=2568)
      0.1 = coord(2/20)
    
    Abstract
    Im Zuge der immer größeren Verfügbarkeit von Daten (Big Data) und rasanter Fortschritte im Daten-basierten maschinellen Lernen haben wir in den letzten Jahren Durchbrüche in der künstlichen Intelligenz erlebt. Dieser Vortrag beleuchtet diese Entwicklungen insbesondere im Hinblick auf die automatische Analyse von Textdaten. Anhand einfacher Beispiele illustrieren wir, wie moderne Textanalyse abläuft und zeigen wiederum anhand von Beispielen, welche praktischen Anwendungsmöglichkeiten sich heutzutage in Branchen wie dem Verlagswesen, der Finanzindustrie oder dem Consulting ergeben.
    Content
    Folien der Präsentation anlässlich des GENIOS Datenbankfrühstücks 2016, 19. Oktober 2016.
  15. Kollewe, W.; Sander, C.; Schmiede, R.; Wille, R.: TOSCANA als Instrument der bibliothekarischen Sacherschließung (1995) 0.00
    0.002860374 = product of:
      0.02860374 = sum of:
        0.008743806 = weight(_text_:und in 585) [ClassicSimilarity], result of:
          0.008743806 = score(doc=585,freq=2.0), product of:
            0.044633795 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.02013827 = queryNorm
            0.19590102 = fieldWeight in 585, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.0625 = fieldNorm(doc=585)
        0.019859934 = weight(_text_:der in 585) [ClassicSimilarity], result of:
          0.019859934 = score(doc=585,freq=10.0), product of:
            0.044984195 = queryWeight, product of:
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.02013827 = queryNorm
            0.44148692 = fieldWeight in 585, product of:
              3.1622777 = tf(freq=10.0), with freq of:
                10.0 = termFreq=10.0
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.0625 = fieldNorm(doc=585)
      0.1 = coord(2/20)
    
    Abstract
    TOSCANA ist ein Computerprogramm, mit dem begriffliche Erkundungssysteme auf der Grundlage der Formalen Begriffsanalyse erstellt werden können.In der vorliegenden Arbeit wird diskutiert, wie TOSCANA zur bibliothekarischen Sacherschließung und thematischen Literatursuche eingesetzt werden kann. Berichtet wird dabei von dem Forschungsprojekt 'Anwendung eines Modells begrifflicher Wissenssysteme im Bereich der Literatur zur interdisziplinären Technikforschung', das vom Darmstädter Zentrum für interdisziplinäre Technikforschung gefördert worden ist
  16. Dietze, J.: Sachkatalogisierung in einem OPAC (1993) 0.00
    0.0028098237 = product of:
      0.028098237 = sum of:
        0.017107777 = weight(_text_:und in 7388) [ClassicSimilarity], result of:
          0.017107777 = score(doc=7388,freq=10.0), product of:
            0.044633795 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.02013827 = queryNorm
            0.38329202 = fieldWeight in 7388, product of:
              3.1622777 = tf(freq=10.0), with freq of:
                10.0 = termFreq=10.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.0546875 = fieldNorm(doc=7388)
        0.0109904595 = weight(_text_:der in 7388) [ClassicSimilarity], result of:
          0.0109904595 = score(doc=7388,freq=4.0), product of:
            0.044984195 = queryWeight, product of:
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.02013827 = queryNorm
            0.24431825 = fieldWeight in 7388, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.0546875 = fieldNorm(doc=7388)
      0.1 = coord(2/20)
    
    Abstract
    Katalogisierung über den Rechner bedeutet immer auch, einen OPAC aufzubauen, der es erlaubt, nach unterschiedlichen Merkmalen - formal und sachlich - und mit deren Kombination zu recherchieren. Die Freiwortrecherche wird von den Benutzern gern verwendet, obwohl dabei die Recall Ration (Vollständigkeit) nicht ausgeschöpft wird. Die Nutzung von Schlagwörtern sollte deren Standardisierung in einer Normdatei voraussetzen, um den Subjektivismus der Katalogisierer auszuschalten. Schlagwortketten sind prinzipiell für einen OPAC überflüssig. Bei Verwendung einer hierarchischen, d.h. systematischen Klassifikation sollte deren Notation systemkohärent, flexibel und synthetisch (Facetten bzw. Schlüssel) sein. Mit Hilfe von Registern sind Schlagwörter und Systemnotationen vice versa zu verknüpfen. Wenn Sachkatalogisierung im Verbund erfolgt, bilden Terminologiekontrolle und Einheitsklassifikation als Grobsysteme wichtige Desiderate
  17. Jaenecke, P.: Knowledge organization due to theory formation (1995) 0.00
    0.0019894529 = product of:
      0.01989453 = sum of:
        0.008881632 = weight(_text_:der in 3751) [ClassicSimilarity], result of:
          0.008881632 = score(doc=3751,freq=2.0), product of:
            0.044984195 = queryWeight, product of:
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.02013827 = queryNorm
            0.19743896 = fieldWeight in 3751, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.0625 = fieldNorm(doc=3751)
        0.011012898 = product of:
          0.022025796 = sum of:
            0.022025796 = weight(_text_:29 in 3751) [ClassicSimilarity], result of:
              0.022025796 = score(doc=3751,freq=2.0), product of:
                0.070840135 = queryWeight, product of:
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.02013827 = queryNorm
                0.31092256 = fieldWeight in 3751, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.0625 = fieldNorm(doc=3751)
          0.5 = coord(1/2)
      0.1 = coord(2/20)
    
    Date
    29. 3.1996 17:26:47
    Source
    Vortrag, anläßlich der Tagung 'EOCONSID '95: 2nd Meeting on Knowledge Organization in Information and Documentation Systems', Madrid, Nov. 16-17, 1995
  18. Noever, D.; Ciolino, M.: ¬The Turing deception (2022) 0.00
    0.0015992456 = product of:
      0.03198491 = sum of:
        0.03198491 = product of:
          0.09595472 = sum of:
            0.09595472 = weight(_text_:3a in 862) [ClassicSimilarity], result of:
              0.09595472 = score(doc=862,freq=2.0), product of:
                0.17073247 = queryWeight, product of:
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.02013827 = queryNorm
                0.56201804 = fieldWeight in 862, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.046875 = fieldNorm(doc=862)
          0.33333334 = coord(1/3)
      0.05 = coord(1/20)
    
    Source
    https%3A%2F%2Farxiv.org%2Fabs%2F2212.06721&usg=AOvVaw3i_9pZm9y_dQWoHi6uv0EN
  19. Schierl, T.: ¬Die Relevanz der Kommunikationswissenschaft für das öffentliche Bibliothekswesen (1995) 0.00
    0.0012560525 = product of:
      0.02512105 = sum of:
        0.02512105 = weight(_text_:der in 1436) [ClassicSimilarity], result of:
          0.02512105 = score(doc=1436,freq=4.0), product of:
            0.044984195 = queryWeight, product of:
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.02013827 = queryNorm
            0.5584417 = fieldWeight in 1436, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.125 = fieldNorm(doc=1436)
      0.05 = coord(1/20)
    
    Footnote
    Schriftliche Fasssung des Vortrages am 17.1.1995 an der FHBD Köln
  20. Scheich, P.; Skorsky, M.; Vogt, F.; Wachter, C.; Wille, R.: Conceptual data systems (1992) 0.00
    0.001099046 = product of:
      0.021980919 = sum of:
        0.021980919 = weight(_text_:der in 3147) [ClassicSimilarity], result of:
          0.021980919 = score(doc=3147,freq=4.0), product of:
            0.044984195 = queryWeight, product of:
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.02013827 = queryNorm
            0.4886365 = fieldWeight in 3147, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.109375 = fieldNorm(doc=3147)
      0.05 = coord(1/20)
    
    Footnote
    Erscheint im Tagungsband der 16. Jahrestagung der Gesellschaft für Klassifikation 1992 in Dortmund

Years

Languages

  • d 23
  • e 11

Types