Search (1739 results, page 1 of 87)

  • × language_ss:"d"
  1. Eversberg, B.: ADV und Zetteldruck : ein Widerspruch? (1975) 0.14
    0.1381995 = product of:
      0.20729923 = sum of:
        0.07809752 = weight(_text_:data in 4431) [ClassicSimilarity], result of:
          0.07809752 = score(doc=4431,freq=6.0), product of:
            0.16132914 = queryWeight, product of:
              3.1620505 = idf(docFreq=5088, maxDocs=44218)
              0.051020417 = queryNorm
            0.48408815 = fieldWeight in 4431, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              3.1620505 = idf(docFreq=5088, maxDocs=44218)
              0.0625 = fieldNorm(doc=4431)
        0.12920171 = sum of:
          0.07390122 = weight(_text_:processing in 4431) [ClassicSimilarity], result of:
            0.07390122 = score(doc=4431,freq=2.0), product of:
              0.20653816 = queryWeight, product of:
                4.048147 = idf(docFreq=2097, maxDocs=44218)
                0.051020417 = queryNorm
              0.35780904 = fieldWeight in 4431, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                4.048147 = idf(docFreq=2097, maxDocs=44218)
                0.0625 = fieldNorm(doc=4431)
          0.055300497 = weight(_text_:22 in 4431) [ClassicSimilarity], result of:
            0.055300497 = score(doc=4431,freq=2.0), product of:
              0.1786648 = queryWeight, product of:
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.051020417 = queryNorm
              0.30952093 = fieldWeight in 4431, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.0625 = fieldNorm(doc=4431)
      0.6666667 = coord(2/3)
    
    Abstract
    A method is outlined which would permit a large number of libraries of all types to use centralised cataloguing facilities without the need for their own automatic data processing equipment and outlay. The method is seen as an alternative to the OCLC on-line data bank, and permits the ordering of printed catalogue cards by machine-readable but hand-prepared data cards, such as the loan cards which readers at the Münster library are at present required to complete. The proposed sequence of ordering is set out in 11 stages
    Source
    Zeitschrift für Bibliothekswesen und Bibliographie. 22(1975) H.5, S.387-390
  2. Semantik, Lexikographie und Computeranwendungen : Workshop ... (Bonn) : 1995.01.27-28 (1996) 0.12
    0.12220091 = product of:
      0.18330136 = sum of:
        0.056362033 = weight(_text_:data in 190) [ClassicSimilarity], result of:
          0.056362033 = score(doc=190,freq=8.0), product of:
            0.16132914 = queryWeight, product of:
              3.1620505 = idf(docFreq=5088, maxDocs=44218)
              0.051020417 = queryNorm
            0.34936053 = fieldWeight in 190, product of:
              2.828427 = tf(freq=8.0), with freq of:
                8.0 = termFreq=8.0
              3.1620505 = idf(docFreq=5088, maxDocs=44218)
              0.0390625 = fieldNorm(doc=190)
        0.12693933 = sum of:
          0.09237652 = weight(_text_:processing in 190) [ClassicSimilarity], result of:
            0.09237652 = score(doc=190,freq=8.0), product of:
              0.20653816 = queryWeight, product of:
                4.048147 = idf(docFreq=2097, maxDocs=44218)
                0.051020417 = queryNorm
              0.4472613 = fieldWeight in 190, product of:
                2.828427 = tf(freq=8.0), with freq of:
                  8.0 = termFreq=8.0
                4.048147 = idf(docFreq=2097, maxDocs=44218)
                0.0390625 = fieldNorm(doc=190)
          0.03456281 = weight(_text_:22 in 190) [ClassicSimilarity], result of:
            0.03456281 = score(doc=190,freq=2.0), product of:
              0.1786648 = queryWeight, product of:
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.051020417 = queryNorm
              0.19345059 = fieldWeight in 190, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.0390625 = fieldNorm(doc=190)
      0.6666667 = coord(2/3)
    
    Date
    14. 4.2007 10:04:22
    LCSH
    Semantics / Data processing ; Lexicography / Data processing ; Computational linguistics
    Subject
    Semantics / Data processing ; Lexicography / Data processing ; Computational linguistics
  3. Polatscheck, K.: Elektronische Versuchung : Test des Sony Data Discman: eine digitale Konkurrenz für Taschenbücher? (1992) 0.12
    0.12188882 = product of:
      0.18283322 = sum of:
        0.12753272 = weight(_text_:data in 6381) [ClassicSimilarity], result of:
          0.12753272 = score(doc=6381,freq=4.0), product of:
            0.16132914 = queryWeight, product of:
              3.1620505 = idf(docFreq=5088, maxDocs=44218)
              0.051020417 = queryNorm
            0.7905126 = fieldWeight in 6381, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              3.1620505 = idf(docFreq=5088, maxDocs=44218)
              0.125 = fieldNorm(doc=6381)
        0.055300497 = product of:
          0.11060099 = sum of:
            0.11060099 = weight(_text_:22 in 6381) [ClassicSimilarity], result of:
              0.11060099 = score(doc=6381,freq=2.0), product of:
                0.1786648 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.051020417 = queryNorm
                0.61904186 = fieldWeight in 6381, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.125 = fieldNorm(doc=6381)
          0.5 = coord(1/2)
      0.6666667 = coord(2/3)
    
    Object
    Data Discman
    Source
    Zeit. Nr.xx vom ???, S.22
  4. Analytische Informationssysteme : Data Warehouse, On-Line Analytical Processing, Data Mining (1999) 0.11
    0.10692283 = product of:
      0.16038424 = sum of:
        0.104383945 = weight(_text_:data in 1381) [ClassicSimilarity], result of:
          0.104383945 = score(doc=1381,freq=14.0), product of:
            0.16132914 = queryWeight, product of:
              3.1620505 = idf(docFreq=5088, maxDocs=44218)
              0.051020417 = queryNorm
            0.64702475 = fieldWeight in 1381, product of:
              3.7416575 = tf(freq=14.0), with freq of:
                14.0 = termFreq=14.0
              3.1620505 = idf(docFreq=5088, maxDocs=44218)
              0.0546875 = fieldNorm(doc=1381)
        0.056000296 = product of:
          0.11200059 = sum of:
            0.11200059 = weight(_text_:processing in 1381) [ClassicSimilarity], result of:
              0.11200059 = score(doc=1381,freq=6.0), product of:
                0.20653816 = queryWeight, product of:
                  4.048147 = idf(docFreq=2097, maxDocs=44218)
                  0.051020417 = queryNorm
                0.54227555 = fieldWeight in 1381, product of:
                  2.4494898 = tf(freq=6.0), with freq of:
                    6.0 = termFreq=6.0
                  4.048147 = idf(docFreq=2097, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=1381)
          0.5 = coord(1/2)
      0.6666667 = coord(2/3)
    
    Abstract
    Neben den operativen Informationssystemen, welche die Abwicklung des betrieblichen Tagesgeschäftes unterstützen, treten heute verstärkt Informationssysteme für analytische Aufgaben der Fach- und Führungskräfte in den Vordergrund. In fast allen Unternehmen werden derzeit Begriffe und Konzepte wie Data Warehouse, On-Line Analytical Processing und Data Mining diskutiert und die zugehörigen Produkte evaluiert. Vor diesem Hintergrund zielt der vorliegende Sammelband darauf ab, einen aktuellen Überblick über Technologien, Produkte und Trends zu bieten. Als Entscheidungsgrundlage für den Praktiker beim Aufbau und Einsatz derartiger analytischer Informationssysteme können die unterschiedlichen Beiträge aus Wirtschaft und Wissenschaft wertvolle Hilfestellung leisten.
    Content
    Grundlagen.- Data Warehouse.- On-line Analytical Processing.- Data Mining.- Betriebswirtschaftliche und strategische Aspekte.
    Theme
    Data Mining
  5. Analytische Informationssysteme : Data Warehouse, On-Line Analytical Processing, Data Mining (1998) 0.10
    0.10205302 = product of:
      0.15307952 = sum of:
        0.10082347 = weight(_text_:data in 1380) [ClassicSimilarity], result of:
          0.10082347 = score(doc=1380,freq=10.0), product of:
            0.16132914 = queryWeight, product of:
              3.1620505 = idf(docFreq=5088, maxDocs=44218)
              0.051020417 = queryNorm
            0.6249551 = fieldWeight in 1380, product of:
              3.1622777 = tf(freq=10.0), with freq of:
                10.0 = termFreq=10.0
              3.1620505 = idf(docFreq=5088, maxDocs=44218)
              0.0625 = fieldNorm(doc=1380)
        0.052256055 = product of:
          0.10451211 = sum of:
            0.10451211 = weight(_text_:processing in 1380) [ClassicSimilarity], result of:
              0.10451211 = score(doc=1380,freq=4.0), product of:
                0.20653816 = queryWeight, product of:
                  4.048147 = idf(docFreq=2097, maxDocs=44218)
                  0.051020417 = queryNorm
                0.5060184 = fieldWeight in 1380, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  4.048147 = idf(docFreq=2097, maxDocs=44218)
                  0.0625 = fieldNorm(doc=1380)
          0.5 = coord(1/2)
      0.6666667 = coord(2/3)
    
    Abstract
    Neben den operativen Informationssystemen treten heute verstärkt Informationssysteme für die analytischen Aufgaben der Fach- und Führungskräfte in den Vordergrund. In fast allen Unternehmen werden derzeit Begriffe und Konzepte wie Data Warehouse, On-Line Analytical Processing und Data Mining diskutiert und die zugehörigen Produkte evaluiert. Vor diesem Hintergrund zielt der vorliegende Sammelband darauf, einen aktuellen Überblick über Technologien, Produkte und Trends zu bieten. Als Entscheidungsgrundlage für den Praktiker beim Aufbau und Einsatz derartiger analytischer Informationssysteme können die unterschiedlichen Beiträge aus Wirtschaft und Wissenschaft wertvolle Hilfestellung leisten
    Theme
    Data Mining
  6. Lusti, M.: Data Warehousing and Data Mining : Eine Einführung in entscheidungsunterstützende Systeme (1999) 0.10
    0.09796413 = product of:
      0.14694619 = sum of:
        0.11929594 = weight(_text_:data in 4261) [ClassicSimilarity], result of:
          0.11929594 = score(doc=4261,freq=14.0), product of:
            0.16132914 = queryWeight, product of:
              3.1620505 = idf(docFreq=5088, maxDocs=44218)
              0.051020417 = queryNorm
            0.7394569 = fieldWeight in 4261, product of:
              3.7416575 = tf(freq=14.0), with freq of:
                14.0 = termFreq=14.0
              3.1620505 = idf(docFreq=5088, maxDocs=44218)
              0.0625 = fieldNorm(doc=4261)
        0.027650248 = product of:
          0.055300497 = sum of:
            0.055300497 = weight(_text_:22 in 4261) [ClassicSimilarity], result of:
              0.055300497 = score(doc=4261,freq=2.0), product of:
                0.1786648 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.051020417 = queryNorm
                0.30952093 = fieldWeight in 4261, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0625 = fieldNorm(doc=4261)
          0.5 = coord(1/2)
      0.6666667 = coord(2/3)
    
    Date
    17. 7.2002 19:22:06
    RSWK
    Data-warehouse-Konzept / Lehrbuch
    Data mining / Lehrbuch
    Subject
    Data-warehouse-Konzept / Lehrbuch
    Data mining / Lehrbuch
    Theme
    Data Mining
  7. Gödert, W.; Lepsky, K.: Informationelle Kompetenz : ein humanistischer Entwurf (2019) 0.09
    0.08932869 = product of:
      0.13399303 = sum of:
        0.094539605 = product of:
          0.2836188 = sum of:
            0.2836188 = weight(_text_:3a in 5955) [ClassicSimilarity], result of:
              0.2836188 = score(doc=5955,freq=2.0), product of:
                0.43255165 = queryWeight, product of:
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.051020417 = queryNorm
                0.65568775 = fieldWeight in 5955, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=5955)
          0.33333334 = coord(1/3)
        0.03945342 = weight(_text_:data in 5955) [ClassicSimilarity], result of:
          0.03945342 = score(doc=5955,freq=2.0), product of:
            0.16132914 = queryWeight, product of:
              3.1620505 = idf(docFreq=5088, maxDocs=44218)
              0.051020417 = queryNorm
            0.24455236 = fieldWeight in 5955, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.1620505 = idf(docFreq=5088, maxDocs=44218)
              0.0546875 = fieldNorm(doc=5955)
      0.6666667 = coord(2/3)
    
    Footnote
    Rez. in: Philosophisch-ethische Rezensionen vom 09.11.2019 (Jürgen Czogalla), Unter: https://philosophisch-ethische-rezensionen.de/rezension/Goedert1.html. In: B.I.T. online 23(2020) H.3, S.345-347 (W. Sühl-Strohmenger) [Unter: https%3A%2F%2Fwww.b-i-t-online.de%2Fheft%2F2020-03-rezensionen.pdf&usg=AOvVaw0iY3f_zNcvEjeZ6inHVnOK]. In: Open Password Nr. 805 vom 14.08.2020 (H.-C. Hobohm) [Unter: https://www.password-online.de/?mailpoet_router&endpoint=view_in_browser&action=view&data=WzE0MywiOGI3NjZkZmNkZjQ1IiwwLDAsMTMxLDFd].
  8. Fabian, C.: Altbestandskatalogisierung als Gemeinschaftsaufgabe Europas (1991) 0.08
    0.08393081 = product of:
      0.12589622 = sum of:
        0.07970795 = weight(_text_:data in 3624) [ClassicSimilarity], result of:
          0.07970795 = score(doc=3624,freq=4.0), product of:
            0.16132914 = queryWeight, product of:
              3.1620505 = idf(docFreq=5088, maxDocs=44218)
              0.051020417 = queryNorm
            0.49407038 = fieldWeight in 3624, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              3.1620505 = idf(docFreq=5088, maxDocs=44218)
              0.078125 = fieldNorm(doc=3624)
        0.04618826 = product of:
          0.09237652 = sum of:
            0.09237652 = weight(_text_:processing in 3624) [ClassicSimilarity], result of:
              0.09237652 = score(doc=3624,freq=2.0), product of:
                0.20653816 = queryWeight, product of:
                  4.048147 = idf(docFreq=2097, maxDocs=44218)
                  0.051020417 = queryNorm
                0.4472613 = fieldWeight in 3624, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  4.048147 = idf(docFreq=2097, maxDocs=44218)
                  0.078125 = fieldNorm(doc=3624)
          0.5 = coord(1/2)
      0.6666667 = coord(2/3)
    
    Abstract
    Report of an international conference on retrospective cataloguing in Europe of 15th-19th century material, Nov. 90. It is quicker to convert an old catalogue for electronic data processing than to compile a new catalogue, although problems with converted data can occur in union catalogues because of adequate standardisation
  9. Pohl, A.; Danowski, P.: Linked Open Data in der Bibliothekswelt : Überblick und Herausforderungen (2015) 0.07
    0.07273987 = product of:
      0.109109804 = sum of:
        0.06763443 = weight(_text_:data in 2057) [ClassicSimilarity], result of:
          0.06763443 = score(doc=2057,freq=2.0), product of:
            0.16132914 = queryWeight, product of:
              3.1620505 = idf(docFreq=5088, maxDocs=44218)
              0.051020417 = queryNorm
            0.4192326 = fieldWeight in 2057, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.1620505 = idf(docFreq=5088, maxDocs=44218)
              0.09375 = fieldNorm(doc=2057)
        0.04147537 = product of:
          0.08295074 = sum of:
            0.08295074 = weight(_text_:22 in 2057) [ClassicSimilarity], result of:
              0.08295074 = score(doc=2057,freq=2.0), product of:
                0.1786648 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.051020417 = queryNorm
                0.46428138 = fieldWeight in 2057, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.09375 = fieldNorm(doc=2057)
          0.5 = coord(1/2)
      0.6666667 = coord(2/3)
    
    Date
    26. 8.2015 10:22:00
  10. Münnich, M.: Katalogisieren auf dem PC : ein Pflichtenheft für die Formalkatalogisierung (1988) 0.07
    0.07049852 = product of:
      0.105747774 = sum of:
        0.07809752 = weight(_text_:data in 2502) [ClassicSimilarity], result of:
          0.07809752 = score(doc=2502,freq=6.0), product of:
            0.16132914 = queryWeight, product of:
              3.1620505 = idf(docFreq=5088, maxDocs=44218)
              0.051020417 = queryNorm
            0.48408815 = fieldWeight in 2502, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              3.1620505 = idf(docFreq=5088, maxDocs=44218)
              0.0625 = fieldNorm(doc=2502)
        0.027650248 = product of:
          0.055300497 = sum of:
            0.055300497 = weight(_text_:22 in 2502) [ClassicSimilarity], result of:
              0.055300497 = score(doc=2502,freq=2.0), product of:
                0.1786648 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.051020417 = queryNorm
                0.30952093 = fieldWeight in 2502, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0625 = fieldNorm(doc=2502)
          0.5 = coord(1/2)
      0.6666667 = coord(2/3)
    
    Abstract
    Examines a simpler cataloguing format offered by PCs, without disturbing compatibility, using A-Z cataloguing rules for data input, category codes for tagging and computer-supported data input through windows. Gives numerous examples of catalogue entries, basing techniques on certain category schemes set out by Klaus Haller and Hans Popst. Examines catalogue entries in respect of categories of data bases for authors and corporate names, titles, single volume works, serial issues of collected works, and limited editions of works in several volumes.
    Source
    Bibliotheksdienst. 22(1988) H.9, S.841-856
  11. Lackes, R.; Tillmanns, C.: Data Mining für die Unternehmenspraxis : Entscheidungshilfen und Fallstudien mit führenden Softwarelösungen (2006) 0.07
    0.06904841 = product of:
      0.103572614 = sum of:
        0.08283493 = weight(_text_:data in 1383) [ClassicSimilarity], result of:
          0.08283493 = score(doc=1383,freq=12.0), product of:
            0.16132914 = queryWeight, product of:
              3.1620505 = idf(docFreq=5088, maxDocs=44218)
              0.051020417 = queryNorm
            0.513453 = fieldWeight in 1383, product of:
              3.4641016 = tf(freq=12.0), with freq of:
                12.0 = termFreq=12.0
              3.1620505 = idf(docFreq=5088, maxDocs=44218)
              0.046875 = fieldNorm(doc=1383)
        0.020737685 = product of:
          0.04147537 = sum of:
            0.04147537 = weight(_text_:22 in 1383) [ClassicSimilarity], result of:
              0.04147537 = score(doc=1383,freq=2.0), product of:
                0.1786648 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.051020417 = queryNorm
                0.23214069 = fieldWeight in 1383, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.046875 = fieldNorm(doc=1383)
          0.5 = coord(1/2)
      0.6666667 = coord(2/3)
    
    Abstract
    Das Buch richtet sich an Praktiker in Unternehmen, die sich mit der Analyse von großen Datenbeständen beschäftigen. Nach einem kurzen Theorieteil werden vier Fallstudien aus dem Customer Relationship Management eines Versandhändlers bearbeitet. Dabei wurden acht führende Softwarelösungen verwendet: der Intelligent Miner von IBM, der Enterprise Miner von SAS, Clementine von SPSS, Knowledge Studio von Angoss, der Delta Miner von Bissantz, der Business Miner von Business Object und die Data Engine von MIT. Im Rahmen der Fallstudien werden die Stärken und Schwächen der einzelnen Lösungen deutlich, und die methodisch-korrekte Vorgehensweise beim Data Mining wird aufgezeigt. Beides liefert wertvolle Entscheidungshilfen für die Auswahl von Standardsoftware zum Data Mining und für die praktische Datenanalyse.
    Content
    Modelle, Methoden und Werkzeuge: Ziele und Aufbau der Untersuchung.- Grundlagen.- Planung und Entscheidung mit Data-Mining-Unterstützung.- Methoden.- Funktionalität und Handling der Softwarelösungen. Fallstudien: Ausgangssituation und Datenbestand im Versandhandel.- Kundensegmentierung.- Erklärung regionaler Marketingerfolge zur Neukundengewinnung.Prognose des Customer Lifetime Values.- Selektion von Kunden für eine Direktmarketingaktion.- Welche Softwarelösung für welche Entscheidung?- Fazit und Marktentwicklungen.
    Date
    22. 3.2008 14:46:06
    Theme
    Data Mining
  12. Engerer, V.: Informationswissenschaft und Linguistik. : kurze Geschichte eines fruchtbaren interdisziplinäaren Verhäaltnisses in drei Akten (2012) 0.07
    0.06836687 = product of:
      0.1025503 = sum of:
        0.056362033 = weight(_text_:data in 3376) [ClassicSimilarity], result of:
          0.056362033 = score(doc=3376,freq=2.0), product of:
            0.16132914 = queryWeight, product of:
              3.1620505 = idf(docFreq=5088, maxDocs=44218)
              0.051020417 = queryNorm
            0.34936053 = fieldWeight in 3376, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.1620505 = idf(docFreq=5088, maxDocs=44218)
              0.078125 = fieldNorm(doc=3376)
        0.04618826 = product of:
          0.09237652 = sum of:
            0.09237652 = weight(_text_:processing in 3376) [ClassicSimilarity], result of:
              0.09237652 = score(doc=3376,freq=2.0), product of:
                0.20653816 = queryWeight, product of:
                  4.048147 = idf(docFreq=2097, maxDocs=44218)
                  0.051020417 = queryNorm
                0.4472613 = fieldWeight in 3376, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  4.048147 = idf(docFreq=2097, maxDocs=44218)
                  0.078125 = fieldNorm(doc=3376)
          0.5 = coord(1/2)
      0.6666667 = coord(2/3)
    
    Source
    SDV - Sprache und Datenverarbeitung. International journal for language data processing. 36(2012) H.2, S.71-91 [= E-Books - Fakten, Perspektiven und Szenarien] 36/2 (2012), S. 71-91
  13. Weingarten, R.: ¬Die Verkabelung der Sprache : Grenzen der Technisierung von Kommunikation (1989) 0.07
    0.06767975 = product of:
      0.101519614 = sum of:
        0.055795565 = weight(_text_:data in 7156) [ClassicSimilarity], result of:
          0.055795565 = score(doc=7156,freq=4.0), product of:
            0.16132914 = queryWeight, product of:
              3.1620505 = idf(docFreq=5088, maxDocs=44218)
              0.051020417 = queryNorm
            0.34584928 = fieldWeight in 7156, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              3.1620505 = idf(docFreq=5088, maxDocs=44218)
              0.0546875 = fieldNorm(doc=7156)
        0.045724045 = product of:
          0.09144809 = sum of:
            0.09144809 = weight(_text_:processing in 7156) [ClassicSimilarity], result of:
              0.09144809 = score(doc=7156,freq=4.0), product of:
                0.20653816 = queryWeight, product of:
                  4.048147 = idf(docFreq=2097, maxDocs=44218)
                  0.051020417 = queryNorm
                0.4427661 = fieldWeight in 7156, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  4.048147 = idf(docFreq=2097, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=7156)
          0.5 = coord(1/2)
      0.6666667 = coord(2/3)
    
    LCSH
    Communication / Data processing
    Subject
    Communication / Data processing
  14. Wettengel, M.: Zur Rekonstruktion digitaler Datenbestände aus der DDR nach der Wiedervereinigung : die Erfahrungen im Bundesarchiv (1997) 0.07
    0.06711142 = product of:
      0.10066712 = sum of:
        0.06833533 = weight(_text_:data in 1092) [ClassicSimilarity], result of:
          0.06833533 = score(doc=1092,freq=6.0), product of:
            0.16132914 = queryWeight, product of:
              3.1620505 = idf(docFreq=5088, maxDocs=44218)
              0.051020417 = queryNorm
            0.42357713 = fieldWeight in 1092, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              3.1620505 = idf(docFreq=5088, maxDocs=44218)
              0.0546875 = fieldNorm(doc=1092)
        0.032331783 = product of:
          0.06466357 = sum of:
            0.06466357 = weight(_text_:processing in 1092) [ClassicSimilarity], result of:
              0.06466357 = score(doc=1092,freq=2.0), product of:
                0.20653816 = queryWeight, product of:
                  4.048147 = idf(docFreq=2097, maxDocs=44218)
                  0.051020417 = queryNorm
                0.3130829 = fieldWeight in 1092, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  4.048147 = idf(docFreq=2097, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=1092)
          0.5 = coord(1/2)
      0.6666667 = coord(2/3)
    
    Abstract
    After German unification, many former East German government agencies and institutions were closed. Archivists had to secure not only their paper records, but also a considerable number of digital data holdings. East German data processing systems proved not entirely different from those in the Western world, so there were no serious technical problems in terms of hardware and software. However, the conditions of acquisition were very different. In many instances, documentation of these electronic records proved to be incomplete or even totally missing. In such cases, different approaches were taken to identify and verify data file structures and to reconstruct missing documentation
  15. Gabler, S.: Vergabe von DDC-Sachgruppen mittels eines Schlagwort-Thesaurus (2021) 0.06
    0.063806206 = product of:
      0.09570931 = sum of:
        0.06752829 = product of:
          0.20258486 = sum of:
            0.20258486 = weight(_text_:3a in 1000) [ClassicSimilarity], result of:
              0.20258486 = score(doc=1000,freq=2.0), product of:
                0.43255165 = queryWeight, product of:
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.051020417 = queryNorm
                0.46834838 = fieldWeight in 1000, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=1000)
          0.33333334 = coord(1/3)
        0.028181016 = weight(_text_:data in 1000) [ClassicSimilarity], result of:
          0.028181016 = score(doc=1000,freq=2.0), product of:
            0.16132914 = queryWeight, product of:
              3.1620505 = idf(docFreq=5088, maxDocs=44218)
              0.051020417 = queryNorm
            0.17468026 = fieldWeight in 1000, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.1620505 = idf(docFreq=5088, maxDocs=44218)
              0.0390625 = fieldNorm(doc=1000)
      0.6666667 = coord(2/3)
    
    Abstract
    Vorgestellt wird die Konstruktion eines thematisch geordneten Thesaurus auf Basis der Sachschlagwörter der Gemeinsamen Normdatei (GND) unter Nutzung der darin enthaltenen DDC-Notationen. Oberste Ordnungsebene dieses Thesaurus werden die DDC-Sachgruppen der Deutschen Nationalbibliothek. Die Konstruktion des Thesaurus erfolgt regelbasiert unter der Nutzung von Linked Data Prinzipien in einem SPARQL Prozessor. Der Thesaurus dient der automatisierten Gewinnung von Metadaten aus wissenschaftlichen Publikationen mittels eines computerlinguistischen Extraktors. Hierzu werden digitale Volltexte verarbeitet. Dieser ermittelt die gefundenen Schlagwörter über Vergleich der Zeichenfolgen Benennungen im Thesaurus, ordnet die Treffer nach Relevanz im Text und gibt die zugeordne-ten Sachgruppen rangordnend zurück. Die grundlegende Annahme dabei ist, dass die gesuchte Sachgruppe unter den oberen Rängen zurückgegeben wird. In einem dreistufigen Verfahren wird die Leistungsfähigkeit des Verfahrens validiert. Hierzu wird zunächst anhand von Metadaten und Erkenntnissen einer Kurzautopsie ein Goldstandard aus Dokumenten erstellt, die im Online-Katalog der DNB abrufbar sind. Die Dokumente vertei-len sich über 14 der Sachgruppen mit einer Losgröße von jeweils 50 Dokumenten. Sämtliche Dokumente werden mit dem Extraktor erschlossen und die Ergebnisse der Kategorisierung do-kumentiert. Schließlich wird die sich daraus ergebende Retrievalleistung sowohl für eine harte (binäre) Kategorisierung als auch eine rangordnende Rückgabe der Sachgruppen beurteilt.
    Content
    Master thesis Master of Science (Library and Information Studies) (MSc), Universität Wien. Advisor: Christoph Steiner. Vgl.: https://www.researchgate.net/publication/371680244_Vergabe_von_DDC-Sachgruppen_mittels_eines_Schlagwort-Thesaurus. DOI: 10.25365/thesis.70030. Vgl. dazu die Präsentation unter: https://www.google.com/url?sa=i&rct=j&q=&esrc=s&source=web&cd=&ved=0CAIQw7AJahcKEwjwoZzzytz_AhUAAAAAHQAAAAAQAg&url=https%3A%2F%2Fwiki.dnb.de%2Fdownload%2Fattachments%2F252121510%2FDA3%2520Workshop-Gabler.pdf%3Fversion%3D1%26modificationDate%3D1671093170000%26api%3Dv2&psig=AOvVaw0szwENK1or3HevgvIDOfjx&ust=1687719410889597&opi=89978449.
  16. Hartmann, S.; Haffner, A.: Linked-RDA-Data in der Praxis (2010) 0.06
    0.06094441 = product of:
      0.09141661 = sum of:
        0.06376636 = weight(_text_:data in 1679) [ClassicSimilarity], result of:
          0.06376636 = score(doc=1679,freq=4.0), product of:
            0.16132914 = queryWeight, product of:
              3.1620505 = idf(docFreq=5088, maxDocs=44218)
              0.051020417 = queryNorm
            0.3952563 = fieldWeight in 1679, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              3.1620505 = idf(docFreq=5088, maxDocs=44218)
              0.0625 = fieldNorm(doc=1679)
        0.027650248 = product of:
          0.055300497 = sum of:
            0.055300497 = weight(_text_:22 in 1679) [ClassicSimilarity], result of:
              0.055300497 = score(doc=1679,freq=2.0), product of:
                0.1786648 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.051020417 = queryNorm
                0.30952093 = fieldWeight in 1679, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0625 = fieldNorm(doc=1679)
          0.5 = coord(1/2)
      0.6666667 = coord(2/3)
    
    Abstract
    Durch den neuen Erschließungsstandard "Resource Description and Access" (RDA) lassen sich bibliografische Daten sowie Normdaten Semantic-Web-konform repräsentieren. Der Vortrag soll aufzeigen, welche Auswirkungen RDA auf die Katalogisierung in Bibliotheken und den Zugang zu den erschlossenen Ressourcen im Semantic Web hat. Anhand erster Erfahrungen aus praktischen Umsetzungen wird erläutert, wie bibliografische Daten durch RDA und Linked-Data-Technologien besser zugänglich gemacht und vor allem nachgenutzt werden können.
    Date
    13. 2.2011 20:22:23
  17. Kaiser, R.; Ockenfeld, M.; Skurcz, N.: Wann versteht mich mein Computer endlich? : 1. DGI-Konfernz: Semantic Web & Linked Data - Elemente zukünftiger Informationsinfrastrukturen (2011) 0.06
    0.06094441 = product of:
      0.09141661 = sum of:
        0.06376636 = weight(_text_:data in 4392) [ClassicSimilarity], result of:
          0.06376636 = score(doc=4392,freq=4.0), product of:
            0.16132914 = queryWeight, product of:
              3.1620505 = idf(docFreq=5088, maxDocs=44218)
              0.051020417 = queryNorm
            0.3952563 = fieldWeight in 4392, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              3.1620505 = idf(docFreq=5088, maxDocs=44218)
              0.0625 = fieldNorm(doc=4392)
        0.027650248 = product of:
          0.055300497 = sum of:
            0.055300497 = weight(_text_:22 in 4392) [ClassicSimilarity], result of:
              0.055300497 = score(doc=4392,freq=2.0), product of:
                0.1786648 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.051020417 = queryNorm
                0.30952093 = fieldWeight in 4392, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0625 = fieldNorm(doc=4392)
          0.5 = coord(1/2)
      0.6666667 = coord(2/3)
    
    Abstract
    »Wann versteht mich mein Computer endlich?« So könnte man die Quintessenz der 1. DGI-Konferenz, ausgerichtet von der Deutschen Gesellschaft für Informationswissenschaft und Informationspraxis (DGI) anlässlich der diesjährigen Frankfurter Buchmesse zusammenfassen, in deren Rahmen zugleich die 62. DGI-Jahrestagung stattgefunden hat. Unter dem Motto »Semantic Web & Linked Data - Elemente zukünftiger Informationsinfrastrukturen« kamen vom 7. bis 9. Oktober 2010 über 400 Informationsfachleute aus Wissenschaft, Bildung, Verwaltung, Wirtschaft und Bibliotheken zusammen, um ihre Arbeiten und Erkenntnisse zur nächsten Generation der Webtechnologien vorzustellen und untereinander zu diskutieren.
    Source
    BuB. 63(2011) H.1, S.22-23
  18. Drewer, P.; Massion, F; Pulitano, D: Was haben Wissensmodellierung, Wissensstrukturierung, künstliche Intelligenz und Terminologie miteinander zu tun? (2017) 0.06
    0.060616564 = product of:
      0.090924844 = sum of:
        0.056362033 = weight(_text_:data in 5576) [ClassicSimilarity], result of:
          0.056362033 = score(doc=5576,freq=2.0), product of:
            0.16132914 = queryWeight, product of:
              3.1620505 = idf(docFreq=5088, maxDocs=44218)
              0.051020417 = queryNorm
            0.34936053 = fieldWeight in 5576, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.1620505 = idf(docFreq=5088, maxDocs=44218)
              0.078125 = fieldNorm(doc=5576)
        0.03456281 = product of:
          0.06912562 = sum of:
            0.06912562 = weight(_text_:22 in 5576) [ClassicSimilarity], result of:
              0.06912562 = score(doc=5576,freq=2.0), product of:
                0.1786648 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.051020417 = queryNorm
                0.38690117 = fieldWeight in 5576, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.078125 = fieldNorm(doc=5576)
          0.5 = coord(1/2)
      0.6666667 = coord(2/3)
    
    Abstract
    Diese Publikation beschreibt die Zusammenhänge zwischen wissenshaltigen begriffsorientierten Terminologien, Ontologien, Big Data und künstliche Intelligenz.
    Date
    13.12.2017 14:17:22
  19. Mönnich, M.W.: Personalcomputer in Bibliotheken : Arbeitsplätze für Benutzer (1991) 0.06
    0.058011204 = product of:
      0.087016806 = sum of:
        0.04782477 = weight(_text_:data in 5389) [ClassicSimilarity], result of:
          0.04782477 = score(doc=5389,freq=4.0), product of:
            0.16132914 = queryWeight, product of:
              3.1620505 = idf(docFreq=5088, maxDocs=44218)
              0.051020417 = queryNorm
            0.29644224 = fieldWeight in 5389, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              3.1620505 = idf(docFreq=5088, maxDocs=44218)
              0.046875 = fieldNorm(doc=5389)
        0.03919204 = product of:
          0.07838408 = sum of:
            0.07838408 = weight(_text_:processing in 5389) [ClassicSimilarity], result of:
              0.07838408 = score(doc=5389,freq=4.0), product of:
                0.20653816 = queryWeight, product of:
                  4.048147 = idf(docFreq=2097, maxDocs=44218)
                  0.051020417 = queryNorm
                0.3795138 = fieldWeight in 5389, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  4.048147 = idf(docFreq=2097, maxDocs=44218)
                  0.046875 = fieldNorm(doc=5389)
          0.5 = coord(1/2)
      0.6666667 = coord(2/3)
    
    LCSH
    Libraries / Data processing
    Subject
    Libraries / Data processing
  20. Seelbach, D.: Computerlinguistik und Dokumentation : keyphrases in Dokumentationsprozessen (1975) 0.06
    0.058011204 = product of:
      0.087016806 = sum of:
        0.04782477 = weight(_text_:data in 299) [ClassicSimilarity], result of:
          0.04782477 = score(doc=299,freq=4.0), product of:
            0.16132914 = queryWeight, product of:
              3.1620505 = idf(docFreq=5088, maxDocs=44218)
              0.051020417 = queryNorm
            0.29644224 = fieldWeight in 299, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              3.1620505 = idf(docFreq=5088, maxDocs=44218)
              0.046875 = fieldNorm(doc=299)
        0.03919204 = product of:
          0.07838408 = sum of:
            0.07838408 = weight(_text_:processing in 299) [ClassicSimilarity], result of:
              0.07838408 = score(doc=299,freq=4.0), product of:
                0.20653816 = queryWeight, product of:
                  4.048147 = idf(docFreq=2097, maxDocs=44218)
                  0.051020417 = queryNorm
                0.3795138 = fieldWeight in 299, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  4.048147 = idf(docFreq=2097, maxDocs=44218)
                  0.046875 = fieldNorm(doc=299)
          0.5 = coord(1/2)
      0.6666667 = coord(2/3)
    
    LCSH
    Documentation / Data processing
    Subject
    Documentation / Data processing

Languages

Types

  • a 1307
  • m 285
  • el 161
  • s 73
  • x 42
  • i 22
  • r 9
  • ? 8
  • b 6
  • d 4
  • p 3
  • u 2
  • au 1
  • h 1
  • n 1
  • More… Less…

Themes

Subjects

Classifications