Search (99 results, page 1 of 5)

  • × year_i:[1980 TO 1990}
  • × type_ss:"a"
  1. Bock, H.-H.: Datenanalyse zur Strukturierung und Ordnung von Information (1989) 0.10
    0.102104165 = product of:
      0.20420833 = sum of:
        0.20420833 = sum of:
          0.1555836 = weight(_text_:daten in 141) [ClassicSimilarity], result of:
            0.1555836 = score(doc=141,freq=6.0), product of:
              0.24402376 = queryWeight, product of:
                4.759573 = idf(docFreq=1029, maxDocs=44218)
                0.051270094 = queryNorm
              0.6375756 = fieldWeight in 141, product of:
                2.4494898 = tf(freq=6.0), with freq of:
                  6.0 = termFreq=6.0
                4.759573 = idf(docFreq=1029, maxDocs=44218)
                0.0546875 = fieldNorm(doc=141)
          0.048624728 = weight(_text_:22 in 141) [ClassicSimilarity], result of:
            0.048624728 = score(doc=141,freq=2.0), product of:
              0.17953913 = queryWeight, product of:
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.051270094 = queryNorm
              0.2708308 = fieldWeight in 141, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.0546875 = fieldNorm(doc=141)
      0.5 = coord(1/2)
    
    Abstract
    Aufgabe der Datenanalyse ist es, Daten zu ordnen, übersichtlich darzustellen, verborgene und natürlich Strukturen zu entdecken, die diesbezüglich wesentlichen Eigenschaften herauszukristallisieren und zweckmäßige Modelle zur Beschreibung von Daten aufzustellen. Es wird ein Einblick in die Methoden und Prinzipien der Datenanalyse vermittelt. Anhand typischer Beispiele wird gezeigt, welche Daten analysiert, welche Strukturen betrachtet, welche Darstellungs- bzw. Ordnungsmethoden verwendet, welche Zielsetzungen verfolgt und welche Bewertungskriterien dabei angewendet werden können. Diskutiert wird auch die angemessene Verwendung der unterschiedlichen Methoden, wobei auf die gefahr und Art von Fehlinterpretationen hingewiesen wird
    Pages
    S.1-22
  2. Herfurth, M.: Daten- versus Literaturdokumentation in den Sozialwissenschaften (1985) 0.04
    0.044913113 = product of:
      0.089826226 = sum of:
        0.089826226 = product of:
          0.17965245 = sum of:
            0.17965245 = weight(_text_:daten in 7591) [ClassicSimilarity], result of:
              0.17965245 = score(doc=7591,freq=2.0), product of:
                0.24402376 = queryWeight, product of:
                  4.759573 = idf(docFreq=1029, maxDocs=44218)
                  0.051270094 = queryNorm
                0.73620886 = fieldWeight in 7591, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  4.759573 = idf(docFreq=1029, maxDocs=44218)
                  0.109375 = fieldNorm(doc=7591)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  3. Ihm, P.: Numerische Taxonomie und Datenbanken (1982) 0.04
    0.044452455 = product of:
      0.08890491 = sum of:
        0.08890491 = product of:
          0.17780982 = sum of:
            0.17780982 = weight(_text_:daten in 101) [ClassicSimilarity], result of:
              0.17780982 = score(doc=101,freq=6.0), product of:
                0.24402376 = queryWeight, product of:
                  4.759573 = idf(docFreq=1029, maxDocs=44218)
                  0.051270094 = queryNorm
                0.72865784 = fieldWeight in 101, product of:
                  2.4494898 = tf(freq=6.0), with freq of:
                    6.0 = termFreq=6.0
                  4.759573 = idf(docFreq=1029, maxDocs=44218)
                  0.0625 = fieldNorm(doc=101)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    Eine Datebank besteht aus Datenbasis und Datenbank-Managementsystem. Sie ist Hilfsmittel zur Speicherung und Bereitstellung von Daten, die häufig Gegenstand einer statistischen (explorativen) Datenanalyse sind. Ein häufig angewandtes Verfahren ist die Clusteranalyse. In den für Datenbanken typischen Fällen haben Daten eine komplexe Struktur, es fragt sich, ob die Clusteranalysealgorithmen statt auf dem Umweg über rechteckige Dateien direkt auf die anders strukturierten Daten angewendet werden können. Entsprechende Methodologien für die verschiedenen Anwendungsfälle werden diskutiert
  4. Brannemann, M.: Weiterverarbeiten von Daten nach Downloading (1988) 0.04
    0.038496953 = product of:
      0.076993905 = sum of:
        0.076993905 = product of:
          0.15398781 = sum of:
            0.15398781 = weight(_text_:daten in 3988) [ClassicSimilarity], result of:
              0.15398781 = score(doc=3988,freq=2.0), product of:
                0.24402376 = queryWeight, product of:
                  4.759573 = idf(docFreq=1029, maxDocs=44218)
                  0.051270094 = queryNorm
                0.63103616 = fieldWeight in 3988, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  4.759573 = idf(docFreq=1029, maxDocs=44218)
                  0.09375 = fieldNorm(doc=3988)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  5. Lobeck, M.A.: Zur Standardisierung von Daten-Feldern und -Inhalten : Überlegungen und Aktivitäten von Normenorganisationen (1987) 0.04
    0.038496953 = product of:
      0.076993905 = sum of:
        0.076993905 = product of:
          0.15398781 = sum of:
            0.15398781 = weight(_text_:daten in 256) [ClassicSimilarity], result of:
              0.15398781 = score(doc=256,freq=2.0), product of:
                0.24402376 = queryWeight, product of:
                  4.759573 = idf(docFreq=1029, maxDocs=44218)
                  0.051270094 = queryNorm
                0.63103616 = fieldWeight in 256, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  4.759573 = idf(docFreq=1029, maxDocs=44218)
                  0.09375 = fieldNorm(doc=256)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  6. Fischer, H.G.: CONDOR: Modell eines integrierten DB-/IR-Systems für strukturierte und unstrukturierte Daten (1982) 0.04
    0.036295276 = product of:
      0.07259055 = sum of:
        0.07259055 = product of:
          0.1451811 = sum of:
            0.1451811 = weight(_text_:daten in 5197) [ClassicSimilarity], result of:
              0.1451811 = score(doc=5197,freq=4.0), product of:
                0.24402376 = queryWeight, product of:
                  4.759573 = idf(docFreq=1029, maxDocs=44218)
                  0.051270094 = queryNorm
                0.5949466 = fieldWeight in 5197, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  4.759573 = idf(docFreq=1029, maxDocs=44218)
                  0.0625 = fieldNorm(doc=5197)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    CONDOR ist ein Modell eines modularen, integrierten DB-/IR-Systems, mit dem sowohl strukturierte als auch unstrukturierte Daten (Textdaten) verarbeiet werden können. Die abzuspeichernden Informationen werden weitgehend automatich erschlossen. Da ein breiter Benutzerkreis Zugang zum System haben soll, sind verschiedene Dialogformen (Kommando, natürlichsprachlich, Formular, Menü) implementiert. Es wird versucht, sie in einer systematischen Oberflächengestaltung des Systems zusammenzuführen, um eine möglichst einfache Bedienung für den einzelnen Benutzer bei hoher Nutzungsflexibilität des Systems zu erreichen
  7. Dahlberg, I.: Conceptual definitions for INTERCONCEPT (1981) 0.03
    0.03473195 = product of:
      0.0694639 = sum of:
        0.0694639 = product of:
          0.1389278 = sum of:
            0.1389278 = weight(_text_:22 in 1630) [ClassicSimilarity], result of:
              0.1389278 = score(doc=1630,freq=2.0), product of:
                0.17953913 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.051270094 = queryNorm
                0.77380234 = fieldWeight in 1630, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.15625 = fieldNorm(doc=1630)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Source
    International classification. 8(1981), S.16-22
  8. Pietris, M.K.D.: LCSH update (1988) 0.03
    0.03473195 = product of:
      0.0694639 = sum of:
        0.0694639 = product of:
          0.1389278 = sum of:
            0.1389278 = weight(_text_:22 in 2798) [ClassicSimilarity], result of:
              0.1389278 = score(doc=2798,freq=2.0), product of:
                0.17953913 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.051270094 = queryNorm
                0.77380234 = fieldWeight in 2798, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.15625 = fieldNorm(doc=2798)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Source
    Cataloguing Australia. 13(1988), S.19-22
  9. Woods, W.A.: What's important about knowledge representation? (1983) 0.03
    0.03473195 = product of:
      0.0694639 = sum of:
        0.0694639 = product of:
          0.1389278 = sum of:
            0.1389278 = weight(_text_:22 in 6143) [ClassicSimilarity], result of:
              0.1389278 = score(doc=6143,freq=2.0), product of:
                0.17953913 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.051270094 = queryNorm
                0.77380234 = fieldWeight in 6143, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.15625 = fieldNorm(doc=6143)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Source
    Computer. 16(1983) no.10, S.22-27
  10. Malsburg, C. von der: ¬The correlation theory of brain function (1981) 0.03
    0.033929378 = product of:
      0.067858756 = sum of:
        0.067858756 = product of:
          0.20357625 = sum of:
            0.20357625 = weight(_text_:3a in 76) [ClassicSimilarity], result of:
              0.20357625 = score(doc=76,freq=2.0), product of:
                0.43466842 = queryWeight, product of:
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.051270094 = queryNorm
                0.46834838 = fieldWeight in 76, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=76)
          0.33333334 = coord(1/3)
      0.5 = coord(1/2)
    
    Source
    http%3A%2F%2Fcogprints.org%2F1380%2F1%2FvdM_correlation.pdf&usg=AOvVaw0g7DvZbQPb2U7dYb49b9v_
  11. Gebhardt, F.: Semantisches Wissen in Datenbanken : ein Literaturbericht (1987) 0.03
    0.032080796 = product of:
      0.06416159 = sum of:
        0.06416159 = product of:
          0.12832318 = sum of:
            0.12832318 = weight(_text_:daten in 2423) [ClassicSimilarity], result of:
              0.12832318 = score(doc=2423,freq=2.0), product of:
                0.24402376 = queryWeight, product of:
                  4.759573 = idf(docFreq=1029, maxDocs=44218)
                  0.051270094 = queryNorm
                0.52586347 = fieldWeight in 2423, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  4.759573 = idf(docFreq=1029, maxDocs=44218)
                  0.078125 = fieldNorm(doc=2423)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    Die "Bedeutung" der Daten schlägt sich darin nieder, wie sie verarbeitet werden oder überhaupt nur verarbeitet werden dürfen. Dieser semantische Aspekt steckt vorwiegend in den Verarbeitungsprogrammen. In mancherlei Situationen ist es jedoch sinnvoll, wenigstens einen Teil davon in die Datenbank zu übernehmen. Hierfür gibt es vielfältige Methoden mit recht unterschiedlichen Voraussetzungen, Auswirkungen und Leistungen ...
  12. Krommer-Benz, M.; Nedobity, W.: Klassifikationssysteme und die terminologischen Datenbanken (1985) 0.03
    0.031758364 = product of:
      0.06351673 = sum of:
        0.06351673 = product of:
          0.12703346 = sum of:
            0.12703346 = weight(_text_:daten in 116) [ClassicSimilarity], result of:
              0.12703346 = score(doc=116,freq=4.0), product of:
                0.24402376 = queryWeight, product of:
                  4.759573 = idf(docFreq=1029, maxDocs=44218)
                  0.051270094 = queryNorm
                0.52057827 = fieldWeight in 116, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  4.759573 = idf(docFreq=1029, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=116)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    Probleme der Klassifikationssysteme, die die speziellen Bedürfnisse terminologischer Datenbanken erfüllen müssen, gewinnen zunehmend an Bedeutung. Die Arbeit präsentiert zunächst eine Übersicht über gegenwärtig verwendete Klassifikationssysteme. Die Beschreibung dieser verschiedenen Systeme basiert auf einer Datenerhebung INFOTERMs im Rahmen der Erstellung des 'World Guide to Terminolgical Activities'. Zusätzlich werden Daten aus der Dokumentation Infoterms herangezogen. Nach einer Analyse der Klassifikationssysteme mittels Vergleich verwendeter Notationen, Strukturen u.a. werden Ähnlichkeiten und Divergenzen aufgezeigt sowie Kompatibilitätsfragen erörtert. Für den angestrebten Austausch terminologischer Daten ist eine Dachklassifikation unerläßlich. Voraussetzungen, Grundsätze und Möglichkeiten einer solchen werden diskutiert
  13. Junginger, F.: Regeln für den Schlagwortkatalog: RSWK : Ergänzungen und Berichtigungen Nr.1 (1988) 0.03
    0.027785558 = product of:
      0.055571117 = sum of:
        0.055571117 = product of:
          0.11114223 = sum of:
            0.11114223 = weight(_text_:22 in 195) [ClassicSimilarity], result of:
              0.11114223 = score(doc=195,freq=2.0), product of:
                0.17953913 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.051270094 = queryNorm
                0.61904186 = fieldWeight in 195, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.125 = fieldNorm(doc=195)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Source
    Bibliotheksdienst. 22(1988), S.552-563
  14. Voorhees, E.M.: Implementing agglomerative hierarchic clustering algorithms for use in document retrieval (1986) 0.03
    0.027785558 = product of:
      0.055571117 = sum of:
        0.055571117 = product of:
          0.11114223 = sum of:
            0.11114223 = weight(_text_:22 in 402) [ClassicSimilarity], result of:
              0.11114223 = score(doc=402,freq=2.0), product of:
                0.17953913 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.051270094 = queryNorm
                0.61904186 = fieldWeight in 402, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.125 = fieldNorm(doc=402)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Source
    Information processing and management. 22(1986) no.6, S.465-476
  15. Tell, B.V.: Cataloging rules and database production : implications for manpower training in a developing country (1989) 0.03
    0.027785558 = product of:
      0.055571117 = sum of:
        0.055571117 = product of:
          0.11114223 = sum of:
            0.11114223 = weight(_text_:22 in 435) [ClassicSimilarity], result of:
              0.11114223 = score(doc=435,freq=2.0), product of:
                0.17953913 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.051270094 = queryNorm
                0.61904186 = fieldWeight in 435, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.125 = fieldNorm(doc=435)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Source
    International forum on information and documentation. 14(1989), S.22-27
  16. Grundsätze der Universellen Dezimalklassifikation (DK) und Regeln für ihre Revision und Veröffentlichung (1981) 0.03
    0.027785558 = product of:
      0.055571117 = sum of:
        0.055571117 = product of:
          0.11114223 = sum of:
            0.11114223 = weight(_text_:22 in 1175) [ClassicSimilarity], result of:
              0.11114223 = score(doc=1175,freq=2.0), product of:
                0.17953913 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.051270094 = queryNorm
                0.61904186 = fieldWeight in 1175, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.125 = fieldNorm(doc=1175)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Source
    DK-Mitteilungen. 25(1981) Nr.4, S.15-22
  17. Hermes, H.J.: ¬Die DK: eine todkranke Klassifikation? (1983) 0.03
    0.027785558 = product of:
      0.055571117 = sum of:
        0.055571117 = product of:
          0.11114223 = sum of:
            0.11114223 = weight(_text_:22 in 1176) [ClassicSimilarity], result of:
              0.11114223 = score(doc=1176,freq=2.0), product of:
                0.17953913 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.051270094 = queryNorm
                0.61904186 = fieldWeight in 1176, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.125 = fieldNorm(doc=1176)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Source
    DK-Mitteilungen. 27(1983) Nr.6, S.19-22
  18. Kashyap, M.M.: Algorithms for analysis and representation of subject contents in a documentary language (1983) 0.03
    0.027785558 = product of:
      0.055571117 = sum of:
        0.055571117 = product of:
          0.11114223 = sum of:
            0.11114223 = weight(_text_:22 in 1955) [ClassicSimilarity], result of:
              0.11114223 = score(doc=1955,freq=2.0), product of:
                0.17953913 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.051270094 = queryNorm
                0.61904186 = fieldWeight in 1955, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.125 = fieldNorm(doc=1955)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Source
    Library herald. 22(1983), S.1-29
  19. Stock, W.G.: Wissenschaftsinformatik : Fundierung, Gegenstand und Methoden (1980) 0.03
    0.027785558 = product of:
      0.055571117 = sum of:
        0.055571117 = product of:
          0.11114223 = sum of:
            0.11114223 = weight(_text_:22 in 2808) [ClassicSimilarity], result of:
              0.11114223 = score(doc=2808,freq=2.0), product of:
                0.17953913 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.051270094 = queryNorm
                0.61904186 = fieldWeight in 2808, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.125 = fieldNorm(doc=2808)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Source
    Ratio. 22(1980), S.155-164
  20. Dorfmüller, K.: Klassifikationsprobleme bei Musikalien : Sachstandsbericht (1) AIBM-Subkommission für Klassifikation (1980) 0.03
    0.027785558 = product of:
      0.055571117 = sum of:
        0.055571117 = product of:
          0.11114223 = sum of:
            0.11114223 = weight(_text_:22 in 3076) [ClassicSimilarity], result of:
              0.11114223 = score(doc=3076,freq=2.0), product of:
                0.17953913 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.051270094 = queryNorm
                0.61904186 = fieldWeight in 3076, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.125 = fieldNorm(doc=3076)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Source
    Forum Musikbibliothek. 1980, H.4, S.18-22

Authors

Languages

  • e 62
  • d 35
  • f 1
  • More… Less…