Search (13 results, page 1 of 1)

  • × theme_ss:"Datenformate"
  • × theme_ss:"Formalerschließung"
  • × year_i:[1990 TO 2000}
  1. Hädrich, G.: Unreglementierte Gedanken zur Weiterentwicklung des Regelwerks für die alphabetische Katalogisierung (1996) 0.03
    0.031123882 = product of:
      0.15561941 = sum of:
        0.022018395 = weight(_text_:und in 5426) [ClassicSimilarity], result of:
          0.022018395 = score(doc=5426,freq=2.0), product of:
            0.06422601 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.028978055 = queryNorm
            0.34282678 = fieldWeight in 5426, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.109375 = fieldNorm(doc=5426)
        0.044533674 = product of:
          0.08906735 = sum of:
            0.08906735 = weight(_text_:bibliothekswesen in 5426) [ClassicSimilarity], result of:
              0.08906735 = score(doc=5426,freq=2.0), product of:
                0.12917466 = queryWeight, product of:
                  4.457672 = idf(docFreq=1392, maxDocs=44218)
                  0.028978055 = queryNorm
                0.68951094 = fieldWeight in 5426, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  4.457672 = idf(docFreq=1392, maxDocs=44218)
                  0.109375 = fieldNorm(doc=5426)
          0.5 = coord(1/2)
        0.08906735 = weight(_text_:bibliothekswesen in 5426) [ClassicSimilarity], result of:
          0.08906735 = score(doc=5426,freq=2.0), product of:
            0.12917466 = queryWeight, product of:
              4.457672 = idf(docFreq=1392, maxDocs=44218)
              0.028978055 = queryNorm
            0.68951094 = fieldWeight in 5426, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              4.457672 = idf(docFreq=1392, maxDocs=44218)
              0.109375 = fieldNorm(doc=5426)
      0.2 = coord(3/15)
    
    Source
    Zeitschrift für Bibliothekswesen und Bibliographie. 43(1996) H.5, S.471-486
  2. Lehmann, K.-D.: ¬Die Mühen der Ebenen : Regelwerke - Datenformate - Kommunikationsschnittstellen (1997) 0.03
    0.026677614 = product of:
      0.13338807 = sum of:
        0.018872911 = weight(_text_:und in 7715) [ClassicSimilarity], result of:
          0.018872911 = score(doc=7715,freq=2.0), product of:
            0.06422601 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.028978055 = queryNorm
            0.29385152 = fieldWeight in 7715, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.09375 = fieldNorm(doc=7715)
        0.03817172 = product of:
          0.07634344 = sum of:
            0.07634344 = weight(_text_:bibliothekswesen in 7715) [ClassicSimilarity], result of:
              0.07634344 = score(doc=7715,freq=2.0), product of:
                0.12917466 = queryWeight, product of:
                  4.457672 = idf(docFreq=1392, maxDocs=44218)
                  0.028978055 = queryNorm
                0.5910094 = fieldWeight in 7715, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  4.457672 = idf(docFreq=1392, maxDocs=44218)
                  0.09375 = fieldNorm(doc=7715)
          0.5 = coord(1/2)
        0.07634344 = weight(_text_:bibliothekswesen in 7715) [ClassicSimilarity], result of:
          0.07634344 = score(doc=7715,freq=2.0), product of:
            0.12917466 = queryWeight, product of:
              4.457672 = idf(docFreq=1392, maxDocs=44218)
              0.028978055 = queryNorm
            0.5910094 = fieldWeight in 7715, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              4.457672 = idf(docFreq=1392, maxDocs=44218)
              0.09375 = fieldNorm(doc=7715)
      0.2 = coord(3/15)
    
    Source
    Zeitschrift für Bibliothekswesen und Bibliographie. 44(1997) H.3, S.229-240
  3. Krischker, U.: Formale Analyse von Dokumenten (1997) 0.00
    0.003953372 = product of:
      0.02965029 = sum of:
        0.026690327 = weight(_text_:und in 3925) [ClassicSimilarity], result of:
          0.026690327 = score(doc=3925,freq=16.0), product of:
            0.06422601 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.028978055 = queryNorm
            0.41556883 = fieldWeight in 3925, product of:
              4.0 = tf(freq=16.0), with freq of:
                16.0 = termFreq=16.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.046875 = fieldNorm(doc=3925)
        0.002959963 = product of:
          0.005919926 = sum of:
            0.005919926 = weight(_text_:information in 3925) [ClassicSimilarity], result of:
              0.005919926 = score(doc=3925,freq=2.0), product of:
                0.050870337 = queryWeight, product of:
                  1.7554779 = idf(docFreq=20772, maxDocs=44218)
                  0.028978055 = queryNorm
                0.116372846 = fieldWeight in 3925, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  1.7554779 = idf(docFreq=20772, maxDocs=44218)
                  0.046875 = fieldNorm(doc=3925)
          0.5 = coord(1/2)
      0.13333334 = coord(2/15)
    
    Abstract
    Die formale Analyse von Dokumenten ist seit alters her eine der Hauptaufgaben der Bibliotheken und Archive, in neuerer Zeit auch der Dokumentationsstellen. Die gesammelten und ausgewerteten Dokumente müssen identifizierbar und wieder auffindbar sein. Die formale Analyse bietet die Möglichkeit, die formalen Charakteristika von Dokumenten zu beschreiben. Damit können die Dokumente unter mehreren, aber immer den formal gleichen Such- und Ordnungsmerkmalen in Katalogen, Datenbanken, Bibliographien o.ä. nachgewiesen werden. Hierzu werden nach festgelegten Regeln die verschiedenen formalen Charakteristika zu "Auswertungselementen" zusammengefaßt, die jeweils eine sinnvoll nicht weiter zu unterteilende Klasse zusammengehöriger Charakteristika bilden (z.B. ist ein Datum, bestehend aus Tag, Monat, Jahr nicht weiter zu unterteilen und bildet deshalb ein Auswertungselement). Alle bei der formalen Analyse ermittelten Auswertungselemente für die Beschreibung eines Dokumentes werden in festgelegter Form und Reihenfolge niedergeschrieben. Aus der vollständigen Menge von Auswertungselementen wird nur ein bestimmter Teil als Ordnungs- oder Suchmerkmale ausgewählt. Das Ergebnis der formalen Analyse wird gemeinhin "Titelaufnahme" oder "Fundstelle" oder "Zitat" genannt.
    Footnote
    Allgemeine, d.h. nicht ausschließlich an bibliothekarischen Anwendungsbereichen orientierte, Darstellung der formalen Erfassung und bibliographischen Beschreibung von Dokumenten verschiedenster Typen.
    Source
    Grundlagen der praktischen Information und Dokumentation: ein Handbuch zur Einführung in die fachliche Informationsarbeit. 4. Aufl. Hrsg.: M. Buder u.a
  4. Krischker, U.: Formale Analyse (Erfassung) von Dokumenten (1990) 0.00
    0.0038814 = product of:
      0.029110499 = sum of:
        0.025163881 = weight(_text_:und in 1535) [ClassicSimilarity], result of:
          0.025163881 = score(doc=1535,freq=8.0), product of:
            0.06422601 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.028978055 = queryNorm
            0.39180204 = fieldWeight in 1535, product of:
              2.828427 = tf(freq=8.0), with freq of:
                8.0 = termFreq=8.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.0625 = fieldNorm(doc=1535)
        0.0039466172 = product of:
          0.0078932345 = sum of:
            0.0078932345 = weight(_text_:information in 1535) [ClassicSimilarity], result of:
              0.0078932345 = score(doc=1535,freq=2.0), product of:
                0.050870337 = queryWeight, product of:
                  1.7554779 = idf(docFreq=20772, maxDocs=44218)
                  0.028978055 = queryNorm
                0.1551638 = fieldWeight in 1535, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  1.7554779 = idf(docFreq=20772, maxDocs=44218)
                  0.0625 = fieldNorm(doc=1535)
          0.5 = coord(1/2)
      0.13333334 = coord(2/15)
    
    Abstract
    Die formale Analyse von Dokumenten ist komplex und vielschichtig. Um sie beschreiben zu können, muß sie unter verschiedenen Gesichtspunkten betrachtet werden. Zu untersuchen ist, welchen Zweck die formale Analyse erfüllen soll, wem sie nützt und wer sie durchführt, welche Methoden zur Verfügung stehen, welche technischen Hilfsmittel eingesetzt werden können bzw. welche Verfahren es beim Einsatz von Datenverarbeitungsanlagen gibt.
    Footnote
    Allgemeine, d.h. nicht ausschließlich an bibliothekarischen Anwendungsbereichen orientierte, Darstellung der formalen Erfassung und bibliographischen Beschreibung von Dokumenten verschiedenster Typen.
    Source
    Grundlagen der praktischen Information und Dokumentation: ein Handbuch zur Einführung in die fachliche Informationsarbeit. 3. Aufl. Hrsg.: M. Buder u.a. Bd.1
  5. Ranta, J.A.: Queens Borough Public Library's Guidelines for cataloging community information (1996) 0.00
    0.0030503988 = product of:
      0.04575598 = sum of:
        0.04575598 = sum of:
          0.018273093 = weight(_text_:information in 6523) [ClassicSimilarity], result of:
            0.018273093 = score(doc=6523,freq=14.0), product of:
              0.050870337 = queryWeight, product of:
                1.7554779 = idf(docFreq=20772, maxDocs=44218)
                0.028978055 = queryNorm
              0.3592092 = fieldWeight in 6523, product of:
                3.7416575 = tf(freq=14.0), with freq of:
                  14.0 = termFreq=14.0
                1.7554779 = idf(docFreq=20772, maxDocs=44218)
                0.0546875 = fieldNorm(doc=6523)
          0.027482886 = weight(_text_:22 in 6523) [ClassicSimilarity], result of:
            0.027482886 = score(doc=6523,freq=2.0), product of:
              0.101476215 = queryWeight, product of:
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.028978055 = queryNorm
              0.2708308 = fieldWeight in 6523, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.0546875 = fieldNorm(doc=6523)
      0.06666667 = coord(1/15)
    
    Abstract
    Currently, few resources exist to guide libraries in the cataloguing of community information using the new USMARC Format for Cammunity Information (1993). In developing a community information database, Queens Borough Public Library, New York City, formulated their own cataloguing procedures for applying AACR2, LoC File Interpretations, and USMARC Format for Community Information to community information. Their practices include entering corporate names directly whenever possible and assigning LC subject headings for classes of persons and topics, adding neighbourhood level geographic subdivisions. The guidelines were specially designed to aid non cataloguers in cataloguing community information and have enabled library to maintain consistency in handling corporate names and in assigning subject headings, while creating database that is highly accessible to library staff and users
    Source
    Cataloging and classification quarterly. 22(1996) no.2, S.51-69
  6. Crook, M.: Barbara Tillett discusses cataloging rules and conceptual models (1996) 0.00
    0.0026296957 = product of:
      0.039445434 = sum of:
        0.039445434 = sum of:
          0.011962548 = weight(_text_:information in 7683) [ClassicSimilarity], result of:
            0.011962548 = score(doc=7683,freq=6.0), product of:
              0.050870337 = queryWeight, product of:
                1.7554779 = idf(docFreq=20772, maxDocs=44218)
                0.028978055 = queryNorm
              0.23515764 = fieldWeight in 7683, product of:
                2.4494898 = tf(freq=6.0), with freq of:
                  6.0 = termFreq=6.0
                1.7554779 = idf(docFreq=20772, maxDocs=44218)
                0.0546875 = fieldNorm(doc=7683)
          0.027482886 = weight(_text_:22 in 7683) [ClassicSimilarity], result of:
            0.027482886 = score(doc=7683,freq=2.0), product of:
              0.101476215 = queryWeight, product of:
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.028978055 = queryNorm
              0.2708308 = fieldWeight in 7683, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.0546875 = fieldNorm(doc=7683)
      0.06666667 = coord(1/15)
    
    Abstract
    The chief of cataloguing policy and support office at the LoC presents her views on the usefulness of conceptual modelling in determining future directions for cataloguing and the MARC format. After describing the evolution of bibliographic processes, suggests usign the entity-relationship conceptual model to step back from how we record information today and start thinking about what information really means and why we provide it. Argues that now is the time to reexamine the basic principles which underpin Anglo-American cataloguing codes and that MARC formats should be looked at to see how they can evolve towards a future, improved structure for communicating bibliographic and authority information
    Source
    OCLC newsletter. 1996, no.220, S.20-22
  7. Meßmer, G.: Brauchen wir noch Regelwerke und Datenformate? : Thesen zu einer Reform des Regelwerks für die Alphabetische Katalogisierung (1999) 0.00
    0.0017977946 = product of:
      0.026966918 = sum of:
        0.026966918 = weight(_text_:und in 2524) [ClassicSimilarity], result of:
          0.026966918 = score(doc=2524,freq=12.0), product of:
            0.06422601 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.028978055 = queryNorm
            0.41987535 = fieldWeight in 2524, product of:
              3.4641016 = tf(freq=12.0), with freq of:
                12.0 = termFreq=12.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.0546875 = fieldNorm(doc=2524)
      0.06666667 = coord(1/15)
    
    Abstract
    Der provokante Titel war gleichzeitig Name und Programm einer gut besuchten Po-diumsdiskussion der Bayerischen Staatsbibliothek, angeregt durch die Bestrebungen, die Katalogisierungsregeln in Deutschland zu reformieren. Die Teilnehmerinnen und Teilnehmer der Podiumsrunde machten mit ihren Diskussionsbeiträgen schnell klar, daß die Frage nur rhetorisch gemeint sein konnte. Unumstritten ist der Zwang zu Veränderungen, bewirkt durch den technischen Fortschritt einerseits und den Kostendruck andererseits. Bei der Frage, welche Veränderungen notwendig sind, gingen die Meinungen allerdings ziemlich auseinander. Aus dem sich rege an der Diskussion beteiligenden Auditorium kam Kritik an den Normdateien, speziell der Personen-namendatei (PND) und vor allem an der Gemeinsamen Körperschaftsdatei (GKD) als dem arbeitsaufwendigsten und teuersten Katalogisierungsinstrument überhaupt.
  8. Kartus, E.: Beyond MARC : is it really possible? (1995) 0.00
    3.7209064E-4 = product of:
      0.0055813594 = sum of:
        0.0055813594 = product of:
          0.011162719 = sum of:
            0.011162719 = weight(_text_:information in 5753) [ClassicSimilarity], result of:
              0.011162719 = score(doc=5753,freq=4.0), product of:
                0.050870337 = queryWeight, product of:
                  1.7554779 = idf(docFreq=20772, maxDocs=44218)
                  0.028978055 = queryNorm
                0.21943474 = fieldWeight in 5753, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  1.7554779 = idf(docFreq=20772, maxDocs=44218)
                  0.0625 = fieldNorm(doc=5753)
          0.5 = coord(1/2)
      0.06666667 = coord(1/15)
    
    Abstract
    In their attempts to make materials more accessible, librarians are making catalogue entries unnecessarily complicated. Discusses the current scenario where catalogues appear to contain much information that is irrelevant to users. Provides an example of a future scenario where one composite catalogue record replaces a number of unitary records. Asks why the information that publishers have in machine readable form cannot be used with current technology to help simplify the entry
  9. Leazer, G.H.: ¬A conceptual schema for the control of bibliographic works (1994) 0.00
    2.848226E-4 = product of:
      0.004272339 = sum of:
        0.004272339 = product of:
          0.008544678 = sum of:
            0.008544678 = weight(_text_:information in 3033) [ClassicSimilarity], result of:
              0.008544678 = score(doc=3033,freq=6.0), product of:
                0.050870337 = queryWeight, product of:
                  1.7554779 = idf(docFreq=20772, maxDocs=44218)
                  0.028978055 = queryNorm
                0.16796975 = fieldWeight in 3033, product of:
                  2.4494898 = tf(freq=6.0), with freq of:
                    6.0 = termFreq=6.0
                  1.7554779 = idf(docFreq=20772, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=3033)
          0.5 = coord(1/2)
      0.06666667 = coord(1/15)
    
    Abstract
    In this paper I describe a conceptual design of a bibliographic retrieval system that enables more thourough control of bibliographic entities. A bibliographic entity has 2 components: the intellectual work and the physical item. Users searching bibliographic retrieval systems generally do not search for a specific item, but are willing to retrieve one of several alternative manifestations of a work. However, contemporary bibliographic retrieval systems are based solely on the descriptions of items. Works are described only implcitly by collocating descriptions of items. This method has resulted in a tool that does not include important descriptive attributes of the work, e.g. information regarding its history, its genre, or its bibliographic relationships. A bibliographic relationship is an association between 2 bibliographic entities. A system evaluation methodology wasused to create a conceptual schema for a bibliographic retrieval system. The model is based upon an analysis of data elements in the USMARC Formats for Bibliographic Data. The conceptual schema describes a database comprising 2 separate files of bibliographic descriptions, one of works and the other of items. Each file consists of individual descriptive surrogates of their respective entities. the specific data content of each file is defined by a data dictionary. Data elements used in the description of bibliographic works reflect the nature of works as intellectual and linguistic objects. The descriptive elements of bibliographic items describe the physical properties of bibliographic entities. Bibliographic relationships constitute the logical strucutre of the database
    Imprint
    Oxford : Learned Information
    Source
    Navigating the networks: Proceedings of the 1994 Mid-year Meeting of the American Society for Information Science, Portland, Oregon, May 21-25, 1994. Ed.: D.L. Andersen et al
  10. Wool, G.J.; Austhof, B.: Cataloguing standards and machine translation : a study of reformatted ISBD records in an online catalog (1993) 0.00
    2.6310782E-4 = product of:
      0.0039466172 = sum of:
        0.0039466172 = product of:
          0.0078932345 = sum of:
            0.0078932345 = weight(_text_:information in 7321) [ClassicSimilarity], result of:
              0.0078932345 = score(doc=7321,freq=2.0), product of:
                0.050870337 = queryWeight, product of:
                  1.7554779 = idf(docFreq=20772, maxDocs=44218)
                  0.028978055 = queryNorm
                0.1551638 = fieldWeight in 7321, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  1.7554779 = idf(docFreq=20772, maxDocs=44218)
                  0.0625 = fieldNorm(doc=7321)
          0.5 = coord(1/2)
      0.06666667 = coord(1/15)
    
    Source
    Information technology and libraries. 12(1993) no.4, S.383-403
  11. Witt, M.; Leresche, F.: IFLA study on functional requirements for bibliographic records : cataloguing practice in France (1995) 0.00
    2.6310782E-4 = product of:
      0.0039466172 = sum of:
        0.0039466172 = product of:
          0.0078932345 = sum of:
            0.0078932345 = weight(_text_:information in 3081) [ClassicSimilarity], result of:
              0.0078932345 = score(doc=3081,freq=2.0), product of:
                0.050870337 = queryWeight, product of:
                  1.7554779 = idf(docFreq=20772, maxDocs=44218)
                  0.028978055 = queryNorm
                0.1551638 = fieldWeight in 3081, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  1.7554779 = idf(docFreq=20772, maxDocs=44218)
                  0.0625 = fieldNorm(doc=3081)
          0.5 = coord(1/2)
      0.06666667 = coord(1/15)
    
    Abstract
    Discusses the French reaction. Covers the entities considered for cataloguing; elements for identifying a document; access points; and authority records. Considers whether it is possible to reduce redundancies among the elements contained in bibliographic records caused by overlapping between the ISBD description, the access points and the coded information; and whether OPACs can be developed to present clearly to users various entities from the most general level to the most specific level
  12. Heaney, M.: Object-oriented cataloging (1995) 0.00
    2.3021935E-4 = product of:
      0.00345329 = sum of:
        0.00345329 = product of:
          0.00690658 = sum of:
            0.00690658 = weight(_text_:information in 3339) [ClassicSimilarity], result of:
              0.00690658 = score(doc=3339,freq=2.0), product of:
                0.050870337 = queryWeight, product of:
                  1.7554779 = idf(docFreq=20772, maxDocs=44218)
                  0.028978055 = queryNorm
                0.13576832 = fieldWeight in 3339, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  1.7554779 = idf(docFreq=20772, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=3339)
          0.5 = coord(1/2)
      0.06666667 = coord(1/15)
    
    Source
    Information technology and libraries. 14(1995) no.3, S.135-153
  13. Fattahi, R.: ¬A uniform approach to the indexing of cataloguing data in online library systems (1997) 0.00
    1.9733087E-4 = product of:
      0.002959963 = sum of:
        0.002959963 = product of:
          0.005919926 = sum of:
            0.005919926 = weight(_text_:information in 131) [ClassicSimilarity], result of:
              0.005919926 = score(doc=131,freq=2.0), product of:
                0.050870337 = queryWeight, product of:
                  1.7554779 = idf(docFreq=20772, maxDocs=44218)
                  0.028978055 = queryNorm
                0.116372846 = fieldWeight in 131, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  1.7554779 = idf(docFreq=20772, maxDocs=44218)
                  0.046875 = fieldNorm(doc=131)
          0.5 = coord(1/2)
      0.06666667 = coord(1/15)
    
    Abstract
    Argues that in library cataloguing and for optional functionality of bibliographic records the indexing of fields and subfields should follow a uniform approach. This would maintain effectiveness in searching, retrieval and display of bibliographic information both within systems and between systems. However, a review of different postings to the AUTOCAT and USMARC discussion lists indicates that the indexing and tagging of cataloguing data do not, at present, follow a consistent approach in online library systems. If the rationale of cataloguing principles is to bring uniformity in bibliographic description and effectiveness in access, they should also address the question of uniform approaches to the indexing of cataloguing data. In this context and in terms of the identification and handling of data elements, cataloguing standards (codes, MARC formats and the Z39.50 standard) should be brought closer, in that they should provide guidelines for the designation of data elements for machine readable records