Search (53 results, page 1 of 3)

  • × language_ss:"e"
  • × theme_ss:"Datenformate"
  • × theme_ss:"Formalerschließung"
  1. Tennant, R.: ¬A bibliographic metadata infrastructure for the twenty-first century (2004) 0.04
    0.041927725 = product of:
      0.08385545 = sum of:
        0.08385545 = sum of:
          0.013257373 = weight(_text_:a in 2845) [ClassicSimilarity], result of:
            0.013257373 = score(doc=2845,freq=12.0), product of:
              0.053105544 = queryWeight, product of:
                1.153047 = idf(docFreq=37942, maxDocs=44218)
                0.046056706 = queryNorm
              0.24964198 = fieldWeight in 2845, product of:
                3.4641016 = tf(freq=12.0), with freq of:
                  12.0 = termFreq=12.0
                1.153047 = idf(docFreq=37942, maxDocs=44218)
                0.0625 = fieldNorm(doc=2845)
          0.07059808 = weight(_text_:22 in 2845) [ClassicSimilarity], result of:
            0.07059808 = score(doc=2845,freq=4.0), product of:
              0.16128273 = queryWeight, product of:
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.046056706 = queryNorm
              0.4377287 = fieldWeight in 2845, product of:
                2.0 = tf(freq=4.0), with freq of:
                  4.0 = termFreq=4.0
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.0625 = fieldNorm(doc=2845)
      0.5 = coord(1/2)
    
    Abstract
    The current library bibliographic infrastructure was constructed in the early days of computers - before the Web, XML, and a variety of other technological advances that now offer new opportunities. General requirements of a modern metadata infrastructure for libraries are identified, including such qualities as versatility, extensibility, granularity, and openness. A new kind of metadata infrastructure is then proposed that exhibits at least some of those qualities. Some key challenges that must be overcome to implement a change of this magnitude are identified.
    Date
    9.12.2005 19:22:38
    Source
    Library hi tech. 22(2004) no.2, S.175-181
    Type
    a
  2. Coyle, K.: Future considerations : the functional library systems record (2004) 0.03
    0.031588875 = product of:
      0.06317775 = sum of:
        0.06317775 = sum of:
          0.013257373 = weight(_text_:a in 562) [ClassicSimilarity], result of:
            0.013257373 = score(doc=562,freq=12.0), product of:
              0.053105544 = queryWeight, product of:
                1.153047 = idf(docFreq=37942, maxDocs=44218)
                0.046056706 = queryNorm
              0.24964198 = fieldWeight in 562, product of:
                3.4641016 = tf(freq=12.0), with freq of:
                  12.0 = termFreq=12.0
                1.153047 = idf(docFreq=37942, maxDocs=44218)
                0.0625 = fieldNorm(doc=562)
          0.04992038 = weight(_text_:22 in 562) [ClassicSimilarity], result of:
            0.04992038 = score(doc=562,freq=2.0), product of:
              0.16128273 = queryWeight, product of:
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.046056706 = queryNorm
              0.30952093 = fieldWeight in 562, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.0625 = fieldNorm(doc=562)
      0.5 = coord(1/2)
    
    Abstract
    The paper performs a thought experiment on the concept of a record based on the Functional Requirements for Bibliographic Records and library system functions, and concludes that if we want to develop a functional bibliographic record we need to do it within the context of a flexible, functional library systems record structure. The article suggests a new way to look at the library systems record that would allow libraries to move forward in terms of technology but also in terms of serving library users.
    Source
    Library hi tech. 22(2004) no.2, S.166-174
    Type
    a
  3. Yee, M.M.: New perspectives on the shared cataloging environment and a MARC 21 shopping list (2004) 0.03
    0.028787265 = product of:
      0.05757453 = sum of:
        0.05757453 = sum of:
          0.007654148 = weight(_text_:a in 132) [ClassicSimilarity], result of:
            0.007654148 = score(doc=132,freq=4.0), product of:
              0.053105544 = queryWeight, product of:
                1.153047 = idf(docFreq=37942, maxDocs=44218)
                0.046056706 = queryNorm
              0.14413087 = fieldWeight in 132, product of:
                2.0 = tf(freq=4.0), with freq of:
                  4.0 = termFreq=4.0
                1.153047 = idf(docFreq=37942, maxDocs=44218)
                0.0625 = fieldNorm(doc=132)
          0.04992038 = weight(_text_:22 in 132) [ClassicSimilarity], result of:
            0.04992038 = score(doc=132,freq=2.0), product of:
              0.16128273 = queryWeight, product of:
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.046056706 = queryNorm
              0.30952093 = fieldWeight in 132, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.0625 = fieldNorm(doc=132)
      0.5 = coord(1/2)
    
    Date
    10. 9.2000 17:38:22
    Type
    a
  4. Ranta, J.A.: Queens Borough Public Library's Guidelines for cataloging community information (1996) 0.03
    0.025188856 = product of:
      0.05037771 = sum of:
        0.05037771 = sum of:
          0.00669738 = weight(_text_:a in 6523) [ClassicSimilarity], result of:
            0.00669738 = score(doc=6523,freq=4.0), product of:
              0.053105544 = queryWeight, product of:
                1.153047 = idf(docFreq=37942, maxDocs=44218)
                0.046056706 = queryNorm
              0.12611452 = fieldWeight in 6523, product of:
                2.0 = tf(freq=4.0), with freq of:
                  4.0 = termFreq=4.0
                1.153047 = idf(docFreq=37942, maxDocs=44218)
                0.0546875 = fieldNorm(doc=6523)
          0.043680333 = weight(_text_:22 in 6523) [ClassicSimilarity], result of:
            0.043680333 = score(doc=6523,freq=2.0), product of:
              0.16128273 = queryWeight, product of:
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.046056706 = queryNorm
              0.2708308 = fieldWeight in 6523, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.0546875 = fieldNorm(doc=6523)
      0.5 = coord(1/2)
    
    Abstract
    Currently, few resources exist to guide libraries in the cataloguing of community information using the new USMARC Format for Cammunity Information (1993). In developing a community information database, Queens Borough Public Library, New York City, formulated their own cataloguing procedures for applying AACR2, LoC File Interpretations, and USMARC Format for Community Information to community information. Their practices include entering corporate names directly whenever possible and assigning LC subject headings for classes of persons and topics, adding neighbourhood level geographic subdivisions. The guidelines were specially designed to aid non cataloguers in cataloguing community information and have enabled library to maintain consistency in handling corporate names and in assigning subject headings, while creating database that is highly accessible to library staff and users
    Source
    Cataloging and classification quarterly. 22(1996) no.2, S.51-69
    Type
    a
  5. Crook, M.: Barbara Tillett discusses cataloging rules and conceptual models (1996) 0.03
    0.025188856 = product of:
      0.05037771 = sum of:
        0.05037771 = sum of:
          0.00669738 = weight(_text_:a in 7683) [ClassicSimilarity], result of:
            0.00669738 = score(doc=7683,freq=4.0), product of:
              0.053105544 = queryWeight, product of:
                1.153047 = idf(docFreq=37942, maxDocs=44218)
                0.046056706 = queryNorm
              0.12611452 = fieldWeight in 7683, product of:
                2.0 = tf(freq=4.0), with freq of:
                  4.0 = termFreq=4.0
                1.153047 = idf(docFreq=37942, maxDocs=44218)
                0.0546875 = fieldNorm(doc=7683)
          0.043680333 = weight(_text_:22 in 7683) [ClassicSimilarity], result of:
            0.043680333 = score(doc=7683,freq=2.0), product of:
              0.16128273 = queryWeight, product of:
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.046056706 = queryNorm
              0.2708308 = fieldWeight in 7683, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.0546875 = fieldNorm(doc=7683)
      0.5 = coord(1/2)
    
    Abstract
    The chief of cataloguing policy and support office at the LoC presents her views on the usefulness of conceptual modelling in determining future directions for cataloguing and the MARC format. After describing the evolution of bibliographic processes, suggests usign the entity-relationship conceptual model to step back from how we record information today and start thinking about what information really means and why we provide it. Argues that now is the time to reexamine the basic principles which underpin Anglo-American cataloguing codes and that MARC formats should be looked at to see how they can evolve towards a future, improved structure for communicating bibliographic and authority information
    Source
    OCLC newsletter. 1996, no.220, S.20-22
    Type
    a
  6. Lee, S.; Jacob, E.K.: ¬An integrated approach to metadata interoperability : construction of a conceptual structure between MARC and FRBR (2011) 0.02
    0.023691658 = product of:
      0.047383316 = sum of:
        0.047383316 = sum of:
          0.00994303 = weight(_text_:a in 302) [ClassicSimilarity], result of:
            0.00994303 = score(doc=302,freq=12.0), product of:
              0.053105544 = queryWeight, product of:
                1.153047 = idf(docFreq=37942, maxDocs=44218)
                0.046056706 = queryNorm
              0.18723148 = fieldWeight in 302, product of:
                3.4641016 = tf(freq=12.0), with freq of:
                  12.0 = termFreq=12.0
                1.153047 = idf(docFreq=37942, maxDocs=44218)
                0.046875 = fieldNorm(doc=302)
          0.037440285 = weight(_text_:22 in 302) [ClassicSimilarity], result of:
            0.037440285 = score(doc=302,freq=2.0), product of:
              0.16128273 = queryWeight, product of:
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.046056706 = queryNorm
              0.23214069 = fieldWeight in 302, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.046875 = fieldNorm(doc=302)
      0.5 = coord(1/2)
    
    Abstract
    Machine-Readable Cataloging (MARC) is currently the most broadly used bibliographic standard for encoding and exchanging bibliographic data. However, MARC may not fully support representation of the dynamic nature and semantics of digital resources because of its rigid and single-layered linear structure. The Functional Requirements for Bibliographic Records (FRBR) model, which is designed to overcome the problems of MARC, does not provide sufficient data elements and adopts a predetermined hierarchy. A flexible structure for bibliographic data with detailed data elements is needed. Integrating MARC format with the hierarchical structure of FRBR is one approach to meet this need. The purpose of this research is to propose an approach that can facilitate interoperability between MARC and FRBR by providing a conceptual structure that can function as a mediator between MARC data elements and FRBR attributes.
    Date
    10. 9.2000 17:38:22
    Type
    a
  7. Riva, P.: Mapping MARC 21 linking entry fields to FRBR and Tillett's taxonomy of bibliographic relationships (2004) 0.02
    0.022235535 = product of:
      0.04447107 = sum of:
        0.04447107 = sum of:
          0.007030784 = weight(_text_:a in 136) [ClassicSimilarity], result of:
            0.007030784 = score(doc=136,freq=6.0), product of:
              0.053105544 = queryWeight, product of:
                1.153047 = idf(docFreq=37942, maxDocs=44218)
                0.046056706 = queryNorm
              0.13239266 = fieldWeight in 136, product of:
                2.4494898 = tf(freq=6.0), with freq of:
                  6.0 = termFreq=6.0
                1.153047 = idf(docFreq=37942, maxDocs=44218)
                0.046875 = fieldNorm(doc=136)
          0.037440285 = weight(_text_:22 in 136) [ClassicSimilarity], result of:
            0.037440285 = score(doc=136,freq=2.0), product of:
              0.16128273 = queryWeight, product of:
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.046056706 = queryNorm
              0.23214069 = fieldWeight in 136, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.046875 = fieldNorm(doc=136)
      0.5 = coord(1/2)
    
    Abstract
    Bibliographic relationships have taken on even greater importance in the context of ongoing efforts to integrate concepts from the Functional Requirements for Bibliographic Records (FRBR) into cataloging codes and database structures. In MARC 21, the linking entry fields are a major mechanism for expressing relationships between bibliographic records. Taxonomies of bibliographic relationships have been proposed by Tillett, with an extension by Smiraglia, and in FRBR itself. The present exercise is to provide a detailed bidirectional mapping of the MARC 21 linking fields to these two schemes. The correspondence of the Tillett taxonomic divisions to the MARC categorization of the linking fields as chronological, horizontal, or vertical is examined as well. Application of the findings to MARC format development and system functionality is discussed.
    Date
    10. 9.2000 17:38:22
    Type
    a
  8. Wisser, K.M.; O'Brien Roper, J.: Maximizing metadata : exploring the EAD-MARC relationship (2003) 0.02
    0.020383961 = product of:
      0.040767923 = sum of:
        0.040767923 = sum of:
          0.009567685 = weight(_text_:a in 154) [ClassicSimilarity], result of:
            0.009567685 = score(doc=154,freq=16.0), product of:
              0.053105544 = queryWeight, product of:
                1.153047 = idf(docFreq=37942, maxDocs=44218)
                0.046056706 = queryNorm
              0.18016359 = fieldWeight in 154, product of:
                4.0 = tf(freq=16.0), with freq of:
                  16.0 = termFreq=16.0
                1.153047 = idf(docFreq=37942, maxDocs=44218)
                0.0390625 = fieldNorm(doc=154)
          0.03120024 = weight(_text_:22 in 154) [ClassicSimilarity], result of:
            0.03120024 = score(doc=154,freq=2.0), product of:
              0.16128273 = queryWeight, product of:
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.046056706 = queryNorm
              0.19345059 = fieldWeight in 154, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.0390625 = fieldNorm(doc=154)
      0.5 = coord(1/2)
    
    Abstract
    Encoded Archival Description (EAD) has provided a new way to approach manuscript and archival collection representation. A review of previous representational practices and problems highlights the benefits of using EAD. This new approach should be considered a partner rather than an adversary in the access providing process. Technological capabilities now allow for multiple metadata schemas to be employed in the creation of the finding aid. Crosswalks allow for MARC records to be generated from the detailed encoding of an EAD finding aid. In the process of creating these crosswalks and detailed encoding, EAD has generated more changes in traditional processes and procedures than originally imagined. The North Carolina State University (NCSU) Libraries sought to test the process of crosswalking EAD to MARC, investigating how this process used technology as well as changed physical procedures. By creating a complex and indepth EAD template for finding aids, with accompanying related encoding analogs embedded within the element structure, MARC records were generated that required minor editing and revision for inclusion in the NCSU Libraries OPAC. The creation of this bridge between EAD and MARC has stimulated theoretical discussions about the role of collaboration, technology, and expertise in the ongoing struggle to maximize access to our collections. While this study is a only a first attempt at harnessing this potential, a presentation of the tensions, struggles, and successes provides illumination to some of the larger issues facing special collections today.
    Date
    10. 9.2000 17:38:22
    Type
    a
  9. Leazer, G.H.: ¬A conceptual schema for the control of bibliographic works (1994) 0.00
    0.0031642143 = product of:
      0.0063284286 = sum of:
        0.0063284286 = product of:
          0.012656857 = sum of:
            0.012656857 = weight(_text_:a in 3033) [ClassicSimilarity], result of:
              0.012656857 = score(doc=3033,freq=28.0), product of:
                0.053105544 = queryWeight, product of:
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.046056706 = queryNorm
                0.23833402 = fieldWeight in 3033, product of:
                  5.2915025 = tf(freq=28.0), with freq of:
                    28.0 = termFreq=28.0
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=3033)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    In this paper I describe a conceptual design of a bibliographic retrieval system that enables more thourough control of bibliographic entities. A bibliographic entity has 2 components: the intellectual work and the physical item. Users searching bibliographic retrieval systems generally do not search for a specific item, but are willing to retrieve one of several alternative manifestations of a work. However, contemporary bibliographic retrieval systems are based solely on the descriptions of items. Works are described only implcitly by collocating descriptions of items. This method has resulted in a tool that does not include important descriptive attributes of the work, e.g. information regarding its history, its genre, or its bibliographic relationships. A bibliographic relationship is an association between 2 bibliographic entities. A system evaluation methodology wasused to create a conceptual schema for a bibliographic retrieval system. The model is based upon an analysis of data elements in the USMARC Formats for Bibliographic Data. The conceptual schema describes a database comprising 2 separate files of bibliographic descriptions, one of works and the other of items. Each file consists of individual descriptive surrogates of their respective entities. the specific data content of each file is defined by a data dictionary. Data elements used in the description of bibliographic works reflect the nature of works as intellectual and linguistic objects. The descriptive elements of bibliographic items describe the physical properties of bibliographic entities. Bibliographic relationships constitute the logical strucutre of the database
    Type
    a
  10. Stephens, O.: Introduction to OpenRefine (2014) 0.00
    0.0030444188 = product of:
      0.0060888375 = sum of:
        0.0060888375 = product of:
          0.012177675 = sum of:
            0.012177675 = weight(_text_:a in 2884) [ClassicSimilarity], result of:
              0.012177675 = score(doc=2884,freq=18.0), product of:
                0.053105544 = queryWeight, product of:
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.046056706 = queryNorm
                0.22931081 = fieldWeight in 2884, product of:
                  4.2426405 = tf(freq=18.0), with freq of:
                    18.0 = termFreq=18.0
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.046875 = fieldNorm(doc=2884)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    OpenRefine is described as a tool for working with 'messy' data - but what does this mean? It is probably easiest to describe the kinds of data OpenRefine is good at working with and the sorts of problems it can help you solve. OpenRefine is most useful where you have data in a simple tabular format but with internal inconsistencies either in data formats, or where data appears, or in terminology used. It can help you: Get an overview of a data set Resolve inconsistencies in a data set Help you split data up into more granular parts Match local data up to other data sets Enhance a data set with data from other sources Some common scenarios might be: 1. Where you want to know how many times a particular value appears in a column in your data. 2. Where you want to know how values are distributed across your whole data set. 3. Where you have a list of dates which are formatted in different ways, and want to change all the dates in the list to a single common date format.
  11. Nichols introduces MARCit (1998) 0.00
    0.0028703054 = product of:
      0.005740611 = sum of:
        0.005740611 = product of:
          0.011481222 = sum of:
            0.011481222 = weight(_text_:a in 1438) [ClassicSimilarity], result of:
              0.011481222 = score(doc=1438,freq=4.0), product of:
                0.053105544 = queryWeight, product of:
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.046056706 = queryNorm
                0.2161963 = fieldWeight in 1438, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.09375 = fieldNorm(doc=1438)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    Reports the release of MARCit, a software package that enables the cataloguing of Internet resources into MARC format bibliographic records
    Type
    a
  12. Wool, G.J.; Austhof, B.: Cataloguing standards and machine translation : a study of reformatted ISBD records in an online catalog (1993) 0.00
    0.00270615 = product of:
      0.0054123 = sum of:
        0.0054123 = product of:
          0.0108246 = sum of:
            0.0108246 = weight(_text_:a in 7321) [ClassicSimilarity], result of:
              0.0108246 = score(doc=7321,freq=8.0), product of:
                0.053105544 = queryWeight, product of:
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.046056706 = queryNorm
                0.20383182 = fieldWeight in 7321, product of:
                  2.828427 = tf(freq=8.0), with freq of:
                    8.0 = termFreq=8.0
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.0625 = fieldNorm(doc=7321)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    Labelled bibliographic display screens in online catalogues can repackage records created for card catalogues in ways that restructure the records, redefine data categories and contexts, and add or omit selected categories of data. Reports on a study of the impact of automated display on catalogue records in a medium-sized research library by comparing the card and online version of 1.005 records created according to the ISBD conventions
    Type
    a
  13. Eliot, J.: MARC and OPAC systems : discussion document (1994) 0.00
    0.00270615 = product of:
      0.0054123 = sum of:
        0.0054123 = product of:
          0.0108246 = sum of:
            0.0108246 = weight(_text_:a in 10) [ClassicSimilarity], result of:
              0.0108246 = score(doc=10,freq=8.0), product of:
                0.053105544 = queryWeight, product of:
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.046056706 = queryNorm
                0.20383182 = fieldWeight in 10, product of:
                  2.828427 = tf(freq=8.0), with freq of:
                    8.0 = termFreq=8.0
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.0625 = fieldNorm(doc=10)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    A discussion document produced following a meeting the Users of Book Industry Standards (UBIS) Bibliographic Standards Working Group at the University of London as part of a project to consider the Survey on the use of UK-MARC by Russell Sweeney published in 1991 by the British Library National Bibliographic Service. Considers the suitability, or otherwise, of the UKMARC format for use in OPACs. Summarizes the issues involved, discussing: the UKMARC exchange format, tagging and coding structure (record complexity, analytical entries, non filing indicators), data content (statements of responsibility, main versus added entry) and records standards
    Type
    a
  14. Parent, I.: IFLA study on functional requirements for bibliographic records : an Anglo-American perspective (1995) 0.00
    0.00270615 = product of:
      0.0054123 = sum of:
        0.0054123 = product of:
          0.0108246 = sum of:
            0.0108246 = weight(_text_:a in 3080) [ClassicSimilarity], result of:
              0.0108246 = score(doc=3080,freq=8.0), product of:
                0.053105544 = queryWeight, product of:
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.046056706 = queryNorm
                0.20383182 = fieldWeight in 3080, product of:
                  2.828427 = tf(freq=8.0), with freq of:
                    8.0 = termFreq=8.0
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.0625 = fieldNorm(doc=3080)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    Presents a view on the work of the Functional Requirements for Bibliographic Records Study Group on behalf of the Anglo-American cataloguing tradition. The study is examining the fundamental aspects of record design using the entity-attribute-relationship model to link data elements to the function that a user can perform while accessing a bibliographic record. The data and functions are being linked by UNIMARC fields
    Type
    a
  15. ¬The core bibliographic record for music and sound recordings (1998) 0.00
    0.00270615 = product of:
      0.0054123 = sum of:
        0.0054123 = product of:
          0.0108246 = sum of:
            0.0108246 = weight(_text_:a in 3801) [ClassicSimilarity], result of:
              0.0108246 = score(doc=3801,freq=8.0), product of:
                0.053105544 = queryWeight, product of:
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.046056706 = queryNorm
                0.20383182 = fieldWeight in 3801, product of:
                  2.828427 = tf(freq=8.0), with freq of:
                    8.0 = termFreq=8.0
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.0625 = fieldNorm(doc=3801)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    Describes the background to the creation of a core bibliographic record for music and sound recordings, provides a definition of a core bibliographic record and presents the core record for printed and manuscript music and the core record for sound recordings which were prepared by the International Association of Music Libraries, Archives and Documentation Centres Working Group in Perugia, 1-6 Sep 1996
    Type
    a
  16. Miller, E.; Ogbuji, U.: Linked data design for the visible library (2015) 0.00
    0.0026849252 = product of:
      0.0053698504 = sum of:
        0.0053698504 = product of:
          0.010739701 = sum of:
            0.010739701 = weight(_text_:a in 2773) [ClassicSimilarity], result of:
              0.010739701 = score(doc=2773,freq=14.0), product of:
                0.053105544 = queryWeight, product of:
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.046056706 = queryNorm
                0.20223314 = fieldWeight in 2773, product of:
                  3.7416575 = tf(freq=14.0), with freq of:
                    14.0 = termFreq=14.0
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.046875 = fieldNorm(doc=2773)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    In response to libraries' frustration over their rich resources being invisible on the web, Zepheira, at the request of the Library of Congress, created BIBFRAME, a bibliographic metadata framework for cataloging. The model replaces MARC records with linked data, promoting resource visibility through a rich network of links. In place of formal taxonomies, a small but extensible vocabulary streamlines metadata efforts. Rather than using a unique bibliographic record to describe one item, BIBFRAME draws on the Dublin Core and the Functional Requirements for Bibliographic Records (FRBR) to generate formalized descriptions of Work, Instance, Authority and Annotation as well as associations between items. Zepheira trains librarians to transform MARC records to BIBFRAME resources and adapt the vocabulary for specialized needs, while subject matter experts and technical experts manage content, site design and usability. With a different approach toward data modeling and metadata, previously invisible resources gain visibility through linking.
    Footnote
    Contribution to a special section "Linked data and the charm of weak semantics".
    Type
    a
  17. Chandrakar, R.: Mapping CCF to MARC21 : an experimental approach (2001) 0.00
    0.0024857575 = product of:
      0.004971515 = sum of:
        0.004971515 = product of:
          0.00994303 = sum of:
            0.00994303 = weight(_text_:a in 5437) [ClassicSimilarity], result of:
              0.00994303 = score(doc=5437,freq=12.0), product of:
                0.053105544 = queryWeight, product of:
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.046056706 = queryNorm
                0.18723148 = fieldWeight in 5437, product of:
                  3.4641016 = tf(freq=12.0), with freq of:
                    12.0 = termFreq=12.0
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.046875 = fieldNorm(doc=5437)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    The purpose of this article is to raise and address a number of issues pertaining to the conversion of Common Communication Format (CCF) into MARC21. In this era of global resource sharing, exchange of bibliographic records from one system to another is imperative in today's library communities. Instead of using a single standard to create machine-readable catalogue records, more than 20 standards have emerged and are being used by different institutions. Because of these variations in standards, sharing of resources and transfer of data from one system to another among the institutions locally and globally has become a significant problem. Addressing this problem requires keeping in mind that countries such as India and others in southeast Asia are using the CCF as a standard for creating bibliographic cataloguing records. This paper describes a way to map the bibliographic catalogue records from CCF to MARC21, although 100% mapping is not possible. In addition, the paper describes an experimental approach that enumerates problems that may occur during the mapping of records/exchanging of records and how these problems can be overcome.
    Type
    a
  18. Fattahi, R.: Anglo American Cataloguing Rules in an online environment : a literature review (1995) 0.00
    0.0024857575 = product of:
      0.004971515 = sum of:
        0.004971515 = product of:
          0.00994303 = sum of:
            0.00994303 = weight(_text_:a in 596) [ClassicSimilarity], result of:
              0.00994303 = score(doc=596,freq=12.0), product of:
                0.053105544 = queryWeight, product of:
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.046056706 = queryNorm
                0.18723148 = fieldWeight in 596, product of:
                  3.4641016 = tf(freq=12.0), with freq of:
                    12.0 = termFreq=12.0
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.046875 = fieldNorm(doc=596)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    As a standard set of rules, AACR2 has received much attention in the literature of descriptive cataloguing. Considers that despite this extensive literature, an important aspect of the code, namely its relevance to the online environment, has not received much attention, particularly in terms of empirical research. Notes however that there is a general criticism that AACR2, being based on manual systems, does not correspond effectively to the online environment. From a review of the literature concludes that while the advent of online catalogues has changed both the internal structure and external appearance of library catalogues, a mojority of writers consider that radical changes in the code are impossible and undesirable in the near future, owing to various factors such as the belief that that MARC format is not conductive to radical change and the large size of existing catalogues created according to the current rules
    Type
    a
  19. Spangen, I.C.: IFLA study on functional requirements for bibliographic records : Nordic and German reactions to the functional requirements study (1995) 0.00
    0.0023919214 = product of:
      0.0047838427 = sum of:
        0.0047838427 = product of:
          0.009567685 = sum of:
            0.009567685 = weight(_text_:a in 3083) [ClassicSimilarity], result of:
              0.009567685 = score(doc=3083,freq=4.0), product of:
                0.053105544 = queryWeight, product of:
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.046056706 = queryNorm
                0.18016359 = fieldWeight in 3083, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.078125 = fieldNorm(doc=3083)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    Presents the German and Nordic reactions. Provides a background to the Northern European cataloguing tradition. Offers views from Denmark, Sweden and Norway
    Type
    a
  20. Parker, V.: MARC tags for cataloging cartographic materials (1999) 0.00
    0.0023919214 = product of:
      0.0047838427 = sum of:
        0.0047838427 = product of:
          0.009567685 = sum of:
            0.009567685 = weight(_text_:a in 5317) [ClassicSimilarity], result of:
              0.009567685 = score(doc=5317,freq=4.0), product of:
                0.053105544 = queryWeight, product of:
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.046056706 = queryNorm
                0.18016359 = fieldWeight in 5317, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.078125 = fieldNorm(doc=5317)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    This is a table of those MARC fields most frequently used when cataloging cartographic materials. The table gives fields both for monographs and for serials.
    Type
    a