Search (39 results, page 1 of 2)

  • × language_ss:"e"
  • × theme_ss:"Datenformate"
  • × theme_ss:"Formalerschließung"
  1. Lee, S.; Jacob, E.K.: ¬An integrated approach to metadata interoperability : construction of a conceptual structure between MARC and FRBR (2011) 0.04
    0.036335014 = product of:
      0.109005034 = sum of:
        0.038461216 = weight(_text_:cataloging in 302) [ClassicSimilarity], result of:
          0.038461216 = score(doc=302,freq=2.0), product of:
            0.14721331 = queryWeight, product of:
              3.9411201 = idf(docFreq=2334, maxDocs=44218)
              0.037353165 = queryNorm
            0.26126182 = fieldWeight in 302, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.9411201 = idf(docFreq=2334, maxDocs=44218)
              0.046875 = fieldNorm(doc=302)
        0.0553613 = weight(_text_:data in 302) [ClassicSimilarity], result of:
          0.0553613 = score(doc=302,freq=10.0), product of:
            0.118112594 = queryWeight, product of:
              3.1620505 = idf(docFreq=5088, maxDocs=44218)
              0.037353165 = queryNorm
            0.46871632 = fieldWeight in 302, product of:
              3.1622777 = tf(freq=10.0), with freq of:
                10.0 = termFreq=10.0
              3.1620505 = idf(docFreq=5088, maxDocs=44218)
              0.046875 = fieldNorm(doc=302)
        0.015182514 = product of:
          0.030365027 = sum of:
            0.030365027 = weight(_text_:22 in 302) [ClassicSimilarity], result of:
              0.030365027 = score(doc=302,freq=2.0), product of:
                0.13080442 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.037353165 = queryNorm
                0.23214069 = fieldWeight in 302, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.046875 = fieldNorm(doc=302)
          0.5 = coord(1/2)
      0.33333334 = coord(3/9)
    
    Abstract
    Machine-Readable Cataloging (MARC) is currently the most broadly used bibliographic standard for encoding and exchanging bibliographic data. However, MARC may not fully support representation of the dynamic nature and semantics of digital resources because of its rigid and single-layered linear structure. The Functional Requirements for Bibliographic Records (FRBR) model, which is designed to overcome the problems of MARC, does not provide sufficient data elements and adopts a predetermined hierarchy. A flexible structure for bibliographic data with detailed data elements is needed. Integrating MARC format with the hierarchical structure of FRBR is one approach to meet this need. The purpose of this research is to propose an approach that can facilitate interoperability between MARC and FRBR by providing a conceptual structure that can function as a mediator between MARC data elements and FRBR attributes.
    Date
    10. 9.2000 17:38:22
  2. Zapounidou, S.; Sfakakis, M.; Papatheodorou, C.: Library data integration : towards BIBFRAME mapping to EDM (2014) 0.04
    0.035131108 = product of:
      0.15808998 = sum of:
        0.08806286 = weight(_text_:germany in 1589) [ClassicSimilarity], result of:
          0.08806286 = score(doc=1589,freq=2.0), product of:
            0.22275731 = queryWeight, product of:
              5.963546 = idf(docFreq=308, maxDocs=44218)
              0.037353165 = queryNorm
            0.39533097 = fieldWeight in 1589, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              5.963546 = idf(docFreq=308, maxDocs=44218)
              0.046875 = fieldNorm(doc=1589)
        0.07002712 = weight(_text_:data in 1589) [ClassicSimilarity], result of:
          0.07002712 = score(doc=1589,freq=16.0), product of:
            0.118112594 = queryWeight, product of:
              3.1620505 = idf(docFreq=5088, maxDocs=44218)
              0.037353165 = queryNorm
            0.5928845 = fieldWeight in 1589, product of:
              4.0 = tf(freq=16.0), with freq of:
                16.0 = termFreq=16.0
              3.1620505 = idf(docFreq=5088, maxDocs=44218)
              0.046875 = fieldNorm(doc=1589)
      0.22222222 = coord(2/9)
    
    Abstract
    Integration of library data into the Linked Data environment is a key issue in libraries and is approached on the basis of interoperability between library data conceptual models. Achieving interoperability for different representations of the same or related entities between the library and cultural heritage domains shall enhance rich bibliographic data reusability and support the development of new data-driven information services. This paper aims to contribute to the desired interoperability by attempting to map core semantic paths between the BIBFRAME and EDM conceptual models. BIBFRAME is developed by the Library of Congress to support transformation of legacy library data in MARC format into linked data. EDM is the model developed for and used in the Europeana Cultural Heritage aggregation portal.
    Source
    Metadata and semantics research: 8th Research Conference, MTSR 2014, Karlsruhe, Germany, November 27-29, 2014, Proceedings. Eds.: S. Closs et al
  3. Samples, J.; Bigelow, I.: MARC to BIBFRAME : converting the PCC to Linked Data (2020) 0.03
    0.028388752 = product of:
      0.12774938 = sum of:
        0.077719584 = weight(_text_:cataloging in 119) [ClassicSimilarity], result of:
          0.077719584 = score(doc=119,freq=6.0), product of:
            0.14721331 = queryWeight, product of:
              3.9411201 = idf(docFreq=2334, maxDocs=44218)
              0.037353165 = queryNorm
            0.52793854 = fieldWeight in 119, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              3.9411201 = idf(docFreq=2334, maxDocs=44218)
              0.0546875 = fieldNorm(doc=119)
        0.050029792 = weight(_text_:data in 119) [ClassicSimilarity], result of:
          0.050029792 = score(doc=119,freq=6.0), product of:
            0.118112594 = queryWeight, product of:
              3.1620505 = idf(docFreq=5088, maxDocs=44218)
              0.037353165 = queryNorm
            0.42357713 = fieldWeight in 119, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              3.1620505 = idf(docFreq=5088, maxDocs=44218)
              0.0546875 = fieldNorm(doc=119)
      0.22222222 = coord(2/9)
    
    Abstract
    The Program for Cooperative Cataloging (PCC) has formal relationships with the Library of Congress (LC), Share-VDE, and Linked Data for Production Phase 2 (LD4P2) for work on Bibliographic Framework (BIBFRAME), and PCC institutions have been very active in the exploration of MARC to BIBFRAME conversion processes. This article will review the involvement of PCC in the development of BIBFRAME and examine the work of LC, Share-VDE, and LD4P2 on MARC to BIBFRAME conversion. It will conclude with a discussion of areas for further exploration by the PCC leading up to the creation of PCC conversion specifications and PCC BIBFRAME data.
    Footnote
    Beitrag in einem Themenheft: 'Program for Cooperative Cataloging (PCC): 25 Years Strong and Growing!'.
    Source
    Cataloging and classification quarterly. 58(2020) no.3/4, S.403-417
  4. Maxwell, R.L.: Bibliographic control (2009) 0.02
    0.024613382 = product of:
      0.11076022 = sum of:
        0.086001895 = weight(_text_:cataloging in 3750) [ClassicSimilarity], result of:
          0.086001895 = score(doc=3750,freq=10.0), product of:
            0.14721331 = queryWeight, product of:
              3.9411201 = idf(docFreq=2334, maxDocs=44218)
              0.037353165 = queryNorm
            0.5841992 = fieldWeight in 3750, product of:
              3.1622777 = tf(freq=10.0), with freq of:
                10.0 = termFreq=10.0
              3.9411201 = idf(docFreq=2334, maxDocs=44218)
              0.046875 = fieldNorm(doc=3750)
        0.024758326 = weight(_text_:data in 3750) [ClassicSimilarity], result of:
          0.024758326 = score(doc=3750,freq=2.0), product of:
            0.118112594 = queryWeight, product of:
              3.1620505 = idf(docFreq=5088, maxDocs=44218)
              0.037353165 = queryNorm
            0.2096163 = fieldWeight in 3750, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.1620505 = idf(docFreq=5088, maxDocs=44218)
              0.046875 = fieldNorm(doc=3750)
      0.22222222 = coord(2/9)
    
    Abstract
    Bibliographic control is the process of creation, exchange, preservation, and use of data about information resources. Formal bibliographic control has been practiced for millennia, but modern techniques began to be developed and implemented in the nineteenth and twentieth centuries. A series of cataloging codes characterized this period. These codes governed the creation of library catalogs, first in book form, then on cards, and finally in electronic formats, including MAchine-Readable Cataloging (MARC). The period was also characterized by the rise of shared cataloging programs, allowing the development of resource-saving copy cataloging procedures. Such programs were assisted by the development of cataloging networks such as OCLC and RLG. The twentieth century saw progress in the theory of bibliographic control, including the 1961 Paris Principles, culminating with the early twenty-first century Statement of International Cataloguing Principles and IFLA's Functional Requirements for Bibliographic Records (FRBR). Toward the end of the period bibliographic control began to be applied to newly invented electronic media, as "metadata." Trends point toward continued development of collaborative and international approaches to bibliographic control.
  5. Yee, M.M.: New perspectives on the shared cataloging environment and a MARC 21 shopping list (2004) 0.02
    0.02423683 = product of:
      0.109065734 = sum of:
        0.08882238 = weight(_text_:cataloging in 132) [ClassicSimilarity], result of:
          0.08882238 = score(doc=132,freq=6.0), product of:
            0.14721331 = queryWeight, product of:
              3.9411201 = idf(docFreq=2334, maxDocs=44218)
              0.037353165 = queryNorm
            0.6033583 = fieldWeight in 132, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              3.9411201 = idf(docFreq=2334, maxDocs=44218)
              0.0625 = fieldNorm(doc=132)
        0.020243352 = product of:
          0.040486705 = sum of:
            0.040486705 = weight(_text_:22 in 132) [ClassicSimilarity], result of:
              0.040486705 = score(doc=132,freq=2.0), product of:
                0.13080442 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.037353165 = queryNorm
                0.30952093 = fieldWeight in 132, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0625 = fieldNorm(doc=132)
          0.5 = coord(1/2)
      0.22222222 = coord(2/9)
    
    Abstract
    This paper surveys the cataloging literature to collect problems that have been identified with the MARC 21 format. The problems are sorted into (1) problems that are not the fault of MARC 21; (2) problems that perhaps are not problems at all; (3) problems that are connected with the current shared cataloging environment; and 4) other problems with MARC 21 and vendor implementation of it. The author makes recommendations to deal with the true MARC 21 problems that remain after this analysis.
    Date
    10. 9.2000 17:38:22
  6. Chapman, L.: How to catalogue : a practical manual using AACR2 and Library of Congress (1990) 0.02
    0.02345206 = product of:
      0.10553427 = sum of:
        0.07252317 = weight(_text_:cataloging in 6081) [ClassicSimilarity], result of:
          0.07252317 = score(doc=6081,freq=4.0), product of:
            0.14721331 = queryWeight, product of:
              3.9411201 = idf(docFreq=2334, maxDocs=44218)
              0.037353165 = queryNorm
            0.49264002 = fieldWeight in 6081, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              3.9411201 = idf(docFreq=2334, maxDocs=44218)
              0.0625 = fieldNorm(doc=6081)
        0.0330111 = weight(_text_:data in 6081) [ClassicSimilarity], result of:
          0.0330111 = score(doc=6081,freq=2.0), product of:
            0.118112594 = queryWeight, product of:
              3.1620505 = idf(docFreq=5088, maxDocs=44218)
              0.037353165 = queryNorm
            0.2794884 = fieldWeight in 6081, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.1620505 = idf(docFreq=5088, maxDocs=44218)
              0.0625 = fieldNorm(doc=6081)
      0.22222222 = coord(2/9)
    
    Abstract
    A practical manual describing standard procedures in cataloguing using AACR2 1988 revision and LoC cataloguing data
    LCSH
    Cataloging
    Subject
    Cataloging
  7. Fiander, D. J.: Applying XML to the bibliographic description (2001) 0.02
    0.022595724 = product of:
      0.101680756 = sum of:
        0.07692243 = weight(_text_:cataloging in 5441) [ClassicSimilarity], result of:
          0.07692243 = score(doc=5441,freq=8.0), product of:
            0.14721331 = queryWeight, product of:
              3.9411201 = idf(docFreq=2334, maxDocs=44218)
              0.037353165 = queryNorm
            0.52252364 = fieldWeight in 5441, product of:
              2.828427 = tf(freq=8.0), with freq of:
                8.0 = termFreq=8.0
              3.9411201 = idf(docFreq=2334, maxDocs=44218)
              0.046875 = fieldNorm(doc=5441)
        0.024758326 = weight(_text_:data in 5441) [ClassicSimilarity], result of:
          0.024758326 = score(doc=5441,freq=2.0), product of:
            0.118112594 = queryWeight, product of:
              3.1620505 = idf(docFreq=5088, maxDocs=44218)
              0.037353165 = queryNorm
            0.2096163 = fieldWeight in 5441, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.1620505 = idf(docFreq=5088, maxDocs=44218)
              0.046875 = fieldNorm(doc=5441)
      0.22222222 = coord(2/9)
    
    Abstract
    Over the past few years there has been a significant amount of work in the area of cataloging internet resources, primarily using new metadata standards like the Dublin Core, but there has been little work on applying new data description formats like SGML and XML to traditional cataloging practices. What little work has been done in the area of using SGML and XML for traditional bibliographic description has primarily been based on the concept of converting MARC tagging into XML tagging. I suggest that, rather than attempting to convert existing MARC tagging into a new syntax based on SGML or XML, a more fruitful possibility is to return to the cataloging standards and describe their inherent structure, learning from how MARC has been used successfully in modern OPAC while attempting to avoid MARC's rigid field-based restrictions.
    Source
    Cataloging and classification quarterly. 33(2001) no.2, S.17-28
  8. Kushwoh, S.S.; Gautam, J.N.; Singh, R.: Migration from CDS / ISIS to KOHA : a case study of data conversion from CCF to MARC 21 (2009) 0.02
    0.020849448 = product of:
      0.09382252 = sum of:
        0.038461216 = weight(_text_:cataloging in 2279) [ClassicSimilarity], result of:
          0.038461216 = score(doc=2279,freq=2.0), product of:
            0.14721331 = queryWeight, product of:
              3.9411201 = idf(docFreq=2334, maxDocs=44218)
              0.037353165 = queryNorm
            0.26126182 = fieldWeight in 2279, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.9411201 = idf(docFreq=2334, maxDocs=44218)
              0.046875 = fieldNorm(doc=2279)
        0.0553613 = weight(_text_:data in 2279) [ClassicSimilarity], result of:
          0.0553613 = score(doc=2279,freq=10.0), product of:
            0.118112594 = queryWeight, product of:
              3.1620505 = idf(docFreq=5088, maxDocs=44218)
              0.037353165 = queryNorm
            0.46871632 = fieldWeight in 2279, product of:
              3.1622777 = tf(freq=10.0), with freq of:
                10.0 = termFreq=10.0
              3.1620505 = idf(docFreq=5088, maxDocs=44218)
              0.046875 = fieldNorm(doc=2279)
      0.22222222 = coord(2/9)
    
    Abstract
    Standards are important for quality and interoperability in any system. Bibliographic record creation standards such as MARC 21 (Machine Readable Catalogue), CCF (Common Communication Format), UNIMARC (Universal MARC) and their local variations, are in practice all across the library community. ILMS (Integrated Library Management Systems) are using these standards for the design of databases and the creation of bibliographic records. Their use is important for uniformity of the system and bibliographic data, but there are problems when a library wants to switch over from one system to another using different standards. This paper discusses migration from one record standard to another, mapping of data and related issues. Data exported from CDS/ISIS CCF based records to KOHA MARC 21 based records are discussed as a case study. This methodology, with few modifications, can be applied for migration of data in other bibliographicformats too. Freeware tools can be utilized for migration.
    Source
    International cataloging & bliographic control. 38(2009) no.1, S.6-12
  9. Xu, A.; Hess, K.; Akerman, L.: From MARC to BIBFRAME 2.0 : Crosswalks (2018) 0.02
    0.020090435 = product of:
      0.090406954 = sum of:
        0.032051016 = weight(_text_:cataloging in 5172) [ClassicSimilarity], result of:
          0.032051016 = score(doc=5172,freq=2.0), product of:
            0.14721331 = queryWeight, product of:
              3.9411201 = idf(docFreq=2334, maxDocs=44218)
              0.037353165 = queryNorm
            0.21771818 = fieldWeight in 5172, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.9411201 = idf(docFreq=2334, maxDocs=44218)
              0.0390625 = fieldNorm(doc=5172)
        0.058355935 = weight(_text_:data in 5172) [ClassicSimilarity], result of:
          0.058355935 = score(doc=5172,freq=16.0), product of:
            0.118112594 = queryWeight, product of:
              3.1620505 = idf(docFreq=5088, maxDocs=44218)
              0.037353165 = queryNorm
            0.49407038 = fieldWeight in 5172, product of:
              4.0 = tf(freq=16.0), with freq of:
                16.0 = termFreq=16.0
              3.1620505 = idf(docFreq=5088, maxDocs=44218)
              0.0390625 = fieldNorm(doc=5172)
      0.22222222 = coord(2/9)
    
    Abstract
    One of the big challenges facing academic libraries today is to increase the relevance of the libraries to their user communities. If the libraries can increase the visibility of their resources on the open web, it will increase the chances of the libraries to reach to their user communities via the user's first search experience. BIBFRAME and library Linked Data will enable libraries to publish their resources in a way that the Web understands, consume Linked Data to enrich their resources relevant to the libraries' user communities, and visualize networks across collections. However, one of the important steps for transitioning to BIBFRAME and library Linked Data involves crosswalks, mapping MARC fields and subfields across data models and performing necessary data reformatting to be in compliance with the specifications of the new model, which is currently BIBFRAME 2.0. This article looks into how the Library of Congress has mapped library bibliographic data from the MARC format to the BIBFRAME 2.0 model and vocabulary published and updated since April 2016, available from http://www.loc.gov/bibframe/docs/index.html based on the recently released conversion specifications and converter, developed by the Library of Congress with input from many community members. The BIBFRAME 2.0 standard and conversion tools will enable libraries to transform bibliographic data from MARC into BIBFRAME 2.0, which introduces a Linked Data model as the improved method of bibliographic control for the future, and make bibliographic information more useful within and beyond library communities.
    Source
    Cataloging and classification quarterly. 56(2018) no.2/3, S.224-250
  10. Miller, E.; Ogbuji, U.: Linked data design for the visible library (2015) 0.02
    0.019550638 = product of:
      0.08797787 = sum of:
        0.038461216 = weight(_text_:cataloging in 2773) [ClassicSimilarity], result of:
          0.038461216 = score(doc=2773,freq=2.0), product of:
            0.14721331 = queryWeight, product of:
              3.9411201 = idf(docFreq=2334, maxDocs=44218)
              0.037353165 = queryNorm
            0.26126182 = fieldWeight in 2773, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.9411201 = idf(docFreq=2334, maxDocs=44218)
              0.046875 = fieldNorm(doc=2773)
        0.04951665 = weight(_text_:data in 2773) [ClassicSimilarity], result of:
          0.04951665 = score(doc=2773,freq=8.0), product of:
            0.118112594 = queryWeight, product of:
              3.1620505 = idf(docFreq=5088, maxDocs=44218)
              0.037353165 = queryNorm
            0.4192326 = fieldWeight in 2773, product of:
              2.828427 = tf(freq=8.0), with freq of:
                8.0 = termFreq=8.0
              3.1620505 = idf(docFreq=5088, maxDocs=44218)
              0.046875 = fieldNorm(doc=2773)
      0.22222222 = coord(2/9)
    
    Abstract
    In response to libraries' frustration over their rich resources being invisible on the web, Zepheira, at the request of the Library of Congress, created BIBFRAME, a bibliographic metadata framework for cataloging. The model replaces MARC records with linked data, promoting resource visibility through a rich network of links. In place of formal taxonomies, a small but extensible vocabulary streamlines metadata efforts. Rather than using a unique bibliographic record to describe one item, BIBFRAME draws on the Dublin Core and the Functional Requirements for Bibliographic Records (FRBR) to generate formalized descriptions of Work, Instance, Authority and Annotation as well as associations between items. Zepheira trains librarians to transform MARC records to BIBFRAME resources and adapt the vocabulary for specialized needs, while subject matter experts and technical experts manage content, site design and usability. With a different approach toward data modeling and metadata, previously invisible resources gain visibility through linking.
    Footnote
    Contribution to a special section "Linked data and the charm of weak semantics".
  11. Hopkins, J.: USMARC as metadata shell (1999) 0.02
    0.018731717 = product of:
      0.084292725 = sum of:
        0.051281624 = weight(_text_:cataloging in 933) [ClassicSimilarity], result of:
          0.051281624 = score(doc=933,freq=2.0), product of:
            0.14721331 = queryWeight, product of:
              3.9411201 = idf(docFreq=2334, maxDocs=44218)
              0.037353165 = queryNorm
            0.3483491 = fieldWeight in 933, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.9411201 = idf(docFreq=2334, maxDocs=44218)
              0.0625 = fieldNorm(doc=933)
        0.0330111 = weight(_text_:data in 933) [ClassicSimilarity], result of:
          0.0330111 = score(doc=933,freq=2.0), product of:
            0.118112594 = queryWeight, product of:
              3.1620505 = idf(docFreq=5088, maxDocs=44218)
              0.037353165 = queryNorm
            0.2794884 = fieldWeight in 933, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.1620505 = idf(docFreq=5088, maxDocs=44218)
              0.0625 = fieldNorm(doc=933)
      0.22222222 = coord(2/9)
    
    Abstract
    This paper introduces the two concepts of Content and Coding which together define Metadata. The encoding scheme used to hold the data content is referred to as a shell. One such shell is the MARC format. In this paper I describe the MARC format and its application to Internet resources, primarily through the OCLC-sponsored Intercat Project
    Source
    Journal of Internet cataloging. 2(1999) no.1, S.55-68
  12. Ranta, J.A.: Queens Borough Public Library's Guidelines for cataloging community information (1996) 0.02
    0.018037936 = product of:
      0.08117071 = sum of:
        0.06345777 = weight(_text_:cataloging in 6523) [ClassicSimilarity], result of:
          0.06345777 = score(doc=6523,freq=4.0), product of:
            0.14721331 = queryWeight, product of:
              3.9411201 = idf(docFreq=2334, maxDocs=44218)
              0.037353165 = queryNorm
            0.43106002 = fieldWeight in 6523, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              3.9411201 = idf(docFreq=2334, maxDocs=44218)
              0.0546875 = fieldNorm(doc=6523)
        0.017712934 = product of:
          0.035425868 = sum of:
            0.035425868 = weight(_text_:22 in 6523) [ClassicSimilarity], result of:
              0.035425868 = score(doc=6523,freq=2.0), product of:
                0.13080442 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.037353165 = queryNorm
                0.2708308 = fieldWeight in 6523, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=6523)
          0.5 = coord(1/2)
      0.22222222 = coord(2/9)
    
    Source
    Cataloging and classification quarterly. 22(1996) no.2, S.51-69
  13. Heaney, M.: Object-oriented cataloging (1995) 0.02
    0.016390251 = product of:
      0.07375613 = sum of:
        0.04487142 = weight(_text_:cataloging in 3339) [ClassicSimilarity], result of:
          0.04487142 = score(doc=3339,freq=2.0), product of:
            0.14721331 = queryWeight, product of:
              3.9411201 = idf(docFreq=2334, maxDocs=44218)
              0.037353165 = queryNorm
            0.30480546 = fieldWeight in 3339, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.9411201 = idf(docFreq=2334, maxDocs=44218)
              0.0546875 = fieldNorm(doc=3339)
        0.028884713 = weight(_text_:data in 3339) [ClassicSimilarity], result of:
          0.028884713 = score(doc=3339,freq=2.0), product of:
            0.118112594 = queryWeight, product of:
              3.1620505 = idf(docFreq=5088, maxDocs=44218)
              0.037353165 = queryNorm
            0.24455236 = fieldWeight in 3339, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.1620505 = idf(docFreq=5088, maxDocs=44218)
              0.0546875 = fieldNorm(doc=3339)
      0.22222222 = coord(2/9)
    
    Abstract
    Catalogues have evolved from lists of physical items present in particular libraries into computerized access and retrieval tools for works dispersed across local and national boundaries. Works themselves are no longer constrained by physical form yet cataloguing rules have not evolved in parallel with these developments. Reanalyzes the nature of works and their publication in an approach based on object oriented modelling and demonstrates the advantages to be gained thereby. Suggests a strategic plan to enable an organic transformation to be made from current MARC based cataloguing to object oriented cataloguing. Proposes major revisions of MARC in order to allow records to maximize the benefits of both computerized databases and high speed data networks. This will involve a fundamental shift away from the AACR philosophy of description of, plus access to, physical items
  14. Riemer, J.J.: Adding 856 Fields to authority records : rationale and implications (1998) 0.02
    0.016390251 = product of:
      0.07375613 = sum of:
        0.04487142 = weight(_text_:cataloging in 3715) [ClassicSimilarity], result of:
          0.04487142 = score(doc=3715,freq=2.0), product of:
            0.14721331 = queryWeight, product of:
              3.9411201 = idf(docFreq=2334, maxDocs=44218)
              0.037353165 = queryNorm
            0.30480546 = fieldWeight in 3715, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.9411201 = idf(docFreq=2334, maxDocs=44218)
              0.0546875 = fieldNorm(doc=3715)
        0.028884713 = weight(_text_:data in 3715) [ClassicSimilarity], result of:
          0.028884713 = score(doc=3715,freq=2.0), product of:
            0.118112594 = queryWeight, product of:
              3.1620505 = idf(docFreq=5088, maxDocs=44218)
              0.037353165 = queryNorm
            0.24455236 = fieldWeight in 3715, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.1620505 = idf(docFreq=5088, maxDocs=44218)
              0.0546875 = fieldNorm(doc=3715)
      0.22222222 = coord(2/9)
    
    Abstract
    Discusses ways of applying MARC Field 856 (Electronic Location and Access) to authority records in online union catalogues. In principle, each catalogue site location can be treated as the electronic record of the work concerned and the MARC Field 856 can then refer to this location as if it were referring to the location of a primary record. Although URLs may become outdated, the fact that they are located in specifically defined MARC Fields makes the data contained amenable to the same link maintenance software ae used for the electronic records themselves. Includes practical examples of typical union catalogue records incorporating MARC Field 856
    Source
    Cataloging and classification quarterly. 26(1998) no.2, S.5-9
  15. Parker, V.: MARC tags for cataloging cartographic materials (1999) 0.01
    0.014244896 = product of:
      0.12820406 = sum of:
        0.12820406 = weight(_text_:cataloging in 5317) [ClassicSimilarity], result of:
          0.12820406 = score(doc=5317,freq=8.0), product of:
            0.14721331 = queryWeight, product of:
              3.9411201 = idf(docFreq=2334, maxDocs=44218)
              0.037353165 = queryNorm
            0.87087274 = fieldWeight in 5317, product of:
              2.828427 = tf(freq=8.0), with freq of:
                8.0 = termFreq=8.0
              3.9411201 = idf(docFreq=2334, maxDocs=44218)
              0.078125 = fieldNorm(doc=5317)
      0.11111111 = coord(1/9)
    
    Abstract
    This is a table of those MARC fields most frequently used when cataloging cartographic materials. The table gives fields both for monographs and for serials.
    Footnote
    Teil eines Themenheftes zu: "Maps and related cartographic materials: cataloging, classification, and bibliographic control"
    Source
    Cataloging and classification quarterly. 27(1999) nos.1/2, S.5-9
  16. Chandrakar, R.: Mapping CCF to MARC21 : an experimental approach (2001) 0.01
    0.014048787 = product of:
      0.06321954 = sum of:
        0.038461216 = weight(_text_:cataloging in 5437) [ClassicSimilarity], result of:
          0.038461216 = score(doc=5437,freq=2.0), product of:
            0.14721331 = queryWeight, product of:
              3.9411201 = idf(docFreq=2334, maxDocs=44218)
              0.037353165 = queryNorm
            0.26126182 = fieldWeight in 5437, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.9411201 = idf(docFreq=2334, maxDocs=44218)
              0.046875 = fieldNorm(doc=5437)
        0.024758326 = weight(_text_:data in 5437) [ClassicSimilarity], result of:
          0.024758326 = score(doc=5437,freq=2.0), product of:
            0.118112594 = queryWeight, product of:
              3.1620505 = idf(docFreq=5088, maxDocs=44218)
              0.037353165 = queryNorm
            0.2096163 = fieldWeight in 5437, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.1620505 = idf(docFreq=5088, maxDocs=44218)
              0.046875 = fieldNorm(doc=5437)
      0.22222222 = coord(2/9)
    
    Abstract
    The purpose of this article is to raise and address a number of issues pertaining to the conversion of Common Communication Format (CCF) into MARC21. In this era of global resource sharing, exchange of bibliographic records from one system to another is imperative in today's library communities. Instead of using a single standard to create machine-readable catalogue records, more than 20 standards have emerged and are being used by different institutions. Because of these variations in standards, sharing of resources and transfer of data from one system to another among the institutions locally and globally has become a significant problem. Addressing this problem requires keeping in mind that countries such as India and others in southeast Asia are using the CCF as a standard for creating bibliographic cataloguing records. This paper describes a way to map the bibliographic catalogue records from CCF to MARC21, although 100% mapping is not possible. In addition, the paper describes an experimental approach that enumerates problems that may occur during the mapping of records/exchanging of records and how these problems can be overcome.
    Source
    Cataloging and classification quarterly. 33(2001) no.1, S.33-49
  17. Crook, M.: Barbara Tillett discusses cataloging rules and conceptual models (1996) 0.01
    0.013907635 = product of:
      0.062584355 = sum of:
        0.04487142 = weight(_text_:cataloging in 7683) [ClassicSimilarity], result of:
          0.04487142 = score(doc=7683,freq=2.0), product of:
            0.14721331 = queryWeight, product of:
              3.9411201 = idf(docFreq=2334, maxDocs=44218)
              0.037353165 = queryNorm
            0.30480546 = fieldWeight in 7683, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.9411201 = idf(docFreq=2334, maxDocs=44218)
              0.0546875 = fieldNorm(doc=7683)
        0.017712934 = product of:
          0.035425868 = sum of:
            0.035425868 = weight(_text_:22 in 7683) [ClassicSimilarity], result of:
              0.035425868 = score(doc=7683,freq=2.0), product of:
                0.13080442 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.037353165 = queryNorm
                0.2708308 = fieldWeight in 7683, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=7683)
          0.5 = coord(1/2)
      0.22222222 = coord(2/9)
    
    Source
    OCLC newsletter. 1996, no.220, S.20-22
  18. Sandberg-Fox, A.M.: ¬The microcomputer revolution (2001) 0.01
    0.0122124525 = product of:
      0.109912075 = sum of:
        0.109912075 = weight(_text_:cataloging in 5409) [ClassicSimilarity], result of:
          0.109912075 = score(doc=5409,freq=12.0), product of:
            0.14721331 = queryWeight, product of:
              3.9411201 = idf(docFreq=2334, maxDocs=44218)
              0.037353165 = queryNorm
            0.7466178 = fieldWeight in 5409, product of:
              3.4641016 = tf(freq=12.0), with freq of:
                12.0 = termFreq=12.0
              3.9411201 = idf(docFreq=2334, maxDocs=44218)
              0.0546875 = fieldNorm(doc=5409)
      0.11111111 = coord(1/9)
    
    Abstract
    With the introduction of the microcomputer in the 1980s, a revolution of sorts was initiated. In libraries this was evidenced by the acquisition of personal computers and the software to run on them. All that catalogers needed were cataloging rules and a MARC format to ensure their bibliographic control. However, little did catalogers realize they were dealing with an industry that introduced rapid technological changes, which effected continual revision of existing rules and the formulation of special guidelines to deal with the industry's innovative products. This article focuses on the attempts of libraries and organized cataloging groups to develop the Chapter 9 descriptive cataloging rules in AACR2; it highlights selected events and includes cataloging examples that illustrate the evolution of the chapter.
    Footnote
    Beitrag eines Themenheftes "The audiovisual cataloging current; Part I"
    Source
    Cataloging and classification quarterly. 31(2001) no.2, S.85-100
  19. Riva, P.: Mapping MARC 21 linking entry fields to FRBR and Tillett's taxonomy of bibliographic relationships (2004) 0.01
    0.011920829 = product of:
      0.05364373 = sum of:
        0.038461216 = weight(_text_:cataloging in 136) [ClassicSimilarity], result of:
          0.038461216 = score(doc=136,freq=2.0), product of:
            0.14721331 = queryWeight, product of:
              3.9411201 = idf(docFreq=2334, maxDocs=44218)
              0.037353165 = queryNorm
            0.26126182 = fieldWeight in 136, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.9411201 = idf(docFreq=2334, maxDocs=44218)
              0.046875 = fieldNorm(doc=136)
        0.015182514 = product of:
          0.030365027 = sum of:
            0.030365027 = weight(_text_:22 in 136) [ClassicSimilarity], result of:
              0.030365027 = score(doc=136,freq=2.0), product of:
                0.13080442 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.037353165 = queryNorm
                0.23214069 = fieldWeight in 136, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.046875 = fieldNorm(doc=136)
          0.5 = coord(1/2)
      0.22222222 = coord(2/9)
    
    Abstract
    Bibliographic relationships have taken on even greater importance in the context of ongoing efforts to integrate concepts from the Functional Requirements for Bibliographic Records (FRBR) into cataloging codes and database structures. In MARC 21, the linking entry fields are a major mechanism for expressing relationships between bibliographic records. Taxonomies of bibliographic relationships have been proposed by Tillett, with an extension by Smiraglia, and in FRBR itself. The present exercise is to provide a detailed bidirectional mapping of the MARC 21 linking fields to these two schemes. The correspondence of the Tillett taxonomic divisions to the MARC categorization of the linking fields as chronological, horizontal, or vertical is examined as well. Application of the findings to MARC format development and system functionality is discussed.
    Date
    10. 9.2000 17:38:22
  20. Stephens, O.: Introduction to OpenRefine (2014) 0.01
    0.010293019 = product of:
      0.092637174 = sum of:
        0.092637174 = weight(_text_:data in 2884) [ClassicSimilarity], result of:
          0.092637174 = score(doc=2884,freq=28.0), product of:
            0.118112594 = queryWeight, product of:
              3.1620505 = idf(docFreq=5088, maxDocs=44218)
              0.037353165 = queryNorm
            0.7843124 = fieldWeight in 2884, product of:
              5.2915025 = tf(freq=28.0), with freq of:
                28.0 = termFreq=28.0
              3.1620505 = idf(docFreq=5088, maxDocs=44218)
              0.046875 = fieldNorm(doc=2884)
      0.11111111 = coord(1/9)
    
    Abstract
    OpenRefine is described as a tool for working with 'messy' data - but what does this mean? It is probably easiest to describe the kinds of data OpenRefine is good at working with and the sorts of problems it can help you solve. OpenRefine is most useful where you have data in a simple tabular format but with internal inconsistencies either in data formats, or where data appears, or in terminology used. It can help you: Get an overview of a data set Resolve inconsistencies in a data set Help you split data up into more granular parts Match local data up to other data sets Enhance a data set with data from other sources Some common scenarios might be: 1. Where you want to know how many times a particular value appears in a column in your data. 2. Where you want to know how values are distributed across your whole data set. 3. Where you have a list of dates which are formatted in different ways, and want to change all the dates in the list to a single common date format.