Search (17 results, page 1 of 1)

  • × language_ss:"e"
  • × theme_ss:"Datenformate"
  • × theme_ss:"Formalerschließung"
  1. Yee, M.M.: New perspectives on the shared cataloging environment and a MARC 21 shopping list (2004) 0.04
    0.038114388 = product of:
      0.09528597 = sum of:
        0.038640905 = weight(_text_:it in 132) [ClassicSimilarity], result of:
          0.038640905 = score(doc=132,freq=2.0), product of:
            0.15115225 = queryWeight, product of:
              2.892262 = idf(docFreq=6664, maxDocs=44218)
              0.052260913 = queryNorm
            0.25564227 = fieldWeight in 132, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              2.892262 = idf(docFreq=6664, maxDocs=44218)
              0.0625 = fieldNorm(doc=132)
        0.05664506 = weight(_text_:22 in 132) [ClassicSimilarity], result of:
          0.05664506 = score(doc=132,freq=2.0), product of:
            0.18300882 = queryWeight, product of:
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.052260913 = queryNorm
            0.30952093 = fieldWeight in 132, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.0625 = fieldNorm(doc=132)
      0.4 = coord(2/5)
    
    Abstract
    This paper surveys the cataloging literature to collect problems that have been identified with the MARC 21 format. The problems are sorted into (1) problems that are not the fault of MARC 21; (2) problems that perhaps are not problems at all; (3) problems that are connected with the current shared cataloging environment; and 4) other problems with MARC 21 and vendor implementation of it. The author makes recommendations to deal with the true MARC 21 problems that remain after this analysis.
    Date
    10. 9.2000 17:38:22
  2. Coyle, K.: Future considerations : the functional library systems record (2004) 0.04
    0.038114388 = product of:
      0.09528597 = sum of:
        0.038640905 = weight(_text_:it in 562) [ClassicSimilarity], result of:
          0.038640905 = score(doc=562,freq=2.0), product of:
            0.15115225 = queryWeight, product of:
              2.892262 = idf(docFreq=6664, maxDocs=44218)
              0.052260913 = queryNorm
            0.25564227 = fieldWeight in 562, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              2.892262 = idf(docFreq=6664, maxDocs=44218)
              0.0625 = fieldNorm(doc=562)
        0.05664506 = weight(_text_:22 in 562) [ClassicSimilarity], result of:
          0.05664506 = score(doc=562,freq=2.0), product of:
            0.18300882 = queryWeight, product of:
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.052260913 = queryNorm
            0.30952093 = fieldWeight in 562, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.0625 = fieldNorm(doc=562)
      0.4 = coord(2/5)
    
    Abstract
    The paper performs a thought experiment on the concept of a record based on the Functional Requirements for Bibliographic Records and library system functions, and concludes that if we want to develop a functional bibliographic record we need to do it within the context of a flexible, functional library systems record structure. The article suggests a new way to look at the library systems record that would allow libraries to move forward in terms of technology but also in terms of serving library users.
    Source
    Library hi tech. 22(2004) no.2, S.166-174
  3. Crook, M.: Barbara Tillett discusses cataloging rules and conceptual models (1996) 0.03
    0.033350088 = product of:
      0.083375216 = sum of:
        0.03381079 = weight(_text_:it in 7683) [ClassicSimilarity], result of:
          0.03381079 = score(doc=7683,freq=2.0), product of:
            0.15115225 = queryWeight, product of:
              2.892262 = idf(docFreq=6664, maxDocs=44218)
              0.052260913 = queryNorm
            0.22368698 = fieldWeight in 7683, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              2.892262 = idf(docFreq=6664, maxDocs=44218)
              0.0546875 = fieldNorm(doc=7683)
        0.04956443 = weight(_text_:22 in 7683) [ClassicSimilarity], result of:
          0.04956443 = score(doc=7683,freq=2.0), product of:
            0.18300882 = queryWeight, product of:
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.052260913 = queryNorm
            0.2708308 = fieldWeight in 7683, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.0546875 = fieldNorm(doc=7683)
      0.4 = coord(2/5)
    
    Abstract
    The chief of cataloguing policy and support office at the LoC presents her views on the usefulness of conceptual modelling in determining future directions for cataloguing and the MARC format. After describing the evolution of bibliographic processes, suggests usign the entity-relationship conceptual model to step back from how we record information today and start thinking about what information really means and why we provide it. Argues that now is the time to reexamine the basic principles which underpin Anglo-American cataloguing codes and that MARC formats should be looked at to see how they can evolve towards a future, improved structure for communicating bibliographic and authority information
    Source
    OCLC newsletter. 1996, no.220, S.20-22
  4. Tennant, R.: ¬A bibliographic metadata infrastructure for the twenty-first century (2004) 0.02
    0.016021643 = product of:
      0.08010821 = sum of:
        0.08010821 = weight(_text_:22 in 2845) [ClassicSimilarity], result of:
          0.08010821 = score(doc=2845,freq=4.0), product of:
            0.18300882 = queryWeight, product of:
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.052260913 = queryNorm
            0.4377287 = fieldWeight in 2845, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.0625 = fieldNorm(doc=2845)
      0.2 = coord(1/5)
    
    Date
    9.12.2005 19:22:38
    Source
    Library hi tech. 22(2004) no.2, S.175-181
  5. Stephens, O.: Introduction to OpenRefine (2014) 0.01
    0.010039202 = product of:
      0.050196007 = sum of:
        0.050196007 = weight(_text_:it in 2884) [ClassicSimilarity], result of:
          0.050196007 = score(doc=2884,freq=6.0), product of:
            0.15115225 = queryWeight, product of:
              2.892262 = idf(docFreq=6664, maxDocs=44218)
              0.052260913 = queryNorm
            0.33208904 = fieldWeight in 2884, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              2.892262 = idf(docFreq=6664, maxDocs=44218)
              0.046875 = fieldNorm(doc=2884)
      0.2 = coord(1/5)
    
    Abstract
    OpenRefine is described as a tool for working with 'messy' data - but what does this mean? It is probably easiest to describe the kinds of data OpenRefine is good at working with and the sorts of problems it can help you solve. OpenRefine is most useful where you have data in a simple tabular format but with internal inconsistencies either in data formats, or where data appears, or in terminology used. It can help you: Get an overview of a data set Resolve inconsistencies in a data set Help you split data up into more granular parts Match local data up to other data sets Enhance a data set with data from other sources Some common scenarios might be: 1. Where you want to know how many times a particular value appears in a column in your data. 2. Where you want to know how values are distributed across your whole data set. 3. Where you have a list of dates which are formatted in different ways, and want to change all the dates in the list to a single common date format.
  6. Ranta, J.A.: Queens Borough Public Library's Guidelines for cataloging community information (1996) 0.01
    0.009912886 = product of:
      0.04956443 = sum of:
        0.04956443 = weight(_text_:22 in 6523) [ClassicSimilarity], result of:
          0.04956443 = score(doc=6523,freq=2.0), product of:
            0.18300882 = queryWeight, product of:
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.052260913 = queryNorm
            0.2708308 = fieldWeight in 6523, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.0546875 = fieldNorm(doc=6523)
      0.2 = coord(1/5)
    
    Source
    Cataloging and classification quarterly. 22(1996) no.2, S.51-69
  7. Riva, P.: Mapping MARC 21 linking entry fields to FRBR and Tillett's taxonomy of bibliographic relationships (2004) 0.01
    0.008496759 = product of:
      0.042483795 = sum of:
        0.042483795 = weight(_text_:22 in 136) [ClassicSimilarity], result of:
          0.042483795 = score(doc=136,freq=2.0), product of:
            0.18300882 = queryWeight, product of:
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.052260913 = queryNorm
            0.23214069 = fieldWeight in 136, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.046875 = fieldNorm(doc=136)
      0.2 = coord(1/5)
    
    Date
    10. 9.2000 17:38:22
  8. Lee, S.; Jacob, E.K.: ¬An integrated approach to metadata interoperability : construction of a conceptual structure between MARC and FRBR (2011) 0.01
    0.008496759 = product of:
      0.042483795 = sum of:
        0.042483795 = weight(_text_:22 in 302) [ClassicSimilarity], result of:
          0.042483795 = score(doc=302,freq=2.0), product of:
            0.18300882 = queryWeight, product of:
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.052260913 = queryNorm
            0.23214069 = fieldWeight in 302, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.046875 = fieldNorm(doc=302)
      0.2 = coord(1/5)
    
    Date
    10. 9.2000 17:38:22
  9. Witt, M.; Leresche, F.: IFLA study on functional requirements for bibliographic records : cataloguing practice in France (1995) 0.01
    0.0077281813 = product of:
      0.038640905 = sum of:
        0.038640905 = weight(_text_:it in 3081) [ClassicSimilarity], result of:
          0.038640905 = score(doc=3081,freq=2.0), product of:
            0.15115225 = queryWeight, product of:
              2.892262 = idf(docFreq=6664, maxDocs=44218)
              0.052260913 = queryNorm
            0.25564227 = fieldWeight in 3081, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              2.892262 = idf(docFreq=6664, maxDocs=44218)
              0.0625 = fieldNorm(doc=3081)
      0.2 = coord(1/5)
    
    Abstract
    Discusses the French reaction. Covers the entities considered for cataloguing; elements for identifying a document; access points; and authority records. Considers whether it is possible to reduce redundancies among the elements contained in bibliographic records caused by overlapping between the ISBD description, the access points and the coded information; and whether OPACs can be developed to present clearly to users various entities from the most general level to the most specific level
  10. Kartus, E.: Beyond MARC : is it really possible? (1995) 0.01
    0.0077281813 = product of:
      0.038640905 = sum of:
        0.038640905 = weight(_text_:it in 5753) [ClassicSimilarity], result of:
          0.038640905 = score(doc=5753,freq=2.0), product of:
            0.15115225 = queryWeight, product of:
              2.892262 = idf(docFreq=6664, maxDocs=44218)
              0.052260913 = queryNorm
            0.25564227 = fieldWeight in 5753, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              2.892262 = idf(docFreq=6664, maxDocs=44218)
              0.0625 = fieldNorm(doc=5753)
      0.2 = coord(1/5)
    
  11. Wisser, K.M.; O'Brien Roper, J.: Maximizing metadata : exploring the EAD-MARC relationship (2003) 0.01
    0.0070806327 = product of:
      0.035403162 = sum of:
        0.035403162 = weight(_text_:22 in 154) [ClassicSimilarity], result of:
          0.035403162 = score(doc=154,freq=2.0), product of:
            0.18300882 = queryWeight, product of:
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.052260913 = queryNorm
            0.19345059 = fieldWeight in 154, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.0390625 = fieldNorm(doc=154)
      0.2 = coord(1/5)
    
    Date
    10. 9.2000 17:38:22
  12. Riemer, J.J.: Adding 856 Fields to authority records : rationale and implications (1998) 0.01
    0.006762158 = product of:
      0.03381079 = sum of:
        0.03381079 = weight(_text_:it in 3715) [ClassicSimilarity], result of:
          0.03381079 = score(doc=3715,freq=2.0), product of:
            0.15115225 = queryWeight, product of:
              2.892262 = idf(docFreq=6664, maxDocs=44218)
              0.052260913 = queryNorm
            0.22368698 = fieldWeight in 3715, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              2.892262 = idf(docFreq=6664, maxDocs=44218)
              0.0546875 = fieldNorm(doc=3715)
      0.2 = coord(1/5)
    
    Abstract
    Discusses ways of applying MARC Field 856 (Electronic Location and Access) to authority records in online union catalogues. In principle, each catalogue site location can be treated as the electronic record of the work concerned and the MARC Field 856 can then refer to this location as if it were referring to the location of a primary record. Although URLs may become outdated, the fact that they are located in specifically defined MARC Fields makes the data contained amenable to the same link maintenance software ae used for the electronic records themselves. Includes practical examples of typical union catalogue records incorporating MARC Field 856
  13. Sandberg-Fox, A.M.: ¬The microcomputer revolution (2001) 0.01
    0.006762158 = product of:
      0.03381079 = sum of:
        0.03381079 = weight(_text_:it in 5409) [ClassicSimilarity], result of:
          0.03381079 = score(doc=5409,freq=2.0), product of:
            0.15115225 = queryWeight, product of:
              2.892262 = idf(docFreq=6664, maxDocs=44218)
              0.052260913 = queryNorm
            0.22368698 = fieldWeight in 5409, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              2.892262 = idf(docFreq=6664, maxDocs=44218)
              0.0546875 = fieldNorm(doc=5409)
      0.2 = coord(1/5)
    
    Abstract
    With the introduction of the microcomputer in the 1980s, a revolution of sorts was initiated. In libraries this was evidenced by the acquisition of personal computers and the software to run on them. All that catalogers needed were cataloging rules and a MARC format to ensure their bibliographic control. However, little did catalogers realize they were dealing with an industry that introduced rapid technological changes, which effected continual revision of existing rules and the formulation of special guidelines to deal with the industry's innovative products. This article focuses on the attempts of libraries and organized cataloging groups to develop the Chapter 9 descriptive cataloging rules in AACR2; it highlights selected events and includes cataloging examples that illustrate the evolution of the chapter.
  14. Samples, J.; Bigelow, I.: MARC to BIBFRAME : converting the PCC to Linked Data (2020) 0.01
    0.006762158 = product of:
      0.03381079 = sum of:
        0.03381079 = weight(_text_:it in 119) [ClassicSimilarity], result of:
          0.03381079 = score(doc=119,freq=2.0), product of:
            0.15115225 = queryWeight, product of:
              2.892262 = idf(docFreq=6664, maxDocs=44218)
              0.052260913 = queryNorm
            0.22368698 = fieldWeight in 119, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              2.892262 = idf(docFreq=6664, maxDocs=44218)
              0.0546875 = fieldNorm(doc=119)
      0.2 = coord(1/5)
    
    Abstract
    The Program for Cooperative Cataloging (PCC) has formal relationships with the Library of Congress (LC), Share-VDE, and Linked Data for Production Phase 2 (LD4P2) for work on Bibliographic Framework (BIBFRAME), and PCC institutions have been very active in the exploration of MARC to BIBFRAME conversion processes. This article will review the involvement of PCC in the development of BIBFRAME and examine the work of LC, Share-VDE, and LD4P2 on MARC to BIBFRAME conversion. It will conclude with a discussion of areas for further exploration by the PCC leading up to the creation of PCC conversion specifications and PCC BIBFRAME data.
  15. Giordano, R.: ¬The documentation of electronic texts : using Text Encoding Initiative headers: an introduction (1994) 0.01
    0.005796136 = product of:
      0.028980678 = sum of:
        0.028980678 = weight(_text_:it in 866) [ClassicSimilarity], result of:
          0.028980678 = score(doc=866,freq=2.0), product of:
            0.15115225 = queryWeight, product of:
              2.892262 = idf(docFreq=6664, maxDocs=44218)
              0.052260913 = queryNorm
            0.19173169 = fieldWeight in 866, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              2.892262 = idf(docFreq=6664, maxDocs=44218)
              0.046875 = fieldNorm(doc=866)
      0.2 = coord(1/5)
    
    Abstract
    Presents a general introduction to the form and functions of the Text Encoding Initiative (TEI) headers and explains their relationship to the MARC record. The TEI header's main strength is that it documents electronic texts in a standard exchange format that should be understandable to both librarian cataloguers and text encoders outside of librarianship. TEI gives encoders the ability to document the the electronic text itself, its source, its encoding principles, and revisions, as well as non bibliographic characteristics of the text that can support both scholarly analysis and retrieval. Its bibliographic descriptions can be loaded into standard remote bibliographic databases, which should make electronic texts as easy to find for researchers as texts in other media. Presents a brief overview of the TEI header, the file description and ways in which the TEI headers have counterparts in MARC, the Encoding Description, the Profile Description, the Revision Description, the size and complexity of the TEI header, and the use of the TEI header to support document retrieval and analysis, with notes on some of the prospects and problems
  16. McBride, J.L.: Faceted subject access for music through USMARC : a case for linked fields (2000) 0.01
    0.005796136 = product of:
      0.028980678 = sum of:
        0.028980678 = weight(_text_:it in 5403) [ClassicSimilarity], result of:
          0.028980678 = score(doc=5403,freq=2.0), product of:
            0.15115225 = queryWeight, product of:
              2.892262 = idf(docFreq=6664, maxDocs=44218)
              0.052260913 = queryNorm
            0.19173169 = fieldWeight in 5403, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              2.892262 = idf(docFreq=6664, maxDocs=44218)
              0.046875 = fieldNorm(doc=5403)
      0.2 = coord(1/5)
    
    Abstract
    The USMARC Format for Bibliographic Description contains three fields (045, 047, and 048) designed to facilitate subject access to music materials. The fields cover three of the main aspects of subject description for music: date of composition, form or genre, and number of instruments or voices, respectively. The codes are rarely used for subject access, because of the difficulty of coding them and because false drops would result in retrieval of bibliographic records where more than one musical work is present, a situation that occurs frequently with sound recordings. It is proposed that the values of the fields be converted to natural language and that subfield 8 be used to link all access fields in a bibliographic record for greater precision in retrieval. This proposal has implications beyond music cataloging, especially for metadata and any bibliographic records describing materials containing many works and subjects.
  17. Xu, A.; Hess, K.; Akerman, L.: From MARC to BIBFRAME 2.0 : Crosswalks (2018) 0.00
    0.004830113 = product of:
      0.024150565 = sum of:
        0.024150565 = weight(_text_:it in 5172) [ClassicSimilarity], result of:
          0.024150565 = score(doc=5172,freq=2.0), product of:
            0.15115225 = queryWeight, product of:
              2.892262 = idf(docFreq=6664, maxDocs=44218)
              0.052260913 = queryNorm
            0.15977642 = fieldWeight in 5172, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              2.892262 = idf(docFreq=6664, maxDocs=44218)
              0.0390625 = fieldNorm(doc=5172)
      0.2 = coord(1/5)
    
    Abstract
    One of the big challenges facing academic libraries today is to increase the relevance of the libraries to their user communities. If the libraries can increase the visibility of their resources on the open web, it will increase the chances of the libraries to reach to their user communities via the user's first search experience. BIBFRAME and library Linked Data will enable libraries to publish their resources in a way that the Web understands, consume Linked Data to enrich their resources relevant to the libraries' user communities, and visualize networks across collections. However, one of the important steps for transitioning to BIBFRAME and library Linked Data involves crosswalks, mapping MARC fields and subfields across data models and performing necessary data reformatting to be in compliance with the specifications of the new model, which is currently BIBFRAME 2.0. This article looks into how the Library of Congress has mapped library bibliographic data from the MARC format to the BIBFRAME 2.0 model and vocabulary published and updated since April 2016, available from http://www.loc.gov/bibframe/docs/index.html based on the recently released conversion specifications and converter, developed by the Library of Congress with input from many community members. The BIBFRAME 2.0 standard and conversion tools will enable libraries to transform bibliographic data from MARC into BIBFRAME 2.0, which introduces a Linked Data model as the improved method of bibliographic control for the future, and make bibliographic information more useful within and beyond library communities.