Search (19 results, page 1 of 1)

  • × theme_ss:"Formalerschließung"
  • × type_ss:"a"
  • × year_i:[1980 TO 1990}
  1. Hustand, S.: Problems of duplicate records (1986) 0.10
    0.103493385 = product of:
      0.15524007 = sum of:
        0.10130662 = weight(_text_:title in 266) [ClassicSimilarity], result of:
          0.10130662 = score(doc=266,freq=2.0), product of:
            0.27436262 = queryWeight, product of:
              5.570018 = idf(docFreq=457, maxDocs=44218)
              0.049257044 = queryNorm
            0.3692435 = fieldWeight in 266, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              5.570018 = idf(docFreq=457, maxDocs=44218)
              0.046875 = fieldNorm(doc=266)
        0.053933453 = product of:
          0.107866906 = sum of:
            0.107866906 = weight(_text_:catalogue in 266) [ClassicSimilarity], result of:
              0.107866906 = score(doc=266,freq=4.0), product of:
                0.23806341 = queryWeight, product of:
                  4.8330836 = idf(docFreq=956, maxDocs=44218)
                  0.049257044 = queryNorm
                0.45310158 = fieldWeight in 266, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  4.8330836 = idf(docFreq=956, maxDocs=44218)
                  0.046875 = fieldNorm(doc=266)
          0.5 = coord(1/2)
      0.6666667 = coord(2/3)
    
    Abstract
    Duplicate records is a familiar problem in bibliographic databases. The problem is obvious when a union catalogue is established by automatically merging two or more separate and independent source of catalogue information. However, even in systems with on-line cataloguing and access to previous records, duplication is a problem. Author / title search search prior to cataloguing does not cut duplication to zero. A great deal of effort has been put into developing methods of duplicate detection. A major problem in this work has been efficiency. Particularly in the on-line setting is this of importance. Most studies have dealt with book and article material. The Research Libraries Group Inc. has described matching algorithms also for films, maps, recordings, scores and serials. Various methods of detecting duplicates will be discussed.
  2. Jeng, L.H.: ¬An expert system for determining title proper in descriptive cataloging : a conceptual model (1986) 0.08
    0.07879403 = product of:
      0.2363821 = sum of:
        0.2363821 = weight(_text_:title in 375) [ClassicSimilarity], result of:
          0.2363821 = score(doc=375,freq=8.0), product of:
            0.27436262 = queryWeight, product of:
              5.570018 = idf(docFreq=457, maxDocs=44218)
              0.049257044 = queryNorm
            0.86156815 = fieldWeight in 375, product of:
              2.828427 = tf(freq=8.0), with freq of:
                8.0 = termFreq=8.0
              5.570018 = idf(docFreq=457, maxDocs=44218)
              0.0546875 = fieldNorm(doc=375)
      0.33333334 = coord(1/3)
    
    Abstract
    The human process of determining bibliographic data from title pages of monographs is complex, yet systematic. This paper investigates the intellectual process involved, in conceptual and logical levels, by proposing a model of the expert system for determining title proper as the first element of the first area in ISBD. It assumes that the title page of a monograph consists of more than one block of character or graphic representation. Each block has its physical and content characteristics and can be separated from other blocks by separators. Three categories of expert knowledge are identified, and the system model is discussed along with its individual system component. It applies the "list" concept for the system data structure and addresses the potentiality of this conceptual model.
  3. Abrera, J.B.; Lin, J.C.: Parallel title problems of interpretation (1981) 0.07
    0.06823764 = product of:
      0.20471291 = sum of:
        0.20471291 = weight(_text_:title in 279) [ClassicSimilarity], result of:
          0.20471291 = score(doc=279,freq=6.0), product of:
            0.27436262 = queryWeight, product of:
              5.570018 = idf(docFreq=457, maxDocs=44218)
              0.049257044 = queryNorm
            0.74613994 = fieldWeight in 279, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              5.570018 = idf(docFreq=457, maxDocs=44218)
              0.0546875 = fieldNorm(doc=279)
      0.33333334 = coord(1/3)
    
    Abstract
    This study examines the interrelationship of the rules on parallel title in AACR2. What emerges from the analysis are two principles that were utilized in the codification of the rules: (1) a principle of interrelationship (i.e. transcribing a data element in its relationship to other data elements); and (2) a principle of structured format (i.e. transcribing a data element in a prescribed order). A graphic representation of the possible combinations and positions of the data elements of descriptions in the title and statement of responsibility area (Area 1) shows that the present rules do not ensure consistency in bibliographic recording.
  4. Münnich, M.: Katalogisieren auf dem PC : ein Pflichtenheft für die Formalkatalogisierung (1988) 0.07
    0.06573725 = product of:
      0.19721174 = sum of:
        0.19721174 = sum of:
          0.14382255 = weight(_text_:catalogue in 2502) [ClassicSimilarity], result of:
            0.14382255 = score(doc=2502,freq=4.0), product of:
              0.23806341 = queryWeight, product of:
                4.8330836 = idf(docFreq=956, maxDocs=44218)
                0.049257044 = queryNorm
              0.60413545 = fieldWeight in 2502, product of:
                2.0 = tf(freq=4.0), with freq of:
                  4.0 = termFreq=4.0
                4.8330836 = idf(docFreq=956, maxDocs=44218)
                0.0625 = fieldNorm(doc=2502)
          0.053389195 = weight(_text_:22 in 2502) [ClassicSimilarity], result of:
            0.053389195 = score(doc=2502,freq=2.0), product of:
              0.17248978 = queryWeight, product of:
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.049257044 = queryNorm
              0.30952093 = fieldWeight in 2502, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.0625 = fieldNorm(doc=2502)
      0.33333334 = coord(1/3)
    
    Abstract
    Examines a simpler cataloguing format offered by PCs, without disturbing compatibility, using A-Z cataloguing rules for data input, category codes for tagging and computer-supported data input through windows. Gives numerous examples of catalogue entries, basing techniques on certain category schemes set out by Klaus Haller and Hans Popst. Examines catalogue entries in respect of categories of data bases for authors and corporate names, titles, single volume works, serial issues of collected works, and limited editions of works in several volumes.
    Source
    Bibliotheksdienst. 22(1988) H.9, S.841-856
  5. André, P.W.; Janakiev, E.; Case, M.M.; Randall, K.M.: Serials control in an online integrated system : can latest entry cataloging help? (1986) 0.06
    0.055715796 = product of:
      0.16714738 = sum of:
        0.16714738 = weight(_text_:title in 381) [ClassicSimilarity], result of:
          0.16714738 = score(doc=381,freq=4.0), product of:
            0.27436262 = queryWeight, product of:
              5.570018 = idf(docFreq=457, maxDocs=44218)
              0.049257044 = queryNorm
            0.6092207 = fieldWeight in 381, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              5.570018 = idf(docFreq=457, maxDocs=44218)
              0.0546875 = fieldNorm(doc=381)
      0.33333334 = coord(1/3)
    
    Abstract
    An experiment in latest entry cataloging of selected serial title changes is currently being conducted at Northwestern University Library. The integrated structure of Northwestern's automated system NOTIS and its applications to serials processing were primary factors leading to the investigation of latest entry cataloging as an alternative to the current standard of successive entry cataloging for serial title changes. A systematic investigation was conducted through most of 1985 and allowed project staff to identify and evaluate a number of concerns and problems. The study team is encouraged with the results, but will continue gathering cataloging statistics and will conduct user studies before adopting latest entry cataloging as a permanent option.
  6. Bratcher, P.: Music OCLC recon : the practical approach (1988) 0.06
    0.055715796 = product of:
      0.16714738 = sum of:
        0.16714738 = weight(_text_:title in 407) [ClassicSimilarity], result of:
          0.16714738 = score(doc=407,freq=4.0), product of:
            0.27436262 = queryWeight, product of:
              5.570018 = idf(docFreq=457, maxDocs=44218)
              0.049257044 = queryNorm
            0.6092207 = fieldWeight in 407, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              5.570018 = idf(docFreq=457, maxDocs=44218)
              0.0546875 = fieldNorm(doc=407)
      0.33333334 = coord(1/3)
    
    Abstract
    An in-house retrospective OCLC conversion process for music titles presents problems infrequently encountered in book titles. The proliferation of works by composers and the manipulation of title elements by publishers makes matching shelflist cards to online records difficult. Placing shelflist cards in alphabetical order by composer and performing presearch authority work saves online search time. Recording plate and/or publisher numbers on the shelflist is also helpful. Limiting composer/title searches by 1900 or ???? will help narrow searches done for shelflist cards which carry no date. If all else fails, the item may need to be retrieved from the stacks.
  7. Svenonius, E.; Baughman, B.; Molto, M.: Title page sanctity? : the distribution of access points in a sample of English language monographs (1986) 0.05
    0.047756396 = product of:
      0.14326918 = sum of:
        0.14326918 = weight(_text_:title in 361) [ClassicSimilarity], result of:
          0.14326918 = score(doc=361,freq=4.0), product of:
            0.27436262 = queryWeight, product of:
              5.570018 = idf(docFreq=457, maxDocs=44218)
              0.049257044 = queryNorm
            0.52218914 = fieldWeight in 361, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              5.570018 = idf(docFreq=457, maxDocs=44218)
              0.046875 = fieldNorm(doc=361)
      0.33333334 = coord(1/3)
    
    Abstract
    The problem addressed in this paper is that of simplifying access point determination. A critique is made of the simple, mechanical rule whereby every name appearing in certain designatable locations within a publication qualifies as an access point. Then a more acceptable version of the every-name-an-access-point rule is tested empirically against a sample of 400 English language monographs. Conclusions reached concern (1) the responsibility profiles of these monographs, i.e., how many authors, editors, illustrators and emanators are typically associated with them and in what combinations, and (2) the relative productivity of different locations within them, e.g., title pages and tables of contents, as sources of access points. The study was conceived to be exploratory in nature and its findings suggest further research that could be done to provide empirical validation for rules for access point determination.
  8. Smiraglia, R.P.: Uniform titles for music : an exercise in collocating works (1989) 0.05
    0.047756396 = product of:
      0.14326918 = sum of:
        0.14326918 = weight(_text_:title in 441) [ClassicSimilarity], result of:
          0.14326918 = score(doc=441,freq=4.0), product of:
            0.27436262 = queryWeight, product of:
              5.570018 = idf(docFreq=457, maxDocs=44218)
              0.049257044 = queryNorm
            0.52218914 = fieldWeight in 441, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              5.570018 = idf(docFreq=457, maxDocs=44218)
              0.046875 = fieldNorm(doc=441)
      0.33333334 = coord(1/3)
    
    Abstract
    The uniform title is viewed historically as an artificial device to collocate works. In music cataloging, problems of multiple manifestations with variant title pages lead to the development of uniform titles that would both collocate and distinguish, and ultimately serve as identifiers for musical works. A principal problem in the authority control of works is recognition of multiple manifestations and the concomitant syndetic depth. Research suggests a low incidence of multiple manifestations among textual works, but hints that a greater incidence might be found among musical works. An empirical study is conducted using a sample of musical works and locating for each all physical manifestations in OCLC and the NUC. Virtually the entire sample of musical works yielded multiple manifestations. A majority of the manifestations had titles proper different from that of the first edition of the work. It is concluded that an authority-controlled collocating device is necessary for musical works, that more references are required, and that links among authority records for works could provide increased syndetic depth.
  9. Gertz, J.; Stout, L.J.: ¬The MARC Archival and Manuskripts Control (AMC) Format : a new direction in cataloging (1989) 0.05
    0.045025162 = product of:
      0.13507548 = sum of:
        0.13507548 = weight(_text_:title in 426) [ClassicSimilarity], result of:
          0.13507548 = score(doc=426,freq=2.0), product of:
            0.27436262 = queryWeight, product of:
              5.570018 = idf(docFreq=457, maxDocs=44218)
              0.049257044 = queryNorm
            0.49232465 = fieldWeight in 426, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              5.570018 = idf(docFreq=457, maxDocs=44218)
              0.0625 = fieldNorm(doc=426)
      0.33333334 = coord(1/3)
    
    Abstract
    The development and use of the MARC Archival and Manuscript Control (AMC) format is described. The format's unique characteristics stem largely from the needs of the archivl community for both a uniform means of control of materials at the collection level and a powerful tool for exchanging information about archival holdings. Uniques aspects of AMC cataloguing are described, including differing approaches to access points, form of entry, title, physical description, notes, subject headings, and call numbers.
  10. Leung, S.W.: MARC CIP records and MARC LC records : an evaluative study of their discrepancies (1983) 0.04
    0.039397016 = product of:
      0.11819105 = sum of:
        0.11819105 = weight(_text_:title in 327) [ClassicSimilarity], result of:
          0.11819105 = score(doc=327,freq=2.0), product of:
            0.27436262 = queryWeight, product of:
              5.570018 = idf(docFreq=457, maxDocs=44218)
              0.049257044 = queryNorm
            0.43078408 = fieldWeight in 327, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              5.570018 = idf(docFreq=457, maxDocs=44218)
              0.0546875 = fieldNorm(doc=327)
      0.33333334 = coord(1/3)
    
    Abstract
    In the last ten years, Cataloging in Publication (CIP) records have gained increasing acceptance and use in libraries, especially for cataloging purposes. Nevertheless, there is a general perception that the accuracy of CIP records can be further improved. Because improvement is only possible with more concrete information identifying specific problem areas, this study is designed to provide catalogers and cataloging managers more empirical data on the frequency and types of discrepancy between MARC CIP records and subsequent MARC LC records. This study differs from an earlier study which involved CIP records that appeared on the verso of the title page of publications. In addition, this study will make some observations regarding more effective use of the CIP records, primarily for cataloging purposes.
  11. Takawashi, T.; Shihota, T.; Oshiro, Z.: ¬The no-main-entry principle : the historical background of the Nippon Cataloging Rules (1989) 0.04
    0.039397016 = product of:
      0.11819105 = sum of:
        0.11819105 = weight(_text_:title in 446) [ClassicSimilarity], result of:
          0.11819105 = score(doc=446,freq=2.0), product of:
            0.27436262 = queryWeight, product of:
              5.570018 = idf(docFreq=457, maxDocs=44218)
              0.049257044 = queryNorm
            0.43078408 = fieldWeight in 446, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              5.570018 = idf(docFreq=457, maxDocs=44218)
              0.0546875 = fieldNorm(doc=446)
      0.33333334 = coord(1/3)
    
    Abstract
    When a unit card has become easy to reproduce, and each entry has an equal value for users in a multiple-entry catalog, the main-entry principle is no longer useful. The historical development of the no-main-entry principle in Japan is examined. It is generally considered that a title entry is the Oriental tradition. However, Japanese catalogers had used codes based upon the main entry principle under the influence of both AACR and the Paris Principles until 1977, when a new code of Nippon Cataloging Rules (NCR) was compiled. The "Description Unit-Card System of NCR1977, which adopted the no-main-entry principle, seems to be the one which is needed in the age of the computer catalog.
  12. Schmidt, G.: Panizzis Regeln für den alphabetischen Katalog : zur Entstehungsgeschichte der 91 'Rules for the compilation of the catalogue' (1982) 0.02
    0.01694965 = product of:
      0.05084895 = sum of:
        0.05084895 = product of:
          0.1016979 = sum of:
            0.1016979 = weight(_text_:catalogue in 5491) [ClassicSimilarity], result of:
              0.1016979 = score(doc=5491,freq=2.0), product of:
                0.23806341 = queryWeight, product of:
                  4.8330836 = idf(docFreq=956, maxDocs=44218)
                  0.049257044 = queryNorm
                0.42718828 = fieldWeight in 5491, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  4.8330836 = idf(docFreq=956, maxDocs=44218)
                  0.0625 = fieldNorm(doc=5491)
          0.5 = coord(1/2)
      0.33333334 = coord(1/3)
    
  13. Süle, G.: Problems of duplicate records, standards and quality control (1986) 0.02
    0.01694965 = product of:
      0.05084895 = sum of:
        0.05084895 = product of:
          0.1016979 = sum of:
            0.1016979 = weight(_text_:catalogue in 2060) [ClassicSimilarity], result of:
              0.1016979 = score(doc=2060,freq=2.0), product of:
                0.23806341 = queryWeight, product of:
                  4.8330836 = idf(docFreq=956, maxDocs=44218)
                  0.049257044 = queryNorm
                0.42718828 = fieldWeight in 2060, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  4.8330836 = idf(docFreq=956, maxDocs=44218)
                  0.0625 = fieldNorm(doc=2060)
          0.5 = coord(1/2)
      0.33333334 = coord(1/3)
    
    Abstract
    The reasons why duplicate records should be avoided in an on-line catalogue are discussed. A mechanism has to be developed either to warn the cataloguer when he is about to enter a duplicate record or to prohibit duplicate entries entirely. Since different interpretations of cataloguing rules may lead to different and thus duplicate records, an institution, committee, or the like is needed to decide on the application of the rules and eliminate records, if necessary.
  14. Havens, C.: Cataloging a special art collection (1989) 0.01
    0.014830943 = product of:
      0.04449283 = sum of:
        0.04449283 = product of:
          0.08898566 = sum of:
            0.08898566 = weight(_text_:catalogue in 444) [ClassicSimilarity], result of:
              0.08898566 = score(doc=444,freq=2.0), product of:
                0.23806341 = queryWeight, product of:
                  4.8330836 = idf(docFreq=956, maxDocs=44218)
                  0.049257044 = queryNorm
                0.37378973 = fieldWeight in 444, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  4.8330836 = idf(docFreq=956, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=444)
          0.5 = coord(1/2)
      0.33333334 = coord(1/3)
    
    Abstract
    Since 1986, a project has been underway to catalogue a private collection of the massive body of work of the artist Anne Ward Huey. The project has progressed from a handwritten card file to a computer file using AACR2 and OCLC tags and format. A compact disc or videotape file is also presently projected. In addition to taking traditional libray cataloguing out of the library setting, the project has addressed a number of problems involved in cataloguing silkscreen prints, and the records created by the project also contain information that AACR2 does not specifically address but that artists consider important. Recommends that the cataloguing rules be revised to encompass these specific aspects of cataloguing art work.
  15. Treichler, W.: Katalogisierungsregeln, Kataloge und Benützer in schweizerischen Bibliotheken (1986) 0.01
    0.0133473 = product of:
      0.040041897 = sum of:
        0.040041897 = product of:
          0.080083795 = sum of:
            0.080083795 = weight(_text_:22 in 5352) [ClassicSimilarity], result of:
              0.080083795 = score(doc=5352,freq=2.0), product of:
                0.17248978 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.049257044 = queryNorm
                0.46428138 = fieldWeight in 5352, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.09375 = fieldNorm(doc=5352)
          0.5 = coord(1/2)
      0.33333334 = coord(1/3)
    
    Date
    8.10.2000 14:22:27
  16. Rolland-Thomas, P.: AACR2: one step towards an international code (1983) 0.01
    0.0088982 = product of:
      0.026694598 = sum of:
        0.026694598 = product of:
          0.053389195 = sum of:
            0.053389195 = weight(_text_:22 in 310) [ClassicSimilarity], result of:
              0.053389195 = score(doc=310,freq=2.0), product of:
                0.17248978 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.049257044 = queryNorm
                0.30952093 = fieldWeight in 310, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0625 = fieldNorm(doc=310)
          0.5 = coord(1/2)
      0.33333334 = coord(1/3)
    
    Date
    6. 1.2007 19:12:22
  17. Struble, C.A.; Kohberger, P.B.: Statistical survey to determine availability of cataloging copy on OCLC (1987) 0.01
    0.0088982 = product of:
      0.026694598 = sum of:
        0.026694598 = product of:
          0.053389195 = sum of:
            0.053389195 = weight(_text_:22 in 358) [ClassicSimilarity], result of:
              0.053389195 = score(doc=358,freq=2.0), product of:
                0.17248978 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.049257044 = queryNorm
                0.30952093 = fieldWeight in 358, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0625 = fieldNorm(doc=358)
          0.5 = coord(1/2)
      0.33333334 = coord(1/3)
    
    Source
    Cataloging and classification quarterly. 7(1987) no.3, S.13-22
  18. Roughton, K.G.: Educating the dinosaur : the evolution of catalog management at the Iowa State University Library (1985) 0.01
    0.0088982 = product of:
      0.026694598 = sum of:
        0.026694598 = product of:
          0.053389195 = sum of:
            0.053389195 = weight(_text_:22 in 364) [ClassicSimilarity], result of:
              0.053389195 = score(doc=364,freq=2.0), product of:
                0.17248978 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.049257044 = queryNorm
                0.30952093 = fieldWeight in 364, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0625 = fieldNorm(doc=364)
          0.5 = coord(1/2)
      0.33333334 = coord(1/3)
    
    Date
    7. 1.2007 13:22:11
  19. Striedieck, S.: Online catalog maintenance : the OOPS command in LIAS (1985) 0.01
    0.0077859247 = product of:
      0.023357773 = sum of:
        0.023357773 = product of:
          0.046715546 = sum of:
            0.046715546 = weight(_text_:22 in 366) [ClassicSimilarity], result of:
              0.046715546 = score(doc=366,freq=2.0), product of:
                0.17248978 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.049257044 = queryNorm
                0.2708308 = fieldWeight in 366, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=366)
          0.5 = coord(1/2)
      0.33333334 = coord(1/3)
    
    Date
    7. 1.2007 13:22:30