Search (19 results, page 1 of 1)

  • × author_ss:"Taniguchi, S."
  1. Taniguchi, S.: Recording evidence in bibliographic records and descriptive metadata (2005) 0.05
    0.047683053 = product of:
      0.095366105 = sum of:
        0.095366105 = sum of:
          0.053292118 = weight(_text_:cataloging in 3565) [ClassicSimilarity], result of:
            0.053292118 = score(doc=3565,freq=2.0), product of:
              0.20397975 = queryWeight, product of:
                3.9411201 = idf(docFreq=2334, maxDocs=44218)
                0.051756795 = queryNorm
              0.26126182 = fieldWeight in 3565, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.9411201 = idf(docFreq=2334, maxDocs=44218)
                0.046875 = fieldNorm(doc=3565)
          0.042073987 = weight(_text_:22 in 3565) [ClassicSimilarity], result of:
            0.042073987 = score(doc=3565,freq=2.0), product of:
              0.18124348 = queryWeight, product of:
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.051756795 = queryNorm
              0.23214069 = fieldWeight in 3565, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.046875 = fieldNorm(doc=3565)
      0.5 = coord(1/2)
    
    Abstract
    In this article recording evidence for data values in addition to the values themselves in bibliographic records and descriptive metadata is proposed, with the aim of improving the expressiveness and reliability of those records and metadata. Recorded evidence indicates why and how data values are recorded for elements. Recording the history of changes in data values is also proposed, with the aim of reinforcing recorded evidence. First, evidence that can be recorded is categorized into classes: identifiers of rules or tasks, action descriptions of them, and input and output data of them. Dates of recording values and evidence are an additional class. Then, the relative usefulness of evidence classes and also levels (i.e., the record, data element, or data value level) to which an individual evidence class is applied, is examined. Second, examples that can be viewed as recorded evidence in existing bibliographic records and current cataloging rules are shown. Third, some examples of bibliographic records and descriptive metadata with notes of evidence are demonstrated. Fourth, ways of using recorded evidence are addressed.
    Date
    18. 6.2005 13:16:22
  2. Taniguchi, S.: ¬An analysis of orientedness in cataloging rules (1999) 0.03
    0.03108707 = product of:
      0.06217414 = sum of:
        0.06217414 = product of:
          0.12434828 = sum of:
            0.12434828 = weight(_text_:cataloging in 4305) [ClassicSimilarity], result of:
              0.12434828 = score(doc=4305,freq=2.0), product of:
                0.20397975 = queryWeight, product of:
                  3.9411201 = idf(docFreq=2334, maxDocs=44218)
                  0.051756795 = queryNorm
                0.6096109 = fieldWeight in 4305, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.9411201 = idf(docFreq=2334, maxDocs=44218)
                  0.109375 = fieldNorm(doc=4305)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  3. Taniguchi, S.: Current status of cataloging and classification education in Japan (2005) 0.03
    0.03108707 = product of:
      0.06217414 = sum of:
        0.06217414 = product of:
          0.12434828 = sum of:
            0.12434828 = weight(_text_:cataloging in 5752) [ClassicSimilarity], result of:
              0.12434828 = score(doc=5752,freq=8.0), product of:
                0.20397975 = queryWeight, product of:
                  3.9411201 = idf(docFreq=2334, maxDocs=44218)
                  0.051756795 = queryNorm
                0.6096109 = fieldWeight in 5752, product of:
                  2.828427 = tf(freq=8.0), with freq of:
                    8.0 = termFreq=8.0
                  3.9411201 = idf(docFreq=2334, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=5752)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    This paper provides an overview of the current status of cataloging and classification (C&C) education in Japan and looks forward to the development in the near future. First, the current status of library and information science (LIS) education and its major issues are briefly reviewed. Second, the situation of C&C practice in Japanese libraries is briefly reviewed, since it affects C&C education. Third, the present situation and issues in C&C education are examined and described under two categories: education in LIS schools and education in LIS programs offered by other colleges and universities. Finally, on-the-job training and continuing education in the C&C domain are discussed.
    Footnote
    Beitrag eines Themenheftes "Education for cataloging: international perspectives. Part I"
    Source
    Cataloging and classification quarterly. 41(2005) no.2, S.121-133
  4. Taniguchi, S.: Design of cataloging rules using conceptual modeling of cataloging process (2004) 0.03
    0.02937452 = product of:
      0.05874904 = sum of:
        0.05874904 = product of:
          0.11749808 = sum of:
            0.11749808 = weight(_text_:cataloging in 2264) [ClassicSimilarity], result of:
              0.11749808 = score(doc=2264,freq=14.0), product of:
                0.20397975 = queryWeight, product of:
                  3.9411201 = idf(docFreq=2334, maxDocs=44218)
                  0.051756795 = queryNorm
                0.57602817 = fieldWeight in 2264, product of:
                  3.7416575 = tf(freq=14.0), with freq of:
                    14.0 = termFreq=14.0
                  3.9411201 = idf(docFreq=2334, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=2264)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    This article proposes a method to design cataloging rules by utilizing conceptual modeling of the cataloging process and also by applying the concept "orientedness." It also proposes a general model for the cataloging process at the conceptual level, which is independent of any situation/system or cataloging code. A design method is made up of the following phases, including the development of a general model. Functional and non-functional requirements are first specified by use of orientedness. Also, cataloger tasks are defined, which are constituents of the cataloging process. Second, a core model is built, which consists of (1) basic event patterns under each task, (2) action patterns applicable to each event, and (3) orientedness involved in an event-action pair. Third, the core model is propagated to reflect the characteristics of an individual data element and also a certain class of materials. Finally, the propagated model is defined by choosing pairs of event and action patterns in the model white referring to orientedness indicated in each event-action pair, in order to match a particular situation. As a result, a set of event-action pairs reflecting specific requirements through categories of orientedness is obtained, and consistent and scalable design can, therefore, be attained.
  5. Taniguchi, S.: ¬A conceptual model giving primacy to expression-level bibliographic entity in cataloging (2002) 0.02
    0.023076165 = product of:
      0.04615233 = sum of:
        0.04615233 = product of:
          0.09230466 = sum of:
            0.09230466 = weight(_text_:cataloging in 4463) [ClassicSimilarity], result of:
              0.09230466 = score(doc=4463,freq=6.0), product of:
                0.20397975 = queryWeight, product of:
                  3.9411201 = idf(docFreq=2334, maxDocs=44218)
                  0.051756795 = queryNorm
                0.45251876 = fieldWeight in 4463, product of:
                  2.4494898 = tf(freq=6.0), with freq of:
                    6.0 = termFreq=6.0
                  3.9411201 = idf(docFreq=2334, maxDocs=44218)
                  0.046875 = fieldNorm(doc=4463)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    This paper proposes a conceptual model for cataloging which gives primacy to expression-level bibliographic entity, with the aim of approaching critical issues in cataloging, such as the so-called "format variations" and "content versus carrier" issues. The term "expression" is defined as "the intellectual or artistic realization of a work in the form of alpha-numeric, musical, or choreographic notation, etc." In this paper, the model by the IFLA Study Group on Functional Requirements for Bibliographic Records (FRBR) is first re-examined and at the same time the outline of a new model giving primacy to expression-level entity is illustrated by indicating differences from the FRBR model. Second, by applying the concept "user tasks," found in the FRBR model, to the new model outlined in this paper, a scenario on how entities are used by users is created. Third, some examples of bibliographic record equivalents in line with the new model are shown.
  6. Taniguchi, S.: ¬A system for analyzing cataloguing rules : a feasibility study (1996) 0.02
    0.02220505 = product of:
      0.0444101 = sum of:
        0.0444101 = product of:
          0.0888202 = sum of:
            0.0888202 = weight(_text_:cataloging in 4198) [ClassicSimilarity], result of:
              0.0888202 = score(doc=4198,freq=8.0), product of:
                0.20397975 = queryWeight, product of:
                  3.9411201 = idf(docFreq=2334, maxDocs=44218)
                  0.051756795 = queryNorm
                0.43543637 = fieldWeight in 4198, product of:
                  2.828427 = tf(freq=8.0), with freq of:
                    8.0 = termFreq=8.0
                  3.9411201 = idf(docFreq=2334, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=4198)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    The quality control of cataloging standards is as important as the quality control of bibliographic records. In order to aid the quality control of cataloging standards, a prototype system to analyze the ambiguity and complexity of cataloging rules was developed. Before developing the system, a standard rule unit was defined and a simple, function-like format was devised to indicate the syntactic structure of each unit rule. The AACR2 chapter 1 rules were then manually transformed into this function-like, unit role format. The systems reads the manually transformed unit rules and puts them into their basic forms based on their syntactic components. The system then applies rule-templates, which are skeletal schemata for specific types of cataloging rules, to the converted rules. As a result of this rule-template application, the internal structure of each unit rule is determined. The system is also used to explore inter-rule relationships. That is, the system determines whether two rules have an exclusive, parallel, complementary, or non-relationship. These relationships are based on the analysis of the structural parts described above in terms of the given rule-template. To assists in this process, the system applies external knowledge represented in the same fashion as the rule units themselves. Although the prototype system can handle only a restricted range of rules, the proposed approach is positively validated and shown to be useful. However, it is possibly impractical to build a complete rule-analyzing system of this type at this stage
  7. Taniguchi, S.: Mapping and merging of IFLA Library Reference Model and BIBFRAME 2.0 (2018) 0.02
    0.02220505 = product of:
      0.0444101 = sum of:
        0.0444101 = product of:
          0.0888202 = sum of:
            0.0888202 = weight(_text_:cataloging in 5162) [ClassicSimilarity], result of:
              0.0888202 = score(doc=5162,freq=2.0), product of:
                0.20397975 = queryWeight, product of:
                  3.9411201 = idf(docFreq=2334, maxDocs=44218)
                  0.051756795 = queryNorm
                0.43543637 = fieldWeight in 5162, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.9411201 = idf(docFreq=2334, maxDocs=44218)
                  0.078125 = fieldNorm(doc=5162)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Source
    Cataloging and classification quarterly. 56(2018) no.5/6, S.354-373
  8. Taniguchi, S.: Conceptual modeling of component parts of bibliographic resources in cataloging (2003) 0.02
    0.01884161 = product of:
      0.03768322 = sum of:
        0.03768322 = product of:
          0.07536644 = sum of:
            0.07536644 = weight(_text_:cataloging in 4442) [ClassicSimilarity], result of:
              0.07536644 = score(doc=4442,freq=4.0), product of:
                0.20397975 = queryWeight, product of:
                  3.9411201 = idf(docFreq=2334, maxDocs=44218)
                  0.051756795 = queryNorm
                0.36948 = fieldWeight in 4442, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  3.9411201 = idf(docFreq=2334, maxDocs=44218)
                  0.046875 = fieldNorm(doc=4442)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    This paper examines differences in modeling component parts of bibliographic resources between two conceptual models in cataloging, as a continuation of the previous study that proposed a model giving primacy to expression-level bibliographic entity. First, the model by IFLA Study Group on Functional Requirements for Bibliographic Records (FRBR) was examined from the viewpoint of modeling component parts when each part in itself is a resource to be described. The examination is done on two types of component parts, a content part and a document part, which are different in terms of whether they are physically independent. This results in different structures for these two component types. Secondly, by applying the viewpoint to the model that the author proposed earlier, it has become clear that both component types can be modeled basically in the same manner, indicating the model's superiority in consistency to the FRBR model in this respect.
  9. Taniguchi, S.: Understanding RDA as a DC application profile (2013) 0.02
    0.01776404 = product of:
      0.03552808 = sum of:
        0.03552808 = product of:
          0.07105616 = sum of:
            0.07105616 = weight(_text_:cataloging in 1906) [ClassicSimilarity], result of:
              0.07105616 = score(doc=1906,freq=2.0), product of:
                0.20397975 = queryWeight, product of:
                  3.9411201 = idf(docFreq=2334, maxDocs=44218)
                  0.051756795 = queryNorm
                0.3483491 = fieldWeight in 1906, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.9411201 = idf(docFreq=2334, maxDocs=44218)
                  0.0625 = fieldNorm(doc=1906)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Source
    Cataloging and classification quarterly. 51(2013) no.6, S.601-623
  10. Taniguchi, S.: Expression-level bibliographic entity records : a trial on creation from pre-existing MARC records (2004) 0.02
    0.015543535 = product of:
      0.03108707 = sum of:
        0.03108707 = product of:
          0.06217414 = sum of:
            0.06217414 = weight(_text_:cataloging in 5658) [ClassicSimilarity], result of:
              0.06217414 = score(doc=5658,freq=2.0), product of:
                0.20397975 = queryWeight, product of:
                  3.9411201 = idf(docFreq=2334, maxDocs=44218)
                  0.051756795 = queryNorm
                0.30480546 = fieldWeight in 5658, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.9411201 = idf(docFreq=2334, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=5658)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Source
    Cataloging and classification quarterly. 38(2004) no.2, S.xx-xx
  11. Tokita, T.; Koto, M.; Miyata, Y.; Yokoyama, Y.; Taniguchi, S.; Ueda, S.: Identifying works for Japanese classics toward construction of FRBRized OPACs (2012) 0.02
    0.015543535 = product of:
      0.03108707 = sum of:
        0.03108707 = product of:
          0.06217414 = sum of:
            0.06217414 = weight(_text_:cataloging in 1925) [ClassicSimilarity], result of:
              0.06217414 = score(doc=1925,freq=2.0), product of:
                0.20397975 = queryWeight, product of:
                  3.9411201 = idf(docFreq=2334, maxDocs=44218)
                  0.051756795 = queryNorm
                0.30480546 = fieldWeight in 1925, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.9411201 = idf(docFreq=2334, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=1925)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Source
    Cataloging and classification quarterly. 50(2012) no.5/7, S.670-687
  12. Taniguchi, S.: Viewing RDA from FRBR and FRAD : does RDA represent a different conceptual model? (2012) 0.02
    0.015543535 = product of:
      0.03108707 = sum of:
        0.03108707 = product of:
          0.06217414 = sum of:
            0.06217414 = weight(_text_:cataloging in 1936) [ClassicSimilarity], result of:
              0.06217414 = score(doc=1936,freq=2.0), product of:
                0.20397975 = queryWeight, product of:
                  3.9411201 = idf(docFreq=2334, maxDocs=44218)
                  0.051756795 = queryNorm
                0.30480546 = fieldWeight in 1936, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.9411201 = idf(docFreq=2334, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=1936)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Source
    Cataloging and classification quarterly. 50(2012) no.8, S.929-943
  13. Taniguchi, S.: Aggregate and component entities in RDA : model and description (2013) 0.02
    0.015543535 = product of:
      0.03108707 = sum of:
        0.03108707 = product of:
          0.06217414 = sum of:
            0.06217414 = weight(_text_:cataloging in 1951) [ClassicSimilarity], result of:
              0.06217414 = score(doc=1951,freq=2.0), product of:
                0.20397975 = queryWeight, product of:
                  3.9411201 = idf(docFreq=2334, maxDocs=44218)
                  0.051756795 = queryNorm
                0.30480546 = fieldWeight in 1951, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.9411201 = idf(docFreq=2334, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=1951)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Source
    Cataloging and classification quarterly. 51(2013) no.5, S.580-599
  14. Taniguchi, S.: User tasks in the RDA-based model (2013) 0.02
    0.015543535 = product of:
      0.03108707 = sum of:
        0.03108707 = product of:
          0.06217414 = sum of:
            0.06217414 = weight(_text_:cataloging in 1956) [ClassicSimilarity], result of:
              0.06217414 = score(doc=1956,freq=2.0), product of:
                0.20397975 = queryWeight, product of:
                  3.9411201 = idf(docFreq=2334, maxDocs=44218)
                  0.051756795 = queryNorm
                0.30480546 = fieldWeight in 1956, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.9411201 = idf(docFreq=2334, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=1956)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Source
    Cataloging and classification quarterly. 51(2013) no.7, S.788-815
  15. Taniguchi, S.: Modeling resource description tasks in RDA (2015) 0.02
    0.015543535 = product of:
      0.03108707 = sum of:
        0.03108707 = product of:
          0.06217414 = sum of:
            0.06217414 = weight(_text_:cataloging in 2013) [ClassicSimilarity], result of:
              0.06217414 = score(doc=2013,freq=2.0), product of:
                0.20397975 = queryWeight, product of:
                  3.9411201 = idf(docFreq=2334, maxDocs=44218)
                  0.051756795 = queryNorm
                0.30480546 = fieldWeight in 2013, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.9411201 = idf(docFreq=2334, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=2013)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Source
    Cataloging and classification quarterly. 53(2015) no.1, S.88-111
  16. Taniguchi, S.: Examining BIBFRAME 2.0 from the viewpoint of RDA metadata schema (2017) 0.02
    0.015543535 = product of:
      0.03108707 = sum of:
        0.03108707 = product of:
          0.06217414 = sum of:
            0.06217414 = weight(_text_:cataloging in 5154) [ClassicSimilarity], result of:
              0.06217414 = score(doc=5154,freq=2.0), product of:
                0.20397975 = queryWeight, product of:
                  3.9411201 = idf(docFreq=2334, maxDocs=44218)
                  0.051756795 = queryNorm
                0.30480546 = fieldWeight in 5154, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.9411201 = idf(docFreq=2334, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=5154)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Source
    Cataloging and classification quarterly. 55(2017) no.6, S.387-412
  17. Taniguchi, S.: Is BIBFRAME 2.0 a suitable schema for exchanging and sharing diverse descriptive metadata about bibliographic resources? (2018) 0.02
    0.015543535 = product of:
      0.03108707 = sum of:
        0.03108707 = product of:
          0.06217414 = sum of:
            0.06217414 = weight(_text_:cataloging in 5165) [ClassicSimilarity], result of:
              0.06217414 = score(doc=5165,freq=2.0), product of:
                0.20397975 = queryWeight, product of:
                  3.9411201 = idf(docFreq=2334, maxDocs=44218)
                  0.051756795 = queryNorm
                0.30480546 = fieldWeight in 5165, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.9411201 = idf(docFreq=2334, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=5165)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Source
    Cataloging and classification quarterly. 56(2018) no.1, S.40-61
  18. Taniguchi, S.: Data provenance and administrative information in library linked data : reviewing RDA in RDF, BIBFRAME, and Wikidata (2024) 0.02
    0.015543535 = product of:
      0.03108707 = sum of:
        0.03108707 = product of:
          0.06217414 = sum of:
            0.06217414 = weight(_text_:cataloging in 1154) [ClassicSimilarity], result of:
              0.06217414 = score(doc=1154,freq=2.0), product of:
                0.20397975 = queryWeight, product of:
                  3.9411201 = idf(docFreq=2334, maxDocs=44218)
                  0.051756795 = queryNorm
                0.30480546 = fieldWeight in 1154, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.9411201 = idf(docFreq=2334, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=1154)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Source
    Cataloging and classification quarterly. 61(2023) no.1, p.67-906
  19. Taniguchi, S.: ¬A system for supporting evidence recording in bibliographic records (2006) 0.01
    0.011102525 = product of:
      0.02220505 = sum of:
        0.02220505 = product of:
          0.0444101 = sum of:
            0.0444101 = weight(_text_:cataloging in 282) [ClassicSimilarity], result of:
              0.0444101 = score(doc=282,freq=2.0), product of:
                0.20397975 = queryWeight, product of:
                  3.9411201 = idf(docFreq=2334, maxDocs=44218)
                  0.051756795 = queryNorm
                0.21771818 = fieldWeight in 282, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.9411201 = idf(docFreq=2334, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=282)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    Recording evidence for data values, in addition to the values themselves, in bibliographic records and descriptive metadata has been proposed in a previous study. Recorded evidence indicates why and how data values are recorded for elements. As a continuation of that study, this article first proposes a scenario in which a cataloger and a system interact with each other in recording evidence in bibliographic records for books, with the aim of minimizing costs and effort in recording evidence. Second, it reports on prototype system development in accordance with the scenario. The system (1) searches a string, corresponding to the data value entered by a cataloger or extracted from the Machine Readable Cataloging (MARC) record, within the scanned and optical character recognition (OCR)-converted title page and verso of the title page of an item being cataloged; (2) identifies the place where the string appears within the source of information; (3) identifies the procedure being used to form the value entered or recorded; and finally (4) displays the place and procedure identified for the data value as its candidate evidence. Third, this study reports on an experiment conducted to examine the system's performance. The results of the experiment show the usefulness of the system and the validity of the proposed scenario.