Search (6 results, page 1 of 1)

  • × author_ss:"Vizine-Goetz, D."
  • × year_i:[2000 TO 2010}
  1. Vizine-Goetz, D.; Hickey, C.; Houghton, A.; Thompson, R.: Vocabulary mapping for terminology services (2004) 0.03
    0.025896767 = product of:
      0.116535455 = sum of:
        0.0921318 = weight(_text_:readable in 918) [ClassicSimilarity], result of:
          0.0921318 = score(doc=918,freq=2.0), product of:
            0.2262076 = queryWeight, product of:
              6.1439276 = idf(docFreq=257, maxDocs=44218)
              0.036818076 = queryNorm
            0.4072887 = fieldWeight in 918, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              6.1439276 = idf(docFreq=257, maxDocs=44218)
              0.046875 = fieldNorm(doc=918)
        0.024403658 = weight(_text_:data in 918) [ClassicSimilarity], result of:
          0.024403658 = score(doc=918,freq=2.0), product of:
            0.11642061 = queryWeight, product of:
              3.1620505 = idf(docFreq=5088, maxDocs=44218)
              0.036818076 = queryNorm
            0.2096163 = fieldWeight in 918, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.1620505 = idf(docFreq=5088, maxDocs=44218)
              0.046875 = fieldNorm(doc=918)
      0.22222222 = coord(2/9)
    
    Abstract
    The paper describes a project to add value to controlled vocabularies by making inter-vocabulary associations. A methodology for mapping terms from one vocabulary to another is presented in the form of a case study applying the approach to the Educational Resources Information Center (ERIC) Thesaurus and the Library of Congress Subject Headings (LCSH). Our approach to mapping involves encoding vocabularies according to Machine-Readable Cataloging (MARC) standards, machine matching of vocabulary terms, and categorizing candidate mappings by likelihood of valid mapping. Mapping data is then stored as machine links. Vocabularies with associations to other schemes will be a key component of Web-based terminology services. The paper briefly describes how the Open Archives Initiative Protocol for Metadata Harvesting (OAI-PMH) is used to provide access to a vocabulary with mappings.
  2. O'Neill, E.T.; Chan, L.M.; Childress, E.; Dean, R.; El-Hoshy, L.M.; Vizine-Goetz, D.: Form subdivisions : their identification and use in LCSH (2001) 0.01
    0.011545782 = product of:
      0.051956017 = sum of:
        0.036990993 = weight(_text_:bibliographic in 2205) [ClassicSimilarity], result of:
          0.036990993 = score(doc=2205,freq=2.0), product of:
            0.14333439 = queryWeight, product of:
              3.893044 = idf(docFreq=2449, maxDocs=44218)
              0.036818076 = queryNorm
            0.2580748 = fieldWeight in 2205, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.893044 = idf(docFreq=2449, maxDocs=44218)
              0.046875 = fieldNorm(doc=2205)
        0.014965023 = product of:
          0.029930046 = sum of:
            0.029930046 = weight(_text_:22 in 2205) [ClassicSimilarity], result of:
              0.029930046 = score(doc=2205,freq=2.0), product of:
                0.12893063 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.036818076 = queryNorm
                0.23214069 = fieldWeight in 2205, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.046875 = fieldNorm(doc=2205)
          0.5 = coord(1/2)
      0.22222222 = coord(2/9)
    
    Abstract
    Form subdivisions have always been an important part of the Library of Congress Subject Headings. However, when the MARC format was developed, no separate subfield code to identify form subdivisions was defined. Form and topical subdivisions were both included within a general subdivision category. In 1995, the USMARC Advisory Group approved a proposal defining subfield v for form subdivisions, and in 1999 the Library of Congress (LC) began identifying form subdivisions with the new code. However, there are millions of older bibliographic records lacking the explicit form subdivision coding. Identifying form subdivisions retrospectively is not a simple task. An algorithmic method was developed to identify form subdivisions coded as general subdivisions. The algorithm was used to identify 2,563 unique form subdivisions or combinations of form subdivisions in OCLC's WorldCat. The algorithm proved to be highly accurate with an error rate estimated to be less than 0.1%. The observed usage of the form subdivisions was highly skewed with the 100 most used form subdivisions or combinations of subdivisions accounting for 90% of the assignments.
    Date
    10. 9.2000 17:38:22
  3. O'Neill, E.T.; Childress, E.; Dean, R.; Kammerer, K.; Vizine-Goetz, D.; Chan, L.M.; El-Hoshy, L.: FAST: faceted application of subject terminology (2003) 0.00
    0.0034250922 = product of:
      0.03082583 = sum of:
        0.03082583 = weight(_text_:bibliographic in 3816) [ClassicSimilarity], result of:
          0.03082583 = score(doc=3816,freq=2.0), product of:
            0.14333439 = queryWeight, product of:
              3.893044 = idf(docFreq=2449, maxDocs=44218)
              0.036818076 = queryNorm
            0.21506234 = fieldWeight in 3816, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.893044 = idf(docFreq=2449, maxDocs=44218)
              0.0390625 = fieldNorm(doc=3816)
      0.11111111 = coord(1/9)
    
    Abstract
    The Library of Congress Subject Headings schema (LCSH) is by far the most commonly used and widely accepted subject vocabulary for general application. It is the de facto universal controlled vocabulary and has been a model for developing subject heading systems by many countries. However, LCSH's complex syntax and rules for constructing headings restrict its application by requiring highly skilled personnel and limit the effectiveness of automated authority control. Recent trends, driven to a large extent by the rapid growth of the Web, are forcing changes in bibliographic control systems to make them easier to use, understand, and apply, and subject headings are no exception. The purpose of adapting the LCSH with a simplified syntax to create FAST is to retain the very rich vocabulary of LCSH while making the schema easier to understand, control, apply, and use. The schema maintains upward compatibility with LCSH, and any valid set of LC subject headings can be converted to FAST headings.
  4. Chan, L.M.; Childress, E.; Dean, R.; O'Neill, E.T.; Vizine-Goetz, D.: ¬A faceted approach to subject data in the Dublin Core metadata record (2001) 0.00
    0.0031634374 = product of:
      0.028470935 = sum of:
        0.028470935 = weight(_text_:data in 6109) [ClassicSimilarity], result of:
          0.028470935 = score(doc=6109,freq=2.0), product of:
            0.11642061 = queryWeight, product of:
              3.1620505 = idf(docFreq=5088, maxDocs=44218)
              0.036818076 = queryNorm
            0.24455236 = fieldWeight in 6109, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.1620505 = idf(docFreq=5088, maxDocs=44218)
              0.0546875 = fieldNorm(doc=6109)
      0.11111111 = coord(1/9)
    
  5. Vizine-Goetz, D.: Dewey research : new uses for the DDC (2001) 0.00
    0.0027713005 = product of:
      0.024941705 = sum of:
        0.024941705 = product of:
          0.04988341 = sum of:
            0.04988341 = weight(_text_:22 in 190) [ClassicSimilarity], result of:
              0.04988341 = score(doc=190,freq=2.0), product of:
                0.12893063 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.036818076 = queryNorm
                0.38690117 = fieldWeight in 190, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.078125 = fieldNorm(doc=190)
          0.5 = coord(1/2)
      0.11111111 = coord(1/9)
    
    Date
    22. 6.2002 19:32:34
  6. Vizine-Goetz, D.; Beall, J.: Using literary warrant to define a version of the DDC for automated classification services (2004) 0.00
    0.0019399103 = product of:
      0.017459193 = sum of:
        0.017459193 = product of:
          0.034918386 = sum of:
            0.034918386 = weight(_text_:22 in 2645) [ClassicSimilarity], result of:
              0.034918386 = score(doc=2645,freq=2.0), product of:
                0.12893063 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.036818076 = queryNorm
                0.2708308 = fieldWeight in 2645, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=2645)
          0.5 = coord(1/2)
      0.11111111 = coord(1/9)
    
    Object
    DDC-22