Search (1 results, page 1 of 1)

  • × author_ss:"Cousins, S.A."
  • × theme_ss:"Verbale Doksprachen im Online-Retrieval"
  1. Cousins, S.A.: Enhancing subject access to OPACs : controlled vocabulary vs. natural language (1992) 0.00
    0.0017847219 = product of:
      0.0071388874 = sum of:
        0.0071388874 = weight(_text_:information in 2230) [ClassicSimilarity], result of:
          0.0071388874 = score(doc=2230,freq=2.0), product of:
            0.06134496 = queryWeight, product of:
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.034944877 = queryNorm
            0.116372846 = fieldWeight in 2230, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.046875 = fieldNorm(doc=2230)
      0.25 = coord(1/4)
    
    Abstract
    Experimental evidence suggests that enhancing the subject content of OPAC records can improve retrieval performance. This is based on the use of natural language index terms derived from the table of contents and back-of-the-book index of documents. The research reported here investigates the alternative approach of translating these natural language terms into controlled vocabulary. Subject queries were collected by interview at the catalogue, and indexing of the queries demonstrated the impressive ability of PRECIS, and to a lesser extent LCSH, to represent users' information needs. DDC performed poorly in this respect. The assumption was made that an index language adequately specific to represent users' queries should be adequate to represent document contents. Searches were carried out on three test databases, and both natural language and PRECIS enhancement of MARC records increased the number of relevant documents found, with PRECIS showing the better performance. However, with weak stemming the advantage of PRECIS was lost. Consideration must also be given to the potential advantages of controlled vocabulary, over and above basic retrieval performance measures