Search (12 results, page 1 of 1)

  • × theme_ss:"Konzeption und Anwendung des Prinzips Thesaurus"
  • × type_ss:"a"
  • × year_i:[2000 TO 2010}
  1. Qin, J.; Paling, S.: Converting a controlled vocabulary into an ontology : the case of GEM (2001) 0.02
    0.01845152 = product of:
      0.03690304 = sum of:
        0.03690304 = product of:
          0.07380608 = sum of:
            0.07380608 = weight(_text_:22 in 3895) [ClassicSimilarity], result of:
              0.07380608 = score(doc=3895,freq=2.0), product of:
                0.15896842 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.045395818 = queryNorm
                0.46428138 = fieldWeight in 3895, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.09375 = fieldNorm(doc=3895)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Date
    24. 8.2005 19:20:22
  2. Wang, J.: Automatic thesaurus development : term extraction from title metadata (2006) 0.01
    0.0134376865 = product of:
      0.026875373 = sum of:
        0.026875373 = product of:
          0.053750746 = sum of:
            0.053750746 = weight(_text_:bibliographic in 5063) [ClassicSimilarity], result of:
              0.053750746 = score(doc=5063,freq=4.0), product of:
                0.17672792 = queryWeight, product of:
                  3.893044 = idf(docFreq=2449, maxDocs=44218)
                  0.045395818 = queryNorm
                0.30414405 = fieldWeight in 5063, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  3.893044 = idf(docFreq=2449, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=5063)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    The application of thesauri in networked environments is seriously hampered by the challenges of introducing new concepts and terminology into the formal controlled vocabulary, which is critical for enhancing its retrieval capability. The author describes an automated process of adding new terms to thesauri as entry vocabulary by analyzing the association between words/phrases extracted from bibliographic titles and subject descriptors in the metadata record (subject descriptors are terms assigned from controlled vocabularies of thesauri to describe the subjects of the objects [e.g., books, articles] represented by the metadata records). The investigated approach uses a corpus of metadata for scientific and technical (S&T) publications in which the titles contain substantive words for key topics. The three steps of the method are (a) extracting words and phrases from the title field of the metadata; (b) applying a method to identify and select the specific and meaningful keywords based on the associated controlled vocabulary terms from the thesaurus used to catalog the objects; and (c) inserting selected keywords into the thesaurus as new terms (most of them are in hierarchical relationships with the existing concepts), thereby updating the thesaurus with new terminology that is being used in the literature. The effectiveness of the method was demonstrated by an experiment with the Chinese Classification Thesaurus (CCT) and bibliographic data in China Machine-Readable Cataloging Record (MARC) format (CNMARC) provided by Peking University Library. This approach is equally effective in large-scale collections and in other languages.
  3. Riesthuis, G.J.A.: Multilingual subject access and the Guidelines for the establishment and development of multilingual thesauri : an experimental study (2000) 0.01
    0.011402255 = product of:
      0.02280451 = sum of:
        0.02280451 = product of:
          0.04560902 = sum of:
            0.04560902 = weight(_text_:bibliographic in 131) [ClassicSimilarity], result of:
              0.04560902 = score(doc=131,freq=2.0), product of:
                0.17672792 = queryWeight, product of:
                  3.893044 = idf(docFreq=2449, maxDocs=44218)
                  0.045395818 = queryNorm
                0.2580748 = fieldWeight in 131, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.893044 = idf(docFreq=2449, maxDocs=44218)
                  0.046875 = fieldNorm(doc=131)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    In this paper, after an introduction about problems of multilingual information languages, the rules and recommendations of the Guidelines for the establishment and development of multilingual thesauri for non-equivalence and partial equivalence of terms in different languages are discussed. Artificial terms are not very useful in searching, because most users are not willing to use a thesaurus to find the right descriptor. On the other hand indexers need guidance on how to index and therefore need a thesaurus with all desirable and necessary relations. It is suggested that bibliographic online systems can take over some of the functions for the searcher from the thesaurus and that a few new relations could be helpful to an indexer
  4. Kuhr, P.S.: Putting the world back together : mapping multiple vocabularies into a single thesaurus (2003) 0.01
    0.011402255 = product of:
      0.02280451 = sum of:
        0.02280451 = product of:
          0.04560902 = sum of:
            0.04560902 = weight(_text_:bibliographic in 3813) [ClassicSimilarity], result of:
              0.04560902 = score(doc=3813,freq=2.0), product of:
                0.17672792 = queryWeight, product of:
                  3.893044 = idf(docFreq=2449, maxDocs=44218)
                  0.045395818 = queryNorm
                0.2580748 = fieldWeight in 3813, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.893044 = idf(docFreq=2449, maxDocs=44218)
                  0.046875 = fieldNorm(doc=3813)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    This paper describes an ongoing project in which the subject headings contained in twelve controlled vocabularies covering multiple disciplines from the humanities to the sciences and including law and education among others are being collapsed into a single vocabulary and reference structure. The design of the database, algorithms created to programmatically link like-concepts, and daily maintenance are detailed. The problems and pitfalls of dealing with multiple vocabularies are noted, as well as the difficulties in relying purely an computer generated algorithms. The application of this megathesaurus to bibliographic records and methodology of retrieval is explained.
  5. Dextre Clarke, S.G.: Thesaural relationships (2001) 0.01
    0.010763386 = product of:
      0.021526773 = sum of:
        0.021526773 = product of:
          0.043053545 = sum of:
            0.043053545 = weight(_text_:22 in 1149) [ClassicSimilarity], result of:
              0.043053545 = score(doc=1149,freq=2.0), product of:
                0.15896842 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.045395818 = queryNorm
                0.2708308 = fieldWeight in 1149, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=1149)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Date
    22. 9.2007 15:45:57
  6. Nielsen, M.L.: Thesaurus construction : key issues and selected readings (2004) 0.01
    0.010763386 = product of:
      0.021526773 = sum of:
        0.021526773 = product of:
          0.043053545 = sum of:
            0.043053545 = weight(_text_:22 in 5006) [ClassicSimilarity], result of:
              0.043053545 = score(doc=5006,freq=2.0), product of:
                0.15896842 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.045395818 = queryNorm
                0.2708308 = fieldWeight in 5006, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=5006)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Date
    18. 5.2006 20:06:22
  7. Schneider, J.W.; Borlund, P.: ¬A bibliometric-based semiautomatic approach to identification of candidate thesaurus terms : parsing and filtering of noun phrases from citation contexts (2005) 0.01
    0.010763386 = product of:
      0.021526773 = sum of:
        0.021526773 = product of:
          0.043053545 = sum of:
            0.043053545 = weight(_text_:22 in 156) [ClassicSimilarity], result of:
              0.043053545 = score(doc=156,freq=2.0), product of:
                0.15896842 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.045395818 = queryNorm
                0.2708308 = fieldWeight in 156, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=156)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Date
    8. 3.2007 19:55:22
  8. Dextre Clarke, S.G.: Evolution towards ISO 25964 : an international standard with guidelines for thesauri and other types of controlled vocabulary (2007) 0.01
    0.010763386 = product of:
      0.021526773 = sum of:
        0.021526773 = product of:
          0.043053545 = sum of:
            0.043053545 = weight(_text_:22 in 749) [ClassicSimilarity], result of:
              0.043053545 = score(doc=749,freq=2.0), product of:
                0.15896842 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.045395818 = queryNorm
                0.2708308 = fieldWeight in 749, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=749)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Date
    8.12.2007 19:25:22
  9. Aitchison, J.; Dextre Clarke, S.G.: ¬The Thesaurus : a historical viewpoint, with a look to the future (2004) 0.01
    0.00922576 = product of:
      0.01845152 = sum of:
        0.01845152 = product of:
          0.03690304 = sum of:
            0.03690304 = weight(_text_:22 in 5005) [ClassicSimilarity], result of:
              0.03690304 = score(doc=5005,freq=2.0), product of:
                0.15896842 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.045395818 = queryNorm
                0.23214069 = fieldWeight in 5005, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.046875 = fieldNorm(doc=5005)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Date
    22. 9.2007 15:46:13
  10. Bagheri, M.: Development of thesauri in Iran (2006) 0.01
    0.00922576 = product of:
      0.01845152 = sum of:
        0.01845152 = product of:
          0.03690304 = sum of:
            0.03690304 = weight(_text_:22 in 260) [ClassicSimilarity], result of:
              0.03690304 = score(doc=260,freq=2.0), product of:
                0.15896842 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.045395818 = queryNorm
                0.23214069 = fieldWeight in 260, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.046875 = fieldNorm(doc=260)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Source
    Indexer. 25(2006) no.1, S.19-22
  11. Dextre Clarke, S.G.; Gilchrist, A.; Will, L.: Revision and extension of thesaurus standards (2004) 0.01
    0.0076015037 = product of:
      0.0152030075 = sum of:
        0.0152030075 = product of:
          0.030406015 = sum of:
            0.030406015 = weight(_text_:bibliographic in 2615) [ClassicSimilarity], result of:
              0.030406015 = score(doc=2615,freq=2.0), product of:
                0.17672792 = queryWeight, product of:
                  3.893044 = idf(docFreq=2449, maxDocs=44218)
                  0.045395818 = queryNorm
                0.17204987 = fieldWeight in 2615, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.893044 = idf(docFreq=2449, maxDocs=44218)
                  0.03125 = fieldNorm(doc=2615)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    The current standards for monolingual and multilingual thesauri are long overdue for an update. This applies to the international standards ISO 2788 and ISO 5964, as well as the corresponding national standards in several countries and the American standard ANSI/NISO Z39.19. Work is now under way in the UK and in the USA to revise and extend the standards, with particular emphasis on interoperability needs in our world of vast electronic networks. Work in the UK is starting with the British Standards, in the hope of leading on to one international standard to serve all. Some of the issues still under discussion include the treatment of facet analysis, coverage of additional types of controlled vocabulary such as classification schemes, taxonomies and ontologies, and mapping from one vocabulary to another. 1. Are thesaurus standards still needed? Since the 1960s, even before the renowned Cranfield experiments (Cleverdon et al., 1966; Cleverdon, 1967) arguments have raged over the usefulness or otherwise of controlled vocabularies. The case has never been proved definitively one way or the other. At the same time, a recognition has become widespread that no one search method can answer all retrieval requirements. In today's environment of very large networks of resources, the skilled information professional uses a range of techniques. Among these, controlled vocabularies are valued alongside others. The first international standard for monolingual thesauri was issued in 1974. In those days, the main application was for postcoordinate indexing and retrieval from document collections or bibliographic databases. For many information professionals the only practicable alternative to a thesaurus was a classification scheme. And so the thesaurus developed a strong following. After computer systems with full text search capability became widely available, however, the arguments against controlled vocabularies gained more followers. The cost of building and maintaining a thesaurus or a classification scheme was a strong disincentive. Today's databases are typically immense compared with those three decades ago. Full text searching is taken for granted, not just in discrete databases but across all the resources in an intranet or even the Internet. But intranets have brought particular frustration as users discover that despite all the computer power, they cannot find items which they know to be present an the network. So the trend against controlled vocabularies is now being reversed, as many information professionals are turning to them for help. Standards to guide them are still in demand.
  12. Burkart, M.: Thesaurus (2004) 0.01
    0.0061505064 = product of:
      0.012301013 = sum of:
        0.012301013 = product of:
          0.024602026 = sum of:
            0.024602026 = weight(_text_:22 in 2913) [ClassicSimilarity], result of:
              0.024602026 = score(doc=2913,freq=2.0), product of:
                0.15896842 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.045395818 = queryNorm
                0.15476047 = fieldWeight in 2913, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.03125 = fieldNorm(doc=2913)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Date
    5. 4.2013 10:18:22