Search (262 results, page 1 of 14)

  • × year_i:[2000 TO 2010}
  • × type_ss:"el"
  1. Functional Requirements for Subject Authority Data (FRSAD) : a conceptual model (2009) 0.05
    0.054242246 = product of:
      0.15187828 = sum of:
        0.06574103 = weight(_text_:subject in 3573) [ClassicSimilarity], result of:
          0.06574103 = score(doc=3573,freq=30.0), product of:
            0.10738805 = queryWeight, product of:
              3.576596 = idf(docFreq=3361, maxDocs=44218)
              0.03002521 = queryNorm
            0.612182 = fieldWeight in 3573, product of:
              5.477226 = tf(freq=30.0), with freq of:
                30.0 = termFreq=30.0
              3.576596 = idf(docFreq=3361, maxDocs=44218)
              0.03125 = fieldNorm(doc=3573)
        0.013458292 = weight(_text_:classification in 3573) [ClassicSimilarity], result of:
          0.013458292 = score(doc=3573,freq=2.0), product of:
            0.09562149 = queryWeight, product of:
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.03002521 = queryNorm
            0.14074548 = fieldWeight in 3573, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.03125 = fieldNorm(doc=3573)
        0.01899904 = product of:
          0.03799808 = sum of:
            0.03799808 = weight(_text_:schemes in 3573) [ClassicSimilarity], result of:
              0.03799808 = score(doc=3573,freq=2.0), product of:
                0.16067243 = queryWeight, product of:
                  5.3512506 = idf(docFreq=569, maxDocs=44218)
                  0.03002521 = queryNorm
                0.2364941 = fieldWeight in 3573, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  5.3512506 = idf(docFreq=569, maxDocs=44218)
                  0.03125 = fieldNorm(doc=3573)
          0.5 = coord(1/2)
        0.04022163 = weight(_text_:bibliographic in 3573) [ClassicSimilarity], result of:
          0.04022163 = score(doc=3573,freq=8.0), product of:
            0.11688946 = queryWeight, product of:
              3.893044 = idf(docFreq=2449, maxDocs=44218)
              0.03002521 = queryNorm
            0.34409973 = fieldWeight in 3573, product of:
              2.828427 = tf(freq=8.0), with freq of:
                8.0 = termFreq=8.0
              3.893044 = idf(docFreq=2449, maxDocs=44218)
              0.03125 = fieldNorm(doc=3573)
        0.013458292 = weight(_text_:classification in 3573) [ClassicSimilarity], result of:
          0.013458292 = score(doc=3573,freq=2.0), product of:
            0.09562149 = queryWeight, product of:
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.03002521 = queryNorm
            0.14074548 = fieldWeight in 3573, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.03125 = fieldNorm(doc=3573)
      0.35714287 = coord(5/14)
    
    Abstract
    Subject access to information has been the predominant approach of users to satisfy their information needs. Research demonstrates that the integration of controlled vocabulary information with an information retrieval system helps users perform more effective subject searches. This integration becomes possible when subject authority data (information about subjects from authority files) are linked to bibliographic files and are made available to users. The purpose of authority control is to ensure consistency in representing a value-a name of a person, a place name, or a subject term-in the elements used as access points in information retrieval. For example, "World War, 1939-1945" has been established as an authorized subject heading in the Library of Congress Subject Headings (LCSH). When using LCSH, in cataloging or indexing, all publications about World War II are assigned the established heading regardless of whether a publication refers to the war as the "European War, 1939-1945", "Second World War", "World War 2", "World War II", "WWII", "World War Two", or "2nd World War." The synonymous expressions are referred to by the authorized heading. This ensures that all publications about World War II can be retrieved by and displayed under the same subject heading, either in an individual institution's own catalog or database or in a union catalog that contains bibliographic records from a number of individual libraries or databases. In almost all large bibliographic databases, authority control is achieved manually or semi-automatically by means of an authority file. The file contains records of headings or access points - names, titles, or subjects - that have been authorized for use in bibliographic records. In addition to ensuring consistency in subject representation, a subject authority record also records and maintains semantic relationships among subject terms and/or their labels. Records in a subject authority file are connected through semantic relationships, which may be expressed statically in subject authority records or generated dynamically according to the specific needs (e.g., presenting the broader and narrower terms) of printed or online display of thesauri, subject headings lists, classification schemes, and other knowledge organization systems.
    Editor
    IFLA Working Group on Functional Requirements for Subject Authority Records (FRSAR)
  2. Bourdon, F.; Landry, P.: Best practices for subject access to national bibliographies : interim report by the Working Group on Guidelines for Subject Access by National Bibliographic Agencies (2007) 0.05
    0.04665659 = product of:
      0.16329806 = sum of:
        0.066422306 = weight(_text_:subject in 698) [ClassicSimilarity], result of:
          0.066422306 = score(doc=698,freq=10.0), product of:
            0.10738805 = queryWeight, product of:
              3.576596 = idf(docFreq=3361, maxDocs=44218)
              0.03002521 = queryNorm
            0.61852604 = fieldWeight in 698, product of:
              3.1622777 = tf(freq=10.0), with freq of:
                10.0 = termFreq=10.0
              3.576596 = idf(docFreq=3361, maxDocs=44218)
              0.0546875 = fieldNorm(doc=698)
        0.023552012 = weight(_text_:classification in 698) [ClassicSimilarity], result of:
          0.023552012 = score(doc=698,freq=2.0), product of:
            0.09562149 = queryWeight, product of:
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.03002521 = queryNorm
            0.24630459 = fieldWeight in 698, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.0546875 = fieldNorm(doc=698)
        0.04977173 = weight(_text_:bibliographic in 698) [ClassicSimilarity], result of:
          0.04977173 = score(doc=698,freq=4.0), product of:
            0.11688946 = queryWeight, product of:
              3.893044 = idf(docFreq=2449, maxDocs=44218)
              0.03002521 = queryNorm
            0.4258017 = fieldWeight in 698, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              3.893044 = idf(docFreq=2449, maxDocs=44218)
              0.0546875 = fieldNorm(doc=698)
        0.023552012 = weight(_text_:classification in 698) [ClassicSimilarity], result of:
          0.023552012 = score(doc=698,freq=2.0), product of:
            0.09562149 = queryWeight, product of:
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.03002521 = queryNorm
            0.24630459 = fieldWeight in 698, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.0546875 = fieldNorm(doc=698)
      0.2857143 = coord(4/14)
    
    Abstract
    The working group to establish guidelines for subject access by national bibliographic agencies was set up in 2005 in order to analyse the question of subject access and propose key elements for an indexing policy for national bibliographies. The group's mandate is to put forward recommendations based on best practices for subject access to national bibliographies. The group is presently assessing the elements which should be included in an indexing policy and will present an initial version of its recommendations in 2008.
    Content
    Vortrag anlässlich: WORLD LIBRARY AND INFORMATION CONGRESS: 73RD IFLA GENERAL CONFERENCE AND COUNCIL 19-23 August 2007, Durban, South Africa. - 89 - Bibliography with National Libraries and Classification and Indexing
  3. Yi, K.: Challenges in automated classification using library classification schemes (2006) 0.04
    0.039771687 = product of:
      0.1856012 = sum of:
        0.0659319 = weight(_text_:classification in 5810) [ClassicSimilarity], result of:
          0.0659319 = score(doc=5810,freq=12.0), product of:
            0.09562149 = queryWeight, product of:
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.03002521 = queryNorm
            0.6895092 = fieldWeight in 5810, product of:
              3.4641016 = tf(freq=12.0), with freq of:
                12.0 = termFreq=12.0
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.0625 = fieldNorm(doc=5810)
        0.053737402 = product of:
          0.107474804 = sum of:
            0.107474804 = weight(_text_:schemes in 5810) [ClassicSimilarity], result of:
              0.107474804 = score(doc=5810,freq=4.0), product of:
                0.16067243 = queryWeight, product of:
                  5.3512506 = idf(docFreq=569, maxDocs=44218)
                  0.03002521 = queryNorm
                0.66890633 = fieldWeight in 5810, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  5.3512506 = idf(docFreq=569, maxDocs=44218)
                  0.0625 = fieldNorm(doc=5810)
          0.5 = coord(1/2)
        0.0659319 = weight(_text_:classification in 5810) [ClassicSimilarity], result of:
          0.0659319 = score(doc=5810,freq=12.0), product of:
            0.09562149 = queryWeight, product of:
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.03002521 = queryNorm
            0.6895092 = fieldWeight in 5810, product of:
              3.4641016 = tf(freq=12.0), with freq of:
                12.0 = termFreq=12.0
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.0625 = fieldNorm(doc=5810)
      0.21428572 = coord(3/14)
    
    Abstract
    A major library classification scheme has long been standard classification framework for information sources in traditional library environment, and text classification (TC) becomes a popular and attractive tool of organizing digital information. This paper gives an overview of previous projects and studies on TC using major library classification schemes, and summarizes a discussion of TC research challenges.
  4. Koch, T.; Ardö, A.: Automatic classification of full-text HTML-documents from one specific subject area : DESIRE II D3.6a, Working Paper 2 (2000) 0.04
    0.038544483 = product of:
      0.17987426 = sum of:
        0.048010457 = weight(_text_:subject in 1667) [ClassicSimilarity], result of:
          0.048010457 = score(doc=1667,freq=4.0), product of:
            0.10738805 = queryWeight, product of:
              3.576596 = idf(docFreq=3361, maxDocs=44218)
              0.03002521 = queryNorm
            0.4470745 = fieldWeight in 1667, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              3.576596 = idf(docFreq=3361, maxDocs=44218)
              0.0625 = fieldNorm(doc=1667)
        0.0659319 = weight(_text_:classification in 1667) [ClassicSimilarity], result of:
          0.0659319 = score(doc=1667,freq=12.0), product of:
            0.09562149 = queryWeight, product of:
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.03002521 = queryNorm
            0.6895092 = fieldWeight in 1667, product of:
              3.4641016 = tf(freq=12.0), with freq of:
                12.0 = termFreq=12.0
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.0625 = fieldNorm(doc=1667)
        0.0659319 = weight(_text_:classification in 1667) [ClassicSimilarity], result of:
          0.0659319 = score(doc=1667,freq=12.0), product of:
            0.09562149 = queryWeight, product of:
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.03002521 = queryNorm
            0.6895092 = fieldWeight in 1667, product of:
              3.4641016 = tf(freq=12.0), with freq of:
                12.0 = termFreq=12.0
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.0625 = fieldNorm(doc=1667)
      0.21428572 = coord(3/14)
    
    Content
    1 Introduction / 2 Method overview / 3 Ei thesaurus preprocessing / 4 Automatic classification process: 4.1 Matching -- 4.2 Weighting -- 4.3 Preparation for display / 5 Results of the classification process / 6 Evaluations / 7 Software / 8 Other applications / 9 Experiments with universal classification systems / References / Appendix A: Ei classification service: Software / Appendix B: Use of the classification software as subject filter in a WWW harvester.
  5. Robbio, A. de; Maguolo, D.; Marini, A.: Scientific and general subject classifications in the digital world (2001) 0.04
    0.03805627 = product of:
      0.13319694 = sum of:
        0.05092278 = weight(_text_:subject in 2) [ClassicSimilarity], result of:
          0.05092278 = score(doc=2,freq=18.0), product of:
            0.10738805 = queryWeight, product of:
              3.576596 = idf(docFreq=3361, maxDocs=44218)
              0.03002521 = queryNorm
            0.4741941 = fieldWeight in 2, product of:
              4.2426405 = tf(freq=18.0), with freq of:
                18.0 = termFreq=18.0
              3.576596 = idf(docFreq=3361, maxDocs=44218)
              0.03125 = fieldNorm(doc=2)
        0.026916584 = weight(_text_:classification in 2) [ClassicSimilarity], result of:
          0.026916584 = score(doc=2,freq=8.0), product of:
            0.09562149 = queryWeight, product of:
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.03002521 = queryNorm
            0.28149095 = fieldWeight in 2, product of:
              2.828427 = tf(freq=8.0), with freq of:
                8.0 = termFreq=8.0
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.03125 = fieldNorm(doc=2)
        0.028440988 = weight(_text_:bibliographic in 2) [ClassicSimilarity], result of:
          0.028440988 = score(doc=2,freq=4.0), product of:
            0.11688946 = queryWeight, product of:
              3.893044 = idf(docFreq=2449, maxDocs=44218)
              0.03002521 = queryNorm
            0.24331525 = fieldWeight in 2, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              3.893044 = idf(docFreq=2449, maxDocs=44218)
              0.03125 = fieldNorm(doc=2)
        0.026916584 = weight(_text_:classification in 2) [ClassicSimilarity], result of:
          0.026916584 = score(doc=2,freq=8.0), product of:
            0.09562149 = queryWeight, product of:
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.03002521 = queryNorm
            0.28149095 = fieldWeight in 2, product of:
              2.828427 = tf(freq=8.0), with freq of:
                8.0 = termFreq=8.0
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.03125 = fieldNorm(doc=2)
      0.2857143 = coord(4/14)
    
    Abstract
    In the present work we discuss opportunities, problems, tools and techniques encountered when interconnecting discipline-specific subject classifications, primarily organized as search devices in bibliographic databases, with general classifications originally devised for book shelving in public libraries. We first state the fundamental distinction between topical (or subject) classifications and object classifications. Then we trace the structural limitations that have constrained subject classifications since their library origins, and the devices that were used to overcome the gap with genuine knowledge representation. After recalling some general notions on structure, dynamics and interferences of subject classifications and of the objects they refer to, we sketch a synthetic overview on discipline-specific classifications in Mathematics, Computing and Physics, on one hand, and on general classifications on the other. In this setting we present The Scientific Classifications Page, which collects groups of Web pages produced by a pool of software tools for developing hypertextual presentations of single or paired subject classifications from sequential source files, as well as facilities for gathering information from KWIC lists of classification descriptions. Further we propose a concept-oriented methodology for interconnecting subject classifications, with the concrete support of a relational analysis of the whole Mathematics Subject Classification through its evolution since 1959. Finally, we recall a very basic method for interconnection provided by coreference in bibliographic records among index elements from different systems, and point out the advantages of establishing the conditions of a more widespread application of such a method. A part of these contents was presented under the title Mathematics Subject Classification and related Classifications in the Digital World at the Eighth International Conference Crimea 2001, "Libraries and Associations in the Transient World: New Technologies and New Forms of Cooperation", Sudak, Ukraine, June 9-17, 2001, in a special session on electronic libraries, electronic publishing and electronic information in science chaired by Bernd Wegner, Editor-in-Chief of Zentralblatt MATH.
    Object
    INSPEC Classification
  6. Automatic classification research at OCLC (2002) 0.04
    0.03687797 = product of:
      0.12907289 = sum of:
        0.04079328 = weight(_text_:classification in 1563) [ClassicSimilarity], result of:
          0.04079328 = score(doc=1563,freq=6.0), product of:
            0.09562149 = queryWeight, product of:
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.03002521 = queryNorm
            0.42661208 = fieldWeight in 1563, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.0546875 = fieldNorm(doc=1563)
        0.03324832 = product of:
          0.06649664 = sum of:
            0.06649664 = weight(_text_:schemes in 1563) [ClassicSimilarity], result of:
              0.06649664 = score(doc=1563,freq=2.0), product of:
                0.16067243 = queryWeight, product of:
                  5.3512506 = idf(docFreq=569, maxDocs=44218)
                  0.03002521 = queryNorm
                0.41386467 = fieldWeight in 1563, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  5.3512506 = idf(docFreq=569, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=1563)
          0.5 = coord(1/2)
        0.04079328 = weight(_text_:classification in 1563) [ClassicSimilarity], result of:
          0.04079328 = score(doc=1563,freq=6.0), product of:
            0.09562149 = queryWeight, product of:
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.03002521 = queryNorm
            0.42661208 = fieldWeight in 1563, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.0546875 = fieldNorm(doc=1563)
        0.014238005 = product of:
          0.02847601 = sum of:
            0.02847601 = weight(_text_:22 in 1563) [ClassicSimilarity], result of:
              0.02847601 = score(doc=1563,freq=2.0), product of:
                0.10514317 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.03002521 = queryNorm
                0.2708308 = fieldWeight in 1563, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=1563)
          0.5 = coord(1/2)
      0.2857143 = coord(4/14)
    
    Abstract
    OCLC enlists the cooperation of the world's libraries to make the written record of humankind's cultural heritage more accessible through electronic media. Part of this goal can be accomplished through the application of the principles of knowledge organization. We believe that cultural artifacts are effectively lost unless they are indexed, cataloged and classified. Accordingly, OCLC has developed products, sponsored research projects, and encouraged the participation in international standards communities whose outcome has been improved library classification schemes, cataloging productivity tools, and new proposals for the creation and maintenance of metadata. Though cataloging and classification requires expert intellectual effort, we recognize that at least some of the work must be automated if we hope to keep pace with cultural change
    Date
    5. 5.2003 9:22:09
  7. Louie, A.J.; Maddox, E.L.; Washington, W.: Using faceted classification to provide structure for information architecture (2003) 0.04
    0.03587399 = product of:
      0.12555896 = sum of:
        0.02546139 = weight(_text_:subject in 2471) [ClassicSimilarity], result of:
          0.02546139 = score(doc=2471,freq=2.0), product of:
            0.10738805 = queryWeight, product of:
              3.576596 = idf(docFreq=3361, maxDocs=44218)
              0.03002521 = queryNorm
            0.23709705 = fieldWeight in 2471, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.576596 = idf(docFreq=3361, maxDocs=44218)
              0.046875 = fieldNorm(doc=2471)
        0.03496567 = weight(_text_:classification in 2471) [ClassicSimilarity], result of:
          0.03496567 = score(doc=2471,freq=6.0), product of:
            0.09562149 = queryWeight, product of:
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.03002521 = queryNorm
            0.3656675 = fieldWeight in 2471, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.046875 = fieldNorm(doc=2471)
        0.030166224 = weight(_text_:bibliographic in 2471) [ClassicSimilarity], result of:
          0.030166224 = score(doc=2471,freq=2.0), product of:
            0.11688946 = queryWeight, product of:
              3.893044 = idf(docFreq=2449, maxDocs=44218)
              0.03002521 = queryNorm
            0.2580748 = fieldWeight in 2471, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.893044 = idf(docFreq=2449, maxDocs=44218)
              0.046875 = fieldNorm(doc=2471)
        0.03496567 = weight(_text_:classification in 2471) [ClassicSimilarity], result of:
          0.03496567 = score(doc=2471,freq=6.0), product of:
            0.09562149 = queryWeight, product of:
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.03002521 = queryNorm
            0.3656675 = fieldWeight in 2471, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.046875 = fieldNorm(doc=2471)
      0.2857143 = coord(4/14)
    
    Abstract
    This is a short, but very thorough and very interesting, report on how the writers built a faceted classification for some legal information and used it to structure a web site with navigation and searching. There is a good summary of why facets work well and how they fit into bibliographic control in general. The last section is about their implementation of a web site for the Washington State Bar Association's Council for Legal Public Education. Their classification uses three facets: Purpose (the general aim of the document, e.g. Resources for K-12 Teachers), Topic (the subject of the document), and Type (the legal format of the document). See Example Web Sites, below, for a discussion of the site and a problem with its design.
  8. Goldberg, J.: Classification of religion in LCC (2000) 0.04
    0.03527055 = product of:
      0.1645959 = sum of:
        0.047104023 = weight(_text_:classification in 5402) [ClassicSimilarity], result of:
          0.047104023 = score(doc=5402,freq=2.0), product of:
            0.09562149 = queryWeight, product of:
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.03002521 = queryNorm
            0.49260917 = fieldWeight in 5402, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.109375 = fieldNorm(doc=5402)
        0.070387855 = weight(_text_:bibliographic in 5402) [ClassicSimilarity], result of:
          0.070387855 = score(doc=5402,freq=2.0), product of:
            0.11688946 = queryWeight, product of:
              3.893044 = idf(docFreq=2449, maxDocs=44218)
              0.03002521 = queryNorm
            0.6021745 = fieldWeight in 5402, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.893044 = idf(docFreq=2449, maxDocs=44218)
              0.109375 = fieldNorm(doc=5402)
        0.047104023 = weight(_text_:classification in 5402) [ClassicSimilarity], result of:
          0.047104023 = score(doc=5402,freq=2.0), product of:
            0.09562149 = queryWeight, product of:
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.03002521 = queryNorm
            0.49260917 = fieldWeight in 5402, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.109375 = fieldNorm(doc=5402)
      0.21428572 = coord(3/14)
    
    Footnote
    Vortrag, IFLA General Conference, Divison IV Bibliographic Control, Jerusalem, 2000
  9. SKOS Simple Knowledge Organization System Primer (2009) 0.04
    0.035095256 = product of:
      0.122833386 = sum of:
        0.02546139 = weight(_text_:subject in 4795) [ClassicSimilarity], result of:
          0.02546139 = score(doc=4795,freq=2.0), product of:
            0.10738805 = queryWeight, product of:
              3.576596 = idf(docFreq=3361, maxDocs=44218)
              0.03002521 = queryNorm
            0.23709705 = fieldWeight in 4795, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.576596 = idf(docFreq=3361, maxDocs=44218)
              0.046875 = fieldNorm(doc=4795)
        0.02018744 = weight(_text_:classification in 4795) [ClassicSimilarity], result of:
          0.02018744 = score(doc=4795,freq=2.0), product of:
            0.09562149 = queryWeight, product of:
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.03002521 = queryNorm
            0.21111822 = fieldWeight in 4795, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.046875 = fieldNorm(doc=4795)
        0.05699712 = product of:
          0.11399424 = sum of:
            0.11399424 = weight(_text_:schemes in 4795) [ClassicSimilarity], result of:
              0.11399424 = score(doc=4795,freq=8.0), product of:
                0.16067243 = queryWeight, product of:
                  5.3512506 = idf(docFreq=569, maxDocs=44218)
                  0.03002521 = queryNorm
                0.7094823 = fieldWeight in 4795, product of:
                  2.828427 = tf(freq=8.0), with freq of:
                    8.0 = termFreq=8.0
                  5.3512506 = idf(docFreq=569, maxDocs=44218)
                  0.046875 = fieldNorm(doc=4795)
          0.5 = coord(1/2)
        0.02018744 = weight(_text_:classification in 4795) [ClassicSimilarity], result of:
          0.02018744 = score(doc=4795,freq=2.0), product of:
            0.09562149 = queryWeight, product of:
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.03002521 = queryNorm
            0.21111822 = fieldWeight in 4795, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.046875 = fieldNorm(doc=4795)
      0.2857143 = coord(4/14)
    
    Abstract
    SKOS (Simple Knowledge Organisation System) provides a model for expressing the basic structure and content of concept schemes such as thesauri, classification schemes, subject heading lists, taxonomies, folksonomies, and other types of controlled vocabulary. As an application of the Resource Description Framework (RDF) SKOS allows concepts to be documented, linked and merged with other data, while still being composed, integrated and published on the World Wide Web. This document is an implementors guide for those who would like to represent their concept scheme using SKOS. In basic SKOS, conceptual resources (concepts) can be identified using URIs, labelled with strings in one or more natural languages, documented with various types of notes, semantically related to each other in informal hierarchies and association networks, and aggregated into distinct concept schemes. In advanced SKOS, conceptual resources can be mapped to conceptual resources in other schemes and grouped into labelled or ordered collections. Concept labels can also be related to each other. Finally, the SKOS vocabulary itself can be extended to suit the needs of particular communities of practice.
  10. SKOS Core Guide (2005) 0.03
    0.032913495 = product of:
      0.11519723 = sum of:
        0.02546139 = weight(_text_:subject in 4689) [ClassicSimilarity], result of:
          0.02546139 = score(doc=4689,freq=2.0), product of:
            0.10738805 = queryWeight, product of:
              3.576596 = idf(docFreq=3361, maxDocs=44218)
              0.03002521 = queryNorm
            0.23709705 = fieldWeight in 4689, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.576596 = idf(docFreq=3361, maxDocs=44218)
              0.046875 = fieldNorm(doc=4689)
        0.02018744 = weight(_text_:classification in 4689) [ClassicSimilarity], result of:
          0.02018744 = score(doc=4689,freq=2.0), product of:
            0.09562149 = queryWeight, product of:
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.03002521 = queryNorm
            0.21111822 = fieldWeight in 4689, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.046875 = fieldNorm(doc=4689)
        0.049360957 = product of:
          0.098721914 = sum of:
            0.098721914 = weight(_text_:schemes in 4689) [ClassicSimilarity], result of:
              0.098721914 = score(doc=4689,freq=6.0), product of:
                0.16067243 = queryWeight, product of:
                  5.3512506 = idf(docFreq=569, maxDocs=44218)
                  0.03002521 = queryNorm
                0.6144297 = fieldWeight in 4689, product of:
                  2.4494898 = tf(freq=6.0), with freq of:
                    6.0 = termFreq=6.0
                  5.3512506 = idf(docFreq=569, maxDocs=44218)
                  0.046875 = fieldNorm(doc=4689)
          0.5 = coord(1/2)
        0.02018744 = weight(_text_:classification in 4689) [ClassicSimilarity], result of:
          0.02018744 = score(doc=4689,freq=2.0), product of:
            0.09562149 = queryWeight, product of:
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.03002521 = queryNorm
            0.21111822 = fieldWeight in 4689, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.046875 = fieldNorm(doc=4689)
      0.2857143 = coord(4/14)
    
    Abstract
    SKOS Core provides a model for expressing the basic structure and content of concept schemes such as thesauri, classification schemes, subject heading lists, taxonomies, 'folksonomies', other types of controlled vocabulary, and also concept schemes embedded in glossaries and terminologies. The SKOS Core Vocabulary is an application of the Resource Description Framework (RDF), that can be used to express a concept scheme as an RDF graph. Using RDF allows data to be linked to and/or merged with other data, enabling data sources to be distributed across the web, but still be meaningfully composed and integrated. This document is a guide using the SKOS Core Vocabulary, for readers who already have a basic understanding of RDF concepts. This edition of the SKOS Core Guide [SKOS Core Guide] is a W3C Public Working Draft. It is the authoritative guide to recommended usage of the SKOS Core Vocabulary at the time of publication.
  11. Godby, C. J.; Stuler, J.: ¬The Library of Congress Classification as a knowledge base for automatic subject categorization (2001) 0.03
    0.032580506 = product of:
      0.15204236 = sum of:
        0.058800567 = weight(_text_:subject in 1567) [ClassicSimilarity], result of:
          0.058800567 = score(doc=1567,freq=6.0), product of:
            0.10738805 = queryWeight, product of:
              3.576596 = idf(docFreq=3361, maxDocs=44218)
              0.03002521 = queryNorm
            0.5475522 = fieldWeight in 1567, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              3.576596 = idf(docFreq=3361, maxDocs=44218)
              0.0625 = fieldNorm(doc=1567)
        0.046620894 = weight(_text_:classification in 1567) [ClassicSimilarity], result of:
          0.046620894 = score(doc=1567,freq=6.0), product of:
            0.09562149 = queryWeight, product of:
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.03002521 = queryNorm
            0.48755667 = fieldWeight in 1567, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.0625 = fieldNorm(doc=1567)
        0.046620894 = weight(_text_:classification in 1567) [ClassicSimilarity], result of:
          0.046620894 = score(doc=1567,freq=6.0), product of:
            0.09562149 = queryWeight, product of:
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.03002521 = queryNorm
            0.48755667 = fieldWeight in 1567, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.0625 = fieldNorm(doc=1567)
      0.21428572 = coord(3/14)
    
    Abstract
    This paper describes a set of experiments in adapting a subset of the Library of Congress Classification for use as a database for automatic classification. A high degree of concept integrity was obtained when subject headings were mapped from OCLC's WorldCat database and filtered using the log-likelihood statistic
    Footnote
    Paper, IFLA Preconference "Subject Retrieval in a Networked Environment", Dublin, OH, August 2001.
  12. Shah, L.; Kumar, S.: Uniform form divisions (common isolates) for digital environment : a proposal (2006) 0.03
    0.030263202 = product of:
      0.14122827 = sum of:
        0.047104023 = weight(_text_:classification in 6101) [ClassicSimilarity], result of:
          0.047104023 = score(doc=6101,freq=8.0), product of:
            0.09562149 = queryWeight, product of:
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.03002521 = queryNorm
            0.49260917 = fieldWeight in 6101, product of:
              2.828427 = tf(freq=8.0), with freq of:
                8.0 = termFreq=8.0
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.0546875 = fieldNorm(doc=6101)
        0.047020227 = product of:
          0.09404045 = sum of:
            0.09404045 = weight(_text_:schemes in 6101) [ClassicSimilarity], result of:
              0.09404045 = score(doc=6101,freq=4.0), product of:
                0.16067243 = queryWeight, product of:
                  5.3512506 = idf(docFreq=569, maxDocs=44218)
                  0.03002521 = queryNorm
                0.58529305 = fieldWeight in 6101, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  5.3512506 = idf(docFreq=569, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=6101)
          0.5 = coord(1/2)
        0.047104023 = weight(_text_:classification in 6101) [ClassicSimilarity], result of:
          0.047104023 = score(doc=6101,freq=8.0), product of:
            0.09562149 = queryWeight, product of:
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.03002521 = queryNorm
            0.49260917 = fieldWeight in 6101, product of:
              2.828427 = tf(freq=8.0), with freq of:
                8.0 = termFreq=8.0
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.0546875 = fieldNorm(doc=6101)
      0.21428572 = coord(3/14)
    
    Abstract
    The study has proposed unification of three major schemes DDC, UDC and CC and developed uniform table for Form Divisions (Common Isolates), which can be used by any of the schemes of library classification or by a uniform classification scheme devised for digital environment. Paper suggests new postulation for the arrangement of geographical divisions. The paper also suggests for further research to prepare uniform classification code, which can be applied in digital environment. Concludes that IFLA may undertake a uniform classification code projects and its application in electronic environment.
  13. Broughton, V.: ¬A new classification for the literature for religion (2000) 0.03
    0.0302319 = product of:
      0.1410822 = sum of:
        0.04037488 = weight(_text_:classification in 5398) [ClassicSimilarity], result of:
          0.04037488 = score(doc=5398,freq=2.0), product of:
            0.09562149 = queryWeight, product of:
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.03002521 = queryNorm
            0.42223644 = fieldWeight in 5398, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.09375 = fieldNorm(doc=5398)
        0.060332447 = weight(_text_:bibliographic in 5398) [ClassicSimilarity], result of:
          0.060332447 = score(doc=5398,freq=2.0), product of:
            0.11688946 = queryWeight, product of:
              3.893044 = idf(docFreq=2449, maxDocs=44218)
              0.03002521 = queryNorm
            0.5161496 = fieldWeight in 5398, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.893044 = idf(docFreq=2449, maxDocs=44218)
              0.09375 = fieldNorm(doc=5398)
        0.04037488 = weight(_text_:classification in 5398) [ClassicSimilarity], result of:
          0.04037488 = score(doc=5398,freq=2.0), product of:
            0.09562149 = queryWeight, product of:
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.03002521 = queryNorm
            0.42223644 = fieldWeight in 5398, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.09375 = fieldNorm(doc=5398)
      0.21428572 = coord(3/14)
    
    Footnote
    Vortrag, IFLA General Conference, Divison IV Bibliographic Control, Jerusalem, 2000
  14. Elazar, D.H.: ¬The making of a classification scheme for libraries of Judaica (2000) 0.03
    0.0302319 = product of:
      0.1410822 = sum of:
        0.04037488 = weight(_text_:classification in 5400) [ClassicSimilarity], result of:
          0.04037488 = score(doc=5400,freq=2.0), product of:
            0.09562149 = queryWeight, product of:
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.03002521 = queryNorm
            0.42223644 = fieldWeight in 5400, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.09375 = fieldNorm(doc=5400)
        0.060332447 = weight(_text_:bibliographic in 5400) [ClassicSimilarity], result of:
          0.060332447 = score(doc=5400,freq=2.0), product of:
            0.11688946 = queryWeight, product of:
              3.893044 = idf(docFreq=2449, maxDocs=44218)
              0.03002521 = queryNorm
            0.5161496 = fieldWeight in 5400, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.893044 = idf(docFreq=2449, maxDocs=44218)
              0.09375 = fieldNorm(doc=5400)
        0.04037488 = weight(_text_:classification in 5400) [ClassicSimilarity], result of:
          0.04037488 = score(doc=5400,freq=2.0), product of:
            0.09562149 = queryWeight, product of:
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.03002521 = queryNorm
            0.42223644 = fieldWeight in 5400, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.09375 = fieldNorm(doc=5400)
      0.21428572 = coord(3/14)
    
    Footnote
    Vortrag, IFLA General Conference, Divison IV Bibliographic Control, Jerusalem, 2000
  15. McIlwaine, I.C.: Section on Classification and Indexing : review of activities 1999-2000 (2000) 0.03
    0.0302319 = product of:
      0.1410822 = sum of:
        0.04037488 = weight(_text_:classification in 5409) [ClassicSimilarity], result of:
          0.04037488 = score(doc=5409,freq=2.0), product of:
            0.09562149 = queryWeight, product of:
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.03002521 = queryNorm
            0.42223644 = fieldWeight in 5409, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.09375 = fieldNorm(doc=5409)
        0.060332447 = weight(_text_:bibliographic in 5409) [ClassicSimilarity], result of:
          0.060332447 = score(doc=5409,freq=2.0), product of:
            0.11688946 = queryWeight, product of:
              3.893044 = idf(docFreq=2449, maxDocs=44218)
              0.03002521 = queryNorm
            0.5161496 = fieldWeight in 5409, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.893044 = idf(docFreq=2449, maxDocs=44218)
              0.09375 = fieldNorm(doc=5409)
        0.04037488 = weight(_text_:classification in 5409) [ClassicSimilarity], result of:
          0.04037488 = score(doc=5409,freq=2.0), product of:
            0.09562149 = queryWeight, product of:
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.03002521 = queryNorm
            0.42223644 = fieldWeight in 5409, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.09375 = fieldNorm(doc=5409)
      0.21428572 = coord(3/14)
    
    Footnote
    Vortrag, IFLA General Conference, Divison IV Bibliographic Control, Jerusalem, 2000
  16. Si, L.E.; O'Brien, A.; Probets, S.: Integration of distributed terminology resources to facilitate subject cross-browsing for library portal systems (2009) 0.03
    0.030056136 = product of:
      0.10519647 = sum of:
        0.03675035 = weight(_text_:subject in 3628) [ClassicSimilarity], result of:
          0.03675035 = score(doc=3628,freq=6.0), product of:
            0.10738805 = queryWeight, product of:
              3.576596 = idf(docFreq=3361, maxDocs=44218)
              0.03002521 = queryNorm
            0.34222013 = fieldWeight in 3628, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              3.576596 = idf(docFreq=3361, maxDocs=44218)
              0.0390625 = fieldNorm(doc=3628)
        0.029138058 = weight(_text_:classification in 3628) [ClassicSimilarity], result of:
          0.029138058 = score(doc=3628,freq=6.0), product of:
            0.09562149 = queryWeight, product of:
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.03002521 = queryNorm
            0.3047229 = fieldWeight in 3628, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.0390625 = fieldNorm(doc=3628)
        0.029138058 = weight(_text_:classification in 3628) [ClassicSimilarity], result of:
          0.029138058 = score(doc=3628,freq=6.0), product of:
            0.09562149 = queryWeight, product of:
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.03002521 = queryNorm
            0.3047229 = fieldWeight in 3628, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.0390625 = fieldNorm(doc=3628)
        0.010170003 = product of:
          0.020340007 = sum of:
            0.020340007 = weight(_text_:22 in 3628) [ClassicSimilarity], result of:
              0.020340007 = score(doc=3628,freq=2.0), product of:
                0.10514317 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.03002521 = queryNorm
                0.19345059 = fieldWeight in 3628, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=3628)
          0.5 = coord(1/2)
      0.2857143 = coord(4/14)
    
    Abstract
    Purpose: To develop a prototype middleware framework between different terminology resources in order to provide a subject cross-browsing service for library portal systems. Design/methodology/approach: Nine terminology experts were interviewed to collect appropriate knowledge to support the development of a theoretical framework for the research. Based on this, a simplified software-based prototype system was constructed incorporating the knowledge acquired. The prototype involved mappings between the computer science schedule of the Dewey Decimal Classification (which acted as a spine) and two controlled vocabularies UKAT and ACM Computing Classification. Subsequently, six further experts in the field were invited to evaluate the prototype system and provide feedback to improve the framework. Findings: The major findings showed that given the large variety of terminology resources distributed on the web, the proposed middleware service is essential to integrate technically and semantically the different terminology resources in order to facilitate subject cross-browsing. A set of recommendations are also made outlining the important approaches and features that support such a cross browsing middleware service.
    Content
    This paper is a pre-print version presented at the ISKO UK 2009 conference, 22-23 June, prior to peer review and editing. For published proceedings see special issue of Aslib Proceedings journal.
    Object
    ACM Computing Classification
  17. Giunchiglia, F.; Zaihrayeu, I.; Farazi, F.: Converting classifications into OWL ontologies (2009) 0.03
    0.02992327 = product of:
      0.13964193 = sum of:
        0.045140486 = weight(_text_:classification in 4690) [ClassicSimilarity], result of:
          0.045140486 = score(doc=4690,freq=10.0), product of:
            0.09562149 = queryWeight, product of:
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.03002521 = queryNorm
            0.4720747 = fieldWeight in 4690, product of:
              3.1622777 = tf(freq=10.0), with freq of:
                10.0 = termFreq=10.0
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.046875 = fieldNorm(doc=4690)
        0.049360957 = product of:
          0.098721914 = sum of:
            0.098721914 = weight(_text_:schemes in 4690) [ClassicSimilarity], result of:
              0.098721914 = score(doc=4690,freq=6.0), product of:
                0.16067243 = queryWeight, product of:
                  5.3512506 = idf(docFreq=569, maxDocs=44218)
                  0.03002521 = queryNorm
                0.6144297 = fieldWeight in 4690, product of:
                  2.4494898 = tf(freq=6.0), with freq of:
                    6.0 = termFreq=6.0
                  5.3512506 = idf(docFreq=569, maxDocs=44218)
                  0.046875 = fieldNorm(doc=4690)
          0.5 = coord(1/2)
        0.045140486 = weight(_text_:classification in 4690) [ClassicSimilarity], result of:
          0.045140486 = score(doc=4690,freq=10.0), product of:
            0.09562149 = queryWeight, product of:
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.03002521 = queryNorm
            0.4720747 = fieldWeight in 4690, product of:
              3.1622777 = tf(freq=10.0), with freq of:
                10.0 = termFreq=10.0
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.046875 = fieldNorm(doc=4690)
      0.21428572 = coord(3/14)
    
    Abstract
    Classification schemes, such as the DMoZ web directory, provide a convenient and intuitive way for humans to access classified contents. While being easy to be dealt with for humans, classification schemes remain hard to be reasoned about by automated software agents. Among other things, this hardness is conditioned by the ambiguous na- ture of the natural language used to describe classification categories. In this paper we describe how classification schemes can be converted into OWL ontologies, thus enabling reasoning on them by Semantic Web applications. The proposed solution is based on a two phase approach in which category names are first encoded in a concept language and then, together with the structure of the classification scheme, are converted into an OWL ontology. We demonstrate the practical applicability of our approach by showing how the results of reasoning on these OWL ontologies can help improve the organization and use of web directories.
  18. Gnoli, C.: "Classic"vs. "freely" faceted classification (2007) 0.03
    0.028796013 = product of:
      0.1343814 = sum of:
        0.044100422 = weight(_text_:subject in 715) [ClassicSimilarity], result of:
          0.044100422 = score(doc=715,freq=6.0), product of:
            0.10738805 = queryWeight, product of:
              3.576596 = idf(docFreq=3361, maxDocs=44218)
              0.03002521 = queryNorm
            0.41066417 = fieldWeight in 715, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              3.576596 = idf(docFreq=3361, maxDocs=44218)
              0.046875 = fieldNorm(doc=715)
        0.045140486 = weight(_text_:classification in 715) [ClassicSimilarity], result of:
          0.045140486 = score(doc=715,freq=10.0), product of:
            0.09562149 = queryWeight, product of:
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.03002521 = queryNorm
            0.4720747 = fieldWeight in 715, product of:
              3.1622777 = tf(freq=10.0), with freq of:
                10.0 = termFreq=10.0
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.046875 = fieldNorm(doc=715)
        0.045140486 = weight(_text_:classification in 715) [ClassicSimilarity], result of:
          0.045140486 = score(doc=715,freq=10.0), product of:
            0.09562149 = queryWeight, product of:
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.03002521 = queryNorm
            0.4720747 = fieldWeight in 715, product of:
              3.1622777 = tf(freq=10.0), with freq of:
                10.0 = termFreq=10.0
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.046875 = fieldNorm(doc=715)
      0.21428572 = coord(3/14)
    
    Abstract
    Claudio Gnoli of the University of Pavia in Italy and Chair of ISKO Italy, explored the relative merits of classic 'faceted classification' (FC) and 'freely faceted classification' (FFC). In classic FC, the facets (and their relationships) which might be combined to express a compound subject, are restricted to those prescribed as inherent in the subject area. FC is therefore largely bounded by and restricted to a specific subject area. At the other extreme, free classification (as in the Web or folksonomies) allows the combination of values from multiple, disparate domains where the relationships among the elements are often indeterminate, and the semantics obscure. Claudio described how punched cards were an early example of free classification, and cited the coordination of dogs : postmen : bites as one where the absence of defined relationships made the semantics ambiguous
  19. Quick Guide to Publishing a Classification Scheme on the Semantic Web (2008) 0.03
    0.02731208 = product of:
      0.12745637 = sum of:
        0.047104023 = weight(_text_:classification in 3061) [ClassicSimilarity], result of:
          0.047104023 = score(doc=3061,freq=8.0), product of:
            0.09562149 = queryWeight, product of:
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.03002521 = queryNorm
            0.49260917 = fieldWeight in 3061, product of:
              2.828427 = tf(freq=8.0), with freq of:
                8.0 = termFreq=8.0
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.0546875 = fieldNorm(doc=3061)
        0.03324832 = product of:
          0.06649664 = sum of:
            0.06649664 = weight(_text_:schemes in 3061) [ClassicSimilarity], result of:
              0.06649664 = score(doc=3061,freq=2.0), product of:
                0.16067243 = queryWeight, product of:
                  5.3512506 = idf(docFreq=569, maxDocs=44218)
                  0.03002521 = queryNorm
                0.41386467 = fieldWeight in 3061, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  5.3512506 = idf(docFreq=569, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=3061)
          0.5 = coord(1/2)
        0.047104023 = weight(_text_:classification in 3061) [ClassicSimilarity], result of:
          0.047104023 = score(doc=3061,freq=8.0), product of:
            0.09562149 = queryWeight, product of:
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.03002521 = queryNorm
            0.49260917 = fieldWeight in 3061, product of:
              2.828427 = tf(freq=8.0), with freq of:
                8.0 = termFreq=8.0
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.0546875 = fieldNorm(doc=3061)
      0.21428572 = coord(3/14)
    
    Abstract
    This document describes in brief how to express the content and structure of a classification scheme, and metadata about a classification scheme, in RDF using the SKOS vocabulary. RDF allows data to be linked to and/or merged with other RDF data by semantic web applications. The Semantic Web, which is based on the Resource Description Framework (RDF), provides a common framework that allows data to be shared and reused across application, enterprise, and community boundaries. Publishing classifications schemes in SKOS will unify the great many of existing classification efforts in the framework of the Semantic Web.
  20. Prabowo, R.; Jackson, M.; Burden, P.; Knoell, H.-D.: Ontology-based automatic classification for the Web pages : design, implementation and evaluation (2002) 0.03
    0.027299229 = product of:
      0.1273964 = sum of:
        0.049448926 = weight(_text_:classification in 3383) [ClassicSimilarity], result of:
          0.049448926 = score(doc=3383,freq=12.0), product of:
            0.09562149 = queryWeight, product of:
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.03002521 = queryNorm
            0.5171319 = fieldWeight in 3383, product of:
              3.4641016 = tf(freq=12.0), with freq of:
                12.0 = termFreq=12.0
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.046875 = fieldNorm(doc=3383)
        0.02849856 = product of:
          0.05699712 = sum of:
            0.05699712 = weight(_text_:schemes in 3383) [ClassicSimilarity], result of:
              0.05699712 = score(doc=3383,freq=2.0), product of:
                0.16067243 = queryWeight, product of:
                  5.3512506 = idf(docFreq=569, maxDocs=44218)
                  0.03002521 = queryNorm
                0.35474116 = fieldWeight in 3383, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  5.3512506 = idf(docFreq=569, maxDocs=44218)
                  0.046875 = fieldNorm(doc=3383)
          0.5 = coord(1/2)
        0.049448926 = weight(_text_:classification in 3383) [ClassicSimilarity], result of:
          0.049448926 = score(doc=3383,freq=12.0), product of:
            0.09562149 = queryWeight, product of:
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.03002521 = queryNorm
            0.5171319 = fieldWeight in 3383, product of:
              3.4641016 = tf(freq=12.0), with freq of:
                12.0 = termFreq=12.0
              3.1847067 = idf(docFreq=4974, maxDocs=44218)
              0.046875 = fieldNorm(doc=3383)
      0.21428572 = coord(3/14)
    
    Abstract
    In recent years, we have witnessed the continual growth in the use of ontologies in order to provide a mechanism to enable machine reasoning. This paper describes an automatic classifier, which focuses on the use of ontologies for classifying Web pages with respect to the Dewey Decimal Classification (DDC) and Library of Congress Classification (LCC) schemes. Firstly, we explain how these ontologies can be built in a modular fashion, and mapped into DDC and LCC. Secondly, we propose the formal definition of a DDC-LCC and an ontology-classification-scheme mapping. Thirdly, we explain the way the classifier uses these ontologies to assist classification. Finally, an experiment in which the accuracy of the classifier was evaluated is presented. The experiment shows that our approach results an improved classification in terms of accuracy. This improvement, however, comes at a cost in a low overage ratio due to the incompleteness of the ontologies used

Languages

  • e 224
  • d 26
  • a 6
  • el 2
  • More… Less…

Types

  • a 62
  • p 23
  • i 6
  • r 6
  • n 4
  • s 2
  • x 2
  • b 1
  • m 1
  • More… Less…