Search (30 results, page 1 of 2)

  • × author_ss:"Beghtol, C."
  1. Beghtol, C.: Knowledge domains : multidisciplinarity and bibliographic classification systems (1998) 0.06
    0.057054784 = product of:
      0.11410957 = sum of:
        0.007175247 = weight(_text_:information in 2028) [ClassicSimilarity], result of:
          0.007175247 = score(doc=2028,freq=2.0), product of:
            0.0616574 = queryWeight, product of:
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.035122856 = queryNorm
            0.116372846 = fieldWeight in 2028, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.046875 = fieldNorm(doc=2028)
        0.008207779 = weight(_text_:for in 2028) [ClassicSimilarity], result of:
          0.008207779 = score(doc=2028,freq=2.0), product of:
            0.06594466 = queryWeight, product of:
              1.8775425 = idf(docFreq=18385, maxDocs=44218)
              0.035122856 = queryNorm
            0.12446466 = fieldWeight in 2028, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.8775425 = idf(docFreq=18385, maxDocs=44218)
              0.046875 = fieldNorm(doc=2028)
        0.019223245 = weight(_text_:the in 2028) [ClassicSimilarity], result of:
          0.019223245 = score(doc=2028,freq=22.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.34689236 = fieldWeight in 2028, product of:
              4.690416 = tf(freq=22.0), with freq of:
                22.0 = termFreq=22.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.046875 = fieldNorm(doc=2028)
        0.013946345 = weight(_text_:of in 2028) [ClassicSimilarity], result of:
          0.013946345 = score(doc=2028,freq=12.0), product of:
            0.054923624 = queryWeight, product of:
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.035122856 = queryNorm
            0.25392252 = fieldWeight in 2028, product of:
              3.4641016 = tf(freq=12.0), with freq of:
                12.0 = termFreq=12.0
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.046875 = fieldNorm(doc=2028)
        0.019223245 = weight(_text_:the in 2028) [ClassicSimilarity], result of:
          0.019223245 = score(doc=2028,freq=22.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.34689236 = fieldWeight in 2028, product of:
              4.690416 = tf(freq=22.0), with freq of:
                22.0 = termFreq=22.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.046875 = fieldNorm(doc=2028)
        0.046333708 = product of:
          0.092667416 = sum of:
            0.092667416 = weight(_text_:communities in 2028) [ClassicSimilarity], result of:
              0.092667416 = score(doc=2028,freq=4.0), product of:
                0.18632571 = queryWeight, product of:
                  5.3049703 = idf(docFreq=596, maxDocs=44218)
                  0.035122856 = queryNorm
                0.49734098 = fieldWeight in 2028, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  5.3049703 = idf(docFreq=596, maxDocs=44218)
                  0.046875 = fieldNorm(doc=2028)
          0.5 = coord(1/2)
      0.5 = coord(6/12)
    
    Abstract
    Bibliographic classification systems purport to organize the world of knowledge for information storage and retrieval purposes in libraries and bibliographies, both manual and online. The major systems that have predominated during the 20th century were originally predicated on the academic disciplines. This structural principle is no longer adequate because multidisciplinray knowledge production has overtaken more traditional disciplinary perspectives and produced communities of cooperation whose documents cannot be accomodated in a disciplinary structure. This paper addresses the problems the major classifications face, reports some attempts to revise these systems to accomodate multidisciplinary works more appropriately, and describes some theoretical research perspectives that attempt to reorient classification research toward the pluralistic needs of multidisciplinary knowledge creation and the perspectives of different discourse communities. Traditionally, the primary desiderata of classification systems were mutual exclusivity and joint exhaustivity. The need to respond to multidisciplinary research may mean that hospitality will replace mutual exclusivity and joint exhaustivity as the most needed and useful characteristics of classification systems in both theory and practice
  2. Beghtol, C.: Response to Hjoerland and Nicolaisen (2004) 0.04
    0.04409705 = product of:
      0.0881941 = sum of:
        0.0059192767 = weight(_text_:information in 3536) [ClassicSimilarity], result of:
          0.0059192767 = score(doc=3536,freq=4.0), product of:
            0.0616574 = queryWeight, product of:
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.035122856 = queryNorm
            0.0960027 = fieldWeight in 3536, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.02734375 = fieldNorm(doc=3536)
        0.0066718496 = weight(_text_:und in 3536) [ClassicSimilarity], result of:
          0.0066718496 = score(doc=3536,freq=2.0), product of:
            0.07784514 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.035122856 = queryNorm
            0.085706696 = fieldWeight in 3536, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.02734375 = fieldNorm(doc=3536)
        0.016585674 = weight(_text_:for in 3536) [ClassicSimilarity], result of:
          0.016585674 = score(doc=3536,freq=24.0), product of:
            0.06594466 = queryWeight, product of:
              1.8775425 = idf(docFreq=18385, maxDocs=44218)
              0.035122856 = queryNorm
            0.25150898 = fieldWeight in 3536, product of:
              4.8989797 = tf(freq=24.0), with freq of:
                24.0 = termFreq=24.0
              1.8775425 = idf(docFreq=18385, maxDocs=44218)
              0.02734375 = fieldNorm(doc=3536)
        0.020565914 = weight(_text_:the in 3536) [ClassicSimilarity], result of:
          0.020565914 = score(doc=3536,freq=74.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.37112147 = fieldWeight in 3536, product of:
              8.602325 = tf(freq=74.0), with freq of:
                74.0 = termFreq=74.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.02734375 = fieldNorm(doc=3536)
        0.01788548 = weight(_text_:of in 3536) [ClassicSimilarity], result of:
          0.01788548 = score(doc=3536,freq=58.0), product of:
            0.054923624 = queryWeight, product of:
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.035122856 = queryNorm
            0.32564276 = fieldWeight in 3536, product of:
              7.615773 = tf(freq=58.0), with freq of:
                58.0 = termFreq=58.0
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.02734375 = fieldNorm(doc=3536)
        0.020565914 = weight(_text_:the in 3536) [ClassicSimilarity], result of:
          0.020565914 = score(doc=3536,freq=74.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.37112147 = fieldWeight in 3536, product of:
              8.602325 = tf(freq=74.0), with freq of:
                74.0 = termFreq=74.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.02734375 = fieldNorm(doc=3536)
      0.5 = coord(6/12)
    
    Abstract
    I am writing to correct some of the misconceptions that Hjoerland and Nicolaisen appear to have about my paper in the previous issue of Knowledge Organization. I would like to address aspects of two of these misapprehensions. The first is the faulty interpretation they have given to my use of the term "naïve classification," and the second is the kinds of classification systems that they appear to believe are discussed in my paper as examples of "naïve classifications." First, the term "naïve classification" is directly analogous to the widely-understood and widelyaccepted term "naïve indexing." It is not analogous to the terms to which Hjorland and Nicolaisen compare it (i.e., "naïve physics", "naïve biology"). The term as I have defined it is not pejorative. It does not imply that the scholars who have developed naïve classifications have not given profoundly serious thought to their own scholarly work. My paper distinguishes between classifications for new knowledge developed by scholars in the various disciplines for the purposes of advancing disciplinary knowledge ("naïve classifications") and classifications for previously existing knowledge developed by information professionals for the purposes of creating access points in information retrieval systems ("professional classifications"). This distinction rests primarily an the purpose of the kind of classification system in question and only secondarily an the knowledge base of the scholars who have created it. Hjoerland and Nicolaisen appear to have misunderstood this point, which is made clearly and adequately in the title, in the abstract and throughout the text of my paper.
    Second, the paper posits that these different reasons for creating classification systems strongly influence the content and extent of the two kinds of classifications, but not necessarily their structures. By definition, naïve classifications for new knowledge have been developed for discrete areas of disciplinary inquiry in new areas of knowledge. These classifications do not attempt to classify the whole of that disciplinary area. That is, naïve classifications have a explicit purpose that is significantly different from the purpose of the major disciplinary classifications Hjoer-land and Nicolaisen provide as examples of classifications they think I discuss under the rubric of "naïve classifications" (e.g., classifications for the entire field of archaeology, biology, linguistics, music, psychology, etc.). My paper is not concerned with these important classifications for major disciplinary areas. Instead, it is concerned solely and specifically with scholarly classifications for small areas of new knowledge within these major disciplines (e.g., cloth of aresta, double harpsichords, child-rearing practices, anomalous phenomena, etc.). Thus, I have nowhere suggested or implied that the broad disciplinary classifications mentioned by Hjoerland and Nicolaisen are appropriately categorized as "naïve classifications." For example, I have not associated the Periodic System of the Elements with naïve classifications, as Hjoerland and Nicolaisen state that I have done. Indeed, broad classifications of this type fall well outside the definition of naïve classifications set out in my paper. In this case, too, 1 believe that Hjorland and Nicolaisen have misunderstood an important point in my paper. I agree with a number of points made in Hjorland and Nicolaisen's paper. In particular, I agree that researchers in the knowledge organization field should adhere to the highest standards of scholarly and scientific precision. For that reason, I am glad to have had the opportunity to respond to their paper.
    Footnote
    Bezugnahme auf: Hjoerland, B., J. Nicolaisen: Scientific and scholarly classifications are not "naïve": a comment to Beghtol (2003). In: Knowledge organization. 31(2004) no.1, S.55-61. - Vgl. die Erwiderung von Nicolaisen und Hjoerland in KO 31(2004) no.3, S.199-201.
  3. Beghtol, C.: Toward a theory of fiction analysis for information storage and retrieval (1992) 0.04
    0.04376505 = product of:
      0.0875301 = sum of:
        0.009566996 = weight(_text_:information in 5830) [ClassicSimilarity], result of:
          0.009566996 = score(doc=5830,freq=2.0), product of:
            0.0616574 = queryWeight, product of:
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.035122856 = queryNorm
            0.1551638 = fieldWeight in 5830, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.0625 = fieldNorm(doc=5830)
        0.02188741 = weight(_text_:for in 5830) [ClassicSimilarity], result of:
          0.02188741 = score(doc=5830,freq=8.0), product of:
            0.06594466 = queryWeight, product of:
              1.8775425 = idf(docFreq=18385, maxDocs=44218)
              0.035122856 = queryNorm
            0.33190575 = fieldWeight in 5830, product of:
              2.828427 = tf(freq=8.0), with freq of:
                8.0 = termFreq=8.0
              1.8775425 = idf(docFreq=18385, maxDocs=44218)
              0.0625 = fieldNorm(doc=5830)
        0.010929092 = weight(_text_:the in 5830) [ClassicSimilarity], result of:
          0.010929092 = score(doc=5830,freq=4.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.19722053 = fieldWeight in 5830, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.0625 = fieldNorm(doc=5830)
        0.015182858 = weight(_text_:of in 5830) [ClassicSimilarity], result of:
          0.015182858 = score(doc=5830,freq=8.0), product of:
            0.054923624 = queryWeight, product of:
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.035122856 = queryNorm
            0.27643585 = fieldWeight in 5830, product of:
              2.828427 = tf(freq=8.0), with freq of:
                8.0 = termFreq=8.0
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.0625 = fieldNorm(doc=5830)
        0.010929092 = weight(_text_:the in 5830) [ClassicSimilarity], result of:
          0.010929092 = score(doc=5830,freq=4.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.19722053 = fieldWeight in 5830, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.0625 = fieldNorm(doc=5830)
        0.019034648 = product of:
          0.038069297 = sum of:
            0.038069297 = weight(_text_:22 in 5830) [ClassicSimilarity], result of:
              0.038069297 = score(doc=5830,freq=2.0), product of:
                0.12299426 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.035122856 = queryNorm
                0.30952093 = fieldWeight in 5830, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0625 = fieldNorm(doc=5830)
          0.5 = coord(1/2)
      0.5 = coord(6/12)
    
    Abstract
    This paper examnines various isues that arise in establishing a theoretical basis for an experimental fiction analysis system. It analyzes the warrants of fiction and of works about fiction. From this analysis, it derives classificatory requirements for a fiction system. Classificatory techniques that may contribute to the specification of data elements in fiction are suggested
    Date
    5. 8.2006 13:22:08
    Source
    Classification research for knowledge representation and organization. Proc. 5th Int. Study Conf. on Classification Research, Toronto, Canada, 24.-28.6.1991. Ed. by N.J. Williamson u. M. Hudon
  4. Beghtol, C.: Naïve classification systems and the global information society (2004) 0.04
    0.043586206 = product of:
      0.08717241 = sum of:
        0.011958744 = weight(_text_:information in 3483) [ClassicSimilarity], result of:
          0.011958744 = score(doc=3483,freq=8.0), product of:
            0.0616574 = queryWeight, product of:
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.035122856 = queryNorm
            0.19395474 = fieldWeight in 3483, product of:
              2.828427 = tf(freq=8.0), with freq of:
                8.0 = termFreq=8.0
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.0390625 = fieldNorm(doc=3483)
        0.00967296 = weight(_text_:for in 3483) [ClassicSimilarity], result of:
          0.00967296 = score(doc=3483,freq=4.0), product of:
            0.06594466 = queryWeight, product of:
              1.8775425 = idf(docFreq=18385, maxDocs=44218)
              0.035122856 = queryNorm
            0.14668301 = fieldWeight in 3483, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              1.8775425 = idf(docFreq=18385, maxDocs=44218)
              0.0390625 = fieldNorm(doc=3483)
        0.019320088 = weight(_text_:the in 3483) [ClassicSimilarity], result of:
          0.019320088 = score(doc=3483,freq=32.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.34863994 = fieldWeight in 3483, product of:
              5.656854 = tf(freq=32.0), with freq of:
                32.0 = termFreq=32.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.0390625 = fieldNorm(doc=3483)
        0.015003879 = weight(_text_:of in 3483) [ClassicSimilarity], result of:
          0.015003879 = score(doc=3483,freq=20.0), product of:
            0.054923624 = queryWeight, product of:
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.035122856 = queryNorm
            0.27317715 = fieldWeight in 3483, product of:
              4.472136 = tf(freq=20.0), with freq of:
                20.0 = termFreq=20.0
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.0390625 = fieldNorm(doc=3483)
        0.019320088 = weight(_text_:the in 3483) [ClassicSimilarity], result of:
          0.019320088 = score(doc=3483,freq=32.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.34863994 = fieldWeight in 3483, product of:
              5.656854 = tf(freq=32.0), with freq of:
                32.0 = termFreq=32.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.0390625 = fieldNorm(doc=3483)
        0.011896656 = product of:
          0.023793312 = sum of:
            0.023793312 = weight(_text_:22 in 3483) [ClassicSimilarity], result of:
              0.023793312 = score(doc=3483,freq=2.0), product of:
                0.12299426 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.035122856 = queryNorm
                0.19345059 = fieldWeight in 3483, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=3483)
          0.5 = coord(1/2)
      0.5 = coord(6/12)
    
    Abstract
    Classification is an activity that transcends time and space and that bridges the divisions between different languages and cultures, including the divisions between academic disciplines. Classificatory activity, however, serves different purposes in different situations. Classifications for infonnation retrieval can be called "professional" classifications and classifications in other fields can be called "naïve" classifications because they are developed by people who have no particular interest in classificatory issues. The general purpose of naïve classification systems is to discover new knowledge. In contrast, the general purpose of information retrieval classifications is to classify pre-existing knowledge. Different classificatory purposes may thus inform systems that are intended to span the cultural specifics of the globalized information society. This paper builds an previous research into the purposes and characteristics of naïve classifications. It describes some of the relationships between the purpose and context of a naive classification, the units of analysis used in it, and the theory that the context and the units of analysis imply.
    Footnote
    Vgl.: Jacob, E.K.: Proposal for a classification of classifications built on Beghtol's distinction between "Naïve Classification" and "Professional Classification". In: Knowledge organization. 37(2010) no.2, S.111-120.
    Pages
    S.19-22
    Source
    Knowledge organization and the global information society: Proceedings of the 8th International ISKO Conference 13-16 July 2004, London, UK. Ed.: I.C. McIlwaine
  5. Beghtol, C.: ¬A proposed ethical warrant for global knowledge representation and organization systems (2002) 0.04
    0.036934543 = product of:
      0.08864291 = sum of:
        0.012427893 = weight(_text_:information in 4462) [ClassicSimilarity], result of:
          0.012427893 = score(doc=4462,freq=6.0), product of:
            0.0616574 = queryWeight, product of:
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.035122856 = queryNorm
            0.20156369 = fieldWeight in 4462, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.046875 = fieldNorm(doc=4462)
        0.021715743 = weight(_text_:for in 4462) [ClassicSimilarity], result of:
          0.021715743 = score(doc=4462,freq=14.0), product of:
            0.06594466 = queryWeight, product of:
              1.8775425 = idf(docFreq=18385, maxDocs=44218)
              0.035122856 = queryNorm
            0.32930255 = fieldWeight in 4462, product of:
              3.7416575 = tf(freq=14.0), with freq of:
                14.0 = termFreq=14.0
              1.8775425 = idf(docFreq=18385, maxDocs=44218)
              0.046875 = fieldNorm(doc=4462)
        0.01738808 = weight(_text_:the in 4462) [ClassicSimilarity], result of:
          0.01738808 = score(doc=4462,freq=18.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.31377596 = fieldWeight in 4462, product of:
              4.2426405 = tf(freq=18.0), with freq of:
                18.0 = termFreq=18.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.046875 = fieldNorm(doc=4462)
        0.019723112 = weight(_text_:of in 4462) [ClassicSimilarity], result of:
          0.019723112 = score(doc=4462,freq=24.0), product of:
            0.054923624 = queryWeight, product of:
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.035122856 = queryNorm
            0.3591007 = fieldWeight in 4462, product of:
              4.8989797 = tf(freq=24.0), with freq of:
                24.0 = termFreq=24.0
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.046875 = fieldNorm(doc=4462)
        0.01738808 = weight(_text_:the in 4462) [ClassicSimilarity], result of:
          0.01738808 = score(doc=4462,freq=18.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.31377596 = fieldWeight in 4462, product of:
              4.2426405 = tf(freq=18.0), with freq of:
                18.0 = termFreq=18.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.046875 = fieldNorm(doc=4462)
      0.41666666 = coord(5/12)
    
    Abstract
    New technologies have made the increased globalization of information resources and services possible. In this situation, it is ethically and intellectually beneficial to protect cultural and information diversity. This paper analyzes the problems of creating ethically based globally accessible and culturally acceptable knowledge representation and organization systems, and foundation principles for the ethical treatment of different cultures are established on the basis of the United Nations Universal Declaration of Human Rights (UDHR). The concept of "cultural hospitality", which can act as a theoretical framework for the ethical warrant of knowledge representation and organization systems, is described. This broad discussion is grounded with an extended example of one cultural universal, the concept of time and its expression in calendars. Methods of achieving cultural and user hospitality in information systems are discussed for their potential for creating ethically based systems. It is concluded that cultural hospitality is a promising concept for assessing the ethical foundations of new knowledge representation and organization systems and for planning revisions to existing systems.
    Source
    Journal of documentation. 58(2002) no.5, S.507-532
  6. Beghtol, C.: Knowledge representation and organization in the ITER project : A Web-based digital library for scholars of the middle ages and renaissance (http://iter.utoronto.ca) (2001) 0.04
    0.036484405 = product of:
      0.087562576 = sum of:
        0.010147331 = weight(_text_:information in 638) [ClassicSimilarity], result of:
          0.010147331 = score(doc=638,freq=4.0), product of:
            0.0616574 = queryWeight, product of:
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.035122856 = queryNorm
            0.16457605 = fieldWeight in 638, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.046875 = fieldNorm(doc=638)
        0.016415559 = weight(_text_:for in 638) [ClassicSimilarity], result of:
          0.016415559 = score(doc=638,freq=8.0), product of:
            0.06594466 = queryWeight, product of:
              1.8775425 = idf(docFreq=18385, maxDocs=44218)
              0.035122856 = queryNorm
            0.24892932 = fieldWeight in 638, product of:
              2.828427 = tf(freq=8.0), with freq of:
                8.0 = termFreq=8.0
              1.8775425 = idf(docFreq=18385, maxDocs=44218)
              0.046875 = fieldNorm(doc=638)
        0.022447916 = weight(_text_:the in 638) [ClassicSimilarity], result of:
          0.022447916 = score(doc=638,freq=30.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.40508303 = fieldWeight in 638, product of:
              5.477226 = tf(freq=30.0), with freq of:
                30.0 = termFreq=30.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.046875 = fieldNorm(doc=638)
        0.016103853 = weight(_text_:of in 638) [ClassicSimilarity], result of:
          0.016103853 = score(doc=638,freq=16.0), product of:
            0.054923624 = queryWeight, product of:
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.035122856 = queryNorm
            0.2932045 = fieldWeight in 638, product of:
              4.0 = tf(freq=16.0), with freq of:
                16.0 = termFreq=16.0
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.046875 = fieldNorm(doc=638)
        0.022447916 = weight(_text_:the in 638) [ClassicSimilarity], result of:
          0.022447916 = score(doc=638,freq=30.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.40508303 = fieldWeight in 638, product of:
              5.477226 = tf(freq=30.0), with freq of:
                30.0 = termFreq=30.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.046875 = fieldNorm(doc=638)
      0.41666666 = coord(5/12)
    
    Abstract
    The Iter Project ("iter" means "path" or "journey" in Latin) is an internationally supported non-profit research project created with the objective of providing electronic access to all kinds and formats of materials that relate to the Middle Ages and Renaissance (400-1700) and that were published between 1700 and the present. Knowledge representation and organization decisions for the Project were influenced by its potential international clientele of scholarly users, and these decisions illustrate the importance and efficacy of collaboration between specialized users and information professionals. The paper outlines the scholarly principles and information goals of the Project and describes in detail the methodology developed to provide reliable and consistent knowledge representation and organization for one component of the Project, the Iter Bibliography. Examples of fully catalogued records for the Iter Bibliography are included.
  7. Beghtol, C.: Ethical decision-making for knowledge representation and organization systems for global use (2005) 0.04
    0.036209572 = product of:
      0.086902976 = sum of:
        0.0118385535 = weight(_text_:information in 1648) [ClassicSimilarity], result of:
          0.0118385535 = score(doc=1648,freq=4.0), product of:
            0.0616574 = queryWeight, product of:
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.035122856 = queryNorm
            0.1920054 = fieldWeight in 1648, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.0546875 = fieldNorm(doc=1648)
        0.02708429 = weight(_text_:for in 1648) [ClassicSimilarity], result of:
          0.02708429 = score(doc=1648,freq=16.0), product of:
            0.06594466 = queryWeight, product of:
              1.8775425 = idf(docFreq=18385, maxDocs=44218)
              0.035122856 = queryNorm
            0.41071242 = fieldWeight in 1648, product of:
              4.0 = tf(freq=16.0), with freq of:
                16.0 = termFreq=16.0
              1.8775425 = idf(docFreq=18385, maxDocs=44218)
              0.0546875 = fieldNorm(doc=1648)
        0.016563525 = weight(_text_:the in 1648) [ClassicSimilarity], result of:
          0.016563525 = score(doc=1648,freq=12.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.2988965 = fieldWeight in 1648, product of:
              3.4641016 = tf(freq=12.0), with freq of:
                12.0 = termFreq=12.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.0546875 = fieldNorm(doc=1648)
        0.0148530835 = weight(_text_:of in 1648) [ClassicSimilarity], result of:
          0.0148530835 = score(doc=1648,freq=10.0), product of:
            0.054923624 = queryWeight, product of:
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.035122856 = queryNorm
            0.2704316 = fieldWeight in 1648, product of:
              3.1622777 = tf(freq=10.0), with freq of:
                10.0 = termFreq=10.0
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.0546875 = fieldNorm(doc=1648)
        0.016563525 = weight(_text_:the in 1648) [ClassicSimilarity], result of:
          0.016563525 = score(doc=1648,freq=12.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.2988965 = fieldWeight in 1648, product of:
              3.4641016 = tf(freq=12.0), with freq of:
                12.0 = termFreq=12.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.0546875 = fieldNorm(doc=1648)
      0.41666666 = coord(5/12)
    
    Abstract
    In this article, ethical decision-making methods for creating, revising, and maintaining knowledge representation and organization systems are described, particularly in relation to the global use of these systems. The analysis uses a three-level model and the literature on ethically based decision-making in the social and technical sciences. In addition, methods for making these kinds of decisions in an ethical manner are presented. This multidisciplinary approach is generalizable to other information areas and is useful for encouraging the development of ethics policies for knowledge representation and organization systems and for other kinds of systems or institutions.
    Source
    Journal of the American Society for Information Science and Technology. 56(2005) no.9, S.903-912
  8. Beghtol, C.: ¬L'¬efficacia del recupero (1993) 0.04
    0.035984185 = product of:
      0.08636204 = sum of:
        0.009566996 = weight(_text_:information in 4018) [ClassicSimilarity], result of:
          0.009566996 = score(doc=4018,freq=2.0), product of:
            0.0616574 = queryWeight, product of:
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.035122856 = queryNorm
            0.1551638 = fieldWeight in 4018, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.0625 = fieldNorm(doc=4018)
        0.010943705 = weight(_text_:for in 4018) [ClassicSimilarity], result of:
          0.010943705 = score(doc=4018,freq=2.0), product of:
            0.06594466 = queryWeight, product of:
              1.8775425 = idf(docFreq=18385, maxDocs=44218)
              0.035122856 = queryNorm
            0.16595288 = fieldWeight in 4018, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.8775425 = idf(docFreq=18385, maxDocs=44218)
              0.0625 = fieldNorm(doc=4018)
        0.024438193 = weight(_text_:the in 4018) [ClassicSimilarity], result of:
          0.024438193 = score(doc=4018,freq=20.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.44099852 = fieldWeight in 4018, product of:
              4.472136 = tf(freq=20.0), with freq of:
                20.0 = termFreq=20.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.0625 = fieldNorm(doc=4018)
        0.016974952 = weight(_text_:of in 4018) [ClassicSimilarity], result of:
          0.016974952 = score(doc=4018,freq=10.0), product of:
            0.054923624 = queryWeight, product of:
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.035122856 = queryNorm
            0.3090647 = fieldWeight in 4018, product of:
              3.1622777 = tf(freq=10.0), with freq of:
                10.0 = termFreq=10.0
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.0625 = fieldNorm(doc=4018)
        0.024438193 = weight(_text_:the in 4018) [ClassicSimilarity], result of:
          0.024438193 = score(doc=4018,freq=20.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.44099852 = fieldWeight in 4018, product of:
              4.472136 = tf(freq=20.0), with freq of:
                20.0 = termFreq=20.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.0625 = fieldNorm(doc=4018)
      0.41666666 = coord(5/12)
    
    Abstract
    Proposes a new experimental methodology for evaluating the results of library research from the user's viewpoint. Illustrates the theory by comparing the efficacy of information retrieved from 2 document catalogues, identical except that one is alphabetical and the other numerical/verbal. The methodology utilises the concept of 3 dependent variables: 'promising references retrieved' by the researcher; 'documents read'; and 'documents cited'. Claims that the retrieval effectiveness of the techniques outlined compares favourably with that of W.S. Cooper's methodology
  9. Beghtol, C.: Classification for information retrieval and classification for knowledge discovery : relationships between "professional" and "naïve" classifications (2003) 0.03
    0.03446123 = product of:
      0.08270695 = sum of:
        0.013370283 = weight(_text_:information in 3021) [ClassicSimilarity], result of:
          0.013370283 = score(doc=3021,freq=10.0), product of:
            0.0616574 = queryWeight, product of:
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.035122856 = queryNorm
            0.21684799 = fieldWeight in 3021, product of:
              3.1622777 = tf(freq=10.0), with freq of:
                10.0 = termFreq=10.0
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.0390625 = fieldNorm(doc=3021)
        0.01675406 = weight(_text_:for in 3021) [ClassicSimilarity], result of:
          0.01675406 = score(doc=3021,freq=12.0), product of:
            0.06594466 = queryWeight, product of:
              1.8775425 = idf(docFreq=18385, maxDocs=44218)
              0.035122856 = queryNorm
            0.2540624 = fieldWeight in 3021, product of:
              3.4641016 = tf(freq=12.0), with freq of:
                12.0 = termFreq=12.0
              1.8775425 = idf(docFreq=18385, maxDocs=44218)
              0.0390625 = fieldNorm(doc=3021)
        0.017414892 = weight(_text_:the in 3021) [ClassicSimilarity], result of:
          0.017414892 = score(doc=3021,freq=26.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.3142598 = fieldWeight in 3021, product of:
              5.0990195 = tf(freq=26.0), with freq of:
                26.0 = termFreq=26.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.0390625 = fieldNorm(doc=3021)
        0.01775283 = weight(_text_:of in 3021) [ClassicSimilarity], result of:
          0.01775283 = score(doc=3021,freq=28.0), product of:
            0.054923624 = queryWeight, product of:
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.035122856 = queryNorm
            0.32322758 = fieldWeight in 3021, product of:
              5.2915025 = tf(freq=28.0), with freq of:
                28.0 = termFreq=28.0
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.0390625 = fieldNorm(doc=3021)
        0.017414892 = weight(_text_:the in 3021) [ClassicSimilarity], result of:
          0.017414892 = score(doc=3021,freq=26.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.3142598 = fieldWeight in 3021, product of:
              5.0990195 = tf(freq=26.0), with freq of:
                26.0 = termFreq=26.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.0390625 = fieldNorm(doc=3021)
      0.41666666 = coord(5/12)
    
    Abstract
    Classification is a transdisciplinary activity that occurs during all human pursuits. Classificatory activity, however, serves different purposes in different situations. In information retrieval, the primary purpose of classification is to find knowledge that already exists, but one of the purposes of classification in other fields is to discover new knowledge. In this paper, classifications for information retrieval are called "professional" classifications because they are devised by people who have a professional interest in classification, and classifications for knowledge discovery are called "naive" classifications because they are devised by people who have no particular interest in studying classification as an end in itself. This paper compares the overall purposes and methods of these two kinds of classifications and provides a general model of the relationships between the two kinds of classificatory activity in the context of information studies. This model addresses issues of the influence of scholarly activity and communication an the creation and revision of classifications for the purposes of information retrieval and for the purposes of knowledge discovery. Further comparisons elucidate the relationships between the universality of classificatory methods and the specific purposes served by naive and professional classification systems.
  10. Beghtol, C.: Within, among, between : three faces of interdisciplinarity (1995) 0.03
    0.033895288 = product of:
      0.081348695 = sum of:
        0.018718397 = weight(_text_:information in 1297) [ClassicSimilarity], result of:
          0.018718397 = score(doc=1297,freq=10.0), product of:
            0.0616574 = queryWeight, product of:
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.035122856 = queryNorm
            0.3035872 = fieldWeight in 1297, product of:
              3.1622777 = tf(freq=10.0), with freq of:
                10.0 = termFreq=10.0
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.0546875 = fieldNorm(doc=1297)
        0.009575742 = weight(_text_:for in 1297) [ClassicSimilarity], result of:
          0.009575742 = score(doc=1297,freq=2.0), product of:
            0.06594466 = queryWeight, product of:
              1.8775425 = idf(docFreq=18385, maxDocs=44218)
              0.035122856 = queryNorm
            0.14520876 = fieldWeight in 1297, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.8775425 = idf(docFreq=18385, maxDocs=44218)
              0.0546875 = fieldNorm(doc=1297)
        0.016563525 = weight(_text_:the in 1297) [ClassicSimilarity], result of:
          0.016563525 = score(doc=1297,freq=12.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.2988965 = fieldWeight in 1297, product of:
              3.4641016 = tf(freq=12.0), with freq of:
                12.0 = termFreq=12.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.0546875 = fieldNorm(doc=1297)
        0.0199275 = weight(_text_:of in 1297) [ClassicSimilarity], result of:
          0.0199275 = score(doc=1297,freq=18.0), product of:
            0.054923624 = queryWeight, product of:
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.035122856 = queryNorm
            0.36282203 = fieldWeight in 1297, product of:
              4.2426405 = tf(freq=18.0), with freq of:
                18.0 = termFreq=18.0
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.0546875 = fieldNorm(doc=1297)
        0.016563525 = weight(_text_:the in 1297) [ClassicSimilarity], result of:
          0.016563525 = score(doc=1297,freq=12.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.2988965 = fieldWeight in 1297, product of:
              3.4641016 = tf(freq=12.0), with freq of:
                12.0 = termFreq=12.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.0546875 = fieldNorm(doc=1297)
      0.41666666 = coord(5/12)
    
    Abstract
    Interdisciplinarity is a means by which the unit of information and the separateness of disciplines can work together harmoniously. Characterizes disciplinarity and the role of information science in serving other disciplines as well as its own discipline. Interdisciplinarity relationships occur among all knowledge seeking disciplines, (including LIS), and those between LIS and every other filed of knowledge production, utilization and practical action. Considers how LIS can promote interdisciplinarity relationships and research
    Imprint
    Alberta : Alberta University, School of Library and Information Studies
    Source
    Connectedness: information, systems, people, organizations. Proceedings of CAIS/ACSI 95, the proceedings of the 23rd Annual Conference of the Canadian Association for Information Science. Ed. by Hope A. Olson and Denis B. Ward
  11. Beghtol, C.: From the universe of knowledge to the universe of concepts : the structural revolution in classification for information retrieval (2008) 0.03
    0.033117868 = product of:
      0.07948288 = sum of:
        0.005979372 = weight(_text_:information in 1856) [ClassicSimilarity], result of:
          0.005979372 = score(doc=1856,freq=2.0), product of:
            0.0616574 = queryWeight, product of:
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.035122856 = queryNorm
            0.09697737 = fieldWeight in 1856, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.0390625 = fieldNorm(doc=1856)
        0.00967296 = weight(_text_:for in 1856) [ClassicSimilarity], result of:
          0.00967296 = score(doc=1856,freq=4.0), product of:
            0.06594466 = queryWeight, product of:
              1.8775425 = idf(docFreq=18385, maxDocs=44218)
              0.035122856 = queryNorm
            0.14668301 = fieldWeight in 1856, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              1.8775425 = idf(docFreq=18385, maxDocs=44218)
              0.0390625 = fieldNorm(doc=1856)
        0.02213394 = weight(_text_:the in 1856) [ClassicSimilarity], result of:
          0.02213394 = score(doc=1856,freq=42.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.39941722 = fieldWeight in 1856, product of:
              6.4807405 = tf(freq=42.0), with freq of:
                42.0 = termFreq=42.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.0390625 = fieldNorm(doc=1856)
        0.019562665 = weight(_text_:of in 1856) [ClassicSimilarity], result of:
          0.019562665 = score(doc=1856,freq=34.0), product of:
            0.054923624 = queryWeight, product of:
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.035122856 = queryNorm
            0.35617945 = fieldWeight in 1856, product of:
              5.8309517 = tf(freq=34.0), with freq of:
                34.0 = termFreq=34.0
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.0390625 = fieldNorm(doc=1856)
        0.02213394 = weight(_text_:the in 1856) [ClassicSimilarity], result of:
          0.02213394 = score(doc=1856,freq=42.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.39941722 = fieldWeight in 1856, product of:
              6.4807405 = tf(freq=42.0), with freq of:
                42.0 = termFreq=42.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.0390625 = fieldNorm(doc=1856)
      0.41666666 = coord(5/12)
    
    Abstract
    During the twentieth century, bibliographic classification theory underwent a structural revolution. The first modern bibliographic classifications were top-down systems that started at the universe of knowledge and subdivided that universe downward to minute subclasses. After the invention of faceted classification by S.R. Ranganathan, the ideal was to build bottom-up classifications that started with the universe of concepts and built upward to larger and larger faceted classes. This ideal has not been achieved, and the two kinds of classification systems are not mutually exclusive. This paper examines the process by which this structural revolution was accomplished by looking at the spread of facet theory after 1924 when Ranganathan attended the School of Librarianship, London, through selected classification textbooks that were published after that date. To this end, the paper examines the role of W.C.B. Sayers as a teacher and author of three editions of The Manual of Classification for Librarians and Bibliographers. Sayers influenced both Ranganathan and the various members of the Classification Research Group (CRG) who were his students. Further, the paper contrasts the methods of evaluating classification systems that arose between Sayers's Canons of Classification in 1915- 1916 and J. Mills's A Modern Outline of Library Classification in 1960 in order to demonstrate the speed with which one kind of classificatory structure was overtaken by another.
  12. Beghtol, C.: Stories : applications of narrative discourse analysis to issues in information storage and retrieval (1997) 0.03
    0.032806836 = product of:
      0.07873641 = sum of:
        0.0118385535 = weight(_text_:information in 5844) [ClassicSimilarity], result of:
          0.0118385535 = score(doc=5844,freq=4.0), product of:
            0.0616574 = queryWeight, product of:
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.035122856 = queryNorm
            0.1920054 = fieldWeight in 5844, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.0546875 = fieldNorm(doc=5844)
        0.013542145 = weight(_text_:for in 5844) [ClassicSimilarity], result of:
          0.013542145 = score(doc=5844,freq=4.0), product of:
            0.06594466 = queryWeight, product of:
              1.8775425 = idf(docFreq=18385, maxDocs=44218)
              0.035122856 = queryNorm
            0.20535621 = fieldWeight in 5844, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              1.8775425 = idf(docFreq=18385, maxDocs=44218)
              0.0546875 = fieldNorm(doc=5844)
        0.01789065 = weight(_text_:the in 5844) [ClassicSimilarity], result of:
          0.01789065 = score(doc=5844,freq=14.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.3228451 = fieldWeight in 5844, product of:
              3.7416575 = tf(freq=14.0), with freq of:
                14.0 = termFreq=14.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.0546875 = fieldNorm(doc=5844)
        0.017574405 = weight(_text_:of in 5844) [ClassicSimilarity], result of:
          0.017574405 = score(doc=5844,freq=14.0), product of:
            0.054923624 = queryWeight, product of:
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.035122856 = queryNorm
            0.31997898 = fieldWeight in 5844, product of:
              3.7416575 = tf(freq=14.0), with freq of:
                14.0 = termFreq=14.0
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.0546875 = fieldNorm(doc=5844)
        0.01789065 = weight(_text_:the in 5844) [ClassicSimilarity], result of:
          0.01789065 = score(doc=5844,freq=14.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.3228451 = fieldWeight in 5844, product of:
              3.7416575 = tf(freq=14.0), with freq of:
                14.0 = termFreq=14.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.0546875 = fieldNorm(doc=5844)
      0.41666666 = coord(5/12)
    
    Abstract
    The arts, humanities, and social sciences commonly borrow concepts and methods from the sciences, but interdisciplinary borrowing seldom occurs in the opposite direction. Research on narrative discourse is relevant to problems of documentary storage and retrieval, for the arts and humanities in particular, but also for other broad areas of knowledge. This paper views the potential application of narrative discourse analysis to information storage and retrieval problems from 2 perspectives: 1) analysis and comparison of narrative documents in all disciplines may be simplified if fundamental categories that occur in narrative documents can be isolated; and 2) the possibility of subdividing the world of knowledge initially into narrative and non-narrative documents is explored with particular attention to Werlich's work on text types
  13. Beghtol, C.: ¬The facet concept as a universal principle of subdivision (2006) 0.03
    0.03261022 = product of:
      0.07826453 = sum of:
        0.016742244 = weight(_text_:information in 1483) [ClassicSimilarity], result of:
          0.016742244 = score(doc=1483,freq=8.0), product of:
            0.0616574 = queryWeight, product of:
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.035122856 = queryNorm
            0.27153665 = fieldWeight in 1483, product of:
              2.828427 = tf(freq=8.0), with freq of:
                8.0 = termFreq=8.0
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.0546875 = fieldNorm(doc=1483)
        0.013542145 = weight(_text_:for in 1483) [ClassicSimilarity], result of:
          0.013542145 = score(doc=1483,freq=4.0), product of:
            0.06594466 = queryWeight, product of:
              1.8775425 = idf(docFreq=18385, maxDocs=44218)
              0.035122856 = queryNorm
            0.20535621 = fieldWeight in 1483, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              1.8775425 = idf(docFreq=18385, maxDocs=44218)
              0.0546875 = fieldNorm(doc=1483)
        0.016563525 = weight(_text_:the in 1483) [ClassicSimilarity], result of:
          0.016563525 = score(doc=1483,freq=12.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.2988965 = fieldWeight in 1483, product of:
              3.4641016 = tf(freq=12.0), with freq of:
                12.0 = termFreq=12.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.0546875 = fieldNorm(doc=1483)
        0.0148530835 = weight(_text_:of in 1483) [ClassicSimilarity], result of:
          0.0148530835 = score(doc=1483,freq=10.0), product of:
            0.054923624 = queryWeight, product of:
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.035122856 = queryNorm
            0.2704316 = fieldWeight in 1483, product of:
              3.1622777 = tf(freq=10.0), with freq of:
                10.0 = termFreq=10.0
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.0546875 = fieldNorm(doc=1483)
        0.016563525 = weight(_text_:the in 1483) [ClassicSimilarity], result of:
          0.016563525 = score(doc=1483,freq=12.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.2988965 = fieldWeight in 1483, product of:
              3.4641016 = tf(freq=12.0), with freq of:
                12.0 = termFreq=12.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.0546875 = fieldNorm(doc=1483)
      0.41666666 = coord(5/12)
    
    Abstract
    Facet analysis has been one of the foremost contenders as a design principle for information retrieval classifications, both manual and electronic in the last fifty years. Evidence is presented that the facet concept has a claim to be considered as a method of subdivision that is cognitively available to human beings, regardless of language, culture, or academic discipline. The possibility that faceting is a universal method of subdivision enhances the claim that facet analysis as an unusually useful design principle for information retrieval classifications in any field. This possibility needs further investigation in an age when information access across boundaries is both necessary and possible.
    Source
    Knowledge organization, information systems and other essays: Professor A. Neelameghan Festschrift. Ed. by K.S. Raghavan and K.N. Prasad
  14. Beghtol, C.: ¬The Iter Bibliography : International standard subject access to medieval and renaissance materials (400-1700) (2003) 0.03
    0.032380946 = product of:
      0.07771427 = sum of:
        0.009566996 = weight(_text_:information in 3965) [ClassicSimilarity], result of:
          0.009566996 = score(doc=3965,freq=8.0), product of:
            0.0616574 = queryWeight, product of:
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.035122856 = queryNorm
            0.1551638 = fieldWeight in 3965, product of:
              2.828427 = tf(freq=8.0), with freq of:
                8.0 = termFreq=8.0
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.03125 = fieldNorm(doc=3965)
        0.013403247 = weight(_text_:for in 3965) [ClassicSimilarity], result of:
          0.013403247 = score(doc=3965,freq=12.0), product of:
            0.06594466 = queryWeight, product of:
              1.8775425 = idf(docFreq=18385, maxDocs=44218)
              0.035122856 = queryNorm
            0.20324993 = fieldWeight in 3965, product of:
              3.4641016 = tf(freq=12.0), with freq of:
                12.0 = termFreq=12.0
              1.8775425 = idf(docFreq=18385, maxDocs=44218)
              0.03125 = fieldNorm(doc=3965)
        0.019320088 = weight(_text_:the in 3965) [ClassicSimilarity], result of:
          0.019320088 = score(doc=3965,freq=50.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.34863994 = fieldWeight in 3965, product of:
              7.071068 = tf(freq=50.0), with freq of:
                50.0 = termFreq=50.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.03125 = fieldNorm(doc=3965)
        0.016103853 = weight(_text_:of in 3965) [ClassicSimilarity], result of:
          0.016103853 = score(doc=3965,freq=36.0), product of:
            0.054923624 = queryWeight, product of:
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.035122856 = queryNorm
            0.2932045 = fieldWeight in 3965, product of:
              6.0 = tf(freq=36.0), with freq of:
                36.0 = termFreq=36.0
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.03125 = fieldNorm(doc=3965)
        0.019320088 = weight(_text_:the in 3965) [ClassicSimilarity], result of:
          0.019320088 = score(doc=3965,freq=50.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.34863994 = fieldWeight in 3965, product of:
              7.071068 = tf(freq=50.0), with freq of:
                50.0 = termFreq=50.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.03125 = fieldNorm(doc=3965)
      0.41666666 = coord(5/12)
    
    Abstract
    Iter ("journey" or "path" in Latin) is a non-profit project for providing electronic access to materials pertaining to the Middle Ages and Renaissance (400-1700). Iter's background is described, and its centrepiece, the Iter Bibliography, is explicated. Emphasis is an the subject cataloguing process and an subject access to records for journal articles (using Library of Congress Subject Headings and the Dewey Decimal Classification). Basic subject analysis of the materials is provided by graduate students specializing in the Middle Ages and Renaissance periods, and, subsequently, subject access points systems are provided by information professionals. This close cooperation between subject and information experts would not be efficient without electronic capabilities.
    Content
    "1. Iter: Gateway to the Middle Ages and Renaissance Iter is a non-profit research project dedicated to providing electronic access to all kinds and formats of materials pertaining to the Middle Ages and Renaissance (400-1700). Iter began in 1995 as a joint initiative of the Renaissance Society of America (RSA) in New York City and the Centre for Reformation and Renaissance Studies (CRRS), Univ. of Toronto. By 1997, three more partners had joined: Faculty of Information Studies (FIS), Univ. of Toronto; Arizona Center for Medieval and Renaissance Studies (ACMRS), Arizona State Univ. at Tempe; and John P. Robarts Library, Univ. of Toronto. Iter was funded initially by the five partners and major foundations and, since 1998, has offered low-cost subscriptions to institutions and individuals. When Iter becomes financially self-sufficient, any profits will be used to enhance and expand the project. Iter databases are housed and maintained at the John P. Robarts Library. The interface is a customized version of DRA WebZ. A new interface using DRA Web can be searched now and will replace the DRA WebZ interface shortly. Iter was originally conceived as a comprehensive bibliography of secondary materials that would be an alternative to the existing commercial research tools for its period. These were expensive, generally appeared several years late, had limited subject indexing, were inconsistent in coverage, of uneven quality, and often depended an fragile networks of volunteers for identification of materials. The production of a reasonably priced, web-based, timely research tool was Iter's first priority. In addition, the partners wanted to involve graduate students in the project in order to contribute to the scholarly training and financial support of future scholars of the Middle Ages and Renaissance and to utilize as much automation as possible."
    Source
    Subject retrieval in a networked environment: Proceedings of the IFLA Satellite Meeting held in Dublin, OH, 14-16 August 2001 and sponsored by the IFLA Classification and Indexing Section, the IFLA Information Technology Section and OCLC. Ed.: I.C. McIlwaine
  15. Beghtol, C.: Exploring new approaches to the organization of knowledge : the subject classification of James Duff Brown (2004) 0.03
    0.03235802 = product of:
      0.07765925 = sum of:
        0.007175247 = weight(_text_:information in 869) [ClassicSimilarity], result of:
          0.007175247 = score(doc=869,freq=2.0), product of:
            0.0616574 = queryWeight, product of:
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.035122856 = queryNorm
            0.116372846 = fieldWeight in 869, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.046875 = fieldNorm(doc=869)
        0.011607553 = weight(_text_:for in 869) [ClassicSimilarity], result of:
          0.011607553 = score(doc=869,freq=4.0), product of:
            0.06594466 = queryWeight, product of:
              1.8775425 = idf(docFreq=18385, maxDocs=44218)
              0.035122856 = queryNorm
            0.17601961 = fieldWeight in 869, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              1.8775425 = idf(docFreq=18385, maxDocs=44218)
              0.046875 = fieldNorm(doc=869)
        0.020897869 = weight(_text_:the in 869) [ClassicSimilarity], result of:
          0.020897869 = score(doc=869,freq=26.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.37711173 = fieldWeight in 869, product of:
              5.0990195 = tf(freq=26.0), with freq of:
                26.0 = termFreq=26.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.046875 = fieldNorm(doc=869)
        0.017080715 = weight(_text_:of in 869) [ClassicSimilarity], result of:
          0.017080715 = score(doc=869,freq=18.0), product of:
            0.054923624 = queryWeight, product of:
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.035122856 = queryNorm
            0.3109903 = fieldWeight in 869, product of:
              4.2426405 = tf(freq=18.0), with freq of:
                18.0 = termFreq=18.0
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.046875 = fieldNorm(doc=869)
        0.020897869 = weight(_text_:the in 869) [ClassicSimilarity], result of:
          0.020897869 = score(doc=869,freq=26.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.37711173 = fieldWeight in 869, product of:
              5.0990195 = tf(freq=26.0), with freq of:
                26.0 = termFreq=26.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.046875 = fieldNorm(doc=869)
      0.41666666 = coord(5/12)
    
    Abstract
    James Duff Brown was an influential and energetic librarian in Great Britain in the late nineteenth and early twentieth centuries. His Subject Classification has characteristics that were unusual and idiosyncratic during his own time, but his work deserves recognition as one of the precursors of modern bibliographic classification systems. This article discusses a number of theories and classification practices that Brown developed. In particular, it investigates his views on the order of main classes, on the phenomenon of "concrete" subjects, and on the need for synthesized notations. It traces these ideas briefly into the future through the work of S. R. Ranganathan, the Classification Research Group, and the second edition of the Bliss Bibliographic Classification system. It concludes that Brown's work warrants further study for the light it may shed on current classification theory and practice.
    Footnote
    Beitrag in einem Themenheft: Pioneers in library and information science
  16. Beghtol, C.: 'Itself an education' classification systems, theory, and research in the information studies curriculum (1997) 0.03
    0.03199827 = product of:
      0.076795846 = sum of:
        0.016742244 = weight(_text_:information in 666) [ClassicSimilarity], result of:
          0.016742244 = score(doc=666,freq=8.0), product of:
            0.0616574 = queryWeight, product of:
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.035122856 = queryNorm
            0.27153665 = fieldWeight in 666, product of:
              2.828427 = tf(freq=8.0), with freq of:
                8.0 = termFreq=8.0
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.0546875 = fieldNorm(doc=666)
        0.013542145 = weight(_text_:for in 666) [ClassicSimilarity], result of:
          0.013542145 = score(doc=666,freq=4.0), product of:
            0.06594466 = queryWeight, product of:
              1.8775425 = idf(docFreq=18385, maxDocs=44218)
              0.035122856 = queryNorm
            0.20535621 = fieldWeight in 666, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              1.8775425 = idf(docFreq=18385, maxDocs=44218)
              0.0546875 = fieldNorm(doc=666)
        0.01512036 = weight(_text_:the in 666) [ClassicSimilarity], result of:
          0.01512036 = score(doc=666,freq=10.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.2728539 = fieldWeight in 666, product of:
              3.1622777 = tf(freq=10.0), with freq of:
                10.0 = termFreq=10.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.0546875 = fieldNorm(doc=666)
        0.016270736 = weight(_text_:of in 666) [ClassicSimilarity], result of:
          0.016270736 = score(doc=666,freq=12.0), product of:
            0.054923624 = queryWeight, product of:
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.035122856 = queryNorm
            0.29624295 = fieldWeight in 666, product of:
              3.4641016 = tf(freq=12.0), with freq of:
                12.0 = termFreq=12.0
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.0546875 = fieldNorm(doc=666)
        0.01512036 = weight(_text_:the in 666) [ClassicSimilarity], result of:
          0.01512036 = score(doc=666,freq=10.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.2728539 = fieldWeight in 666, product of:
              3.1622777 = tf(freq=10.0), with freq of:
                10.0 = termFreq=10.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.0546875 = fieldNorm(doc=666)
      0.41666666 = coord(5/12)
    
    Abstract
    The interdisciplinary field of information studies requires an eclectic and imaginative curriculum. Future information professionals need intellectual tools that will enable them to adapt to changed social and technological environments. In this situation, the study of classification, including both principles application for current bibliographic systems and principles of construction that could be used to develop new systems for bibliographic and non bibliographic materials, is one way to equip students with the balanced flexibility to adapt to changing needs. Knowledge of the organization of knowledge is basic to any kind of information work
  17. Beghtol, C.: Domain analysis, literary warrant, and consensus : the case of fiction studies (1995) 0.03
    0.031009382 = product of:
      0.074422516 = sum of:
        0.007175247 = weight(_text_:information in 7728) [ClassicSimilarity], result of:
          0.007175247 = score(doc=7728,freq=2.0), product of:
            0.0616574 = queryWeight, product of:
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.035122856 = queryNorm
            0.116372846 = fieldWeight in 7728, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.046875 = fieldNorm(doc=7728)
        0.008207779 = weight(_text_:for in 7728) [ClassicSimilarity], result of:
          0.008207779 = score(doc=7728,freq=2.0), product of:
            0.06594466 = queryWeight, product of:
              1.8775425 = idf(docFreq=18385, maxDocs=44218)
              0.035122856 = queryNorm
            0.12446466 = fieldWeight in 7728, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.8775425 = idf(docFreq=18385, maxDocs=44218)
              0.046875 = fieldNorm(doc=7728)
        0.020078024 = weight(_text_:the in 7728) [ClassicSimilarity], result of:
          0.020078024 = score(doc=7728,freq=24.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.36231726 = fieldWeight in 7728, product of:
              4.8989797 = tf(freq=24.0), with freq of:
                24.0 = termFreq=24.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.046875 = fieldNorm(doc=7728)
        0.018883443 = weight(_text_:of in 7728) [ClassicSimilarity], result of:
          0.018883443 = score(doc=7728,freq=22.0), product of:
            0.054923624 = queryWeight, product of:
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.035122856 = queryNorm
            0.34381276 = fieldWeight in 7728, product of:
              4.690416 = tf(freq=22.0), with freq of:
                22.0 = termFreq=22.0
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.046875 = fieldNorm(doc=7728)
        0.020078024 = weight(_text_:the in 7728) [ClassicSimilarity], result of:
          0.020078024 = score(doc=7728,freq=24.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.36231726 = fieldWeight in 7728, product of:
              4.8989797 = tf(freq=24.0), with freq of:
                24.0 = termFreq=24.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.046875 = fieldNorm(doc=7728)
      0.41666666 = coord(5/12)
    
    Abstract
    This article reports research that used descriptor subfields in MLA Bibliography online to quantify literary warrant in the domain of scholarly work about fiction (i.e., 'fiction studies'). The research used Hulme's concept of literary warrant and Kernan's description of the interactive processes of literature and literary scholarship to justify quantifying existing subject indexing in existing bibliographic records as a first step in the domain analysis of a field. It was found that certain of the MLA Bibliography onle's descriptor subfields and certain of the descriptor terms within those subfields occured more often than would occur by chance. The techniques used in the research might be extended to domain analysis of other fields. Use of the methodology might improve the ability to evaluate existing and to design future subject access systems
    Source
    Journal of the American Society for Information Science. 46(1995) no.1, S.30-44
  18. Beghtol, C.: Universal concepts, cultural warrant and cultural hospitality (2003) 0.03
    0.030905144 = product of:
      0.07417235 = sum of:
        0.007175247 = weight(_text_:information in 2681) [ClassicSimilarity], result of:
          0.007175247 = score(doc=2681,freq=2.0), product of:
            0.0616574 = queryWeight, product of:
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.035122856 = queryNorm
            0.116372846 = fieldWeight in 2681, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.046875 = fieldNorm(doc=2681)
        0.014216291 = weight(_text_:for in 2681) [ClassicSimilarity], result of:
          0.014216291 = score(doc=2681,freq=6.0), product of:
            0.06594466 = queryWeight, product of:
              1.8775425 = idf(docFreq=18385, maxDocs=44218)
              0.035122856 = queryNorm
            0.21557912 = fieldWeight in 2681, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              1.8775425 = idf(docFreq=18385, maxDocs=44218)
              0.046875 = fieldNorm(doc=2681)
        0.01738808 = weight(_text_:the in 2681) [ClassicSimilarity], result of:
          0.01738808 = score(doc=2681,freq=18.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.31377596 = fieldWeight in 2681, product of:
              4.2426405 = tf(freq=18.0), with freq of:
                18.0 = termFreq=18.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.046875 = fieldNorm(doc=2681)
        0.018004656 = weight(_text_:of in 2681) [ClassicSimilarity], result of:
          0.018004656 = score(doc=2681,freq=20.0), product of:
            0.054923624 = queryWeight, product of:
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.035122856 = queryNorm
            0.32781258 = fieldWeight in 2681, product of:
              4.472136 = tf(freq=20.0), with freq of:
                20.0 = termFreq=20.0
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.046875 = fieldNorm(doc=2681)
        0.01738808 = weight(_text_:the in 2681) [ClassicSimilarity], result of:
          0.01738808 = score(doc=2681,freq=18.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.31377596 = fieldWeight in 2681, product of:
              4.2426405 = tf(freq=18.0), with freq of:
                18.0 = termFreq=18.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.046875 = fieldNorm(doc=2681)
      0.41666666 = coord(5/12)
    
    Abstract
    The problem of how to provide access to information regardless of linguistic or other domain boundaries or cultural traditions can be approached by examining how cultural universals are implemented in specific cultures at specific times and places. The universal concept of "time" and its implementation in calendars is used as an illustration, and how time is treated in knowledge organization systems is briefly described. A broadened definition for the concept of "hospitality" is proposed for use in evaluating the efficacy of knowledge organization systems. The identification of the complementary concept of "cultural hospitality" provides a theoretical framework to inform decisions about the types of access that can (and/or should) be provided by knowledge organization systems that purport to be globally useful and ethically balanced.
    Source
    Challenges in knowledge representation and organization for the 21st century: Integration of knowledge across boundaries. Proceedings of the 7th ISKO International Conference Granada, Spain, July 10-13, 2002. Ed.: M. López-Huertas
  19. Beghtol, C.: Nancy J. Williamson and the International Society for Knowledge Organization (ISKO) (2010) 0.03
    0.029574819 = product of:
      0.070979565 = sum of:
        0.008371122 = weight(_text_:information in 3566) [ClassicSimilarity], result of:
          0.008371122 = score(doc=3566,freq=2.0), product of:
            0.0616574 = queryWeight, product of:
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.035122856 = queryNorm
            0.13576832 = fieldWeight in 3566, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.0546875 = fieldNorm(doc=3566)
        0.013542145 = weight(_text_:for in 3566) [ClassicSimilarity], result of:
          0.013542145 = score(doc=3566,freq=4.0), product of:
            0.06594466 = queryWeight, product of:
              1.8775425 = idf(docFreq=18385, maxDocs=44218)
              0.035122856 = queryNorm
            0.20535621 = fieldWeight in 3566, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              1.8775425 = idf(docFreq=18385, maxDocs=44218)
              0.0546875 = fieldNorm(doc=3566)
        0.01789065 = weight(_text_:the in 3566) [ClassicSimilarity], result of:
          0.01789065 = score(doc=3566,freq=14.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.3228451 = fieldWeight in 3566, product of:
              3.7416575 = tf(freq=14.0), with freq of:
                14.0 = termFreq=14.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.0546875 = fieldNorm(doc=3566)
        0.013285002 = weight(_text_:of in 3566) [ClassicSimilarity], result of:
          0.013285002 = score(doc=3566,freq=8.0), product of:
            0.054923624 = queryWeight, product of:
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.035122856 = queryNorm
            0.24188137 = fieldWeight in 3566, product of:
              2.828427 = tf(freq=8.0), with freq of:
                8.0 = termFreq=8.0
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.0546875 = fieldNorm(doc=3566)
        0.01789065 = weight(_text_:the in 3566) [ClassicSimilarity], result of:
          0.01789065 = score(doc=3566,freq=14.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.3228451 = fieldWeight in 3566, product of:
              3.7416575 = tf(freq=14.0), with freq of:
                14.0 = termFreq=14.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.0546875 = fieldNorm(doc=3566)
      0.41666666 = coord(5/12)
    
    Abstract
    This article documents and analyzes Nancy J. Williamson's contributions to two of the major publications of the International Society for Knowledge Organization (ISKO), International Classification/ Knowledge Organization and Advances in Classification Research. The results show her serious and long-standing commitment to the field of representing and organizing information and knowledge and her dedication to expanding worldwide interest and involvement in these fields. The Appendix provides access to each of Williamson's contributions to the two ISKO publications.
  20. Beghtol, C.: Classification theory (2010) 0.03
    0.026032588 = product of:
      0.06247821 = sum of:
        0.010147331 = weight(_text_:information in 3761) [ClassicSimilarity], result of:
          0.010147331 = score(doc=3761,freq=4.0), product of:
            0.0616574 = queryWeight, product of:
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.035122856 = queryNorm
            0.16457605 = fieldWeight in 3761, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.046875 = fieldNorm(doc=3761)
        0.016415559 = weight(_text_:for in 3761) [ClassicSimilarity], result of:
          0.016415559 = score(doc=3761,freq=8.0), product of:
            0.06594466 = queryWeight, product of:
              1.8775425 = idf(docFreq=18385, maxDocs=44218)
              0.035122856 = queryNorm
            0.24892932 = fieldWeight in 3761, product of:
              2.828427 = tf(freq=8.0), with freq of:
                8.0 = termFreq=8.0
              1.8775425 = idf(docFreq=18385, maxDocs=44218)
              0.046875 = fieldNorm(doc=3761)
        0.011592053 = weight(_text_:the in 3761) [ClassicSimilarity], result of:
          0.011592053 = score(doc=3761,freq=8.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.20918396 = fieldWeight in 3761, product of:
              2.828427 = tf(freq=8.0), with freq of:
                8.0 = termFreq=8.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.046875 = fieldNorm(doc=3761)
        0.012731214 = weight(_text_:of in 3761) [ClassicSimilarity], result of:
          0.012731214 = score(doc=3761,freq=10.0), product of:
            0.054923624 = queryWeight, product of:
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.035122856 = queryNorm
            0.23179851 = fieldWeight in 3761, product of:
              3.1622777 = tf(freq=10.0), with freq of:
                10.0 = termFreq=10.0
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.046875 = fieldNorm(doc=3761)
        0.011592053 = weight(_text_:the in 3761) [ClassicSimilarity], result of:
          0.011592053 = score(doc=3761,freq=8.0), product of:
            0.05541559 = queryWeight, product of:
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.035122856 = queryNorm
            0.20918396 = fieldWeight in 3761, product of:
              2.828427 = tf(freq=8.0), with freq of:
                8.0 = termFreq=8.0
              1.5777643 = idf(docFreq=24812, maxDocs=44218)
              0.046875 = fieldNorm(doc=3761)
      0.41666666 = coord(5/12)
    
    Abstract
    In the library and information sciences, classification theories are used primarily for knowledge organization, either in a manual or in a machine environment. In this context, classification theories have usually been developed initially as a support for specific knowledge organization classification systems, although the theories and the systems have influenced and re-influenced each other in particular ways throughout their lives. This entry discusses theories for knowledge organization classifications using examples from a number of classification systems, but no one system is discussed at length. Instead, the entry is organized into sections that deal first with classificatory issues in general and then with theories of content, theories of structure, and theories of notation for knowledge organization classifications.
    Source
    Encyclopedia of library and information sciences. 3rd ed. Ed.: M.J. Bates