Search (38328 results, page 1 of 1917)

  1. Ackermann, E.: Piaget's constructivism, Papert's constructionism : what's the difference? (2001) 0.27
    0.27326033 = sum of:
      0.2698863 = product of:
        0.5397726 = sum of:
          0.21062453 = weight(_text_:3a in 692) [ClassicSimilarity], result of:
            0.21062453 = score(doc=692,freq=2.0), product of:
              0.44971764 = queryWeight, product of:
                8.478011 = idf(docFreq=24, maxDocs=44218)
                0.053045183 = queryNorm
              0.46834838 = fieldWeight in 692, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                8.478011 = idf(docFreq=24, maxDocs=44218)
                0.0390625 = fieldNorm(doc=692)
          0.32914808 = weight(_text_:2c in 692) [ClassicSimilarity], result of:
            0.32914808 = score(doc=692,freq=2.0), product of:
              0.56218743 = queryWeight, product of:
                10.598275 = idf(docFreq=2, maxDocs=44218)
                0.053045183 = queryNorm
              0.5854775 = fieldWeight in 692, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                10.598275 = idf(docFreq=2, maxDocs=44218)
                0.0390625 = fieldNorm(doc=692)
        0.5 = coord(2/4)
      0.0033740045 = product of:
        0.006748009 = sum of:
          0.006748009 = weight(_text_:a in 692) [ClassicSimilarity], result of:
            0.006748009 = score(doc=692,freq=6.0), product of:
              0.06116359 = queryWeight, product of:
                1.153047 = idf(docFreq=37942, maxDocs=44218)
                0.053045183 = queryNorm
              0.11032722 = fieldWeight in 692, product of:
                2.4494898 = tf(freq=6.0), with freq of:
                  6.0 = termFreq=6.0
                1.153047 = idf(docFreq=37942, maxDocs=44218)
                0.0390625 = fieldNorm(doc=692)
        0.5 = coord(1/2)
    
    Abstract
    What is the difference between Piaget's constructivism and Papert's "constructionism"? Beyond the mere play on the words, I think the distinction holds, and that integrating both views can enrich our understanding of how people learn and grow. Piaget's constructivism offers a window into what children are interested in, and able to achieve, at different stages of their development. The theory describes how children's ways of doing and thinking evolve over time, and under which circumstance children are more likely to let go of-or hold onto- their currently held views. Piaget suggests that children have very good reasons not to abandon their worldviews just because someone else, be it an expert, tells them they're wrong. Papert's constructionism, in contrast, focuses more on the art of learning, or 'learning to learn', and on the significance of making things in learning. Papert is interested in how learners engage in a conversation with [their own or other people's] artifacts, and how these conversations boost self-directed learning, and ultimately facilitate the construction of new knowledge. He stresses the importance of tools, media, and context in human development. Integrating both perspectives illuminates the processes by which individuals come to make sense of their experience, gradually optimizing their interactions with the world.
    Content
    Vgl.: https://www.semanticscholar.org/paper/Piaget-%E2%80%99-s-Constructivism-%2C-Papert-%E2%80%99-s-%3A-What-%E2%80%99-s-Ackermann/89cbcc1e740a4591443ff4765a6ae8df0fdf5554. Darunter weitere Hinweise auf verwandte Beiträge. Auch unter: Learning Group Publication 5(2001) no.3, S.438.
    Type
    a
  2. Carter, J.A.: PASSPORT/PRISM: authors and titles and MARC : oh my! (1993) 0.18
    0.17617816 = sum of:
      0.048720833 = product of:
        0.19488333 = sum of:
          0.19488333 = weight(_text_:authors in 527) [ClassicSimilarity], result of:
            0.19488333 = score(doc=527,freq=2.0), product of:
              0.24182312 = queryWeight, product of:
                4.558814 = idf(docFreq=1258, maxDocs=44218)
                0.053045183 = queryNorm
              0.80589205 = fieldWeight in 527, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                4.558814 = idf(docFreq=1258, maxDocs=44218)
                0.125 = fieldNorm(doc=527)
        0.25 = coord(1/4)
      0.12745732 = sum of:
        0.012467085 = weight(_text_:a in 527) [ClassicSimilarity], result of:
          0.012467085 = score(doc=527,freq=2.0), product of:
            0.06116359 = queryWeight, product of:
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.053045183 = queryNorm
            0.20383182 = fieldWeight in 527, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.125 = fieldNorm(doc=527)
        0.11499024 = weight(_text_:22 in 527) [ClassicSimilarity], result of:
          0.11499024 = score(doc=527,freq=2.0), product of:
            0.1857552 = queryWeight, product of:
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.053045183 = queryNorm
            0.61904186 = fieldWeight in 527, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.125 = fieldNorm(doc=527)
    
    Source
    OCLC systems and services. 9(1993) no.3, S.20-22
    Type
    a
  3. Gödert, W.; Hubrich, J.; Boteram, F.: Thematische Recherche und Interoperabilität : Wege zur Optimierung des Zugriffs auf heterogen erschlossene Dokumente (2009) 0.12
    0.12211744 = sum of:
      0.08228702 = product of:
        0.32914808 = sum of:
          0.32914808 = weight(_text_:2c in 193) [ClassicSimilarity], result of:
            0.32914808 = score(doc=193,freq=2.0), product of:
              0.56218743 = queryWeight, product of:
                10.598275 = idf(docFreq=2, maxDocs=44218)
                0.053045183 = queryNorm
              0.5854775 = fieldWeight in 193, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                10.598275 = idf(docFreq=2, maxDocs=44218)
                0.0390625 = fieldNorm(doc=193)
        0.25 = coord(1/4)
      0.039830416 = sum of:
        0.0038959642 = weight(_text_:a in 193) [ClassicSimilarity], result of:
          0.0038959642 = score(doc=193,freq=2.0), product of:
            0.06116359 = queryWeight, product of:
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.053045183 = queryNorm
            0.06369744 = fieldWeight in 193, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.0390625 = fieldNorm(doc=193)
        0.035934452 = weight(_text_:22 in 193) [ClassicSimilarity], result of:
          0.035934452 = score(doc=193,freq=2.0), product of:
            0.1857552 = queryWeight, product of:
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.053045183 = queryNorm
            0.19345059 = fieldWeight in 193, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.0390625 = fieldNorm(doc=193)
    
    Source
    https://opus4.kobv.de/opus4-bib-info/frontdoor/index/index/searchtype/authorsearch/author/%22Hubrich%2C+Jessica%22/docId/703/start/0/rows/20
    Type
    a
  4. Hotho, A.; Bloehdorn, S.: Data Mining 2004 : Text classification by boosting weak learners based on terms and concepts (2004) 0.12
    0.115659006 = sum of:
      0.06318735 = product of:
        0.2527494 = sum of:
          0.2527494 = weight(_text_:3a in 562) [ClassicSimilarity], result of:
            0.2527494 = score(doc=562,freq=2.0), product of:
              0.44971764 = queryWeight, product of:
                8.478011 = idf(docFreq=24, maxDocs=44218)
                0.053045183 = queryNorm
              0.56201804 = fieldWeight in 562, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                8.478011 = idf(docFreq=24, maxDocs=44218)
                0.046875 = fieldNorm(doc=562)
        0.25 = coord(1/4)
      0.052471653 = sum of:
        0.009350315 = weight(_text_:a in 562) [ClassicSimilarity], result of:
          0.009350315 = score(doc=562,freq=8.0), product of:
            0.06116359 = queryWeight, product of:
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.053045183 = queryNorm
            0.15287387 = fieldWeight in 562, product of:
              2.828427 = tf(freq=8.0), with freq of:
                8.0 = termFreq=8.0
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.046875 = fieldNorm(doc=562)
        0.043121338 = weight(_text_:22 in 562) [ClassicSimilarity], result of:
          0.043121338 = score(doc=562,freq=2.0), product of:
            0.1857552 = queryWeight, product of:
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.053045183 = queryNorm
            0.23214069 = fieldWeight in 562, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.046875 = fieldNorm(doc=562)
    
    Abstract
    Document representations for text classification are typically based on the classical Bag-Of-Words paradigm. This approach comes with deficiencies that motivate the integration of features on a higher semantic level than single words. In this paper we propose an enhancement of the classical document representation through concepts extracted from background knowledge. Boosting is used for actual classification. Experimental evaluations on two well known text corpora support our approach through consistent improvement of the results.
    Content
    Vgl.: http://www.google.de/url?sa=t&rct=j&q=&esrc=s&source=web&cd=1&cad=rja&ved=0CEAQFjAA&url=http%3A%2F%2Fciteseerx.ist.psu.edu%2Fviewdoc%2Fdownload%3Fdoi%3D10.1.1.91.4940%26rep%3Drep1%26type%3Dpdf&ei=dOXrUMeIDYHDtQahsIGACg&usg=AFQjCNHFWVh6gNPvnOrOS9R3rkrXCNVD-A&sig2=5I2F5evRfMnsttSgFF9g7Q&bvm=bv.1357316858,d.Yms.
    Date
    8. 1.2013 10:22:32
    Type
    a
  5. Soergel, D.: Knowledge organization for learning (2014) 0.10
    0.10017557 = sum of:
      0.021315364 = product of:
        0.08526146 = sum of:
          0.08526146 = weight(_text_:authors in 1400) [ClassicSimilarity], result of:
            0.08526146 = score(doc=1400,freq=2.0), product of:
              0.24182312 = queryWeight, product of:
                4.558814 = idf(docFreq=1258, maxDocs=44218)
                0.053045183 = queryNorm
              0.35257778 = fieldWeight in 1400, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                4.558814 = idf(docFreq=1258, maxDocs=44218)
                0.0546875 = fieldNorm(doc=1400)
        0.25 = coord(1/4)
      0.0788602 = sum of:
        0.0077136164 = weight(_text_:a in 1400) [ClassicSimilarity], result of:
          0.0077136164 = score(doc=1400,freq=4.0), product of:
            0.06116359 = queryWeight, product of:
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.053045183 = queryNorm
            0.12611452 = fieldWeight in 1400, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.0546875 = fieldNorm(doc=1400)
        0.071146585 = weight(_text_:22 in 1400) [ClassicSimilarity], result of:
          0.071146585 = score(doc=1400,freq=4.0), product of:
            0.1857552 = queryWeight, product of:
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.053045183 = queryNorm
            0.38301262 = fieldWeight in 1400, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.0546875 = fieldNorm(doc=1400)
    
    Abstract
    This paper discusses and illustrates through examples how meaningful or deep learning can be supported through well-structured presentation of material, through giving learners schemas they can use to organize knowledge in their minds, and through helping learners to understand knowledge organization principles they can use to construct their own schemas. It is a call to all authors, educators and information designers to pay attention to meaningful presentation that expresses the internal structure of the domain and facilitates the learner's assimilation of concepts and their relationships.
    Pages
    S.22-32
    Source
    Knowledge organization in the 21st century: between historical patterns and future prospects. Proceedings of the Thirteenth International ISKO Conference 19-22 May 2014, Kraków, Poland. Ed.: Wieslaw Babik
    Type
    a
  6. Siddiqui, M.A.: ¬A bibliometric study of authorship characteristics in four international information science journals (1997) 0.10
    0.09722459 = sum of:
      0.04475294 = product of:
        0.17901176 = sum of:
          0.17901176 = weight(_text_:authors in 853) [ClassicSimilarity], result of:
            0.17901176 = score(doc=853,freq=12.0), product of:
              0.24182312 = queryWeight, product of:
                4.558814 = idf(docFreq=1258, maxDocs=44218)
                0.053045183 = queryNorm
              0.7402591 = fieldWeight in 853, product of:
                3.4641016 = tf(freq=12.0), with freq of:
                  12.0 = termFreq=12.0
                4.558814 = idf(docFreq=1258, maxDocs=44218)
                0.046875 = fieldNorm(doc=853)
        0.25 = coord(1/4)
      0.052471653 = sum of:
        0.009350315 = weight(_text_:a in 853) [ClassicSimilarity], result of:
          0.009350315 = score(doc=853,freq=8.0), product of:
            0.06116359 = queryWeight, product of:
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.053045183 = queryNorm
            0.15287387 = fieldWeight in 853, product of:
              2.828427 = tf(freq=8.0), with freq of:
                8.0 = termFreq=8.0
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.046875 = fieldNorm(doc=853)
        0.043121338 = weight(_text_:22 in 853) [ClassicSimilarity], result of:
          0.043121338 = score(doc=853,freq=2.0), product of:
            0.1857552 = queryWeight, product of:
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.053045183 = queryNorm
            0.23214069 = fieldWeight in 853, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.046875 = fieldNorm(doc=853)
    
    Abstract
    Reports results of a bibliometric study of the authorship characteristics of articles published in 4 major information science periodicals: JASIS, Information technology and libraries, Journal of information science, and Program. The aim was to determine the details of their authors, such as: sex, occupation, affiliation, geographic distribution, and institutional affiliation. A total of 163 articles published in 1993 and written by 294 authors were analyzed. Results indicate that: men (206 or 70%) publish 3.0 times more articles than women (69 or 23,5%). Schools of library and information science contributed the most authors. The majority of authors came from the USA (148 or 50,3%), with the Midwest region claiming the largest share (110 or 25,0%). Academic libraries (110 or 37,4%) account for the major share of library publication. 12 schools of library and information science, in the USA, contributed 32 authors (50,0%) and assistant professors (25 or 39,1%) publish the most in these library schools. Male school of library and information science authors publish 1,6 times more than their female counterparts
    Source
    International forum on information and documentation. 22(1997) no.3, S.3-23
    Type
    a
  7. Hickey, T.B.; Toves, J.; O'Neill, E.T.: NACO normalization : a detailed examination of the authority file comparison rules (2006) 0.10
    0.09667474 = sum of:
      0.036919296 = product of:
        0.14767718 = sum of:
          0.14767718 = weight(_text_:authors in 5760) [ClassicSimilarity], result of:
            0.14767718 = score(doc=5760,freq=6.0), product of:
              0.24182312 = queryWeight, product of:
                4.558814 = idf(docFreq=1258, maxDocs=44218)
                0.053045183 = queryNorm
              0.61068267 = fieldWeight in 5760, product of:
                2.4494898 = tf(freq=6.0), with freq of:
                  6.0 = termFreq=6.0
                4.558814 = idf(docFreq=1258, maxDocs=44218)
                0.0546875 = fieldNorm(doc=5760)
        0.25 = coord(1/4)
      0.059755445 = sum of:
        0.009447212 = weight(_text_:a in 5760) [ClassicSimilarity], result of:
          0.009447212 = score(doc=5760,freq=6.0), product of:
            0.06116359 = queryWeight, product of:
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.053045183 = queryNorm
            0.1544581 = fieldWeight in 5760, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.0546875 = fieldNorm(doc=5760)
        0.05030823 = weight(_text_:22 in 5760) [ClassicSimilarity], result of:
          0.05030823 = score(doc=5760,freq=2.0), product of:
            0.1857552 = queryWeight, product of:
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.053045183 = queryNorm
            0.2708308 = fieldWeight in 5760, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.0546875 = fieldNorm(doc=5760)
    
    Abstract
    Normalization rules are essential for interoperability between bibliographic systems. In the process of working with Name Authority Cooperative Program (NACO) authority files to match records with Functional Requirements for Bibliographic Records (FRBR) and developing the Faceted Application of Subject Terminology (FAST) subject heading schema, the authors found inconsistencies in independently created NACO normalization implementations. Investigating these, the authors found ambiguities in the NACO standard that need resolution, and came to conclusions on how the procedure could be simplified with little impact on matching headings. To encourage others to test their software for compliance with the current rules, the authors have established a Web site that has test files and interactive services showing their current implementation.
    Date
    10. 9.2000 17:38:22
    Type
    a
  8. Alexandre Hannud Abdo, A.H. => Hannud Abdo, A.: 0.10
    0.09559299 = product of:
      0.19118598 = sum of:
        0.19118598 = sum of:
          0.01870063 = weight(_text_:a in 617) [ClassicSimilarity], result of:
            0.01870063 = score(doc=617,freq=2.0), product of:
              0.06116359 = queryWeight, product of:
                1.153047 = idf(docFreq=37942, maxDocs=44218)
                0.053045183 = queryNorm
              0.30574775 = fieldWeight in 617, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                1.153047 = idf(docFreq=37942, maxDocs=44218)
                0.1875 = fieldNorm(doc=617)
          0.17248535 = weight(_text_:22 in 617) [ClassicSimilarity], result of:
            0.17248535 = score(doc=617,freq=2.0), product of:
              0.1857552 = queryWeight, product of:
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.053045183 = queryNorm
              0.92856276 = fieldWeight in 617, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.1875 = fieldNorm(doc=617)
      0.5 = coord(1/2)
    
    Date
    7. 6.2022 19:22:19
  9. Mandel, C.A.; Wolven, R.: Intellectual access to digital documents : joining proven principles with new technologies (1996) 0.09
    0.09494114 = sum of:
      0.036919296 = product of:
        0.14767718 = sum of:
          0.14767718 = weight(_text_:authors in 597) [ClassicSimilarity], result of:
            0.14767718 = score(doc=597,freq=6.0), product of:
              0.24182312 = queryWeight, product of:
                4.558814 = idf(docFreq=1258, maxDocs=44218)
                0.053045183 = queryNorm
              0.61068267 = fieldWeight in 597, product of:
                2.4494898 = tf(freq=6.0), with freq of:
                  6.0 = termFreq=6.0
                4.558814 = idf(docFreq=1258, maxDocs=44218)
                0.0546875 = fieldNorm(doc=597)
        0.25 = coord(1/4)
      0.058021847 = sum of:
        0.0077136164 = weight(_text_:a in 597) [ClassicSimilarity], result of:
          0.0077136164 = score(doc=597,freq=4.0), product of:
            0.06116359 = queryWeight, product of:
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.053045183 = queryNorm
            0.12611452 = fieldWeight in 597, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.0546875 = fieldNorm(doc=597)
        0.05030823 = weight(_text_:22 in 597) [ClassicSimilarity], result of:
          0.05030823 = score(doc=597,freq=2.0), product of:
            0.1857552 = queryWeight, product of:
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.053045183 = queryNorm
            0.2708308 = fieldWeight in 597, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.0546875 = fieldNorm(doc=597)
    
    Abstract
    This paper considers the relevance of Charles Ami Cutter's principles of bibliographic access to the uiniverse of Internet accessible digital objects and explores new methods for applying these principles in the context of new information technologies. The paper examines the value for retrieval of collecting authors' names, identifying authors' roles, collocating works and versions, and providing subject access through classification and controlled vocabularies for digital resources available through the World Wide Web. THe authors identify emerging techniques and technologies that can be used in lieu of or as a supplement to traditional cataloging to achieve these functions in organizing access to Internet resources
    Source
    Cataloging and classification quarterly. 22(1996) nos.3/4, S.25-42
    Type
    a
  10. Mitchell, J.S.; Zeng, M.L.; Zumer, M.: Modeling classification systems in multicultural and multilingual contexts (2012) 0.09
    0.093432575 = sum of:
      0.025838124 = product of:
        0.103352495 = sum of:
          0.103352495 = weight(_text_:authors in 1967) [ClassicSimilarity], result of:
            0.103352495 = score(doc=1967,freq=4.0), product of:
              0.24182312 = queryWeight, product of:
                4.558814 = idf(docFreq=1258, maxDocs=44218)
                0.053045183 = queryNorm
              0.42738882 = fieldWeight in 1967, product of:
                2.0 = tf(freq=4.0), with freq of:
                  4.0 = termFreq=4.0
                4.558814 = idf(docFreq=1258, maxDocs=44218)
                0.046875 = fieldNorm(doc=1967)
        0.25 = coord(1/4)
      0.06759445 = sum of:
        0.006611671 = weight(_text_:a in 1967) [ClassicSimilarity], result of:
          0.006611671 = score(doc=1967,freq=4.0), product of:
            0.06116359 = queryWeight, product of:
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.053045183 = queryNorm
            0.10809815 = fieldWeight in 1967, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.046875 = fieldNorm(doc=1967)
        0.060982786 = weight(_text_:22 in 1967) [ClassicSimilarity], result of:
          0.060982786 = score(doc=1967,freq=4.0), product of:
            0.1857552 = queryWeight, product of:
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.053045183 = queryNorm
            0.32829654 = fieldWeight in 1967, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.046875 = fieldNorm(doc=1967)
    
    Abstract
    This paper reports on the second part of an initiative of the authors on researching classification systems with the conceptual model defined by the Functional Requirements for Subject Authority Data (FRSAD) final report. In an earlier study, the authors explored whether the FRSAD conceptual model could be extended beyond subject authority data to model classification data. The focus of the current study is to determine if classification data modeled using FRSAD can be used to solve real-world discovery problems in multicultural and multilingual contexts. The paper discusses the relationships between entities (same type or different types) in the context of classification systems that involve multiple translations and /or multicultural implementations. Results of two case studies are presented in detail: (a) two instances of the DDC (DDC 22 in English, and the Swedish-English mixed translation of DDC 22), and (b) Chinese Library Classification. The use cases of conceptual models in practice are also discussed.
    Type
    a
  11. Townsel-Winston, M.: What's new in public services? (1992) 0.09
    0.09265235 = sum of:
      0.024360416 = product of:
        0.097441666 = sum of:
          0.097441666 = weight(_text_:authors in 5550) [ClassicSimilarity], result of:
            0.097441666 = score(doc=5550,freq=2.0), product of:
              0.24182312 = queryWeight, product of:
                4.558814 = idf(docFreq=1258, maxDocs=44218)
                0.053045183 = queryNorm
              0.40294603 = fieldWeight in 5550, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                4.558814 = idf(docFreq=1258, maxDocs=44218)
                0.0625 = fieldNorm(doc=5550)
        0.25 = coord(1/4)
      0.06829193 = sum of:
        0.010796814 = weight(_text_:a in 5550) [ClassicSimilarity], result of:
          0.010796814 = score(doc=5550,freq=6.0), product of:
            0.06116359 = queryWeight, product of:
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.053045183 = queryNorm
            0.17652355 = fieldWeight in 5550, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.0625 = fieldNorm(doc=5550)
        0.05749512 = weight(_text_:22 in 5550) [ClassicSimilarity], result of:
          0.05749512 = score(doc=5550,freq=2.0), product of:
            0.1857552 = queryWeight, product of:
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.053045183 = queryNorm
            0.30952093 = fieldWeight in 5550, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.0625 = fieldNorm(doc=5550)
    
    Abstract
    OCLC has added 5 new databases to the FirstSearch catalogue and EPIC service: ContentsFirst, ArticleFirst, Social Sciences Index, General Science Index and Event-Line. DiscLit is available in a British literature version. DiscLit: British Authors is a CD-ROM tool for high school and college literature students. Announces winners of the OCLC on the Front Line Award and Database Magazine's Product of the year as Jane Bambrick and the Online Journal of Current Clinical Trials
    Source
    OCLC micro. 8(1992) no.6, S.21-22
    Type
    a
  12. Münnich, M.: Katalogisieren auf dem PC : ein Pflichtenheft für die Formalkatalogisierung (1988) 0.09
    0.09265235 = sum of:
      0.024360416 = product of:
        0.097441666 = sum of:
          0.097441666 = weight(_text_:authors in 2502) [ClassicSimilarity], result of:
            0.097441666 = score(doc=2502,freq=2.0), product of:
              0.24182312 = queryWeight, product of:
                4.558814 = idf(docFreq=1258, maxDocs=44218)
                0.053045183 = queryNorm
              0.40294603 = fieldWeight in 2502, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                4.558814 = idf(docFreq=1258, maxDocs=44218)
                0.0625 = fieldNorm(doc=2502)
        0.25 = coord(1/4)
      0.06829193 = sum of:
        0.010796814 = weight(_text_:a in 2502) [ClassicSimilarity], result of:
          0.010796814 = score(doc=2502,freq=6.0), product of:
            0.06116359 = queryWeight, product of:
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.053045183 = queryNorm
            0.17652355 = fieldWeight in 2502, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.0625 = fieldNorm(doc=2502)
        0.05749512 = weight(_text_:22 in 2502) [ClassicSimilarity], result of:
          0.05749512 = score(doc=2502,freq=2.0), product of:
            0.1857552 = queryWeight, product of:
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.053045183 = queryNorm
            0.30952093 = fieldWeight in 2502, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.0625 = fieldNorm(doc=2502)
    
    Abstract
    Examines a simpler cataloguing format offered by PCs, without disturbing compatibility, using A-Z cataloguing rules for data input, category codes for tagging and computer-supported data input through windows. Gives numerous examples of catalogue entries, basing techniques on certain category schemes set out by Klaus Haller and Hans Popst. Examines catalogue entries in respect of categories of data bases for authors and corporate names, titles, single volume works, serial issues of collected works, and limited editions of works in several volumes.
    Source
    Bibliotheksdienst. 22(1988) H.9, S.841-856
    Type
    a
  13. Aptagiri, D.V.; Gopinath, M.A.; Prasad, A.R.D.: ¬A frame based knowledge representation paradigm for automating POPSI (1995) 0.09
    0.09265235 = sum of:
      0.024360416 = product of:
        0.097441666 = sum of:
          0.097441666 = weight(_text_:authors in 2887) [ClassicSimilarity], result of:
            0.097441666 = score(doc=2887,freq=2.0), product of:
              0.24182312 = queryWeight, product of:
                4.558814 = idf(docFreq=1258, maxDocs=44218)
                0.053045183 = queryNorm
              0.40294603 = fieldWeight in 2887, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                4.558814 = idf(docFreq=1258, maxDocs=44218)
                0.0625 = fieldNorm(doc=2887)
        0.25 = coord(1/4)
      0.06829193 = sum of:
        0.010796814 = weight(_text_:a in 2887) [ClassicSimilarity], result of:
          0.010796814 = score(doc=2887,freq=6.0), product of:
            0.06116359 = queryWeight, product of:
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.053045183 = queryNorm
            0.17652355 = fieldWeight in 2887, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.0625 = fieldNorm(doc=2887)
        0.05749512 = weight(_text_:22 in 2887) [ClassicSimilarity], result of:
          0.05749512 = score(doc=2887,freq=2.0), product of:
            0.1857552 = queryWeight, product of:
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.053045183 = queryNorm
            0.30952093 = fieldWeight in 2887, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.0625 = fieldNorm(doc=2887)
    
    Abstract
    This paper is based on the project work carries out by the authors at DRTC. Knowledge representation models are used in building intelligent systems for problem solving. The paper discusses, a frame based knowledge representation model built for automatic indexing. The system assigns POPSI indicators and produces subject strings for titles. The results are given in appendices
    Source
    Knowledge organization. 22(1995) nos.3/4, S.162-167
    Type
    a
  14. Popper, K.R.: Three worlds : the Tanner lecture on human values. Deliverd at the University of Michigan, April 7, 1978 (1978) 0.09
    0.09121912 = sum of:
      0.08424981 = product of:
        0.33699924 = sum of:
          0.33699924 = weight(_text_:3a in 230) [ClassicSimilarity], result of:
            0.33699924 = score(doc=230,freq=2.0), product of:
              0.44971764 = queryWeight, product of:
                8.478011 = idf(docFreq=24, maxDocs=44218)
                0.053045183 = queryNorm
              0.7493574 = fieldWeight in 230, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                8.478011 = idf(docFreq=24, maxDocs=44218)
                0.0625 = fieldNorm(doc=230)
        0.25 = coord(1/4)
      0.0069693136 = product of:
        0.013938627 = sum of:
          0.013938627 = weight(_text_:a in 230) [ClassicSimilarity], result of:
            0.013938627 = score(doc=230,freq=10.0), product of:
              0.06116359 = queryWeight, product of:
                1.153047 = idf(docFreq=37942, maxDocs=44218)
                0.053045183 = queryNorm
              0.22789092 = fieldWeight in 230, product of:
                3.1622777 = tf(freq=10.0), with freq of:
                  10.0 = termFreq=10.0
                1.153047 = idf(docFreq=37942, maxDocs=44218)
                0.0625 = fieldNorm(doc=230)
        0.5 = coord(1/2)
    
    Abstract
    In this lecture I intend to challenge those who uphold a monist or even a dualist view of the universe; and I will propose, instead, a pluralist view. I will propose a view of the universe that recognizes at least three different but interacting sub-universes.
    Source
    https%3A%2F%2Ftannerlectures.utah.edu%2F_documents%2Fa-to-z%2Fp%2Fpopper80.pdf&usg=AOvVaw3f4QRTEH-OEBmoYr2J_c7H
    Type
    a
  15. Oppenheim, C.: ¬The implications of copyright legislation for electronic access to journal collections (1994) 0.09
    0.0906711 = sum of:
      0.024360416 = product of:
        0.097441666 = sum of:
          0.097441666 = weight(_text_:authors in 7245) [ClassicSimilarity], result of:
            0.097441666 = score(doc=7245,freq=2.0), product of:
              0.24182312 = queryWeight, product of:
                4.558814 = idf(docFreq=1258, maxDocs=44218)
                0.053045183 = queryNorm
              0.40294603 = fieldWeight in 7245, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                4.558814 = idf(docFreq=1258, maxDocs=44218)
                0.0625 = fieldNorm(doc=7245)
        0.25 = coord(1/4)
      0.06631068 = sum of:
        0.008815561 = weight(_text_:a in 7245) [ClassicSimilarity], result of:
          0.008815561 = score(doc=7245,freq=4.0), product of:
            0.06116359 = queryWeight, product of:
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.053045183 = queryNorm
            0.14413087 = fieldWeight in 7245, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.0625 = fieldNorm(doc=7245)
        0.05749512 = weight(_text_:22 in 7245) [ClassicSimilarity], result of:
          0.05749512 = score(doc=7245,freq=2.0), product of:
            0.1857552 = queryWeight, product of:
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.053045183 = queryNorm
            0.30952093 = fieldWeight in 7245, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.0625 = fieldNorm(doc=7245)
    
    Abstract
    The nature and implications of electrocopying are summarised. After a brief review of the principles of copyright, the issue of whether electrocopying infringes copyright is debated. Publishers are aware of the threat that electrocopying poses to their business. The various options available to publishers for responding to electrocopying are summarised. Patterns of scholarly communications and the relationships between authors, publishers and libraries are being challenged. Constructive dialogue is necessary if the issues are to be resolved
    Source
    Journal of document and text management. 2(1994) no.1, S.10-22
    Type
    a
  16. Klein, R.D.: ¬The problem of cataloguing world literature using the Nippon Decimal Classification (1994) 0.09
    0.0906711 = sum of:
      0.024360416 = product of:
        0.097441666 = sum of:
          0.097441666 = weight(_text_:authors in 867) [ClassicSimilarity], result of:
            0.097441666 = score(doc=867,freq=2.0), product of:
              0.24182312 = queryWeight, product of:
                4.558814 = idf(docFreq=1258, maxDocs=44218)
                0.053045183 = queryNorm
              0.40294603 = fieldWeight in 867, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                4.558814 = idf(docFreq=1258, maxDocs=44218)
                0.0625 = fieldNorm(doc=867)
        0.25 = coord(1/4)
      0.06631068 = sum of:
        0.008815561 = weight(_text_:a in 867) [ClassicSimilarity], result of:
          0.008815561 = score(doc=867,freq=4.0), product of:
            0.06116359 = queryWeight, product of:
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.053045183 = queryNorm
            0.14413087 = fieldWeight in 867, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.0625 = fieldNorm(doc=867)
        0.05749512 = weight(_text_:22 in 867) [ClassicSimilarity], result of:
          0.05749512 = score(doc=867,freq=2.0), product of:
            0.1857552 = queryWeight, product of:
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.053045183 = queryNorm
            0.30952093 = fieldWeight in 867, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.0625 = fieldNorm(doc=867)
    
    Abstract
    The Nippon Decimal Classification (NDC) system, extensively used in Japanese libraries, was devised in 1929. It is difficult to use NDC to classify world literature, such as fiction in English by non American, non British writers. This is not necessarily straightforward in other classification schemes but a survey of 40 Japanese university libraries, of which 24 responded, showed remarkable inconsistencies in the treatment of 22 world literature authors. NDC clearly needs updating to deal with this problem
    Type
    a
  17. CannCasciato, D.: ¬The OLUC from a NACO point of view : eliminating derived search keys (1994) 0.09
    0.0906711 = sum of:
      0.024360416 = product of:
        0.097441666 = sum of:
          0.097441666 = weight(_text_:authors in 6575) [ClassicSimilarity], result of:
            0.097441666 = score(doc=6575,freq=2.0), product of:
              0.24182312 = queryWeight, product of:
                4.558814 = idf(docFreq=1258, maxDocs=44218)
                0.053045183 = queryNorm
              0.40294603 = fieldWeight in 6575, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                4.558814 = idf(docFreq=1258, maxDocs=44218)
                0.0625 = fieldNorm(doc=6575)
        0.25 = coord(1/4)
      0.06631068 = sum of:
        0.008815561 = weight(_text_:a in 6575) [ClassicSimilarity], result of:
          0.008815561 = score(doc=6575,freq=4.0), product of:
            0.06116359 = queryWeight, product of:
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.053045183 = queryNorm
            0.14413087 = fieldWeight in 6575, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.0625 = fieldNorm(doc=6575)
        0.05749512 = weight(_text_:22 in 6575) [ClassicSimilarity], result of:
          0.05749512 = score(doc=6575,freq=2.0), product of:
            0.1857552 = queryWeight, product of:
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.053045183 = queryNorm
            0.30952093 = fieldWeight in 6575, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.0625 = fieldNorm(doc=6575)
    
    Abstract
    Explains how the current resurgence of cooperative cataloguing initiatives require that OCLC Online Union Catalog (OLUC) search keys evolve to allow more precise searching. Describes the results achieved by searching personal and corporate authors by word and phrase to obtain guidance on deciding about heading formation and necessary references. Concludes that phrase searching needs to be brought into existence in OLUC search mechanisms as quickly as possible to speed up cataloguing, and that OCLC needs to take note of these requirements by eliminating limitations on searching
    Source
    OCLC systems and services. 10(1994) no.4, S.22-25
    Type
    a
  18. Diederichs, A.: Wissensmanagement ist Macht : Effektiv und kostenbewußt arbeiten im Informationszeitalter (2005) 0.09
    0.09012594 = product of:
      0.18025188 = sum of:
        0.18025188 = sum of:
          0.017631123 = weight(_text_:a in 3211) [ClassicSimilarity], result of:
            0.017631123 = score(doc=3211,freq=4.0), product of:
              0.06116359 = queryWeight, product of:
                1.153047 = idf(docFreq=37942, maxDocs=44218)
                0.053045183 = queryNorm
              0.28826174 = fieldWeight in 3211, product of:
                2.0 = tf(freq=4.0), with freq of:
                  4.0 = termFreq=4.0
                1.153047 = idf(docFreq=37942, maxDocs=44218)
                0.125 = fieldNorm(doc=3211)
          0.16262075 = weight(_text_:22 in 3211) [ClassicSimilarity], result of:
            0.16262075 = score(doc=3211,freq=4.0), product of:
              0.1857552 = queryWeight, product of:
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.053045183 = queryNorm
              0.8754574 = fieldWeight in 3211, product of:
                2.0 = tf(freq=4.0), with freq of:
                  4.0 = termFreq=4.0
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.125 = fieldNorm(doc=3211)
      0.5 = coord(1/2)
    
    Date
    22. 2.2005 9:16:22
    Type
    a
  19. Avramescu, A.: Teoria difuziei informatiei stiintifice (1997) 0.09
    0.08989992 = sum of:
      0.030144477 = product of:
        0.12057791 = sum of:
          0.12057791 = weight(_text_:authors in 3030) [ClassicSimilarity], result of:
            0.12057791 = score(doc=3030,freq=4.0), product of:
              0.24182312 = queryWeight, product of:
                4.558814 = idf(docFreq=1258, maxDocs=44218)
                0.053045183 = queryNorm
              0.49862027 = fieldWeight in 3030, product of:
                2.0 = tf(freq=4.0), with freq of:
                  4.0 = termFreq=4.0
                4.558814 = idf(docFreq=1258, maxDocs=44218)
                0.0546875 = fieldNorm(doc=3030)
        0.25 = coord(1/4)
      0.059755445 = sum of:
        0.009447212 = weight(_text_:a in 3030) [ClassicSimilarity], result of:
          0.009447212 = score(doc=3030,freq=6.0), product of:
            0.06116359 = queryWeight, product of:
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.053045183 = queryNorm
            0.1544581 = fieldWeight in 3030, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.0546875 = fieldNorm(doc=3030)
        0.05030823 = weight(_text_:22 in 3030) [ClassicSimilarity], result of:
          0.05030823 = score(doc=3030,freq=2.0), product of:
            0.1857552 = queryWeight, product of:
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.053045183 = queryNorm
            0.2708308 = fieldWeight in 3030, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.0546875 = fieldNorm(doc=3030)
    
    Abstract
    The theory of diffusion can be successfully applied to scientific information dissemination by identifying space with a series of successive authors, and potential (temperature) with the interest of new authors towards earlier published papers, measured by the number of citations. As the total number of citation equals the number of references, the conservation law is fulfilled and Fourier's parabolic differential equation can be applied
    Date
    22. 2.1999 16:16:11
    Type
    a
  20. Elovici, Y.; Shapira, Y.B.; Kantor, P.B.: ¬A decision theoretic approach to combining information filters : an analytical and empirical evaluation. (2006) 0.09
    0.08989992 = sum of:
      0.030144477 = product of:
        0.12057791 = sum of:
          0.12057791 = weight(_text_:authors in 5267) [ClassicSimilarity], result of:
            0.12057791 = score(doc=5267,freq=4.0), product of:
              0.24182312 = queryWeight, product of:
                4.558814 = idf(docFreq=1258, maxDocs=44218)
                0.053045183 = queryNorm
              0.49862027 = fieldWeight in 5267, product of:
                2.0 = tf(freq=4.0), with freq of:
                  4.0 = termFreq=4.0
                4.558814 = idf(docFreq=1258, maxDocs=44218)
                0.0546875 = fieldNorm(doc=5267)
        0.25 = coord(1/4)
      0.059755445 = sum of:
        0.009447212 = weight(_text_:a in 5267) [ClassicSimilarity], result of:
          0.009447212 = score(doc=5267,freq=6.0), product of:
            0.06116359 = queryWeight, product of:
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.053045183 = queryNorm
            0.1544581 = fieldWeight in 5267, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.0546875 = fieldNorm(doc=5267)
        0.05030823 = weight(_text_:22 in 5267) [ClassicSimilarity], result of:
          0.05030823 = score(doc=5267,freq=2.0), product of:
            0.1857552 = queryWeight, product of:
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.053045183 = queryNorm
            0.2708308 = fieldWeight in 5267, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.0546875 = fieldNorm(doc=5267)
    
    Abstract
    The outputs of several information filtering (IF) systems can be combined to improve filtering performance. In this article the authors propose and explore a framework based on the so-called information structure (IS) model, which is frequently used in Information Economics, for combining the output of multiple IF systems according to each user's preferences (profile). The combination seeks to maximize the expected payoff to that user. The authors show analytically that the proposed framework increases users expected payoff from the combined filtering output for any user preferences. An experiment using the TREC-6 test collection confirms the theoretical findings.
    Date
    22. 7.2006 15:05:39
    Type
    a

Authors

Languages

Types

Themes

Subjects

Classifications