Search (10288 results, page 1 of 515)

  • × year_i:[2000 TO 2010}
  1. Ackermann, E.: Piaget's constructivism, Papert's constructionism : what's the difference? (2001) 0.15
    0.1524423 = sum of:
      0.14963627 = product of:
        0.44890878 = sum of:
          0.17516857 = weight(_text_:3a in 692) [ClassicSimilarity], result of:
            0.17516857 = score(doc=692,freq=2.0), product of:
              0.3740134 = queryWeight, product of:
                8.478011 = idf(docFreq=24, maxDocs=44218)
                0.0441157 = queryNorm
              0.46834838 = fieldWeight in 692, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                8.478011 = idf(docFreq=24, maxDocs=44218)
                0.0390625 = fieldNorm(doc=692)
          0.2737402 = weight(_text_:2c in 692) [ClassicSimilarity], result of:
            0.2737402 = score(doc=692,freq=2.0), product of:
              0.46755034 = queryWeight, product of:
                10.598275 = idf(docFreq=2, maxDocs=44218)
                0.0441157 = queryNorm
              0.5854775 = fieldWeight in 692, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                10.598275 = idf(docFreq=2, maxDocs=44218)
                0.0390625 = fieldNorm(doc=692)
        0.33333334 = coord(2/6)
      0.0028060337 = product of:
        0.0056120674 = sum of:
          0.0056120674 = weight(_text_:a in 692) [ClassicSimilarity], result of:
            0.0056120674 = score(doc=692,freq=6.0), product of:
              0.050867476 = queryWeight, product of:
                1.153047 = idf(docFreq=37942, maxDocs=44218)
                0.0441157 = queryNorm
              0.11032722 = fieldWeight in 692, product of:
                2.4494898 = tf(freq=6.0), with freq of:
                  6.0 = termFreq=6.0
                1.153047 = idf(docFreq=37942, maxDocs=44218)
                0.0390625 = fieldNorm(doc=692)
        0.5 = coord(1/2)
    
    Abstract
    What is the difference between Piaget's constructivism and Papert's "constructionism"? Beyond the mere play on the words, I think the distinction holds, and that integrating both views can enrich our understanding of how people learn and grow. Piaget's constructivism offers a window into what children are interested in, and able to achieve, at different stages of their development. The theory describes how children's ways of doing and thinking evolve over time, and under which circumstance children are more likely to let go of-or hold onto- their currently held views. Piaget suggests that children have very good reasons not to abandon their worldviews just because someone else, be it an expert, tells them they're wrong. Papert's constructionism, in contrast, focuses more on the art of learning, or 'learning to learn', and on the significance of making things in learning. Papert is interested in how learners engage in a conversation with [their own or other people's] artifacts, and how these conversations boost self-directed learning, and ultimately facilitate the construction of new knowledge. He stresses the importance of tools, media, and context in human development. Integrating both perspectives illuminates the processes by which individuals come to make sense of their experience, gradually optimizing their interactions with the world.
    Content
    Vgl.: https://www.semanticscholar.org/paper/Piaget-%E2%80%99-s-Constructivism-%2C-Papert-%E2%80%99-s-%3A-What-%E2%80%99-s-Ackermann/89cbcc1e740a4591443ff4765a6ae8df0fdf5554. Darunter weitere Hinweise auf verwandte Beiträge. Auch unter: Learning Group Publication 5(2001) no.3, S.438.
    Type
    a
  2. Gödert, W.; Hubrich, J.; Boteram, F.: Thematische Recherche und Interoperabilität : Wege zur Optimierung des Zugriffs auf heterogen erschlossene Dokumente (2009) 0.08
    0.07874884 = sum of:
      0.04562337 = product of:
        0.2737402 = sum of:
          0.2737402 = weight(_text_:2c in 193) [ClassicSimilarity], result of:
            0.2737402 = score(doc=193,freq=2.0), product of:
              0.46755034 = queryWeight, product of:
                10.598275 = idf(docFreq=2, maxDocs=44218)
                0.0441157 = queryNorm
              0.5854775 = fieldWeight in 193, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                10.598275 = idf(docFreq=2, maxDocs=44218)
                0.0390625 = fieldNorm(doc=193)
        0.16666667 = coord(1/6)
      0.033125468 = sum of:
        0.003240128 = weight(_text_:a in 193) [ClassicSimilarity], result of:
          0.003240128 = score(doc=193,freq=2.0), product of:
            0.050867476 = queryWeight, product of:
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.0441157 = queryNorm
            0.06369744 = fieldWeight in 193, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.0390625 = fieldNorm(doc=193)
        0.02988534 = weight(_text_:22 in 193) [ClassicSimilarity], result of:
          0.02988534 = score(doc=193,freq=2.0), product of:
            0.15448566 = queryWeight, product of:
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.0441157 = queryNorm
            0.19345059 = fieldWeight in 193, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.0390625 = fieldNorm(doc=193)
    
    Source
    https://opus4.kobv.de/opus4-bib-info/frontdoor/index/index/searchtype/authorsearch/author/%22Hubrich%2C+Jessica%22/docId/703/start/0/rows/20
    Type
    a
  3. Hotho, A.; Bloehdorn, S.: Data Mining 2004 : Text classification by boosting weak learners based on terms and concepts (2004) 0.08
    0.07867243 = sum of:
      0.035033714 = product of:
        0.21020228 = sum of:
          0.21020228 = weight(_text_:3a in 562) [ClassicSimilarity], result of:
            0.21020228 = score(doc=562,freq=2.0), product of:
              0.3740134 = queryWeight, product of:
                8.478011 = idf(docFreq=24, maxDocs=44218)
                0.0441157 = queryNorm
              0.56201804 = fieldWeight in 562, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                8.478011 = idf(docFreq=24, maxDocs=44218)
                0.046875 = fieldNorm(doc=562)
        0.16666667 = coord(1/6)
      0.043638717 = sum of:
        0.007776308 = weight(_text_:a in 562) [ClassicSimilarity], result of:
          0.007776308 = score(doc=562,freq=8.0), product of:
            0.050867476 = queryWeight, product of:
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.0441157 = queryNorm
            0.15287387 = fieldWeight in 562, product of:
              2.828427 = tf(freq=8.0), with freq of:
                8.0 = termFreq=8.0
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.046875 = fieldNorm(doc=562)
        0.03586241 = weight(_text_:22 in 562) [ClassicSimilarity], result of:
          0.03586241 = score(doc=562,freq=2.0), product of:
            0.15448566 = queryWeight, product of:
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.0441157 = queryNorm
            0.23214069 = fieldWeight in 562, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.046875 = fieldNorm(doc=562)
    
    Abstract
    Document representations for text classification are typically based on the classical Bag-Of-Words paradigm. This approach comes with deficiencies that motivate the integration of features on a higher semantic level than single words. In this paper we propose an enhancement of the classical document representation through concepts extracted from background knowledge. Boosting is used for actual classification. Experimental evaluations on two well known text corpora support our approach through consistent improvement of the results.
    Content
    Vgl.: http://www.google.de/url?sa=t&rct=j&q=&esrc=s&source=web&cd=1&cad=rja&ved=0CEAQFjAA&url=http%3A%2F%2Fciteseerx.ist.psu.edu%2Fviewdoc%2Fdownload%3Fdoi%3D10.1.1.91.4940%26rep%3Drep1%26type%3Dpdf&ei=dOXrUMeIDYHDtQahsIGACg&usg=AFQjCNHFWVh6gNPvnOrOS9R3rkrXCNVD-A&sig2=5I2F5evRfMnsttSgFF9g7Q&bvm=bv.1357316858,d.Yms.
    Date
    8. 1.2013 10:22:32
    Type
    a
  4. Baker, T.; Dekkers, M.; Heery, R.; Patel, M.; Salokhe, G.: What Terms Does Your Metadata Use? : Application Profiles as Machine-Understandable Narratives (2002) 0.08
    0.07601857 = sum of:
      0.07277844 = product of:
        0.43667063 = sum of:
          0.43667063 = weight(_text_:baker in 1279) [ClassicSimilarity], result of:
            0.43667063 = score(doc=1279,freq=4.0), product of:
              0.35112646 = queryWeight, product of:
                7.9592175 = idf(docFreq=41, maxDocs=44218)
                0.0441157 = queryNorm
              1.2436278 = fieldWeight in 1279, product of:
                2.0 = tf(freq=4.0), with freq of:
                  4.0 = termFreq=4.0
                7.9592175 = idf(docFreq=41, maxDocs=44218)
                0.078125 = fieldNorm(doc=1279)
        0.16666667 = coord(1/6)
      0.003240128 = product of:
        0.006480256 = sum of:
          0.006480256 = weight(_text_:a in 1279) [ClassicSimilarity], result of:
            0.006480256 = score(doc=1279,freq=2.0), product of:
              0.050867476 = queryWeight, product of:
                1.153047 = idf(docFreq=37942, maxDocs=44218)
                0.0441157 = queryNorm
              0.12739488 = fieldWeight in 1279, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                1.153047 = idf(docFreq=37942, maxDocs=44218)
                0.078125 = fieldNorm(doc=1279)
        0.5 = coord(1/2)
    
    Footnote
    http://jodi.ecs.soton.ac.uk/Articles/v02/i02/Baker/
    Type
    a
  5. Diederichs, A.: Wissensmanagement ist Macht : Effektiv und kostenbewußt arbeiten im Informationszeitalter (2005) 0.07
    0.074954376 = product of:
      0.14990875 = sum of:
        0.14990875 = sum of:
          0.014663147 = weight(_text_:a in 3211) [ClassicSimilarity], result of:
            0.014663147 = score(doc=3211,freq=4.0), product of:
              0.050867476 = queryWeight, product of:
                1.153047 = idf(docFreq=37942, maxDocs=44218)
                0.0441157 = queryNorm
              0.28826174 = fieldWeight in 3211, product of:
                2.0 = tf(freq=4.0), with freq of:
                  4.0 = termFreq=4.0
                1.153047 = idf(docFreq=37942, maxDocs=44218)
                0.125 = fieldNorm(doc=3211)
          0.1352456 = weight(_text_:22 in 3211) [ClassicSimilarity], result of:
            0.1352456 = score(doc=3211,freq=4.0), product of:
              0.15448566 = queryWeight, product of:
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.0441157 = queryNorm
              0.8754574 = fieldWeight in 3211, product of:
                2.0 = tf(freq=4.0), with freq of:
                  4.0 = termFreq=4.0
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.125 = fieldNorm(doc=3211)
      0.5 = coord(1/2)
    
    Date
    22. 2.2005 9:16:22
    Type
    a
  6. RAK-NBM : Interpretationshilfe zu NBM 3b,3 (2000) 0.07
    0.07280701 = product of:
      0.14561401 = sum of:
        0.14561401 = sum of:
          0.0103684105 = weight(_text_:a in 4362) [ClassicSimilarity], result of:
            0.0103684105 = score(doc=4362,freq=2.0), product of:
              0.050867476 = queryWeight, product of:
                1.153047 = idf(docFreq=37942, maxDocs=44218)
                0.0441157 = queryNorm
              0.20383182 = fieldWeight in 4362, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                1.153047 = idf(docFreq=37942, maxDocs=44218)
                0.125 = fieldNorm(doc=4362)
          0.1352456 = weight(_text_:22 in 4362) [ClassicSimilarity], result of:
            0.1352456 = score(doc=4362,freq=4.0), product of:
              0.15448566 = queryWeight, product of:
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.0441157 = queryNorm
              0.8754574 = fieldWeight in 4362, product of:
                2.0 = tf(freq=4.0), with freq of:
                  4.0 = termFreq=4.0
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.125 = fieldNorm(doc=4362)
      0.5 = coord(1/2)
    
    Date
    22. 1.2000 19:22:27
    Type
    a
  7. Hawking, D.; Robertson, S.: On collection size and retrieval effectiveness (2003) 0.07
    0.07280701 = product of:
      0.14561401 = sum of:
        0.14561401 = sum of:
          0.0103684105 = weight(_text_:a in 4109) [ClassicSimilarity], result of:
            0.0103684105 = score(doc=4109,freq=2.0), product of:
              0.050867476 = queryWeight, product of:
                1.153047 = idf(docFreq=37942, maxDocs=44218)
                0.0441157 = queryNorm
              0.20383182 = fieldWeight in 4109, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                1.153047 = idf(docFreq=37942, maxDocs=44218)
                0.125 = fieldNorm(doc=4109)
          0.1352456 = weight(_text_:22 in 4109) [ClassicSimilarity], result of:
            0.1352456 = score(doc=4109,freq=4.0), product of:
              0.15448566 = queryWeight, product of:
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.0441157 = queryNorm
              0.8754574 = fieldWeight in 4109, product of:
                2.0 = tf(freq=4.0), with freq of:
                  4.0 = termFreq=4.0
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.125 = fieldNorm(doc=4109)
      0.5 = coord(1/2)
    
    Date
    14. 8.2005 14:22:22
    Type
    a
  8. Hickey, T.B.; Toves, J.; O'Neill, E.T.: NACO normalization : a detailed examination of the authority file comparison rules (2006) 0.07
    0.07016598 = sum of:
      0.020469602 = product of:
        0.122817606 = sum of:
          0.122817606 = weight(_text_:authors in 5760) [ClassicSimilarity], result of:
            0.122817606 = score(doc=5760,freq=6.0), product of:
              0.20111527 = queryWeight, product of:
                4.558814 = idf(docFreq=1258, maxDocs=44218)
                0.0441157 = queryNorm
              0.61068267 = fieldWeight in 5760, product of:
                2.4494898 = tf(freq=6.0), with freq of:
                  6.0 = termFreq=6.0
                4.558814 = idf(docFreq=1258, maxDocs=44218)
                0.0546875 = fieldNorm(doc=5760)
        0.16666667 = coord(1/6)
      0.04969637 = sum of:
        0.007856894 = weight(_text_:a in 5760) [ClassicSimilarity], result of:
          0.007856894 = score(doc=5760,freq=6.0), product of:
            0.050867476 = queryWeight, product of:
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.0441157 = queryNorm
            0.1544581 = fieldWeight in 5760, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.0546875 = fieldNorm(doc=5760)
        0.041839477 = weight(_text_:22 in 5760) [ClassicSimilarity], result of:
          0.041839477 = score(doc=5760,freq=2.0), product of:
            0.15448566 = queryWeight, product of:
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.0441157 = queryNorm
            0.2708308 = fieldWeight in 5760, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.0546875 = fieldNorm(doc=5760)
    
    Abstract
    Normalization rules are essential for interoperability between bibliographic systems. In the process of working with Name Authority Cooperative Program (NACO) authority files to match records with Functional Requirements for Bibliographic Records (FRBR) and developing the Faceted Application of Subject Terminology (FAST) subject heading schema, the authors found inconsistencies in independently created NACO normalization implementations. Investigating these, the authors found ambiguities in the NACO standard that need resolution, and came to conclusions on how the procedure could be simplified with little impact on matching headings. To encourage others to test their software for compliance with the current rules, the authors have established a Web site that has test files and interactive services showing their current implementation.
    Date
    10. 9.2000 17:38:22
    Type
    a
  9. LeBlanc, J.; Kurth, M.: ¬An operational model for library metadata maintenance (2008) 0.07
    0.068623245 = sum of:
      0.010129825 = product of:
        0.06077895 = sum of:
          0.06077895 = weight(_text_:authors in 101) [ClassicSimilarity], result of:
            0.06077895 = score(doc=101,freq=2.0), product of:
              0.20111527 = queryWeight, product of:
                4.558814 = idf(docFreq=1258, maxDocs=44218)
                0.0441157 = queryNorm
              0.30220953 = fieldWeight in 101, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                4.558814 = idf(docFreq=1258, maxDocs=44218)
                0.046875 = fieldNorm(doc=101)
        0.16666667 = coord(1/6)
      0.058493417 = sum of:
        0.007776308 = weight(_text_:a in 101) [ClassicSimilarity], result of:
          0.007776308 = score(doc=101,freq=8.0), product of:
            0.050867476 = queryWeight, product of:
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.0441157 = queryNorm
            0.15287387 = fieldWeight in 101, product of:
              2.828427 = tf(freq=8.0), with freq of:
                8.0 = termFreq=8.0
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.046875 = fieldNorm(doc=101)
        0.050717108 = weight(_text_:22 in 101) [ClassicSimilarity], result of:
          0.050717108 = score(doc=101,freq=4.0), product of:
            0.15448566 = queryWeight, product of:
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.0441157 = queryNorm
            0.32829654 = fieldWeight in 101, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.046875 = fieldNorm(doc=101)
    
    Abstract
    Libraries pay considerable attention to the creation, preservation, and transformation of descriptive metadata in both MARC and non-MARC formats. Little evidence suggests that they devote as much time, energy, and financial resources to the ongoing maintenance of non-MARC metadata, especially with regard to updating and editing existing descriptive content, as they do to maintenance of such information in the MARC-based online public access catalog. In this paper, the authors introduce a model, derived loosely from J. A. Zachman's framework for information systems architecture, with which libraries can identify and inventory components of catalog or metadata maintenance and plan interdepartmental, even interinstitutional, workflows. The model draws on the notion that the expertise and skills that have long been the hallmark for the maintenance of libraries' catalog data can and should be parlayed towards metadata maintenance in a broader set of information delivery systems.
    Date
    10. 9.2000 17:38:22
    19. 6.2010 19:22:28
    Type
    a
  10. Elovici, Y.; Shapira, Y.B.; Kantor, P.B.: ¬A decision theoretic approach to combining information filters : an analytical and empirical evaluation. (2006) 0.07
    0.06640973 = sum of:
      0.016713358 = product of:
        0.10028015 = sum of:
          0.10028015 = weight(_text_:authors in 5267) [ClassicSimilarity], result of:
            0.10028015 = score(doc=5267,freq=4.0), product of:
              0.20111527 = queryWeight, product of:
                4.558814 = idf(docFreq=1258, maxDocs=44218)
                0.0441157 = queryNorm
              0.49862027 = fieldWeight in 5267, product of:
                2.0 = tf(freq=4.0), with freq of:
                  4.0 = termFreq=4.0
                4.558814 = idf(docFreq=1258, maxDocs=44218)
                0.0546875 = fieldNorm(doc=5267)
        0.16666667 = coord(1/6)
      0.04969637 = sum of:
        0.007856894 = weight(_text_:a in 5267) [ClassicSimilarity], result of:
          0.007856894 = score(doc=5267,freq=6.0), product of:
            0.050867476 = queryWeight, product of:
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.0441157 = queryNorm
            0.1544581 = fieldWeight in 5267, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.0546875 = fieldNorm(doc=5267)
        0.041839477 = weight(_text_:22 in 5267) [ClassicSimilarity], result of:
          0.041839477 = score(doc=5267,freq=2.0), product of:
            0.15448566 = queryWeight, product of:
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.0441157 = queryNorm
            0.2708308 = fieldWeight in 5267, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.0546875 = fieldNorm(doc=5267)
    
    Abstract
    The outputs of several information filtering (IF) systems can be combined to improve filtering performance. In this article the authors propose and explore a framework based on the so-called information structure (IS) model, which is frequently used in Information Economics, for combining the output of multiple IF systems according to each user's preferences (profile). The combination seeks to maximize the expected payoff to that user. The authors show analytically that the proposed framework increases users expected payoff from the combined filtering output for any user preferences. An experiment using the TREC-6 test collection confirms the theoretical findings.
    Date
    22. 7.2006 15:05:39
    Type
    a
  11. Buzydlowski, J.W.; White, H.D.; Lin, X.: Term Co-occurrence Analysis as an Interface for Digital Libraries (2002) 0.07
    0.06600367 = product of:
      0.13200735 = sum of:
        0.13200735 = sum of:
          0.007776308 = weight(_text_:a in 1339) [ClassicSimilarity], result of:
            0.007776308 = score(doc=1339,freq=2.0), product of:
              0.050867476 = queryWeight, product of:
                1.153047 = idf(docFreq=37942, maxDocs=44218)
                0.0441157 = queryNorm
              0.15287387 = fieldWeight in 1339, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                1.153047 = idf(docFreq=37942, maxDocs=44218)
                0.09375 = fieldNorm(doc=1339)
          0.12423103 = weight(_text_:22 in 1339) [ClassicSimilarity], result of:
            0.12423103 = score(doc=1339,freq=6.0), product of:
              0.15448566 = queryWeight, product of:
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.0441157 = queryNorm
              0.804159 = fieldWeight in 1339, product of:
                2.4494898 = tf(freq=6.0), with freq of:
                  6.0 = termFreq=6.0
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.09375 = fieldNorm(doc=1339)
      0.5 = coord(1/2)
    
    Date
    22. 2.2003 17:25:39
    22. 2.2003 18:16:22
    Type
    a
  12. Baker, T.: ¬A grammar of Dublin Core (2000) 0.06
    0.06461188 = sum of:
      0.029111374 = product of:
        0.17466824 = sum of:
          0.17466824 = weight(_text_:baker in 1236) [ClassicSimilarity], result of:
            0.17466824 = score(doc=1236,freq=4.0), product of:
              0.35112646 = queryWeight, product of:
                7.9592175 = idf(docFreq=41, maxDocs=44218)
                0.0441157 = queryNorm
              0.4974511 = fieldWeight in 1236, product of:
                2.0 = tf(freq=4.0), with freq of:
                  4.0 = termFreq=4.0
                7.9592175 = idf(docFreq=41, maxDocs=44218)
                0.03125 = fieldNorm(doc=1236)
        0.16666667 = coord(1/6)
      0.035500508 = sum of:
        0.011592236 = weight(_text_:a in 1236) [ClassicSimilarity], result of:
          0.011592236 = score(doc=1236,freq=40.0), product of:
            0.050867476 = queryWeight, product of:
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.0441157 = queryNorm
            0.22789092 = fieldWeight in 1236, product of:
              6.3245554 = tf(freq=40.0), with freq of:
                40.0 = termFreq=40.0
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.03125 = fieldNorm(doc=1236)
        0.023908272 = weight(_text_:22 in 1236) [ClassicSimilarity], result of:
          0.023908272 = score(doc=1236,freq=2.0), product of:
            0.15448566 = queryWeight, product of:
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.0441157 = queryNorm
            0.15476047 = fieldWeight in 1236, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.03125 = fieldNorm(doc=1236)
    
    Abstract
    Dublin Core is often presented as a modern form of catalog card -- a set of elements (and now qualifiers) that describe resources in a complete package. Sometimes it is proposed as an exchange format for sharing records among multiple collections. The founding principle that "every element is optional and repeatable" reinforces the notion that a Dublin Core description is to be taken as a whole. This paper, in contrast, is based on a much different premise: Dublin Core is a language. More precisely, it is a small language for making a particular class of statements about resources. Like natural languages, it has a vocabulary of word-like terms, the two classes of which -- elements and qualifiers -- function within statements like nouns and adjectives; and it has a syntax for arranging elements and qualifiers into statements according to a simple pattern. Whenever tourists order a meal or ask directions in an unfamiliar language, considerate native speakers will spontaneously limit themselves to basic words and simple sentence patterns along the lines of "I am so-and-so" or "This is such-and-such". Linguists call this pidginization. In such situations, a small phrase book or translated menu can be most helpful. By analogy, today's Web has been called an Internet Commons where users and information providers from a wide range of scientific, commercial, and social domains present their information in a variety of incompatible data models and description languages. In this context, Dublin Core presents itself as a metadata pidgin for digital tourists who must find their way in this linguistically diverse landscape. Its vocabulary is small enough to learn quickly, and its basic pattern is easily grasped. It is well-suited to serve as an auxiliary language for digital libraries. This grammar starts by defining terms. It then follows a 200-year-old tradition of English grammar teaching by focusing on the structure of single statements. It concludes by looking at the growing dictionary of Dublin Core vocabulary terms -- its registry, and at how statements can be used to build the metadata equivalent of paragraphs and compositions -- the application profile.
    Date
    26.12.2011 14:01:22
    Footnote
    Vgl.: http://dlib.ukoln.ac.uk/dlib/october00/baker/10baker.html.
    Type
    a
  13. Pesch, K.: ¬Eine gigantische Informationsfülle : "Brockhaus multimedial 2004" kann jedoch nicht rundum überzeugen (2003) 0.06
    0.06370614 = product of:
      0.12741227 = sum of:
        0.12741227 = sum of:
          0.009072359 = weight(_text_:a in 502) [ClassicSimilarity], result of:
            0.009072359 = score(doc=502,freq=2.0), product of:
              0.050867476 = queryWeight, product of:
                1.153047 = idf(docFreq=37942, maxDocs=44218)
                0.0441157 = queryNorm
              0.17835285 = fieldWeight in 502, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                1.153047 = idf(docFreq=37942, maxDocs=44218)
                0.109375 = fieldNorm(doc=502)
          0.11833991 = weight(_text_:22 in 502) [ClassicSimilarity], result of:
            0.11833991 = score(doc=502,freq=4.0), product of:
              0.15448566 = queryWeight, product of:
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.0441157 = queryNorm
              0.76602525 = fieldWeight in 502, product of:
                2.0 = tf(freq=4.0), with freq of:
                  4.0 = termFreq=4.0
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.109375 = fieldNorm(doc=502)
      0.5 = coord(1/2)
    
    Date
    3. 5.1997 8:44:22
    22. 9.2003 10:02:00
    Type
    a
  14. Hemminger, B.M.: Introduction to the special issue on bioinformatics (2005) 0.06
    0.06370614 = product of:
      0.12741227 = sum of:
        0.12741227 = sum of:
          0.009072359 = weight(_text_:a in 4189) [ClassicSimilarity], result of:
            0.009072359 = score(doc=4189,freq=2.0), product of:
              0.050867476 = queryWeight, product of:
                1.153047 = idf(docFreq=37942, maxDocs=44218)
                0.0441157 = queryNorm
              0.17835285 = fieldWeight in 4189, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                1.153047 = idf(docFreq=37942, maxDocs=44218)
                0.109375 = fieldNorm(doc=4189)
          0.11833991 = weight(_text_:22 in 4189) [ClassicSimilarity], result of:
            0.11833991 = score(doc=4189,freq=4.0), product of:
              0.15448566 = queryWeight, product of:
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.0441157 = queryNorm
              0.76602525 = fieldWeight in 4189, product of:
                2.0 = tf(freq=4.0), with freq of:
                  4.0 = termFreq=4.0
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.109375 = fieldNorm(doc=4189)
      0.5 = coord(1/2)
    
    Date
    22. 7.2006 14:19:22
    Type
    a
  15. Copeland, A.; Hamburger, S.; Hamilton, J.; Robinson, K.J.: Cataloging and digitizing ephemera : one team's experience with Pennsylvania German broadsides and fraktur (2006) 0.06
    0.06272996 = sum of:
      0.011818129 = product of:
        0.07090877 = sum of:
          0.07090877 = weight(_text_:authors in 768) [ClassicSimilarity], result of:
            0.07090877 = score(doc=768,freq=2.0), product of:
              0.20111527 = queryWeight, product of:
                4.558814 = idf(docFreq=1258, maxDocs=44218)
                0.0441157 = queryNorm
              0.35257778 = fieldWeight in 768, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                4.558814 = idf(docFreq=1258, maxDocs=44218)
                0.0546875 = fieldNorm(doc=768)
        0.16666667 = coord(1/6)
      0.050911836 = sum of:
        0.009072359 = weight(_text_:a in 768) [ClassicSimilarity], result of:
          0.009072359 = score(doc=768,freq=8.0), product of:
            0.050867476 = queryWeight, product of:
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.0441157 = queryNorm
            0.17835285 = fieldWeight in 768, product of:
              2.828427 = tf(freq=8.0), with freq of:
                8.0 = termFreq=8.0
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.0546875 = fieldNorm(doc=768)
        0.041839477 = weight(_text_:22 in 768) [ClassicSimilarity], result of:
          0.041839477 = score(doc=768,freq=2.0), product of:
            0.15448566 = queryWeight, product of:
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.0441157 = queryNorm
            0.2708308 = fieldWeight in 768, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.0546875 = fieldNorm(doc=768)
    
    Abstract
    The growing interest in ephemera collections within libraries will necessitate the bibliographic control of materials that do not easily fall into traditional categories. This paper discusses the many challenges confronting catalogers when approaching a mixed collection of unique materials of an ephemeral nature. Based on their experience cataloging a collection of Pennsylvania German broadsides and Fraktur at the Pennsylvania State University, the authors describe the process of deciphering handwriting, preserving genealogical information, deciding on cataloging approaches at the format and field level, and furthering access to the materials through digitization and the Encoded Archival Description finding aid. Observations are made on expanding the skills of traditional book catalogers to include manuscript cataloging, and on project management.
    Date
    10. 9.2000 17:38:22
    Type
    a
  16. Feinberg, M.: Classificationist as author : the case of the Prelinger Library (2008) 0.06
    0.06272996 = sum of:
      0.011818129 = product of:
        0.07090877 = sum of:
          0.07090877 = weight(_text_:authors in 2237) [ClassicSimilarity], result of:
            0.07090877 = score(doc=2237,freq=2.0), product of:
              0.20111527 = queryWeight, product of:
                4.558814 = idf(docFreq=1258, maxDocs=44218)
                0.0441157 = queryNorm
              0.35257778 = fieldWeight in 2237, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                4.558814 = idf(docFreq=1258, maxDocs=44218)
                0.0546875 = fieldNorm(doc=2237)
        0.16666667 = coord(1/6)
      0.050911836 = sum of:
        0.009072359 = weight(_text_:a in 2237) [ClassicSimilarity], result of:
          0.009072359 = score(doc=2237,freq=8.0), product of:
            0.050867476 = queryWeight, product of:
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.0441157 = queryNorm
            0.17835285 = fieldWeight in 2237, product of:
              2.828427 = tf(freq=8.0), with freq of:
                8.0 = termFreq=8.0
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.0546875 = fieldNorm(doc=2237)
        0.041839477 = weight(_text_:22 in 2237) [ClassicSimilarity], result of:
          0.041839477 = score(doc=2237,freq=2.0), product of:
            0.15448566 = queryWeight, product of:
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.0441157 = queryNorm
            0.2708308 = fieldWeight in 2237, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.0546875 = fieldNorm(doc=2237)
    
    Content
    Within information science, neutrality and objectivity have been standard design goals for knowledge organization schemes; designers have seen themselves as compilers, rather than as authors or creators. The organization of resources in the Prelinger Library in San Francisco, however, shows a distinct authorial voice, or unique sense of expression and vision. This voice, in turn, works as a persuasive mechanism, facilitating a rhetorical purpose for the collection.
    Pages
    S.22-28
    Type
    a
  17. Camacho-Miñano, M.-del-Mar; Núñez-Nickel, M.: ¬The multilayered nature of reference selection (2009) 0.06
    0.061852604 = sum of:
      0.014325736 = product of:
        0.08595441 = sum of:
          0.08595441 = weight(_text_:authors in 2751) [ClassicSimilarity], result of:
            0.08595441 = score(doc=2751,freq=4.0), product of:
              0.20111527 = queryWeight, product of:
                4.558814 = idf(docFreq=1258, maxDocs=44218)
                0.0441157 = queryNorm
              0.42738882 = fieldWeight in 2751, product of:
                2.0 = tf(freq=4.0), with freq of:
                  4.0 = termFreq=4.0
                4.558814 = idf(docFreq=1258, maxDocs=44218)
                0.046875 = fieldNorm(doc=2751)
        0.16666667 = coord(1/6)
      0.04752687 = sum of:
        0.011664462 = weight(_text_:a in 2751) [ClassicSimilarity], result of:
          0.011664462 = score(doc=2751,freq=18.0), product of:
            0.050867476 = queryWeight, product of:
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.0441157 = queryNorm
            0.22931081 = fieldWeight in 2751, product of:
              4.2426405 = tf(freq=18.0), with freq of:
                18.0 = termFreq=18.0
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.046875 = fieldNorm(doc=2751)
        0.03586241 = weight(_text_:22 in 2751) [ClassicSimilarity], result of:
          0.03586241 = score(doc=2751,freq=2.0), product of:
            0.15448566 = queryWeight, product of:
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.0441157 = queryNorm
            0.23214069 = fieldWeight in 2751, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.046875 = fieldNorm(doc=2751)
    
    Abstract
    Why authors choose some references in preference to others is a question that is still not wholly answered despite its being of interest to scientists. The relevance of references is twofold: They are a mechanism for tracing the evolution of science, and because they enhance the image of the cited authors, citations are a widely known and used indicator of scientific endeavor. Following an extensive review of the literature, we selected all papers that seek to answer the central question and demonstrate that the existing theories are not sufficient: Neither citation nor indicator theory provides a complete and convincing answer. Some perspectives in this arena remain, which are isolated from the core literature. The purpose of this article is to offer a fresh perspective on a 30-year-old problem by extending the context of the discussion. We suggest reviving the discussion about citation theories with a new perspective, that of the readers, by layers or phases, in the final choice of references, allowing for a new classification in which any paper, to date, could be included.
    Date
    22. 3.2009 19:05:07
    Type
    a
  18. Jones, M.; Buchanan, G.; Cheng, T.-C.; Jain, P.: Changing the pace of search : supporting background information seeking (2006) 0.06
    0.0615145 = sum of:
      0.011818129 = product of:
        0.07090877 = sum of:
          0.07090877 = weight(_text_:authors in 5287) [ClassicSimilarity], result of:
            0.07090877 = score(doc=5287,freq=2.0), product of:
              0.20111527 = queryWeight, product of:
                4.558814 = idf(docFreq=1258, maxDocs=44218)
                0.0441157 = queryNorm
              0.35257778 = fieldWeight in 5287, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                4.558814 = idf(docFreq=1258, maxDocs=44218)
                0.0546875 = fieldNorm(doc=5287)
        0.16666667 = coord(1/6)
      0.04969637 = sum of:
        0.007856894 = weight(_text_:a in 5287) [ClassicSimilarity], result of:
          0.007856894 = score(doc=5287,freq=6.0), product of:
            0.050867476 = queryWeight, product of:
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.0441157 = queryNorm
            0.1544581 = fieldWeight in 5287, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.0546875 = fieldNorm(doc=5287)
        0.041839477 = weight(_text_:22 in 5287) [ClassicSimilarity], result of:
          0.041839477 = score(doc=5287,freq=2.0), product of:
            0.15448566 = queryWeight, product of:
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.0441157 = queryNorm
            0.2708308 = fieldWeight in 5287, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.0546875 = fieldNorm(doc=5287)
    
    Abstract
    Almost all Web searches are carried out while the user is sitting at a conventional desktop computer connected to the Internet. Although online, handheld, mobile search offers new possibilities, the fast-paced, focused style of interaction may not be appropriate for all user search needs. The authors explore an alternative, relaxed style for Web searching that asynchronously combines an offline handheld computer and an online desktop personal computer. They discuss the role and utility of such an approach, present a tool to meet these user needs, and discuss its relation to other systems.
    Date
    22. 7.2006 18:37:49
    Type
    a
  19. Kavcic-Colic, A.: Archiving the Web : some legal aspects (2003) 0.06
    0.060072735 = sum of:
      0.011818129 = product of:
        0.07090877 = sum of:
          0.07090877 = weight(_text_:authors in 4754) [ClassicSimilarity], result of:
            0.07090877 = score(doc=4754,freq=2.0), product of:
              0.20111527 = queryWeight, product of:
                4.558814 = idf(docFreq=1258, maxDocs=44218)
                0.0441157 = queryNorm
              0.35257778 = fieldWeight in 4754, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                4.558814 = idf(docFreq=1258, maxDocs=44218)
                0.0546875 = fieldNorm(doc=4754)
        0.16666667 = coord(1/6)
      0.048254605 = sum of:
        0.0064151273 = weight(_text_:a in 4754) [ClassicSimilarity], result of:
          0.0064151273 = score(doc=4754,freq=4.0), product of:
            0.050867476 = queryWeight, product of:
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.0441157 = queryNorm
            0.12611452 = fieldWeight in 4754, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.0546875 = fieldNorm(doc=4754)
        0.041839477 = weight(_text_:22 in 4754) [ClassicSimilarity], result of:
          0.041839477 = score(doc=4754,freq=2.0), product of:
            0.15448566 = queryWeight, product of:
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.0441157 = queryNorm
            0.2708308 = fieldWeight in 4754, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.0546875 = fieldNorm(doc=4754)
    
    Abstract
    Technological developments have changed the concepts of publication, reproduction and distribution. However, legislation, and in particular the Legal Deposit Law has not adjusted to these changes - it is very restrictive in the sense of protecting the rights of authors of electronic publications. National libraries and national archival institutions, being aware of their important role in preserving the written and spoken cultural heritage, try to find different legal ways to live up to these responsibilities. This paper presents some legal aspects of archiving Web pages, examines the harvesting of Web pages, provision of public access to pages, and their long-term preservation.
    Date
    10.12.2005 11:22:13
    Type
    a
  20. Horn, M.E.: "Garbage" in, "refuse and refuse disposal" out : making the most of the subject authority file in the OPAC (2002) 0.06
    0.060072735 = sum of:
      0.011818129 = product of:
        0.07090877 = sum of:
          0.07090877 = weight(_text_:authors in 156) [ClassicSimilarity], result of:
            0.07090877 = score(doc=156,freq=2.0), product of:
              0.20111527 = queryWeight, product of:
                4.558814 = idf(docFreq=1258, maxDocs=44218)
                0.0441157 = queryNorm
              0.35257778 = fieldWeight in 156, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                4.558814 = idf(docFreq=1258, maxDocs=44218)
                0.0546875 = fieldNorm(doc=156)
        0.16666667 = coord(1/6)
      0.048254605 = sum of:
        0.0064151273 = weight(_text_:a in 156) [ClassicSimilarity], result of:
          0.0064151273 = score(doc=156,freq=4.0), product of:
            0.050867476 = queryWeight, product of:
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.0441157 = queryNorm
            0.12611452 = fieldWeight in 156, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.0546875 = fieldNorm(doc=156)
        0.041839477 = weight(_text_:22 in 156) [ClassicSimilarity], result of:
          0.041839477 = score(doc=156,freq=2.0), product of:
            0.15448566 = queryWeight, product of:
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.0441157 = queryNorm
            0.2708308 = fieldWeight in 156, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.0546875 = fieldNorm(doc=156)
    
    Abstract
    Subject access in the OPAC, as discussed in this article, is predicated on two different kinds of searching: subject (authority, alphabetic, or controlled vocabulary searching) or keyword (uncontrolled, free text, natural language vocabulary). The literature has focused on demonstrating that both approaches are needed, but very few authors address the need to integrate keyword into authority searching. The article discusses this difference and compares, with a query on the term garbage, search results in two online catalogs, one that performs keyword searches through the authority file and one where only bibliographic records are included in keyword searches.
    Date
    10. 9.2000 17:38:22
    Type
    a

Authors

Languages

Types

  • a 9260
  • m 641
  • el 501
  • s 207
  • x 52
  • b 40
  • i 28
  • r 28
  • n 17
  • p 10
  • l 1
  • More… Less…

Themes

Subjects

Classifications