Search (90 results, page 1 of 5)

  • × theme_ss:"Information Gateway"
  1. Facet analytical theory for managing knowledge structure in the humanities : FATKS (2003) 0.03
    0.034122285 = product of:
      0.2047337 = sum of:
        0.2047337 = sum of:
          0.119337276 = weight(_text_:theory in 2526) [ClassicSimilarity], result of:
            0.119337276 = score(doc=2526,freq=2.0), product of:
              0.16234003 = queryWeight, product of:
                4.1583924 = idf(docFreq=1878, maxDocs=44218)
                0.03903913 = queryNorm
              0.7351069 = fieldWeight in 2526, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                4.1583924 = idf(docFreq=1878, maxDocs=44218)
                0.125 = fieldNorm(doc=2526)
          0.08539642 = weight(_text_:29 in 2526) [ClassicSimilarity], result of:
            0.08539642 = score(doc=2526,freq=2.0), product of:
              0.13732746 = queryWeight, product of:
                3.5176873 = idf(docFreq=3565, maxDocs=44218)
                0.03903913 = queryNorm
              0.6218451 = fieldWeight in 2526, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.5176873 = idf(docFreq=3565, maxDocs=44218)
                0.125 = fieldNorm(doc=2526)
      0.16666667 = coord(1/6)
    
    Date
    29. 8.2004 9:17:18
  2. Lim, E.: Southeast Asian subject gateways : an examination of their classification practices (2000) 0.02
    0.02125308 = product of:
      0.06375924 = sum of:
        0.032023653 = product of:
          0.06404731 = sum of:
            0.06404731 = weight(_text_:29 in 6040) [ClassicSimilarity], result of:
              0.06404731 = score(doc=6040,freq=2.0), product of:
                0.13732746 = queryWeight, product of:
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.03903913 = queryNorm
                0.46638384 = fieldWeight in 6040, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.09375 = fieldNorm(doc=6040)
          0.5 = coord(1/2)
        0.03173558 = product of:
          0.06347116 = sum of:
            0.06347116 = weight(_text_:22 in 6040) [ClassicSimilarity], result of:
              0.06347116 = score(doc=6040,freq=2.0), product of:
                0.1367084 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.03903913 = queryNorm
                0.46428138 = fieldWeight in 6040, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.09375 = fieldNorm(doc=6040)
          0.5 = coord(1/2)
      0.33333334 = coord(2/6)
    
    Date
    22. 6.2002 19:42:47
    Source
    International cataloguing and bibliographic control. 29(2000) no.3, S.45-48
  3. LaBarre, K.: Adventures in faceted classification: a brave new world or a world of confusion? (2004) 0.01
    0.014928497 = product of:
      0.089570984 = sum of:
        0.089570984 = sum of:
          0.052210055 = weight(_text_:theory in 2634) [ClassicSimilarity], result of:
            0.052210055 = score(doc=2634,freq=2.0), product of:
              0.16234003 = queryWeight, product of:
                4.1583924 = idf(docFreq=1878, maxDocs=44218)
                0.03903913 = queryNorm
              0.32160926 = fieldWeight in 2634, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                4.1583924 = idf(docFreq=1878, maxDocs=44218)
                0.0546875 = fieldNorm(doc=2634)
          0.03736093 = weight(_text_:29 in 2634) [ClassicSimilarity], result of:
            0.03736093 = score(doc=2634,freq=2.0), product of:
              0.13732746 = queryWeight, product of:
                3.5176873 = idf(docFreq=3565, maxDocs=44218)
                0.03903913 = queryNorm
              0.27205724 = fieldWeight in 2634, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.5176873 = idf(docFreq=3565, maxDocs=44218)
                0.0546875 = fieldNorm(doc=2634)
      0.16666667 = coord(1/6)
    
    Abstract
    A preliminary, purposive survey of definitions and current applications of facet analytical theory (FA) is used to develop a framework for the analysis of Websites. This set of guidelines may well serve to highlight commonalities and differences among FA applications an the Web. Rather than identifying FA as the terrain of a particular interest group, the goal is to explore current practices, uncover common misconceptions, extend understanding, and highlight developments that augment the traditional practice of FA and faceted classification (FC).
    Date
    29. 8.2004 9:42:50
  4. Zeitz, G.: Wissenschaftliche Informationen per Mausklick : Bibliotheken und Forschungsinstitute eröffnen fächerübergreifendes Internetportal - Hessische Einrichtungen sind beteiligt (2003) 0.01
    0.00708436 = product of:
      0.02125308 = sum of:
        0.010674552 = product of:
          0.021349104 = sum of:
            0.021349104 = weight(_text_:29 in 1800) [ClassicSimilarity], result of:
              0.021349104 = score(doc=1800,freq=2.0), product of:
                0.13732746 = queryWeight, product of:
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.03903913 = queryNorm
                0.15546128 = fieldWeight in 1800, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.03125 = fieldNorm(doc=1800)
          0.5 = coord(1/2)
        0.010578527 = product of:
          0.021157054 = sum of:
            0.021157054 = weight(_text_:22 in 1800) [ClassicSimilarity], result of:
              0.021157054 = score(doc=1800,freq=2.0), product of:
                0.1367084 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.03903913 = queryNorm
                0.15476047 = fieldWeight in 1800, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.03125 = fieldNorm(doc=1800)
          0.5 = coord(1/2)
      0.33333334 = coord(2/6)
    
    Date
    17. 7.1996 9:33:22
    Source
    Frankfurter Rundschau. Nr.196 vom 23.8.2003, S.29
  5. Woldering, B.: ¬Die Europäische Digitale Bibliothek nimmt Gestalt an (2007) 0.01
    0.00708436 = product of:
      0.02125308 = sum of:
        0.010674552 = product of:
          0.021349104 = sum of:
            0.021349104 = weight(_text_:29 in 2439) [ClassicSimilarity], result of:
              0.021349104 = score(doc=2439,freq=2.0), product of:
                0.13732746 = queryWeight, product of:
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.03903913 = queryNorm
                0.15546128 = fieldWeight in 2439, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.03125 = fieldNorm(doc=2439)
          0.5 = coord(1/2)
        0.010578527 = product of:
          0.021157054 = sum of:
            0.021157054 = weight(_text_:22 in 2439) [ClassicSimilarity], result of:
              0.021157054 = score(doc=2439,freq=2.0), product of:
                0.1367084 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.03903913 = queryNorm
                0.15476047 = fieldWeight in 2439, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.03125 = fieldNorm(doc=2439)
          0.5 = coord(1/2)
      0.33333334 = coord(2/6)
    
    Date
    22. 2.2009 19:10:56
    Source
    Dialog mit Bibliotheken. 20(2008) H.1, S.29-31
  6. MacLeod, R.: Promoting a subject gateway : a case study from EEVL (Edinburgh Engineering Virtual Library) (2000) 0.01
    0.006233457 = product of:
      0.03740074 = sum of:
        0.03740074 = product of:
          0.07480148 = sum of:
            0.07480148 = weight(_text_:22 in 4872) [ClassicSimilarity], result of:
              0.07480148 = score(doc=4872,freq=4.0), product of:
                0.1367084 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.03903913 = queryNorm
                0.54716086 = fieldWeight in 4872, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.078125 = fieldNorm(doc=4872)
          0.5 = coord(1/2)
      0.16666667 = coord(1/6)
    
    Date
    22. 6.2002 19:40:22
  7. Internetportale unterstützen den Wissenstrasfer : Die Gelben Seiten der Forschung (2004) 0.01
    0.006226822 = product of:
      0.03736093 = sum of:
        0.03736093 = product of:
          0.07472186 = sum of:
            0.07472186 = weight(_text_:29 in 2447) [ClassicSimilarity], result of:
              0.07472186 = score(doc=2447,freq=2.0), product of:
                0.13732746 = queryWeight, product of:
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.03903913 = queryNorm
                0.5441145 = fieldWeight in 2447, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.109375 = fieldNorm(doc=2447)
          0.5 = coord(1/2)
      0.16666667 = coord(1/6)
    
    Date
    29. 1.1997 18:49:05
  8. Subject gateways (2000) 0.01
    0.0061708074 = product of:
      0.037024844 = sum of:
        0.037024844 = product of:
          0.07404969 = sum of:
            0.07404969 = weight(_text_:22 in 6483) [ClassicSimilarity], result of:
              0.07404969 = score(doc=6483,freq=2.0), product of:
                0.1367084 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.03903913 = queryNorm
                0.5416616 = fieldWeight in 6483, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.109375 = fieldNorm(doc=6483)
          0.5 = coord(1/2)
      0.16666667 = coord(1/6)
    
    Date
    22. 6.2002 19:43:01
  9. Broughton, V.: Organizing a national humanities portal : a model for the classification and subject management of digital resources (2002) 0.01
    0.0053372756 = product of:
      0.032023653 = sum of:
        0.032023653 = product of:
          0.06404731 = sum of:
            0.06404731 = weight(_text_:29 in 4607) [ClassicSimilarity], result of:
              0.06404731 = score(doc=4607,freq=2.0), product of:
                0.13732746 = queryWeight, product of:
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.03903913 = queryNorm
                0.46638384 = fieldWeight in 4607, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.09375 = fieldNorm(doc=4607)
          0.5 = coord(1/2)
      0.16666667 = coord(1/6)
    
    Date
    29. 8.2004 9:14:57
  10. LeVan, R.R.: Searching Digital Libraries (2001) 0.01
    0.0053372756 = product of:
      0.032023653 = sum of:
        0.032023653 = product of:
          0.06404731 = sum of:
            0.06404731 = weight(_text_:29 in 1054) [ClassicSimilarity], result of:
              0.06404731 = score(doc=1054,freq=2.0), product of:
                0.13732746 = queryWeight, product of:
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.03903913 = queryNorm
                0.46638384 = fieldWeight in 1054, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.09375 = fieldNorm(doc=1054)
          0.5 = coord(1/2)
      0.16666667 = coord(1/6)
    
    Date
    6.10.2002 14:34:29
  11. Schmidt, J.; Horn, A.; Thorsen, B.: Australian Subject Gateways, the successes and the challenges (2003) 0.01
    0.0053372756 = product of:
      0.032023653 = sum of:
        0.032023653 = product of:
          0.06404731 = sum of:
            0.06404731 = weight(_text_:29 in 1745) [ClassicSimilarity], result of:
              0.06404731 = score(doc=1745,freq=2.0), product of:
                0.13732746 = queryWeight, product of:
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.03903913 = queryNorm
                0.46638384 = fieldWeight in 1745, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.09375 = fieldNorm(doc=1745)
          0.5 = coord(1/2)
      0.16666667 = coord(1/6)
    
    Date
    26.12.2011 12:46:29
  12. Milanesi, C.: Möglichkeiten der Kooperation im Rahmen von Subject Gateways : das Euler-Projekt im Vergleich mit weiteren europäischen Projekten (2001) 0.01
    0.0052892636 = product of:
      0.03173558 = sum of:
        0.03173558 = product of:
          0.06347116 = sum of:
            0.06347116 = weight(_text_:22 in 4865) [ClassicSimilarity], result of:
              0.06347116 = score(doc=4865,freq=2.0), product of:
                0.1367084 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.03903913 = queryNorm
                0.46428138 = fieldWeight in 4865, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.09375 = fieldNorm(doc=4865)
          0.5 = coord(1/2)
      0.16666667 = coord(1/6)
    
    Date
    22. 6.2002 19:41:59
  13. Hjoerland, B.: ¬The methodology of constructing classification schemes : a discussion of the state-of-the-art (2003) 0.01
    0.005196493 = product of:
      0.031178957 = sum of:
        0.031178957 = product of:
          0.062357914 = sum of:
            0.062357914 = weight(_text_:methods in 2760) [ClassicSimilarity], result of:
              0.062357914 = score(doc=2760,freq=10.0), product of:
                0.15695344 = queryWeight, product of:
                  4.0204134 = idf(docFreq=2156, maxDocs=44218)
                  0.03903913 = queryNorm
                0.397302 = fieldWeight in 2760, product of:
                  3.1622777 = tf(freq=10.0), with freq of:
                    10.0 = termFreq=10.0
                  4.0204134 = idf(docFreq=2156, maxDocs=44218)
                  0.03125 = fieldNorm(doc=2760)
          0.5 = coord(1/2)
      0.16666667 = coord(1/6)
    
    Abstract
    Special classifications have been somewhat neglected in KO compared to general classifications. The methodology of constructing special classifications is important, however, also for the methodology of constructing general classification schemes. The methodology of constructing special classifications can be regarded as one among about a dozen approaches to domain analysis. The methodology of (special) classification in LIS has been dominated by the rationalistic facet-analytic tradition, which, however, neglects the question of the empirical basis of classification. The empirical basis is much better grasped by, for example, bibliometric methods. Even the combination of rational and empirical methods is insufficient. This presentation will provide evidence for the necessity of historical and pragmatic methods for the methodology of classification and will point to the necessity of analyzing "paradigms". The presentation covers the methods of constructing classifications from Ranganathan to the design of ontologies in computer science and further to the recent "paradigm shift" in classification research. 1. Introduction Classification of a subject field is one among about eleven approaches to analyzing a domain that are specific for information science and in my opinion define the special competencies of information specialists (Hjoerland, 2002a). Classification and knowledge organization are commonly regarded as core qualifications of librarians and information specialists. Seen from this perspective one expects a firm methodological basis for the field. This paper tries to explore the state-of-the-art conceming the methodology of classification. 2. Classification: Science or non-science? As it is part of the curriculum at universities and subject in scientific journals and conferences like ISKO, orte expects classification/knowledge organization to be a scientific or scholarly activity and a scientific field. However, very often when information specialists classify or index documents and when they revise classification system, the methods seem to be rather ad hoc. Research libraries or scientific databases may employ people with adequate subject knowledge. When information scientists construct or evaluate systems, they very often elicit the knowledge from "experts" (Hjorland, 2002b, p. 260). Mostly no specific arguments are provided for the specific decisions in these processes.
  14. Mayr, P.; Mutschke, P.; Petras, V.: Reducing semantic complexity in distributed digital libraries : Treatment of term vagueness and document re-ranking (2008) 0.01
    0.0050314823 = product of:
      0.030188894 = sum of:
        0.030188894 = product of:
          0.060377788 = sum of:
            0.060377788 = weight(_text_:methods in 1909) [ClassicSimilarity], result of:
              0.060377788 = score(doc=1909,freq=6.0), product of:
                0.15695344 = queryWeight, product of:
                  4.0204134 = idf(docFreq=2156, maxDocs=44218)
                  0.03903913 = queryNorm
                0.384686 = fieldWeight in 1909, product of:
                  2.4494898 = tf(freq=6.0), with freq of:
                    6.0 = termFreq=6.0
                  4.0204134 = idf(docFreq=2156, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=1909)
          0.5 = coord(1/2)
      0.16666667 = coord(1/6)
    
    Abstract
    Purpose - The general science portal "vascoda" merges structured, high-quality information collections from more than 40 providers on the basis of search engine technology (FAST) and a concept which treats semantic heterogeneity between different controlled vocabularies. First experiences with the portal show some weaknesses of this approach which come out in most metadata-driven Digital Libraries (DLs) or subject specific portals. The purpose of the paper is to propose models to reduce the semantic complexity in heterogeneous DLs. The aim is to introduce value-added services (treatment of term vagueness and document re-ranking) that gain a certain quality in DLs if they are combined with heterogeneity components established in the project "Competence Center Modeling and Treatment of Semantic Heterogeneity". Design/methodology/approach - Two methods, which are derived from scientometrics and network analysis, will be implemented with the objective to re-rank result sets by the following structural properties: the ranking of the results by core journals (so-called Bradfordizing) and ranking by centrality of authors in co-authorship networks. Findings - The methods, which will be implemented, focus on the query and on the result side of a search and are designed to positively influence each other. Conceptually, they will improve the search quality and guarantee that the most relevant documents in result sets will be ranked higher. Originality/value - The central impact of the paper focuses on the integration of three structural value-adding methods, which aim at reducing the semantic complexity represented in distributed DLs at several stages in the information retrieval process: query construction, search and ranking and re-ranking.
  15. Price, A.: Five new Danish subject gateways under development (2000) 0.00
    0.0044077197 = product of:
      0.026446318 = sum of:
        0.026446318 = product of:
          0.052892637 = sum of:
            0.052892637 = weight(_text_:22 in 4878) [ClassicSimilarity], result of:
              0.052892637 = score(doc=4878,freq=2.0), product of:
                0.1367084 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.03903913 = queryNorm
                0.38690117 = fieldWeight in 4878, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.078125 = fieldNorm(doc=4878)
          0.5 = coord(1/2)
      0.16666667 = coord(1/6)
    
    Date
    22. 6.2002 19:41:31
  16. Christof, J.: Metadata sharing : Die Verbunddatenbank Internetquellen der Virtuellen Fachbibliothek Politikwissenschaft und der Virtuellen Fachbibliothek Wirtschaftswissenschaften (2003) 0.00
    0.004403028 = product of:
      0.026418168 = sum of:
        0.026418168 = product of:
          0.052836336 = sum of:
            0.052836336 = weight(_text_:29 in 1916) [ClassicSimilarity], result of:
              0.052836336 = score(doc=1916,freq=4.0), product of:
                0.13732746 = queryWeight, product of:
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.03903913 = queryNorm
                0.38474706 = fieldWeight in 1916, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=1916)
          0.5 = coord(1/2)
      0.16666667 = coord(1/6)
    
    Date
    29. 9.2022 16:17:51
    Source
    Bibliotheken und Informationseinrichtungen - Aufgaben, Strukturen, Ziele: 29. Arbeits- und Fortbildungstagung der ASpB / Sektion 5 im DBV in Zusammenarbeit mit der BDB, BIB, DBV, DGI und VDB, zugleich DBV-Jahrestagung, 8.-11.4.2003 in Stuttgart. Red.: Margit Bauer
  17. Aksoy, C.; Can, F.; Kocberber, S.: Novelty detection for topic tracking (2012) 0.00
    0.004108188 = product of:
      0.024649128 = sum of:
        0.024649128 = product of:
          0.049298257 = sum of:
            0.049298257 = weight(_text_:methods in 51) [ClassicSimilarity], result of:
              0.049298257 = score(doc=51,freq=4.0), product of:
                0.15695344 = queryWeight, product of:
                  4.0204134 = idf(docFreq=2156, maxDocs=44218)
                  0.03903913 = queryNorm
                0.31409478 = fieldWeight in 51, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  4.0204134 = idf(docFreq=2156, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=51)
          0.5 = coord(1/2)
      0.16666667 = coord(1/6)
    
    Abstract
    Multisource web news portals provide various advantages such as richness in news content and an opportunity to follow developments from different perspectives. However, in such environments, news variety and quantity can have an overwhelming effect. New-event detection and topic-tracking studies address this problem. They examine news streams and organize stories according to their events; however, several tracking stories of an event/topic may contain no new information (i.e., no novelty). We study the novelty detection (ND) problem on the tracking news of a particular topic. For this purpose, we build a Turkish ND test collection called BilNov-2005 and propose the usage of three ND methods: a cosine-similarity (CS)-based method, a language-model (LM)-based method, and a cover-coefficient (CC)-based method. For the LM-based ND method, we show that a simpler smoothing approach, Dirichlet smoothing, can have similar performance to a more complex smoothing approach, Shrinkage smoothing. We introduce a baseline that shows the performance of a system with random novelty decisions. In addition, a category-based threshold learning method is used for the first time in ND literature. The experimental results show that the LM-based ND method significantly outperforms the CS- and CC-based methods, and category-based threshold learning achieves promising results when compared to general threshold learning.
  18. Monopoli, M.; Nicholas, D.: ¬A user evaluation of Subject Based Information Gateways : case study ADAM (2001) 0.00
    0.004066899 = product of:
      0.024401393 = sum of:
        0.024401393 = product of:
          0.048802786 = sum of:
            0.048802786 = weight(_text_:methods in 696) [ClassicSimilarity], result of:
              0.048802786 = score(doc=696,freq=2.0), product of:
                0.15695344 = queryWeight, product of:
                  4.0204134 = idf(docFreq=2156, maxDocs=44218)
                  0.03903913 = queryNorm
                0.31093797 = fieldWeight in 696, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  4.0204134 = idf(docFreq=2156, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=696)
          0.5 = coord(1/2)
      0.16666667 = coord(1/6)
    
    Abstract
    Nowadays, end-users have quick and direct access to massive amount of information available on the Net. However, this information is unorganized expecting users to be able to identify and evaluate it in accordance with their information needs. Subject based information Gateways SBIG, organized collections of networked information, provide users with a catalogue of authoritative Internet resources, which can be searched and/ or browsed. This paper provides an evaluation of one such gateway - the Art, Design, Architecture & Media Gateway ADAM. It provides information on who these users are, how often they use the service, what their reasons for use are, which search methods and services they prefer and what are the advantages and disadvantages of an online information service.
  19. Bainbridge, D.; Dewsnip, M.; Witten, l.H.: Searching digital music libraries (2005) 0.00
    0.004066899 = product of:
      0.024401393 = sum of:
        0.024401393 = product of:
          0.048802786 = sum of:
            0.048802786 = weight(_text_:methods in 997) [ClassicSimilarity], result of:
              0.048802786 = score(doc=997,freq=2.0), product of:
                0.15695344 = queryWeight, product of:
                  4.0204134 = idf(docFreq=2156, maxDocs=44218)
                  0.03903913 = queryNorm
                0.31093797 = fieldWeight in 997, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  4.0204134 = idf(docFreq=2156, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=997)
          0.5 = coord(1/2)
      0.16666667 = coord(1/6)
    
    Abstract
    There has been a recent explosion of interest in digital music libraries. In particular, interactive melody retrieval is a striking example of a search paradigm that differs radically from the standard full-text search. Many different techniques have been proposed for melody matching, but the area lacks standard databases that allow them to be compared on common grounds--and copyright issues have stymied attempts to develop such a corpus. This paper focuses on methods for evaluating different symbolic music matching strategies, and describes a series of experiments that compare and contrast results obtained using three dominant paradigms. Combining two of these paradigms yields a hybrid approach which is shown to have the best overall combination of efficiency and effectiveness.
  20. Theng, Y.-L.; Goh, D.H.-L.; Lim, E.-P.; Liu, Z.; Yin, M.; Pang, N.L.-S.; Wong, P.B.-B.: Applying scenario-based design and claims analysis to the design of a digital library of geography examination resources (2005) 0.00
    0.0037292899 = product of:
      0.022375738 = sum of:
        0.022375738 = product of:
          0.044751476 = sum of:
            0.044751476 = weight(_text_:theory in 1002) [ClassicSimilarity], result of:
              0.044751476 = score(doc=1002,freq=2.0), product of:
                0.16234003 = queryWeight, product of:
                  4.1583924 = idf(docFreq=1878, maxDocs=44218)
                  0.03903913 = queryNorm
                0.27566507 = fieldWeight in 1002, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  4.1583924 = idf(docFreq=1878, maxDocs=44218)
                  0.046875 = fieldNorm(doc=1002)
          0.5 = coord(1/2)
      0.16666667 = coord(1/6)
    
    Abstract
    This paper describes the application of Carroll's scenario-based design and claims analysis as a means of refinement to the initial design of a digital library of geographical resources (GeogDL) to prepare Singapore students to take a national examination in geography. GeogDL is built on top of G-Portal, a digital library providing services over geospatial and georeferenced Web content. Beyond improving the initial design of GeogDL, a main contribution of the paper is making explicit the use of Carroll's strong theory-based but undercapitalized scenario-based design and claims analysis that inspired recommendations for the refinement of GeogDL. The paper concludes with an overview of the implementation of some of the recommendations identified in the study to address "usability" and "usefulness" design issues in GeogDL, and discusses implications of the findings in relation to geospatial digital libraries in general.

Languages

  • e 58
  • d 32

Types

  • a 84
  • el 10
  • s 1
  • x 1
  • More… Less…