Search (8 results, page 1 of 1)

  • × author_ss:"Kantor, P.B."
  1. Ng, K.B.; Loewenstern, D.; Basu, C.; Hirsh, H.; Kantor, P.B.: Data fusion of machine-learning methods for the TREC5 routing tak (and other work) (1997) 0.06
    0.05997485 = product of:
      0.1199497 = sum of:
        0.1199497 = sum of:
          0.049157884 = weight(_text_:5 in 3107) [ClassicSimilarity], result of:
            0.049157884 = score(doc=3107,freq=2.0), product of:
              0.15247129 = queryWeight, product of:
                2.9180994 = idf(docFreq=6494, maxDocs=44218)
                0.052250203 = queryNorm
              0.32240748 = fieldWeight in 3107, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                2.9180994 = idf(docFreq=6494, maxDocs=44218)
                0.078125 = fieldNorm(doc=3107)
          0.07079182 = weight(_text_:22 in 3107) [ClassicSimilarity], result of:
            0.07079182 = score(doc=3107,freq=2.0), product of:
              0.18297131 = queryWeight, product of:
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.052250203 = queryNorm
              0.38690117 = fieldWeight in 3107, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.078125 = fieldNorm(doc=3107)
      0.5 = coord(1/2)
    
    Date
    27. 2.1999 20:59:22
    Source
    The Fifth Text Retrieval Conference (TREC-5). Ed.: E.M. Voorhees u. D.K. Harman
  2. Kantor, P.B.: Mathematical models in information science (2002) 0.02
    0.024777135 = product of:
      0.04955427 = sum of:
        0.04955427 = product of:
          0.09910854 = sum of:
            0.09910854 = weight(_text_:22 in 4112) [ClassicSimilarity], result of:
              0.09910854 = score(doc=4112,freq=2.0), product of:
                0.18297131 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.052250203 = queryNorm
                0.5416616 = fieldWeight in 4112, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.109375 = fieldNorm(doc=4112)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Source
    Bulletin of the American Society for Information Science. 28(2002) no.6, S.22-24
  3. Elovici, Y.; Shapira, Y.B.; Kantor, P.B.: ¬A decision theoretic approach to combining information filters : an analytical and empirical evaluation. (2006) 0.01
    0.012388567 = product of:
      0.024777135 = sum of:
        0.024777135 = product of:
          0.04955427 = sum of:
            0.04955427 = weight(_text_:22 in 5267) [ClassicSimilarity], result of:
              0.04955427 = score(doc=5267,freq=2.0), product of:
                0.18297131 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.052250203 = queryNorm
                0.2708308 = fieldWeight in 5267, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=5267)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Date
    22. 7.2006 15:05:39
  4. Kantor, P.B.; Lee, J.J.: Testing the maximum entropy principle for information retrieval (1998) 0.01
    0.009831577 = product of:
      0.019663153 = sum of:
        0.019663153 = product of:
          0.039326306 = sum of:
            0.039326306 = weight(_text_:5 in 3266) [ClassicSimilarity], result of:
              0.039326306 = score(doc=3266,freq=2.0), product of:
                0.15247129 = queryWeight, product of:
                  2.9180994 = idf(docFreq=6494, maxDocs=44218)
                  0.052250203 = queryNorm
                0.257926 = fieldWeight in 3266, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  2.9180994 = idf(docFreq=6494, maxDocs=44218)
                  0.0625 = fieldNorm(doc=3266)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    Examines the Maximum Extropy Principle (MEP) retrieving method using the TREC-5 database. Evaluates the MEP by several tests and compared with a naive ordering method and lexicographic ordering method. The MEP does not provide any startling improvement, and it works reasonably well only in the case of a small number of keys and a relatively small collection
  5. Kantor, P.B.: Information theory (2009) 0.01
    0.009831577 = product of:
      0.019663153 = sum of:
        0.019663153 = product of:
          0.039326306 = sum of:
            0.039326306 = weight(_text_:5 in 3815) [ClassicSimilarity], result of:
              0.039326306 = score(doc=3815,freq=2.0), product of:
                0.15247129 = queryWeight, product of:
                  2.9180994 = idf(docFreq=6494, maxDocs=44218)
                  0.052250203 = queryNorm
                0.257926 = fieldWeight in 3815, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  2.9180994 = idf(docFreq=6494, maxDocs=44218)
                  0.0625 = fieldNorm(doc=3815)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Date
    5. 8.2010 17:33:43
  6. Saracevic, T.; Kantor, P.B.: Studying the value of library and information services : Part II: Methodology and taxonomy (1997) 0.01
    0.0073736827 = product of:
      0.014747365 = sum of:
        0.014747365 = product of:
          0.02949473 = sum of:
            0.02949473 = weight(_text_:5 in 353) [ClassicSimilarity], result of:
              0.02949473 = score(doc=353,freq=2.0), product of:
                0.15247129 = queryWeight, product of:
                  2.9180994 = idf(docFreq=6494, maxDocs=44218)
                  0.052250203 = queryNorm
                0.19344449 = fieldWeight in 353, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  2.9180994 = idf(docFreq=6494, maxDocs=44218)
                  0.046875 = fieldNorm(doc=353)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    Details with specifics of the study: importance of taxonomy; the method used for gathering data on user assessments of value in 5 research libraries, involving 18 services and 528 interviews with users; development and presentation of the taxonomy; and statistics and tests of the taxonomy. A novel aspect is the division of value of information services into 3 general classes or facets; reasons for use of a service in the given instance; quality of interaction (use) related to that service; and worth, benefits, or implications of subsequent results from use
  7. Kantor, P.B.; Saracevic, T.: Quantitative study of the value of research libraries : a foundation for the evaluation of digital libraries (1999) 0.01
    0.0061447355 = product of:
      0.012289471 = sum of:
        0.012289471 = product of:
          0.024578942 = sum of:
            0.024578942 = weight(_text_:5 in 6711) [ClassicSimilarity], result of:
              0.024578942 = score(doc=6711,freq=2.0), product of:
                0.15247129 = queryWeight, product of:
                  2.9180994 = idf(docFreq=6494, maxDocs=44218)
                  0.052250203 = queryNorm
                0.16120374 = fieldWeight in 6711, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  2.9180994 = idf(docFreq=6494, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=6711)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    In anticipation of the explosive growth of digital libraries, a complex study was undertaken seeking to evaluate 21 diverse services at 5 major academic research libraries. This work stands as a model for evaluation of digital libraries, through its focus on both the costs of operations and the impacts of the services that those operations provide. The data have been analyzed using both statistical methods and methods of Data Envelopment Analysis. The results of the study, which are presented in detail, serve to demonstrate that a cross-functional approach to library services is feasible. They also highlight a new measure of impact, which is a weighted logarithmic combination of the amount of time that users spend interacting with the service, combined with a Likert-scale indication of the value of that service in relation to the time expended. The measure derived, incorporating simple information obtainable from the user, together with information which is readily available in server/client logs, provides an excellent foundation for transferring these measurement principles to the Digital Library environment
  8. Sun, Y.; Kantor, P.B.: Cross-evaluation : a new model for information system evaluation (2006) 0.01
    0.0061447355 = product of:
      0.012289471 = sum of:
        0.012289471 = product of:
          0.024578942 = sum of:
            0.024578942 = weight(_text_:5 in 5048) [ClassicSimilarity], result of:
              0.024578942 = score(doc=5048,freq=2.0), product of:
                0.15247129 = queryWeight, product of:
                  2.9180994 = idf(docFreq=6494, maxDocs=44218)
                  0.052250203 = queryNorm
                0.16120374 = fieldWeight in 5048, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  2.9180994 = idf(docFreq=6494, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=5048)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Source
    Journal of the American Society for Information Science and Technology. 57(2006) no.5, S.614-628