Search (1 results, page 1 of 1)

  • × author_ss:"Katzer, J."
  • × author_ss:"Snyder, H."
  1. Katzer, J.; Snyder, H.: Toward a more realistic assessment of information retrieval performance (1990) 0.00
    0.0022989952 = product of:
      0.006896985 = sum of:
        0.006896985 = weight(_text_:a in 4865) [ClassicSimilarity], result of:
          0.006896985 = score(doc=4865,freq=6.0), product of:
            0.05209492 = queryWeight, product of:
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.045180224 = queryNorm
            0.13239266 = fieldWeight in 4865, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.046875 = fieldNorm(doc=4865)
      0.33333334 = coord(1/3)
    
    Abstract
    The dominant approach to information retrieval (IR) experiments contains several questionable assumptions which are no longer necessary for pragmatic reasons nor warranted conceptually. The consequence of continued acceptance of one particular untenable assumption, namely that the user's information need does not change as a result of interacting with the system or with the search intermediary, is that our understanding of the IR process and the evaluation of IR systems is distored. This distortion tends to underestimate the performance of the system and its benefits of the user. Describes work-in-progress empirically to test this assertion and obtain estimates of the value added to the user's output by various components of the search process
    Type
    a