Search (2 results, page 1 of 1)

  • × author_ss:"Losee, R.M."
  • × theme_ss:"Retrievalstudien"
  1. Losee, R.M.: Determining information retrieval and filtering performance without experimentation (1995) 0.01
    0.007888435 = product of:
      0.023665305 = sum of:
        0.023665305 = product of:
          0.04733061 = sum of:
            0.04733061 = weight(_text_:22 in 3368) [ClassicSimilarity], result of:
              0.04733061 = score(doc=3368,freq=2.0), product of:
                0.1747608 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.04990557 = queryNorm
                0.2708308 = fieldWeight in 3368, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=3368)
          0.5 = coord(1/2)
      0.33333334 = coord(1/3)
    
    Date
    22. 2.1996 13:14:10
  2. Losee, R.M.: Evaluating retrieval performance given database and query characteristics : analytic determination of performance surfaces (1996) 0.01
    0.0053345575 = product of:
      0.016003672 = sum of:
        0.016003672 = weight(_text_:on in 4162) [ClassicSimilarity], result of:
          0.016003672 = score(doc=4162,freq=2.0), product of:
            0.109763056 = queryWeight, product of:
              2.199415 = idf(docFreq=13325, maxDocs=44218)
              0.04990557 = queryNorm
            0.14580199 = fieldWeight in 4162, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              2.199415 = idf(docFreq=13325, maxDocs=44218)
              0.046875 = fieldNorm(doc=4162)
      0.33333334 = coord(1/3)
    
    Abstract
    An analytic method of information retrieval and filtering evaluation can quantitatively predict the expected number of documents examined in retrieving a relevant document. It also allows researchers and practioners to qualitatively understand how varying different estimates of query parameter values affects retrieval performance. The incoorporation of relevance feedback to increase our knowledge about the parameters of relevant documents and the robustness of parameter estimates is modeled. Single term and two term independence models, as well as a complete term dependence model, are developed. An economic model of retrieval performance may be used to study the effects of database size and to provide analytic answers to questions comparing retrieval from small and large databases, as well as questions about the number of terms in a query. Results are presented as a performance surface, a three dimensional graph showing the effects of two independent variables on performance.