Information retrieval experiment (1981)
0.00
0.004369706 = product of:
0.017478824 = sum of:
0.017478824 = product of:
0.052436467 = sum of:
0.052436467 = weight(_text_:k in 2653) [ClassicSimilarity], result of:
0.052436467 = score(doc=2653,freq=4.0), product of:
0.13429943 = queryWeight, product of:
3.569778 = idf(docFreq=3384, maxDocs=44218)
0.037621226 = queryNorm
0.39044446 = fieldWeight in 2653, product of:
2.0 = tf(freq=4.0), with freq of:
4.0 = termFreq=4.0
3.569778 = idf(docFreq=3384, maxDocs=44218)
0.0546875 = fieldNorm(doc=2653)
0.33333334 = coord(1/3)
0.25 = coord(1/4)
- Content
- Enthält die Beiträge: ROBERTSON, S.E.: The methodology of information retrieval experiment; RIJSBERGEN, C.J. van: Retrieval effectiveness; BELKIN, N.: Ineffable concepts in information retrieval; TAGUE, J.M.: The pragmatics of information retrieval experimentation; LANCASTER, F.W.: Evaluation within the environment of an operating information service; BARRACLOUGH, E.D.: Opportunities for testing with online systems; KEEN, M.E.: Laboratory tests of manual systems; ODDY, R.N.: Laboratory tests: automatic systems; HEINE, M.D.: Simulation, and simulation experiments; COOPER, W.S.: Gedanken experimentation: an alternative to traditional system testing?; SPARCK JONES, K.: Actual tests - retrieval system tests; EVANS, L.: An experiment: search strategy variation in SDI profiles; SALTON, G.: The Smart environment for retrieval system evaluation - advantage and problem areas
- Editor
- Sparck Jones, K.