Information retrieval experiment (1981)
0.01
0.013857029 = product of:
0.027714059 = sum of:
0.027714059 = product of:
0.055428118 = sum of:
0.055428118 = weight(_text_:l in 2653) [ClassicSimilarity], result of:
0.055428118 = score(doc=2653,freq=2.0), product of:
0.18031335 = queryWeight, product of:
3.9746525 = idf(docFreq=2257, maxDocs=44218)
0.045365814 = queryNorm
0.30739886 = fieldWeight in 2653, product of:
1.4142135 = tf(freq=2.0), with freq of:
2.0 = termFreq=2.0
3.9746525 = idf(docFreq=2257, maxDocs=44218)
0.0546875 = fieldNorm(doc=2653)
0.5 = coord(1/2)
0.5 = coord(1/2)
- Content
- Enthält die Beiträge: ROBERTSON, S.E.: The methodology of information retrieval experiment; RIJSBERGEN, C.J. van: Retrieval effectiveness; BELKIN, N.: Ineffable concepts in information retrieval; TAGUE, J.M.: The pragmatics of information retrieval experimentation; LANCASTER, F.W.: Evaluation within the environment of an operating information service; BARRACLOUGH, E.D.: Opportunities for testing with online systems; KEEN, M.E.: Laboratory tests of manual systems; ODDY, R.N.: Laboratory tests: automatic systems; HEINE, M.D.: Simulation, and simulation experiments; COOPER, W.S.: Gedanken experimentation: an alternative to traditional system testing?; SPARCK JONES, K.: Actual tests - retrieval system tests; EVANS, L.: An experiment: search strategy variation in SDI profiles; SALTON, G.: The Smart environment for retrieval system evaluation - advantage and problem areas