Balog, K.; Schuth, A.; Dekker, P.; Tavakolpoursaleh, N.; Schaer, P.; Chuang, P.-Y.: Overview of the TREC 2016 Open Search track Academic Search Edition (2016)
0.01
0.0068851607 = product of:
0.017212901 = sum of:
0.010897844 = weight(_text_:a in 43) [ClassicSimilarity], result of:
0.010897844 = score(doc=43,freq=8.0), product of:
0.053464882 = queryWeight, product of:
1.153047 = idf(docFreq=37942, maxDocs=44218)
0.046368346 = queryNorm
0.20383182 = fieldWeight in 43, product of:
2.828427 = tf(freq=8.0), with freq of:
8.0 = termFreq=8.0
1.153047 = idf(docFreq=37942, maxDocs=44218)
0.0625 = fieldNorm(doc=43)
0.006315058 = product of:
0.012630116 = sum of:
0.012630116 = weight(_text_:information in 43) [ClassicSimilarity], result of:
0.012630116 = score(doc=43,freq=2.0), product of:
0.08139861 = queryWeight, product of:
1.7554779 = idf(docFreq=20772, maxDocs=44218)
0.046368346 = queryNorm
0.1551638 = fieldWeight in 43, product of:
1.4142135 = tf(freq=2.0), with freq of:
2.0 = termFreq=2.0
1.7554779 = idf(docFreq=20772, maxDocs=44218)
0.0625 = fieldNorm(doc=43)
0.5 = coord(1/2)
0.4 = coord(2/5)
- Abstract
- We present the TREC Open Search track, which represents a new evaluation paradigm for information retrieval. It offers the possibility for researchers to evaluate their approaches in a live setting, with real, unsuspecting users of an existing search engine. The first edition of the track focuses on the academic search domain and features the ad-hoc scientific literature search task. We report on experiments with three different academic search engines: Cite-SeerX, SSOAR, and Microsoft Academic Search.
- Type
- a