Search (1 results, page 1 of 1)

  • × author_ss:"Dekker, P."
  • × author_ss:"Schaer, P."
  • × theme_ss:"Retrievalstudien"
  1. Balog, K.; Schuth, A.; Dekker, P.; Tavakolpoursaleh, N.; Schaer, P.; Chuang, P.-Y.: Overview of the TREC 2016 Open Search track Academic Search Edition (2016) 0.01
    0.006580358 = product of:
      0.036191966 = sum of:
        0.0062481174 = weight(_text_:a in 43) [ClassicSimilarity], result of:
          0.0062481174 = score(doc=43,freq=8.0), product of:
            0.030653298 = queryWeight, product of:
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.026584605 = queryNorm
            0.20383182 = fieldWeight in 43, product of:
              2.828427 = tf(freq=8.0), with freq of:
                8.0 = termFreq=8.0
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.0625 = fieldNorm(doc=43)
        0.02994385 = weight(_text_:k in 43) [ClassicSimilarity], result of:
          0.02994385 = score(doc=43,freq=2.0), product of:
            0.09490114 = queryWeight, product of:
              3.569778 = idf(docFreq=3384, maxDocs=44218)
              0.026584605 = queryNorm
            0.31552678 = fieldWeight in 43, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.569778 = idf(docFreq=3384, maxDocs=44218)
              0.0625 = fieldNorm(doc=43)
      0.18181819 = coord(2/11)
    
    Abstract
    We present the TREC Open Search track, which represents a new evaluation paradigm for information retrieval. It offers the possibility for researchers to evaluate their approaches in a live setting, with real, unsuspecting users of an existing search engine. The first edition of the track focuses on the academic search domain and features the ad-hoc scientific literature search task. We report on experiments with three different academic search engines: Cite-SeerX, SSOAR, and Microsoft Academic Search.
    Type
    a