Search (1 results, page 1 of 1)

  • × author_ss:"Petrelli, D."
  • × theme_ss:"Retrievalstudien"
  • × year_i:[2000 TO 2010}
  1. Petrelli, D.: On the role of user-centred evaluation in the advancement of interactive information retrieval (2008) 0.03
    0.030485678 = product of:
      0.060971357 = sum of:
        0.060971357 = sum of:
          0.025592614 = weight(_text_:technology in 2026) [ClassicSimilarity], result of:
            0.025592614 = score(doc=2026,freq=2.0), product of:
              0.15554588 = queryWeight, product of:
                2.978387 = idf(docFreq=6114, maxDocs=44218)
                0.052224867 = queryNorm
              0.16453418 = fieldWeight in 2026, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                2.978387 = idf(docFreq=6114, maxDocs=44218)
                0.0390625 = fieldNorm(doc=2026)
          0.035378743 = weight(_text_:22 in 2026) [ClassicSimilarity], result of:
            0.035378743 = score(doc=2026,freq=2.0), product of:
              0.18288259 = queryWeight, product of:
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.052224867 = queryNorm
              0.19345059 = fieldWeight in 2026, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.0390625 = fieldNorm(doc=2026)
      0.5 = coord(1/2)
    
    Abstract
    This paper discusses the role of user-centred evaluations as an essential method for researching interactive information retrieval. It draws mainly on the work carried out during the Clarity Project where different user-centred evaluations were run during the lifecycle of a cross-language information retrieval system. The iterative testing was not only instrumental to the development of a usable system, but it enhanced our knowledge of the potential, impact, and actual use of cross-language information retrieval technology. Indeed the role of the user evaluation was dual: by testing a specific prototype it was possible to gain a micro-view and assess the effectiveness of each component of the complex system; by cumulating the result of all the evaluations (in total 43 people were involved) it was possible to build a macro-view of how cross-language retrieval would impact on users and their tasks. By showing the richness of results that can be acquired, this paper aims at stimulating researchers into considering user-centred evaluations as a flexible, adaptable and comprehensive technique for investigating non-traditional information access systems.
    Source
    Information processing and management. 44(2008) no.1, S.22-38