Search (1 results, page 1 of 1)

  • × author_ss:"Petrelli, D."
  • × theme_ss:"Retrievalstudien"
  • × year_i:[2000 TO 2010}
  1. Petrelli, D.: On the role of user-centred evaluation in the advancement of interactive information retrieval (2008) 0.02
    0.02082137 = product of:
      0.08328548 = sum of:
        0.08328548 = sum of:
          0.049331643 = weight(_text_:project in 2026) [ClassicSimilarity], result of:
            0.049331643 = score(doc=2026,freq=2.0), product of:
              0.21156175 = queryWeight, product of:
                4.220981 = idf(docFreq=1764, maxDocs=44218)
                0.050121464 = queryNorm
              0.23317845 = fieldWeight in 2026, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                4.220981 = idf(docFreq=1764, maxDocs=44218)
                0.0390625 = fieldNorm(doc=2026)
          0.033953834 = weight(_text_:22 in 2026) [ClassicSimilarity], result of:
            0.033953834 = score(doc=2026,freq=2.0), product of:
              0.17551683 = queryWeight, product of:
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.050121464 = queryNorm
              0.19345059 = fieldWeight in 2026, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.0390625 = fieldNorm(doc=2026)
      0.25 = coord(1/4)
    
    Abstract
    This paper discusses the role of user-centred evaluations as an essential method for researching interactive information retrieval. It draws mainly on the work carried out during the Clarity Project where different user-centred evaluations were run during the lifecycle of a cross-language information retrieval system. The iterative testing was not only instrumental to the development of a usable system, but it enhanced our knowledge of the potential, impact, and actual use of cross-language information retrieval technology. Indeed the role of the user evaluation was dual: by testing a specific prototype it was possible to gain a micro-view and assess the effectiveness of each component of the complex system; by cumulating the result of all the evaluations (in total 43 people were involved) it was possible to build a macro-view of how cross-language retrieval would impact on users and their tasks. By showing the richness of results that can be acquired, this paper aims at stimulating researchers into considering user-centred evaluations as a flexible, adaptable and comprehensive technique for investigating non-traditional information access systems.
    Source
    Information processing and management. 44(2008) no.1, S.22-38