Search (3 results, page 1 of 1)

  • × author_ss:"Freund, L."
  • × author_ss:"Toms, E.G."
  1. Toms, E.G.; Freund, L.; Li, C.: WilRE: the Web Interactive information retrieval experimentation system prototype (2004) 0.00
    0.003091229 = product of:
      0.012364916 = sum of:
        0.012364916 = weight(_text_:information in 2534) [ClassicSimilarity], result of:
          0.012364916 = score(doc=2534,freq=6.0), product of:
            0.06134496 = queryWeight, product of:
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.034944877 = queryNorm
            0.20156369 = fieldWeight in 2534, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.046875 = fieldNorm(doc=2534)
      0.25 = coord(1/4)
    
    Abstract
    We introduce WiIRE, a prototype system for conducting interactive information retrieval (IIR) experiments via the Internet. We conceived Wi IRE to increase validity while streamlining procedures and adding efficiencies to the conduct of IIR experiments. The system incorporates password-controlled access, online questionnaires, study instructions and tutorials, conditional interface assignment, and conditional query assignment as well as provision for data collection. As an initial evaluation, we used WiIRE inhouse to conduct a Web-based IIR experiment using an external search engine with customized search interfaces and the TREC 11 Interactive Track search queries. Our evaluation of the prototype indicated significant cost efficiencies in the conduct of IIR studies, and additionally had some novel findings about the human perspective: about half participants would have preferred some personal contact with the researcher, and participants spent a significantly decreasing amount of time on tasks over the course of a session.
    Source
    Information processing and management. 40(2004) no.4, S.655-676
  2. Freund, L.; Toms, E.G.: Interacting with archival finding aids (2016) 0.00
    0.003091229 = product of:
      0.012364916 = sum of:
        0.012364916 = weight(_text_:information in 2851) [ClassicSimilarity], result of:
          0.012364916 = score(doc=2851,freq=6.0), product of:
            0.06134496 = queryWeight, product of:
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.034944877 = queryNorm
            0.20156369 = fieldWeight in 2851, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.046875 = fieldNorm(doc=2851)
      0.25 = coord(1/4)
    
    Abstract
    This research aimed to gain a detailed understanding of how genealogists and historians interact with, and make use of, finding aids in print and digital form. The study uses the lens of human information interaction to investigate finding aid use. Data were collected through a lab-based study of 32 experienced archives' users who completed two tasks with each of two finding aids. Participants were able to carry out the tasks, but they were somewhat challenged by the structure of the finding aid and employed various techniques to cope. Their patterns of interaction differed by task type and they reported higher rates of satisfaction, ease of use, and clarity for the assessment task than the known-item task. Four common patterns of interaction were identified: top-down, bottom-up, interrogative, and opportunistic. Results show how users interact with findings aids and identify features that support and hinder use. This research examines process and performance in addition to outcomes. Results contribute to the archival science literature and also suggest ways to extend models of human information interaction.
    Source
    Journal of the Association for Information Science and Technology. 67(2016) no.4, S.994-1008
  3. Wildemuth, B.; Freund, L.; Toms, E.G.: Untangling search task complexity and difficulty in the context of interactive information retrieval studies (2014) 0.00
    0.0021033147 = product of:
      0.008413259 = sum of:
        0.008413259 = weight(_text_:information in 1786) [ClassicSimilarity], result of:
          0.008413259 = score(doc=1786,freq=4.0), product of:
            0.06134496 = queryWeight, product of:
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.034944877 = queryNorm
            0.13714671 = fieldWeight in 1786, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.0390625 = fieldNorm(doc=1786)
      0.25 = coord(1/4)
    
    Abstract
    Purpose - One core element of interactive information retrieval (IIR) experiments is the assignment of search tasks. The purpose of this paper is to provide an analytical review of current practice in developing those search tasks to test, observe or control task complexity and difficulty. Design/methodology/approach - Over 100 prior studies of IIR were examined in terms of how each defined task complexity and/or difficulty (or related concepts) and subsequently interpreted those concepts in the development of the assigned search tasks. Findings - Search task complexity is found to include three dimensions: multiplicity of subtasks or steps, multiplicity of facets, and indeterminability. Search task difficulty is based on an interaction between the search task and the attributes of the searcher or the attributes of the search situation. The paper highlights the anomalies in our use of these two concepts, concluding with suggestions for future methodological research related to search task complexity and difficulty. Originality/value - By analyzing and synthesizing current practices, this paper provides guidance for future experiments in IIR that involve these two constructs.