Search (4 results, page 1 of 1)

  • × author_ss:"Toms, E.G."
  • × year_i:[2010 TO 2020}
  1. Dufour, C.; Bartlett, J.C.; Toms, E.G.: Understanding how webcasts are used as sources of information (2011) 0.01
    0.005470008 = product of:
      0.019145027 = sum of:
        0.0068760267 = product of:
          0.034380134 = sum of:
            0.034380134 = weight(_text_:system in 4195) [ClassicSimilarity], result of:
              0.034380134 = score(doc=4195,freq=6.0), product of:
                0.11408355 = queryWeight, product of:
                  3.1495528 = idf(docFreq=5152, maxDocs=44218)
                  0.03622214 = queryNorm
                0.30135927 = fieldWeight in 4195, product of:
                  2.4494898 = tf(freq=6.0), with freq of:
                    6.0 = termFreq=6.0
                  3.1495528 = idf(docFreq=5152, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=4195)
          0.2 = coord(1/5)
        0.0122690005 = product of:
          0.024538001 = sum of:
            0.024538001 = weight(_text_:22 in 4195) [ClassicSimilarity], result of:
              0.024538001 = score(doc=4195,freq=2.0), product of:
                0.12684377 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.03622214 = queryNorm
                0.19345059 = fieldWeight in 4195, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=4195)
          0.5 = coord(1/2)
      0.2857143 = coord(2/7)
    
    Abstract
    Webcasting systems were developed to provide remote access in real-time to live events. Today, these systems have an additional requirement: to accommodate the "second life" of webcasts as archival information objects. Research to date has focused on facilitating the production and storage of webcasts as well as the development of more interactive and collaborative multimedia tools to support the event, but research has not examined how people interact with a webcasting system to access and use the contents of those archived events. Using an experimental design, this study examined how 16 typical users interact with a webcasting system to respond to a set of information tasks: selecting a webcast, searching for specific information, and making a gist of a webcast. Using several data sources that included user actions, user perceptions, and user explanations of their actions and decisions, the study also examined the strategies employed to complete the tasks. The results revealed distinctive system-use patterns for each task and provided insights into the types of tools needed to make webcasting systems better suited for also using the webcasts as information objects.
    Date
    22. 1.2011 14:16:14
  2. Wildemuth, B.; Freund, L.; Toms, E.G.: Untangling search task complexity and difficulty in the context of interactive information retrieval studies (2014) 0.00
    0.0049850564 = product of:
      0.017447697 = sum of:
        0.005178697 = product of:
          0.025893483 = sum of:
            0.025893483 = weight(_text_:retrieval in 1786) [ClassicSimilarity], result of:
              0.025893483 = score(doc=1786,freq=4.0), product of:
                0.109568894 = queryWeight, product of:
                  3.024915 = idf(docFreq=5836, maxDocs=44218)
                  0.03622214 = queryNorm
                0.23632148 = fieldWeight in 1786, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  3.024915 = idf(docFreq=5836, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=1786)
          0.2 = coord(1/5)
        0.0122690005 = product of:
          0.024538001 = sum of:
            0.024538001 = weight(_text_:22 in 1786) [ClassicSimilarity], result of:
              0.024538001 = score(doc=1786,freq=2.0), product of:
                0.12684377 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.03622214 = queryNorm
                0.19345059 = fieldWeight in 1786, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=1786)
          0.5 = coord(1/2)
      0.2857143 = coord(2/7)
    
    Abstract
    Purpose - One core element of interactive information retrieval (IIR) experiments is the assignment of search tasks. The purpose of this paper is to provide an analytical review of current practice in developing those search tasks to test, observe or control task complexity and difficulty. Design/methodology/approach - Over 100 prior studies of IIR were examined in terms of how each defined task complexity and/or difficulty (or related concepts) and subsequently interpreted those concepts in the development of the assigned search tasks. Findings - Search task complexity is found to include three dimensions: multiplicity of subtasks or steps, multiplicity of facets, and indeterminability. Search task difficulty is based on an interaction between the search task and the attributes of the searcher or the attributes of the search situation. The paper highlights the anomalies in our use of these two concepts, concluding with suggestions for future methodological research related to search task complexity and difficulty. Originality/value - By analyzing and synthesizing current practices, this paper provides guidance for future experiments in IIR that involve these two constructs.
    Date
    6. 4.2015 19:31:22
  3. Toms, E.G.: Task-based information searching and retrieval (2011) 0.00
    0.0020714786 = product of:
      0.01450035 = sum of:
        0.01450035 = product of:
          0.07250175 = sum of:
            0.07250175 = weight(_text_:retrieval in 544) [ClassicSimilarity], result of:
              0.07250175 = score(doc=544,freq=4.0), product of:
                0.109568894 = queryWeight, product of:
                  3.024915 = idf(docFreq=5836, maxDocs=44218)
                  0.03622214 = queryNorm
                0.6617001 = fieldWeight in 544, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  3.024915 = idf(docFreq=5836, maxDocs=44218)
                  0.109375 = fieldNorm(doc=544)
          0.2 = coord(1/5)
      0.14285715 = coord(1/7)
    
    Source
    Interactive information seeking, behaviour and retrieval. Eds.: Ruthven, I. u. D. Kelly
  4. O'Brien, H.L.; Toms, E.G.: ¬The development and evaluation of a survey to measure user engagement (2010) 0.00
    5.6712516E-4 = product of:
      0.003969876 = sum of:
        0.003969876 = product of:
          0.01984938 = sum of:
            0.01984938 = weight(_text_:system in 3312) [ClassicSimilarity], result of:
              0.01984938 = score(doc=3312,freq=2.0), product of:
                0.11408355 = queryWeight, product of:
                  3.1495528 = idf(docFreq=5152, maxDocs=44218)
                  0.03622214 = queryNorm
                0.17398985 = fieldWeight in 3312, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.1495528 = idf(docFreq=5152, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=3312)
          0.2 = coord(1/5)
      0.14285715 = coord(1/7)
    
    Abstract
    Facilitating engaging user experiences is essential in the design of interactive systems. To accomplish this, it is necessary to understand the composition of this construct and how to evaluate it. Building on previous work that posited a theory of engagement and identified a core set of attributes that operationalized this construct, we constructed and evaluated a multidimensional scale to measure user engagement. In this paper we describe the development of the scale, as well as two large-scale studies (N=440 and N=802) that were undertaken to assess its reliability and validity in online shopping environments. In the first we used Reliability Analysis and Exploratory Factor Analysis to identify six attributes of engagement: Perceived Usability, Aesthetics, Focused Attention, Felt Involvement, Novelty, and Endurability. In the second we tested the validity of and relationships among those attributes using Structural Equation Modeling. The result of this research is a multidimensional scale that may be used to test the engagement of software applications. In addition, findings indicate that attributes of engagement are highly intertwined, a complex interplay of user-system interaction variables. Notably, Perceived Usability played a mediating role in the relationship between Endurability and Novelty, Aesthetics, Felt Involvement, and Focused Attention.