Search (2 results, page 1 of 1)

  • × author_ss:"Sahib, N.G."
  1. Sahib, N.G.; Tombros, A.; Stockman, T.: Investigating the behavior of visually impaired users for multi-session search tasks (2014) 0.00
    0.0016833913 = product of:
      0.016833913 = sum of:
        0.016833913 = weight(_text_:web in 1181) [ClassicSimilarity], result of:
          0.016833913 = score(doc=1181,freq=2.0), product of:
            0.0933738 = queryWeight, product of:
              3.2635105 = idf(docFreq=4597, maxDocs=44218)
              0.028611459 = queryNorm
            0.18028519 = fieldWeight in 1181, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.2635105 = idf(docFreq=4597, maxDocs=44218)
              0.0390625 = fieldNorm(doc=1181)
      0.1 = coord(1/10)
    
    Abstract
    Multi-session search tasks are complex and span more than one web session. Such tasks are challenging because searchers must keep track of their search progress and the information they encounter across sessions. Multi-session tasks can be cognitively taxing for visually impaired users because the lack of persistence of screen readers causes the load on working memory to be high. In this article, we first discuss the habitual behavior of visually impaired participants for multi-session tasks when using popular search interfaces. We then present the evaluation of a search interface developed to support complex information seeking for visually impaired users. The user evaluation was structured in two sessions to simulate a multi-session task. Thus, we discuss the strategies observed among participants to resume the search, to review previously encountered information, and to satisfy their evolved information need. We also compare the information-seeking behavior across the two sessions and examine how the proposed interface supports participants for multi-session tasks. Findings from this evaluation contribute to our understanding of the information-seeking behavior of visually impaired users and have implications for the design of tools to support searchers to manage and make sense of information during multi-session search tasks.
  2. Sahib, N.G.; Tombros, A.; Stockman, T.: Evaluating a search interface for visually impaired searchers (2015) 0.00
    0.0016833913 = product of:
      0.016833913 = sum of:
        0.016833913 = weight(_text_:web in 2255) [ClassicSimilarity], result of:
          0.016833913 = score(doc=2255,freq=2.0), product of:
            0.0933738 = queryWeight, product of:
              3.2635105 = idf(docFreq=4597, maxDocs=44218)
              0.028611459 = queryNorm
            0.18028519 = fieldWeight in 2255, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.2635105 = idf(docFreq=4597, maxDocs=44218)
              0.0390625 = fieldNorm(doc=2255)
      0.1 = coord(1/10)
    
    Abstract
    Understanding the information-seeking behavior of visually impaired users is essential to designing search interfaces that support them during their search tasks. In a previous article, we reported the information-seeking behavior of visually impaired users when performing complex search tasks on the web, and we examined the difficulties encountered when interacting with search interfaces via speech-based screen readers. In this article, we use our previous findings to inform the design of a search interface to support visually impaired users for complex information seeking. We particularly focus on implementing TrailNote, a tool to support visually impaired searchers in managing the search process, and we also redesign the spelling-support mechanism using nonspeech sounds to address previously observed difficulties in interacting with this feature. To enhance the user experience, we have designed interface features to be technically accessible as well as usable with speech-based screen readers. We have evaluated the proposed interface with 12 visually impaired users and studied how they interacted with the interface components. Our findings show that the search interface was effective in supporting participants for complex information seeking and that the proposed interface features were accessible and usable with speech-based screen readers.