Search (4 results, page 1 of 1)

  • × author_ss:"Shah, C."
  1. Choi, E.; Shah, C.: User motivations for asking questions in online Q&A services (2016) 0.07
    0.06920077 = product of:
      0.13840154 = sum of:
        0.13840154 = product of:
          0.27680308 = sum of:
            0.27680308 = weight(_text_:q in 2896) [ClassicSimilarity], result of:
              0.27680308 = score(doc=2896,freq=14.0), product of:
                0.28916505 = queryWeight, product of:
                  6.5493927 = idf(docFreq=171, maxDocs=44218)
                  0.04415143 = queryNorm
                0.9572494 = fieldWeight in 2896, product of:
                  3.7416575 = tf(freq=14.0), with freq of:
                    14.0 = termFreq=14.0
                  6.5493927 = idf(docFreq=171, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=2896)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    Online Q&A services are information sources where people identify their information need, formulate the need in natural language, and interact with one another to satisfy their needs. Even though in recent years online Q&A has considerably grown in popularity and impacted information-seeking behaviors, we still lack knowledge about what motivates people to ask a question in online Q&A environments. Yahoo! Answers and WikiAnswers were selected as the test beds in the study, and a sequential mixed method employing an Internet-based survey, a diary method, and interviews was used to investigate user motivations for asking a question in online Q&A services. Cognitive needs were found as the most significant motivation, driving people to ask a question. Yet, it was found that other motivational factors (e.g., tension free needs) also played an important role in user motivations for asking a question, depending on asker's contexts and situations. Understanding motivations for asking a question could provide a general framework of conceptualizing different contexts and situations of information needs in online Q&A. The findings have several implications not only for developing better question-answering processes in online Q&A environments, but also for gaining insights into the broader understanding of online information-seeking behaviors.
  2. Shah, C.; Kitzie, V.: Social Q&A and virtual reference : comparing apples and oranges with the help of experts and users (2012) 0.06
    0.05848532 = product of:
      0.11697064 = sum of:
        0.11697064 = product of:
          0.23394129 = sum of:
            0.23394129 = weight(_text_:q in 457) [ClassicSimilarity], result of:
              0.23394129 = score(doc=457,freq=10.0), product of:
                0.28916505 = queryWeight, product of:
                  6.5493927 = idf(docFreq=171, maxDocs=44218)
                  0.04415143 = queryNorm
                0.8090234 = fieldWeight in 457, product of:
                  3.1622777 = tf(freq=10.0), with freq of:
                    10.0 = termFreq=10.0
                  6.5493927 = idf(docFreq=171, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=457)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    Online question-answering (Q&A) services are becoming increasingly popular among information seekers. We divide them into two categories, social Q&A (SQA) and virtual reference (VR), and examine how experts (librarians) and end users (students) evaluate information within both categories. To accomplish this, we first performed an extensive literature review and compiled a list of the aspects found to contribute to a "good" answer. These aspects were divided among three high-level concepts: relevance, quality, and satisfaction. We then interviewed both experts and users, asking them first to reflect on their online Q&A experiences and then comment on our list of aspects. These interviews uncovered two main disparities. One disparity was found between users' expectations with these services and how information was actually delivered among them, and the other disparity between the perceptions of users and experts with regard to the aforementioned three characteristics of relevance, quality, and satisfaction. Using qualitative analyses of both the interviews and relevant literature, we suggest ways to create better hybrid solutions for online Q&A and to bridge the gap between experts' and users' understandings of relevance, quality, and satisfaction, as well as the perceived importance of each in contributing to a good answer.
  3. Radford, M.L.; Connaway, L.S.; Mikitish, S.; Alpert, M.; Shah, C.; Cooke, N.A.: Shared values, new vision : collaboration and communities of practice in virtual reference and SQA (2017) 0.03
    0.026155427 = product of:
      0.052310854 = sum of:
        0.052310854 = product of:
          0.10462171 = sum of:
            0.10462171 = weight(_text_:q in 3352) [ClassicSimilarity], result of:
              0.10462171 = score(doc=3352,freq=2.0), product of:
                0.28916505 = queryWeight, product of:
                  6.5493927 = idf(docFreq=171, maxDocs=44218)
                  0.04415143 = queryNorm
                0.3618062 = fieldWeight in 3352, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  6.5493927 = idf(docFreq=171, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=3352)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    This investigation of new approaches to improving collaboration, user/librarian experiences, and sustainability for virtual reference services (VRS) reports findings from a grant project titled "Cyber Synergy: Seeking Sustainability between Virtual Reference and Social Q&A Sites" (Radford, Connaway, & Shah, 2011-2014). In-depth telephone interviews with 50 VRS librarians included questions on collaboration, referral practices, and attitudes toward Social Question and Answer (SQA) services using the Critical Incident Technique (Flanagan, 1954). The Community of Practice (CoP) (Wenger, 1998; Davies, 2005) framework was found to be a useful conceptualization for understanding VRS professionals' approaches to their work. Findings indicate that participants usually refer questions from outside of their area of expertise to other librarians, but occasionally refer them to nonlibrarian experts. These referrals are made possible because participants believe that other VRS librarians are qualified and willing collaborators. Barriers to collaboration include not knowing appropriate librarians/experts for referral, inability to verify credentials, and perceived unwillingness to collaborate. Facilitators to collaboration include knowledge of appropriate collaborators who are qualified and willingness to refer. Answers from SQA services were perceived as less objective and authoritative, but participants were open to collaborating with nonlibrarian experts with confirmation of professional expertise or extensive knowledge.
  4. Wang, Y.; Shah, C.: Investigating failures in information seeking episodes (2017) 0.01
    0.0074773864 = product of:
      0.014954773 = sum of:
        0.014954773 = product of:
          0.029909546 = sum of:
            0.029909546 = weight(_text_:22 in 2922) [ClassicSimilarity], result of:
              0.029909546 = score(doc=2922,freq=2.0), product of:
                0.15461078 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.04415143 = queryNorm
                0.19345059 = fieldWeight in 2922, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=2922)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Date
    20. 1.2015 18:30:22