Search (4 results, page 1 of 1)

  • × author_ss:"Shah, C."
  • × theme_ss:"Informationsdienstleistungen"
  1. Shah, C.; Kitzie, V.: Social Q&A and virtual reference : comparing apples and oranges with the help of experts and users (2012) 0.00
    0.0025370158 = product of:
      0.0050740317 = sum of:
        0.0050740317 = product of:
          0.010148063 = sum of:
            0.010148063 = weight(_text_:a in 457) [ClassicSimilarity], result of:
              0.010148063 = score(doc=457,freq=18.0), product of:
                0.053105544 = queryWeight, product of:
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.046056706 = queryNorm
                0.19109234 = fieldWeight in 457, product of:
                  4.2426405 = tf(freq=18.0), with freq of:
                    18.0 = termFreq=18.0
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=457)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    Online question-answering (Q&A) services are becoming increasingly popular among information seekers. We divide them into two categories, social Q&A (SQA) and virtual reference (VR), and examine how experts (librarians) and end users (students) evaluate information within both categories. To accomplish this, we first performed an extensive literature review and compiled a list of the aspects found to contribute to a "good" answer. These aspects were divided among three high-level concepts: relevance, quality, and satisfaction. We then interviewed both experts and users, asking them first to reflect on their online Q&A experiences and then comment on our list of aspects. These interviews uncovered two main disparities. One disparity was found between users' expectations with these services and how information was actually delivered among them, and the other disparity between the perceptions of users and experts with regard to the aforementioned three characteristics of relevance, quality, and satisfaction. Using qualitative analyses of both the interviews and relevant literature, we suggest ways to create better hybrid solutions for online Q&A and to bridge the gap between experts' and users' understandings of relevance, quality, and satisfaction, as well as the perceived importance of each in contributing to a good answer.
    Type
    a
  2. Shah, C.: Collaborative information seeking (2014) 0.00
    0.0022374375 = product of:
      0.004474875 = sum of:
        0.004474875 = product of:
          0.00894975 = sum of:
            0.00894975 = weight(_text_:a in 1193) [ClassicSimilarity], result of:
              0.00894975 = score(doc=1193,freq=14.0), product of:
                0.053105544 = queryWeight, product of:
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.046056706 = queryNorm
                0.1685276 = fieldWeight in 1193, product of:
                  3.7416575 = tf(freq=14.0), with freq of:
                    14.0 = termFreq=14.0
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=1193)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    The notions that information seeking is not always a solitary activity and that people working in collaboration for information intensive tasks should be studied and supported have become more prevalent in recent years. Several new research questions, methodologies, and systems have emerged around these notions that may prove to be useful beyond the field of collaborative information seeking (CIS), with relevance to the broader area of information seeking and behavior. This article provides an overview of such key research work from a variety of domains, including library and information science, computer-supported cooperative work, human-computer interaction, and information retrieval. It starts with explanations of collaboration and how CIS fits in different contexts, emphasizing the interactive, intentional, and mutually beneficial nature of CIS activities. Relations to similar and related fields such as collaborative information retrieval, collaborative information behavior, and collaborative filtering are also clarified. Next, the article presents a synthesis of various frameworks and models that exist in the field today, along with a new synthesis of 12 different dimensions of group activities. A discussion on issues and approaches relating to evaluating various parameters in CIS follows. Finally, a list of known issues and challenges is presented to provide an overview of research opportunities in this field.
    Type
    a
  3. Le, L.T.; Shah, C.: Retrieving people : identifying potential answerers in Community Question-Answering (2018) 0.00
    0.0022374375 = product of:
      0.004474875 = sum of:
        0.004474875 = product of:
          0.00894975 = sum of:
            0.00894975 = weight(_text_:a in 4467) [ClassicSimilarity], result of:
              0.00894975 = score(doc=4467,freq=14.0), product of:
                0.053105544 = queryWeight, product of:
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.046056706 = queryNorm
                0.1685276 = fieldWeight in 4467, product of:
                  3.7416575 = tf(freq=14.0), with freq of:
                    14.0 = termFreq=14.0
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=4467)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    Community Question-Answering (CQA) sites have become popular venues where people can ask questions, seek information, or share knowledge with a user community. Although responses on CQA sites are obviously slower than information retrieved by a search engine, one of the most frustrating aspects of CQAs occurs when an asker's posted question does not receive a reasonable answer or remains unanswered. CQA sites could improve users' experience by identifying potential answerers and routing appropriate questions to them. In this paper, we predict the potential answerers based on question content and user profiles. Our approach builds user profiles based on past activity. When a new question is posted, the proposed method computes scores between the question and all user profiles to find the potential answerers. We conduct extensive experimental evaluations on two popular CQA sites - Yahoo! Answers and Stack Overflow - to show the effectiveness of our algorithm. The results show that our technique is able to predict a small group of 1000 users from which at least one user will answer the question with a probability higher than 50% in both CQA sites. Further analysis indicates that topic interest and activity level can improve the correctness of our approach.
    Type
    a
  4. Radford, M.L.; Connaway, L.S.; Mikitish, S.; Alpert, M.; Shah, C.; Cooke, N.A.: Shared values, new vision : collaboration and communities of practice in virtual reference and SQA (2017) 0.00
    0.0016913437 = product of:
      0.0033826875 = sum of:
        0.0033826875 = product of:
          0.006765375 = sum of:
            0.006765375 = weight(_text_:a in 3352) [ClassicSimilarity], result of:
              0.006765375 = score(doc=3352,freq=8.0), product of:
                0.053105544 = queryWeight, product of:
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.046056706 = queryNorm
                0.12739488 = fieldWeight in 3352, product of:
                  2.828427 = tf(freq=8.0), with freq of:
                    8.0 = termFreq=8.0
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=3352)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    This investigation of new approaches to improving collaboration, user/librarian experiences, and sustainability for virtual reference services (VRS) reports findings from a grant project titled "Cyber Synergy: Seeking Sustainability between Virtual Reference and Social Q&A Sites" (Radford, Connaway, & Shah, 2011-2014). In-depth telephone interviews with 50 VRS librarians included questions on collaboration, referral practices, and attitudes toward Social Question and Answer (SQA) services using the Critical Incident Technique (Flanagan, 1954). The Community of Practice (CoP) (Wenger, 1998; Davies, 2005) framework was found to be a useful conceptualization for understanding VRS professionals' approaches to their work. Findings indicate that participants usually refer questions from outside of their area of expertise to other librarians, but occasionally refer them to nonlibrarian experts. These referrals are made possible because participants believe that other VRS librarians are qualified and willing collaborators. Barriers to collaboration include not knowing appropriate librarians/experts for referral, inability to verify credentials, and perceived unwillingness to collaborate. Facilitators to collaboration include knowledge of appropriate collaborators who are qualified and willingness to refer. Answers from SQA services were perceived as less objective and authoritative, but participants were open to collaborating with nonlibrarian experts with confirmation of professional expertise or extensive knowledge.
    Type
    a