Search (5 results, page 1 of 1)

  • × author_ss:"Hert, C.A."
  • × year_i:[1990 TO 2000}
  1. Harter, S.P.; Hert, C.A.: Evaluation of information retrieval systems : approaches, issues, and methods (1997) 0.02
    0.023341617 = product of:
      0.046683233 = sum of:
        0.046683233 = product of:
          0.09336647 = sum of:
            0.09336647 = weight(_text_:systems in 2264) [ClassicSimilarity], result of:
              0.09336647 = score(doc=2264,freq=12.0), product of:
                0.16037072 = queryWeight, product of:
                  3.0731742 = idf(docFreq=5561, maxDocs=44218)
                  0.052184064 = queryNorm
                0.58219147 = fieldWeight in 2264, product of:
                  3.4641016 = tf(freq=12.0), with freq of:
                    12.0 = termFreq=12.0
                  3.0731742 = idf(docFreq=5561, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=2264)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    State of the art review of information retrieval systems, defined as systems retrieving documents a sopposed to numerical data. Explains the classic Cranfield studies that have served as a standard for retrieval testing since the 1960s and discusses the Cranfield model and its relevance based measures of retrieval effectiveness. Details sosme of the problems with the Cranfield instruments and issues of validity and reliability, generalizability, usefulness and basic concepts. Discusses the evaluation of the Internet search engines in light of the Cranfield model, noting the very real differences between batch systems (Cranfield) and interactive systems (Internet). Because the Internet collection is not fixed, it is impossible to determine recall as a measure of retrieval effectiveness. considers future directions in evaluating information retrieval systems
  2. Nilan, M.S.; Hert, C.A.: Incorporating the user in system evaluation and design (1992) 0.01
    0.010890487 = product of:
      0.021780973 = sum of:
        0.021780973 = product of:
          0.043561947 = sum of:
            0.043561947 = weight(_text_:systems in 3867) [ClassicSimilarity], result of:
              0.043561947 = score(doc=3867,freq=2.0), product of:
                0.16037072 = queryWeight, product of:
                  3.0731742 = idf(docFreq=5561, maxDocs=44218)
                  0.052184064 = queryNorm
                0.2716328 = fieldWeight in 3867, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.0731742 = idf(docFreq=5561, maxDocs=44218)
                  0.0625 = fieldNorm(doc=3867)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    Suggests a user based approach to system design and evaluation, of computerized library systems (e.g. OPACs), which is application specific. Data was collected on 93 end user interactions with a newly implemented OPAC in a large US university library. Concludes that researchers need to explore questions relating directly to problems, faced by users and to present findings in ways which provide action oriented recommendations
  3. Hert, C.A.; Nilan, M.S.: User-based information retrieval evaluation : an examination of an online public access catalog (1991) 0.01
    0.009529176 = product of:
      0.019058352 = sum of:
        0.019058352 = product of:
          0.038116705 = sum of:
            0.038116705 = weight(_text_:systems in 3671) [ClassicSimilarity], result of:
              0.038116705 = score(doc=3671,freq=2.0), product of:
                0.16037072 = queryWeight, product of:
                  3.0731742 = idf(docFreq=5561, maxDocs=44218)
                  0.052184064 = queryNorm
                0.23767869 = fieldWeight in 3671, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.0731742 = idf(docFreq=5561, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=3671)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    Suggests that an appropriate method for evaluating information retrieval systems is to focus on the individual actions users attempts to perform on a system. The evaluation criterion 'degree of match' between attempted actions and actual system commands is one of a series of evaluation measures including perceived measures and summary measures that were investigated. Results indicate that there is the potential for significant improvement of the OPAC's interface, particularly those parts of the interface which concern searching and revision activities
  4. Hert, C.A.: Exploring a new model for understanding information retrieval interactions (1992) 0.01
    0.009529176 = product of:
      0.019058352 = sum of:
        0.019058352 = product of:
          0.038116705 = sum of:
            0.038116705 = weight(_text_:systems in 4521) [ClassicSimilarity], result of:
              0.038116705 = score(doc=4521,freq=2.0), product of:
                0.16037072 = queryWeight, product of:
                  3.0731742 = idf(docFreq=5561, maxDocs=44218)
                  0.052184064 = queryNorm
                0.23767869 = fieldWeight in 4521, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.0731742 = idf(docFreq=5561, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=4521)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    Describes a project to pull together several different strands of research into the information retrieval process by the inductive development of a model of the information retrieval process. Using the constant comparative method, user interactions with systems as represented by talk aloud protocols and post search interviews were analysed to develop the model. preliminary results, based on an analysis of the interactions and interviews of 5 users of an OPAC, suggest new variables and elements of the information retrieval process which need to be considered in later research
  5. Hert, C.A.: User goals on an online public access catalog (1996) 0.01
    0.008167865 = product of:
      0.01633573 = sum of:
        0.01633573 = product of:
          0.03267146 = sum of:
            0.03267146 = weight(_text_:systems in 4381) [ClassicSimilarity], result of:
              0.03267146 = score(doc=4381,freq=2.0), product of:
                0.16037072 = queryWeight, product of:
                  3.0731742 = idf(docFreq=5561, maxDocs=44218)
                  0.052184064 = queryNorm
                0.2037246 = fieldWeight in 4381, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.0731742 = idf(docFreq=5561, maxDocs=44218)
                  0.046875 = fieldNorm(doc=4381)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    An ongoing thread in information retrieval research has been the exploration of user goals (or information needs, or problems) on information retrieval systems. It has been suggested that an understanding of goals and their role in the information retrieval interaction can provide insight into appropriate retrieval strategies, relevant documents, and general system design. This article reports on empirical findings concerning goals of users searching an OPAC at a northeastern United States university. These findings were generated during a large inductive and qualitative study of users' interactions with the OPAC. It was found that respondents came to the OPAC to search for a variety of course- or degree-related projects in which they were engaged. Respondent goals were not greatly modified during the course of these interactions. A set of situational elements associated with the respondent's goal was also identified. The implications of these findings for OPAC design and the training of informational professionals are discussed