Search (2 results, page 1 of 1)

  • × author_ss:"Pentecost, J."
  • × theme_ss:"Retrievalstudien"
  1. Hersh, W.; Pentecost, J.; Hickam, D.: ¬A task-oriented approach to information retrieval evaluation : overview and design for empirical testing (1996) 0.01
    0.014540519 = product of:
      0.072702594 = sum of:
        0.072702594 = weight(_text_:thesaurus in 3001) [ClassicSimilarity], result of:
          0.072702594 = score(doc=3001,freq=2.0), product of:
            0.23732872 = queryWeight, product of:
              4.6210785 = idf(docFreq=1182, maxDocs=44218)
              0.051357865 = queryNorm
            0.30633712 = fieldWeight in 3001, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              4.6210785 = idf(docFreq=1182, maxDocs=44218)
              0.046875 = fieldNorm(doc=3001)
      0.2 = coord(1/5)
    
    Abstract
    As retrieval system become more oriented towards end-users, there is an increasing need for improved methods to evaluate their effectiveness. We performed a task-oriented assessment of 2 MEDLINE searching systems, one which promotes traditional Boolean searching on human-indexed thesaurus terms and the other natural language searching on words in the title, abstracts and indexing terms. Medical students were randomized to one of the 2 systems and given clinical questions to answer. The students were able to use each system successfully, with no significant differences in questions correctly answered, time taken, relevant articles retrieved, or user satisfaction between the systems. This approach to evaluation was successful in measuring effectiveness of system use and demonstrates that both types of systems can be used equally well with minimal training
  2. Hersh, W.R.; Pentecost, J.; Hickam, D.H.: ¬A task-oriented approach to retrieval system evaluation (1995) 0.01
    0.014540519 = product of:
      0.072702594 = sum of:
        0.072702594 = weight(_text_:thesaurus in 3867) [ClassicSimilarity], result of:
          0.072702594 = score(doc=3867,freq=2.0), product of:
            0.23732872 = queryWeight, product of:
              4.6210785 = idf(docFreq=1182, maxDocs=44218)
              0.051357865 = queryNorm
            0.30633712 = fieldWeight in 3867, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              4.6210785 = idf(docFreq=1182, maxDocs=44218)
              0.046875 = fieldNorm(doc=3867)
      0.2 = coord(1/5)
    
    Abstract
    There is a need for improved methods to evaluate the effectiveness of end user information retrieval systems. Performs a task oriented assessment of 2 MEDLINE searching systems, one which promotes Boolean searching on human indexed thesaurus terms and the other natural language searching on words in the title, abstract, and indexing terms. Each was used by medical students to answer clinical questions. Students were able to use each system successfully, with no significant differences in questions correctly answered, time taken, relevant articles retrieved, or user satisfaction between the systems. This approach to evaluation was successful in measuring effectiveness of system use and demonstrates that both types of systems can be used equally well with minimal training