Search (2 results, page 1 of 1)

  • × author_ss:"Evens, M.W."
  • × theme_ss:"Suchoberflächen"
  1. Lee, Y.-H.; Evens, M.W.: Natural language interface for an expert system (1998) 0.00
    8.619048E-4 = product of:
      0.0017238096 = sum of:
        0.0017238096 = product of:
          0.0034476193 = sum of:
            0.0034476193 = weight(_text_:s in 5108) [ClassicSimilarity], result of:
              0.0034476193 = score(doc=5108,freq=2.0), product of:
                0.04100075 = queryWeight, product of:
                  1.0872376 = idf(docFreq=40523, maxDocs=44218)
                  0.03771094 = queryNorm
                0.08408674 = fieldWeight in 5108, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  1.0872376 = idf(docFreq=40523, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=5108)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Source
    Expert systems. 15(1998) no.4, S.233-239
  2. Evens, M.W.: Natural language interface for an expert system (2002) 0.00
    5.2239327E-4 = product of:
      0.0010447865 = sum of:
        0.0010447865 = product of:
          0.002089573 = sum of:
            0.002089573 = weight(_text_:s in 3719) [ClassicSimilarity], result of:
              0.002089573 = score(doc=3719,freq=4.0), product of:
                0.04100075 = queryWeight, product of:
                  1.0872376 = idf(docFreq=40523, maxDocs=44218)
                  0.03771094 = queryNorm
                0.050964262 = fieldWeight in 3719, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  1.0872376 = idf(docFreq=40523, maxDocs=44218)
                  0.0234375 = fieldNorm(doc=3719)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    The explanation facility, the ability to display its reasoning to the user, has been a key component of the expert system from the very beginning. Even though this facility may not be used very often, its presence gives users some crucial reassurance that they can explore the system's decision-making processes and themselves make a reasoned decision about whether or not to accept the advice given by the system. Elaine Rich (S) was the first to enunciate a fundamental principle of explanation generation in expert systems: It is essential that the explanation generated be derived from the actual decision-making process used by the system so that as that process changes, the explanations change with it. If the system relies an previously stored "canned explanations," then changes in the rules or the inference processes will leave the system providing explanations that are no longer valid. She argues also that the system can give deeper explanations if it operates off the internal reasoning process. From the very beginning, expert systems were thought of as vehicles for learning, particularly through the text that the system provides to explain its reasoning. When William Clancey (6) set out to produce a tutoring system based an the MYCIN system, people thought that this was going to be a quick and easy thesis, but Clancey soon realized that MYCIN's rules, written by experts for other practicing physicians, were not an appropriate way to teach diagnosis to medical students. He spent 10 years building and rebuilding the NEOMYCIN/GUIDON system as an effective tutoring system for medical students. Because of the historic connection between expert sytems and tutoring systems, we add a discussion of natural language interfaces for tutoring systems at the end of this article. Dialogue issues are becoming important as hardware systems speed up and software systems become sophisticated enough to carry an an actual dialogue with the user. This is particularly true in tutoring systems that teach languages. We will conclude with a brief mention of some systems of the future that are still in the research stage.
    Pages
    S.228-258