File Information

File: 05-lr/acl_arc_1_sum/cleansed_text/xml_by_section/relat/86/c86-1006_relat.xml

Size: 2,702 bytes

Last Modified: 2025-10-06 14:15:59

<?xml version="1.0" standalone="yes"?>
<Paper uid="C86-1006">
  <Title>USER MODELS: THE PROBLEM OF DISPARITY</Title>
  <Section position="5" start_page="29" end_page="30" type="relat">
    <SectionTitle>
3. RELATED WORK
</SectionTitle>
    <Paragraph position="0"> Several research efforts have addressed problems related to plan disparity. Kaplan\[1982\] and McCoy\[1986\] investigated misconceptions about domain knowledge and proposed responses intended to remove the misconception. However such misconceptions may not be exhibited when they first influence the information-seeker's plan construction; in such cases, disparate plans may result and correction will entail both a response correcting the misconception and further processing to bring the system's context model and the plan under construction by the information-seeker back into alignment.</Paragraph>
    <Paragraph position="1"> Pollack\[1986\] is studying removal of what she terms the &amp;quot;appropriate query assumption&amp;quot; of previous planning systems; she proposes a richer model of planning that explicitly reasons about the information-seeker's possible beliefs and intentions. Her overall goal is to develop a better model of plan inference. She addresses the problem of queries that indicate the information-seeker's plan is inappropriate to his overall goal., and attempts to isolate the erroneous beliefs that led to the inappropriate query.</Paragraph>
    <Paragraph position="2"> This is a subclass of &amp;quot;erroneous plans&amp;quot;, since upon hearing the query, the system should detect that its context model no longer agrees with that of the information-seeker. However, queries deemed inappropriate by the system may signal phenomena other than inappropriate user plans. For example, the information-seeker may have shifted focus to another aspect of the overall task without successfully conveying this to the system, the  information-seeker may be addressing aspects of the task outside the system's limited knowledge, or the system's context model may have been in error prior to the query.</Paragraph>
    <Paragraph position="3"> Pollack is concerned with issues that arise when the information-s.~eker's plan is incorrect due to a misconception. She assumes Chat, immediately prior to the user making the &amp;quot;prob~ lematic&amp;quot; q~ery, the system's partial model of the user's plan is correct. We argue that since the system's inference mechanisms are not infallible and communication itself is imperfect, the system must contend with the possibility that its inferred model does not accurately reflect the user's plan. Previous research has failed to address this problem.</Paragraph>
  </Section>
class="xml-element"></Paper>
Download Original XML