File Information

File: 05-lr/acl_arc_1_sum/cleansed_text/xml_by_section/metho/94/w94-0305_metho.xml

Size: 35,930 bytes

Last Modified: 2025-10-06 14:13:53

<?xml version="1.0" standalone="yes"?>
<Paper uid="W94-0305">
  <Title>Discourse Planning as an Optimization Process</Title>
  <Section position="3" start_page="0" end_page="37" type="metho">
    <SectionTitle>
2 Overview of the System
</SectionTitle>
    <Paragraph position="0"> WISHFUL-II receives as input a conceptto be conveyed, e.g., Wallaby, a list of aspects that must be conveyed about this concept, e.g., habitat and body parts, and a desired level of expertise the user should attain as a result of the presentation. null WISHFUL-II was used to generate descriptive discourse in various technical domains, such as chemistry, high-school algebra, animal taxonomy and Lisp. It produces multi-sentence paragraphs of connected English text. The discourse planning mechanism, which is the focus of this paper, generates a set of Rhetorical Devices (RDs), where each RD is a rhetorical action, such as Assert, Negate or Instantiate, applied to a proposition. This set of RDs is optimal with respect to a given optimization criterion, e.g., conciseness or depth.</Paragraph>
    <Paragraph position="1"> The following steps axe performed by WISHFUL-II.</Paragraph>
    <Paragraph position="2">  1. Content Selection - WISHFUL-II consults a model of the user's beliefs in order to determine which propositions</Paragraph>
    <Section position="1" start_page="37" end_page="37" type="sub_section">
      <SectionTitle>
7th International Generation Workshop * Kennebunkport, Maine * June 21-24, 1994
</SectionTitle>
      <Paragraph position="0"> must be presented to convey the given aspects of a concept.</Paragraph>
      <Paragraph position="1"> This step selects propositions about which the user has misconceptions, propositions that axe unknown to the user, and propositions that are not believed by the user to the extent demanded by the desired level of expertise.</Paragraph>
      <Paragraph position="2"> 2. Generation of the Optiraal Set of RDs - WISHFUL-II searches for a set of RDs that conveys the propositions generated in the previous step while satisfying a given optimization objective. The process of generating candidate sets of RDs considers the following factors: (1) the effect of inferences from an RD on a user's beliefs; (2) the amount of prerequisite information required by the user to understand the concepts in an RD; and (3) the amount of information to be included in referring expressions which identify the concepts in an RD.</Paragraph>
      <Paragraph position="3"> 3. Discourse Structuring- A discourse structuring mechanism extracts discourse rdations and constraints from the set of RDs generated in Step 2. It then generates an ordered sequence of the RDs in this set, where the strongest relations between the RDs are represented and no constraints are violated. Where necessary, the RDs in this sequence are interleaved with conjunctive expressions that signal the relationship between them \[Zukerman and Mc-Conachy, 1993b\].</Paragraph>
      <Paragraph position="4"> 4. Generation of Anaphorlc Referring Expressions-Anaphoric referring expressions are generated for RDs that refer to a concept in focus. This process follows the organization of the discourse, since the appropriate use of anaphora depends on the structure of the discourse.</Paragraph>
    </Section>
  </Section>
  <Section position="4" start_page="37" end_page="37" type="metho">
    <SectionTitle>
5. Discourse Realization - The resulting sequence of RDs
</SectionTitle>
    <Paragraph position="0"> is realized in English by means of the Functional Unification Grammar described in \[Elhadad, 1992\].</Paragraph>
  </Section>
  <Section position="5" start_page="37" end_page="41" type="metho">
    <SectionTitle>
3 Generating the Optimal Set of RDs
</SectionTitle>
    <Paragraph position="0"> The main stage of the optimization procedure consists of generating alternative sets of RDs that can convey a set of propositions. The first step in this stage consists of generating candidate RDs that can convey each proposition separately.</Paragraph>
    <Paragraph position="1"> To this effect, WISHFUL-II reasons from the propositions determined in the content selection step to the RDs that may be used to convey these propositions. This reasoning mechanism has been widely used in NLG systems, e.g., \[Moore and Swartout, 1989; Cawsey, 1990\].</Paragraph>
    <Paragraph position="2"> The process of generating a set of RDs that can convey a set of propositions is not a straightforward extension of the process of generating candidate RDs that can convey each proposition separately. This is due to the foUowing reasons: (1) an inference from an RD generated to convey a proposition pl may undermine the effect of an RD generated to convey a proposition pj; and (2) an RD generated to convey a proposition pl may be made obsolete by an RD which was generated to convey another proposition, but from which the user can infer pi. Further, it is not sufficient to propose a single set of RDs that can convey a set of propositions, because a set of RDs that initially appears to be promising may require a large number of RDs to convey its prerequisite information or to identify the concepts mentioned in it. Thus, after generating the RDs that can convey each of the intended propositions separately, the optimization procedure must consider concurrently the following inter-related factors in order to generate candidate sets of RDs that can convey the intended propositions: (1) the effect of the RDs in a set on a user's beliefs, (2) the prerequisite information that the user must know in order to understand these RDs, and (3) the referring expressions required to identify the concepts mentioned in these RDs.</Paragraph>
    <Paragraph position="3"> Owing to the interactions between the RDs in a set, the problems of generating the most concise set of RDs and generating the shallowest set of RDs are NP-hard 1. Since this level of complexity is likely to be maintained for other optimization objectives, we have chosen a weak search procedure for the implementation of the optimization process.</Paragraph>
    <Paragraph position="4"> In the following sections, we describe the optimization process as an application of the Graphsearch algorithm \[Nilsson, 1980\], and discuss the implementation of the main steps of this algorithm.</Paragraph>
    <Section position="1" start_page="37" end_page="38" type="sub_section">
      <SectionTitle>
3.1 The Basic Optimization Procedure
</SectionTitle>
      <Paragraph position="0"> Our optimization procedure, Optimize-RDs, receives as input the set of propositions generated in the content selection step of WISHFUL-II. It implements a simplitied version of the Graphsearch algorithm \[Nilsson, 1980\] to generate a set of RDs that conveys these propositions and satisfies a given optimization criterion. The discourse planning considerations are incorporated during the expansion stage and the evaluation stage of Graphsearch.</Paragraph>
      <Paragraph position="1"> The expansion stage of our procedure activates algorithm Ezpand-sets-of-RDs, which generates alternative minimally sufficient sets of RDs that can convey a set of intended propositions (Step 5 in procedure Optimize.RDs). A set of RDs is minimally sufficient if the removal of any RD causes the set to stop conveying the intended information. Note that a minimally sufficient set of RDs is not necessarily a minimal set of RDs. For example, both of the alternatives in Table 1 are composed of minimally sufficient sets of RDs.</Paragraph>
      <Paragraph position="2"> In this stage, the procedure also determines which prerequisite propositions must be known to the user to enable a set of RDs to convey the intended propositions, and which referring expressions are required to identify the concepts in a set of RDs. During the evaluation stage, the procedure ranks each set of RDs in relation to the other candidates, and prunes redundant RDs (Step 7). Both the ranking process and the pruning process consider the extent to which a set of RDs is likely to satisfy a given optimization criterion.</Paragraph>
      <Paragraph position="3">  Algorithm Optimlze-RDs( {propositions} )  1. Create a search graph G consisting solely of a start node s which contains {propositions}. Put s in a list called OPEN. 2. LOOP: If OPEN is empty, exit with failure.</Paragraph>
      <Paragraph position="4"> 3. Select the first node in OPEN, remove it from OPEN. Call this node n.</Paragraph>
      <Paragraph position="5"> 4. If n does not require prerequisite information or referring expressions, then exit successfuUy (n is a goal node).</Paragraph>
      <Paragraph position="6"> 5. Expansion: M *..- Ezpand-sets-oI-RDs(n).</Paragraph>
      <Paragraph position="7">  Install the sets of RDs in M as successors of n in G.</Paragraph>
      <Paragraph position="8"> I Finding a concise set of RDs that conveys a set of propositions reduces to the Minimum Covering problem, and finding a shallow set of &amp;quot;RDs that conveys a set of propositions reduces to Satisfiability \[Gaxey and Johnson, 1979\].</Paragraph>
      <Paragraph position="9">  7th International Generation Workshop * Kennebunkport, Maine * June 21-24, 1994 6. Add the successors of node n to OPEN.</Paragraph>
      <Paragraph position="10"> 7. Evaluation: Reorder the nodes in OPEN and prune redundant nodes according to the given optimization criterion. null 8. Go LOOP.</Paragraph>
    </Section>
    <Section position="2" start_page="38" end_page="41" type="sub_section">
      <SectionTitle>
3.2 Expanding Sets of RDs
</SectionTitle>
      <Paragraph position="0"> be expanded, and returns all the minimally sufficient sets of RDs that convey the set of propositions in this node, accompanied by their respective prerequisite propositions and referring expressions. We compute all the minimally sufficient sets of RDs, rather than just the minimal set of RDs, because a set of RDs that initially appears to be promising may require a large number of RDs in order to convey its prerequisite information or to identify the concepts in it.</Paragraph>
      <Paragraph position="1">  Algorithm Expand-sets-of-RDs(n) 1. Determine RDs that increase a user's belief in each proposition in node n. (Not all the RDs generated in this step axe capable of conveying an intended proposition by themselves, but they may be able to do so in combination with other RDs.) (Section 3.2.1) 2. Use these RDs to construct minimally sufficient sets of RDs that convey all the propositions in n jointly. Put these sets of I~Ds in {A47~D}. (Section 3.2.2) 3. Determine the prerequisite propositions required by each set of RDs in {A4gD} so that the user can understand it.</Paragraph>
      <Paragraph position="2"> (Section 3.2.3) 4. Determine referring expressions which evoke the concepts  in each set of RDs in {.MT~D}. (Section 3.2.4) The output of Ezpand.sets-of.RDs takes the form of a set of RD-Graphs. An RD-Graph is a directed graph that contains the following components: (1) the set of propositions to be conveyed (pl,.--,p,~ in Figure 1); (2) a minimally sufficient set of RDs (RD~,..., RDm); (3) the effect of the inferences from the RDs in this set on the intended propositions, and possibly on other (unintended) propositions (labelled wid); (4) the prerequisite propositions that enable these RDs to succeed (p~ .... ,p~); (5) the relations between the pre-requisite propositions and the main RDs (in thick lines); (6) the sets of RDs that evoke concepts in the main RDs ({RD m+l } .... , {RD m+t }); and (7) the relations between the evocative sets of RDs and the main RDs (labelled vm+id)-The main set of RDs and the relations between the RDs in this set and the propositions to be conveyed are generated in Step 2 above. The weight wid contains information about the effect of RDi on the user's belief in proposition pj. The prerequisite propositions are generated in Step 3, and the evocative sets of RDs and their corresponding links are produced in Step 4.</Paragraph>
      <Paragraph position="3"> 3.L1 Determining RDs Given a list of propositions to be conveyed, {p}, in this step we propose RDs that can increase a user's belief in each of these propositions. To this effect, for each proposition pi E {p} we first consider the following RDs: Assertion (A), Negation (N), Instantiation (I) and Simile (S),</Paragraph>
      <Paragraph position="5"> where different Instantiations and Similes may be generated for each proposition. For example, the proposition \[Bracket-Simplification step-1 +/-\] may be instantiated with respect to Numbers, e.g., 2(3 + 5) -- 2 x 8, and to Like Terms, e.g., 2(3z + 5z) = 2 x 8z. Those RDs that increase a user's belief in pl are then put in a list called RD.list(pl). Next, for each proposition pi, we consider the RDs in RD-list(pi). If an inference from any of these RDs increases a user's belief in a proposition pj # pi, this RD is added to RD.list(pj). In aAdition, if any of the generated RDs yields an incorrect belief with respect to a proposition that is not in {p}, this proposition is added to {p}, and the process of determining RDs is repeated for this proposition in conjunction with the other propositions in {p}. This is necessary because RDs that axe used to convey this new proposition could affect other propositions previously in {p} and vice versa. This process stops when no incorrect beliefs are inferred.</Paragraph>
      <Paragraph position="6"> This process is implemented in algorithm Determine-RDs, which receives three input parameters: (1) the propositions for which RDs were generated in the previous recursive call to Determine-RDs, (2) the propositions to be considered in the current activation of Determine-RDs, and (3) the RD-list generated in the previous recursive calls. Its initial activation is Determine-RDs(nil,{p}, nil), and its output is RD.list.</Paragraph>
      <Paragraph position="7">  stantiation is performed with respect to an instance I, and the Simile is performed between an object Oi, which is the subject of proposition pl, and another object O ~.</Paragraph>
      <Paragraph position="8"> (Note that several instances I and objects O may be used to generate different Instantiations and Similes, respectively.) null  (b) Assign to RD-list(pi) the RDs that increase the user's belief in pl.</Paragraph>
      <Paragraph position="9"> 2. Forward reasoning: (a) For each proposition pi E {newp} determine whether any RD in RD.list(pi) supports other propositions in {oldp} U{newp}. If so, add this RD to the RD-lists of these propositions.</Paragraph>
      <Paragraph position="10"> 2 Other RDs that may be generated involve subclass or superclass concepts of the target concept in an intended proposition. However, the generation of these RDs has not been incorporated into WISHFUL-II yet.</Paragraph>
      <Paragraph position="11">  7th International Generation Workshop * Kennebunkport, Maine * June 21-24, 1994 (b) For each proposition pi ~ {oldp} determine whether any RD in RD.list(pl) supports propositions in {newp}. If so, add this RD to the RD-lists of these propositions.</Paragraph>
      <Paragraph position="12"> (c) Append the propositions in {newp} to {oldp}.</Paragraph>
      <Paragraph position="13"> (d) If any RD used to convey a proposition pi E {newp} yields incorrect beliefs, then i. Assign to {newp} the propositions which contradict these beliefs.</Paragraph>
      <Paragraph position="14"> if. Assign to RD-list the result returned by DetermineRDs({oldp},{newp},RD-list) null 3. Return( RD-list)  To illustrate this process, let us consider a situation where we want to convey to the user the following propositions: \[Wallaby hop\] and \[Kangaroo hop\]. In the first stage, our procedure generates two RDs that can convey the proposition \[Wallaby hop\]: Assert\[Wallaby hop\] and Instantiate\[Wallaby hop\], where the Instantiation is performed with respect to a wallaby called Wally that is known to the user. Our procedure generates only the RD Assert\[Kangaroo hop\] to convey the proposition \[Kangaroo hop\] (an Instantiation is not generated since the user is not familiar with any particular kangaroos). In the forward reasoning stage, the inferences from these RDs axe considered. If the user knows that wallPSbies are similar to kangaroos, the RD generated to convey the proposition \[Kangaroo hop\] can increase the user's belief in the proposition \[Wallaby hop\], and is therefore added to the RD-listof \[Wallaby hop\]. Similarly, the RDs generated to convey \[Wallaby hop\] are added to the RD.listof \[Kangaroo hop\]. From the above Assertions the user may also infer incorrectly that wombats hop. In this case, a proposition which negates this incorrect conclusion, i.e., \[Wombat -~hop\], is assigned to {newp}. The RDs that can convey this proposition in our example axe Negate\[Wombat hop\] and Instantiate\[Wombat -~hop\], where the Instantiation is performed with respect to a wombat called Wimpy that is known to the user. These RDs in turn may yield the incorrect inferences that neither wallabies nor kangaroos hop, which contradict the intended propositions. However, since the propositions that contradict these inferences already exist in {oldp}, the process stops.  In this step, we generate all the minimally sufficient sets of RDs that can convey jointly all the intended propositions. For each proposition pi, we first determine whether RDs that were generated to convey other propositions can decrease a user's belief in pl. Next, for each RD in RD.list(pl), we determine whether it can overcome the detrimental effect of these 'negative' RDs. This step identifies combinations of RDs that cannot succeed in conveying the intended propositions. It results in the following labelling of the RDs in RD-list: RDs that cart overcome all negative effects are labelled with the symbol Jail\] (the only RDs that may be labelled in this manner axe Assertions and Negations); RDs that cannot convey an intended proposition by themselves are labelled with the symbol \[-\]; RDs that can convey an intended proposition, but cannot overcome any negative effects are labelled with \[none\]; and the remaining RDs are labelled with the negative RDs they can overcome.</Paragraph>
      <Paragraph position="15"> We then use a search procedure to generate all the sets of RDs which consist of one RD from each RD.list. The sets of RDs that convey all the intended propositions are then stored in a list called SUCCEED; and the sets of RDs that fall to convey one or more propositions axe stored in a list called FAILED, together with the proposition(s) that failed to be conveyed. Additional minimally sufficient sets of RDs axe then generated from FAILED as follows: we select a proposition pi that was not conveyed, and create pairs of RDs composed of the RD that failed to convey pi and each of the other RDs in RD.list(pl) that is not labelled Jail\] (the RDs that are labelled Jail\] can convey pi by themselves, and therefore there is no need to combine them with other RDs). Each pair of RDs inherits the negative RDs that can be overcome by its paxents, and may be able to overcome additional negative RDs which caused its parents to fail separately. For each pair of RDs, a new set of RDs is created by replacing the RD which failed to convey pi with this pair of RDs. The search is then continued for each of these new sets of RDs until a minimally sufficient set of RDs is generated or failure occurs again. In this case, the process of generating pairs of RDs is repeated, and the seaxch is resumed. If a pair of RDs fails, then it forms the basis for triplets, and so on. The search stops when the RD.list of a proposition which failed to be conveyed contains no RDs with which the failed RDs (or RD-tuples) can be combined.</Paragraph>
      <Paragraph position="16">  Algorithm Construct.sets-of-RDs( {p}, RD-list) 1. Initialize two lists, SUCCEED and FAILED, to empty.</Paragraph>
      <Paragraph position="17"> 2. Determine- I~Ds( nil, {p } , nil).</Paragraph>
      <Paragraph position="18"> 3. For each proposition p~ E {p} (a) Put in NegRDs(pl) all the RDs in RD-list that have a negative effect on pi.</Paragraph>
      <Paragraph position="19"> (b) Label each RD in RD-list(p~) according to the RDs in NegRDs(pi) whose effect it can overcome. If there are several RDs in NegRDs(pi) then all the combinations of these RDs must be considered.</Paragraph>
      <Paragraph position="20"> 4. Exhaustively generate all the combinations of RDs consisting of one RD from the RD-list of each proposition.</Paragraph>
      <Paragraph position="21"> Consider the combined effect of several RDs to determine whether a set of RDs conveys completely a set of propositions. null 5. Append the successful combinations of RDs to SUCCEED, and remove any sets of RDs in SUCCEED that subsume other sets of RDs.</Paragraph>
      <Paragraph position="22"> 6. Append the failed combinations of RDs to FAILED together with the reason for the failure, i.e., the RDs that failed and the propositions that were not conveyed.</Paragraph>
      <Paragraph position="23"> 7. If FAILED is empty, then exit.</Paragraph>
      <Paragraph position="24"> 8. Assign to CURRD the first set of RDs in FAILED, and remove it from FAILED.</Paragraph>
      <Paragraph position="25"> 9. Select from CURRD a proposition pi that was not conveyed, and generate successors of CURRD as follows: (a) Generate children of the RD that failed to convey pi by combining it with other RDs in RD-list(pi) that axe not labelled Jail\].</Paragraph>
      <Paragraph position="26"> (b) If the failed RD has no children in pl, then remove from FAILED all the sets of RDs which failed when this RD tried to convey pi, and go to Step 7.</Paragraph>
      <Paragraph position="27">  (c) Attach to each combination of RDs the list of negative RDs it can overcome.</Paragraph>
      <Paragraph position="28"> (d) Create sets of RDs such that in each set the failed RD is replaced with one of its children.</Paragraph>
      <Paragraph position="29"> 10. Go to Step 5.</Paragraph>
      <Paragraph position="30">  To illustrate this process, let us reconsider the example discussed in Section 3.2.1. Table 2 contains the RD-lists for the propositions in this example, where each RD is labelled according to the RDs whose negative effect it can overcome. For instance, Negate(p3) can overcome the combined negative effect of Assert(p~) and Instantiate(pz), and also the effect of Assert(p2). However, it cannot overcome the combined effect of Assert(p1) and Assert(p2), or Assert(p2) and Instantiate(p1). Instantiate(px) can convey proposition pl, but cannot overcome any negative effects. Assert(p2) contributes to the belief in pl but cannot convey it alone. Figure 2 contains part of the search tree generated in Step 4 of algorithm Construct-sets-o/-RDs. Each path in this tree contains one RD from each row in Table 2. Successful paths axe drawn with thick lines and are marked S. Failed paths are marked F accompanied by the propositions which were not conveyed by the RDs in these paths. An RD that increases a user's belief in more than one proposition may appear in a path more than once. The repeated instances of such an RD appear in brackets, e.g., {Assert(px)}, indicating that the RD will be mentioned only once. In the successful path to the left, Assert(p1) can overcome all negative effects to convey p~. In addition, it can overcome the negative effect of</Paragraph>
      <Paragraph position="32"> In the successful path to the right, Assert(p1) together with Instantiate(pl) overcome the negative effect of Negate(p3) to convey p2, even though neither could do so by itself; and Negate(p3) can overcome the joint effect of Assert(p1 ) and Instantlate(pl ).</Paragraph>
      <Paragraph position="33"> Table 3 contains the successful minimally sufficient sets of RDs generated by this search and the failed sets of RDs accompanied by the propositions that were not conveyed.</Paragraph>
      <Paragraph position="34"> In Step 9 of Construct-sets-o/-RDs, the RDs that failed to convey a proposition axe combined with other RDs that can increase the user's belief in this proposition. For instance, Negate(p3) is combined with Instantlate(-~p3) for all the paths where -~p3 failed to be conveyed, and the seaxch is continued. Our procedure does not generate children from repeated RDs that failed to convey a proposition, since this would yield already existing combinations of RDs. Table 4 contains the minimally sufficient sets of RDs returned by algorithm Gonstruet-sets-ofiRDs. Set 5-6 is obtained by com.plementing Set 5 in Table 3 with the RD Instantiate(-~p3), and also by complementing Set 6 with the RD Negate(p3).</Paragraph>
      <Paragraph position="35"> Additional successful sets of RDs are generated by appending complen~entaxy RDs to the failed sets of RDs in Table 3. However, these sets subsume Set I, 2 and 5-6, and hence axe not minimally sufficient. For example, when Set 4 in Table 3 is complemented with Instantiate(pl), it yields a set of RDs that is equal to Set 2. This set is removed in Step 5 of Gonstruct-sets-of-RDs.</Paragraph>
      <Paragraph position="36"> This process ensures that only minimally sufficient sets of RDs are generated, because it generates RD-tuples only from the unsuccessful RDs in the RD-list of a proposition, and it prunes sets of RDs that subsume other sets of RDs. In addition, this process ensures that all the minimally sufficient sets of RDs axe generated, because it considers all the RD-tuples resulting from the unsuccessful RDs in the RD.list of a proposition.</Paragraph>
      <Paragraph position="38"/>
    </Section>
    <Section position="3" start_page="41" end_page="41" type="sub_section">
      <SectionTitle>
7th International Generation Workshop * Kennebunkport, Maine * June 21-24, 1994
3.2.3 Determining Prerequisite Propositions
</SectionTitle>
      <Paragraph position="0"> The prerequisite propositions to be conveyed depend on the user's expertise with respect to the concepts mentioned in a set of RDs, and on the context where these concepts are mentioned. The context influences both the aspects of these concepts that must be understood by the user and the extent to which these aspects must be understood.</Paragraph>
      <Paragraph position="1"> The process of determining the relevant aspects of a concept and the required level of expertise is described in \[Zukerman and McConachy, 1993a\]. The relevant aspects of a concept are determined by considering the predicates of the propositions where a concept is mentioned, and the role of the concept in these propositions. For example, in order to understand the RD Assert\[Marsupial has-part pouch\], the user must know the aspects type and structure of a pouch, i.e., what it is and what it looks like. The extent to which a user must know the selected aspects of a concept depends on the relevance of this concept to the original propositions to be conveyed, i.e., the system demands a high level of expertise with respect to the more relevant concepts, and a lower level of expertise with respect to the less relevant ones.</Paragraph>
      <Paragraph position="2"> After the relevant aspects and required level of expertise of each concept have been determined, WISHFUL-II applies the content selection step described in Section 2 to determine the prerequisite propositions of each concept. WISHFUL-II then merges into a single set the prerequisite propositions generated for individual concepts. This merger is executed because some prerequisite propositions of two or more concepts may be conveyed by a single RD. A special case of this happens when two or more concepts have common prerequisite propositions. For example, consider the situation depicted in Figure 3, where prerequisite information for the set of RDs {RD1,RD2} is being conveyed. RD1 requires the prerequisite propositions {pl,p2}, while RD2 requires the prerequisite propositions {p2, p3, p4 }. If we considered separately the prerequisites of these RDs, we would generate RDs to convey {pl,p2}, and {RD4,RDs} to convey {p2,pa,p4}. This would result in a total of three RDs. However, by considering jointly all the prerequisite propositions of {RDi, RD2}, we will require two RDs only, namely {RD3, RDs}.</Paragraph>
      <Paragraph position="3"> 3.2.4 Evoking the Concepts in a Set of RDs RDs that convey referring information differ from RDs that convey prerequisite propositions in that the former identifies a concept by means of information known to the user, while the latter conveys information that the user does not know about a concept. Further, the process of generating referring information has the flexibility of selecting the propositions that can identify a concept uniquely, while the propositions  that convey prerequisite information are dictated by the context and by the user's expertise.</Paragraph>
      <Paragraph position="4"> In order to generate referring expressions for the concepts mentioned in a set of RDs, we propose for each concept a list of candidate lexical items that can be used to refer to it. If there is a lexical item that identifies each concept uniquely and is known to the user, the evocation process is finished.</Paragraph>
      <Paragraph position="5"> However, if there axe concepts that are not identified uniquely by any of their candidate lexical items, then these lexical items axe complemented with additional RDs that help them identify the intended concepts. This task is performed by iteratively selecting propositions that identify an intended concept until this concept is singled out, and generating RDs that convey these propositions. This algorithm differs from the procedure described in \[Dale, 1990\] in that we generate several alternative sets of complementing RDs in order to avoid dead-end situations where the only identifying information that is generated for a set of concepts is circular. The evocation process then selects the most concise non-circulax combination of referring expressions that identifies all the concepts in a set of RDs. For example, Table 5 illustrates candidate referring expressions generated for the concepts Like-Terms and Algebraic-Terms. Each referring expression is composed of a lexical item and a complement 3 . The noncircular alternatives in this example contain the complements {Co~p~,Comp,}, {co~p~,Co~p~} and {Co~p~,Co~p,}</Paragraph>
    </Section>
    <Section position="4" start_page="41" end_page="41" type="sub_section">
      <SectionTitle>
3.3 Two Optimization Criteria
</SectionTitle>
      <Paragraph position="0"> As indicated in Section 3.1, the optimization criterion determines the manner in which the nodes in OPEN are ranked and pruned. In our implementation we have tried two optimization criteria: (1) conciseness and (2) depth.</Paragraph>
      <Paragraph position="1"> 3.3.1 Optimizing the Depth of the Generated RDs When optimizing the depth of a set of RDs, the nodes in OPEN are pruned according to the following rule:</Paragraph>
    </Section>
  </Section>
  <Section position="6" start_page="41" end_page="42" type="metho">
    <SectionTitle>
3 A referring expression may contain a null lexical item, i.e., com-
</SectionTitle>
    <Paragraph position="0"> plementing information only. However, at present this option is not generated by WISHFUL-II.</Paragraph>
    <Paragraph position="1">  The tveight of a node reflects the number of RDs in this node and their type. The total weight of a node is the sum of the weights of the nodes in the path from the root of the search tree to this node. All the RDs have a weight of 1, except an Instantiation of a proposition p that accompanies an Assertion of p or a Negation of ~p. Such an Instantiation has a weight of 1 ~, because it does not contain new information, rather it is a continuation of the idea presented in the Assertion or the Negation. For example, the Instantiation in Set 1 in Table 6 has a weight of 1, because the instantiated proposition is different from the asserted proposition. In contrust, in Set 2, the weight of the Instantiation is 1/2 because it instantiates the asserted proposition.</Paragraph>
    <Paragraph position="2"> The above rule is also applied after Step 4 of algorithm Expand-sets-afiRDs to prune the list of minimally sufficient sets of 1RDs (Section 3.2). It removes a node if its prerequisites subsume those of another node. It considers the number of referring expressions of a node only when the same pre-requisite propositions are required by two nodes, and considers the total weight of a node only when two nodes have the same prerequisite propositions and the same number of referring expressions. This rule compares only nodes at the same depth, because even if the prerequisites of a node at level i subsume the prerequisites of a node at level i -t- 1, the node at level i may lead to discourse that has depth i q- 1, while the node at level i / 1 can lead to discourse of depth i -t- 2 at best.</Paragraph>
    <Paragraph position="3"> To illustrate the pruning process let us reconsider the minimally sufficient sets of RDs in Table 4, assuming that the prerequisite propositions required by these sets are as shown in Table 64 . Here, the pruning rule removes Set 2, since its prerequisite propositions subsume those of Set 1.</Paragraph>
    <Paragraph position="4"> The nodes remaining in OPEN are ordered as follows: 1. In increasing order of their depth, so that we expand the more shallow nodes first during the optimization process.</Paragraph>
    <Paragraph position="5"> 2. In increasing order of the number of prerequisite propositions they require, so that the nodes that contain the sets of RDs with the fewest prerequisites are preferred among the nodes at the same level.</Paragraph>
    <Paragraph position="6"> 3. In increasing order of the number of referring expressions they have, so that the nodes with the fewest referring expressions are preferred among the nodes with the same number of prerequisite propositions.</Paragraph>
    <Paragraph position="7"> 4. In increasing order of their total weight, so that the most concise set of RDs is preferred when all else is equal.</Paragraph>
  </Section>
  <Section position="7" start_page="42" end_page="42" type="metho">
    <SectionTitle>
4 The first coefficient of each prerequisite proposition indicates
</SectionTitle>
    <Paragraph position="0"> the RD for which it is required, e.g., pit is a prerequisite of Assert(pl).</Paragraph>
    <Paragraph position="1"> To illustrate this process let us consider the minimally sufficient sets of RDs that remain after pruning, namely Set 1 and Set 5-6, and assign them to nodes nl and ns-e respectively. Since Set 5-6 has the fewest prerequisite propositions, us-6 will precede nl in OPEN, and will be the next node to be expanded by algorithm Optimize.RDs (Section 3.1). If upon expansion of ns-s we find that there is a minimally sufficient set of RDs that conveys propositions {p21 ,psi } and requires no prerequisite information, then the node which contains this set of RDs is a goal node, and the search is finished.</Paragraph>
    <Paragraph position="2"> 3.3.2 Optimizing the Number of Generated RDs When optimizing the total number of RDs to be presented, the following rule is used to prune the nodes in OPEN:</Paragraph>
    <Paragraph position="4"> THEN remove ni.</Paragraph>
    <Paragraph position="5"> As in depth optimization, this rule is also applied after Step 4 of algorithm Ea:pand-sets-of-RDs to prune the list of minimally sufficient sets of RDs.</Paragraph>
    <Paragraph position="6"> The nodes remaining in OPEN are sorted in increasing order of their total weight.</Paragraph>
    <Paragraph position="7"> To illustrate this process let us consider once more the minimally sufficient sets of RDs in Table 6. Since the prerequisite propositions of Set 2 subsume those of Set 1, and the total weight of Set 2 is higher than that of Set 1, Set 2 is removed in the pruning stage. The ordering of the remaining nodes in OPEN is different from the ordering obtained for the depth optimization, i.e., nl precedes ns-e in OPEN, since the total weight of Set 1 is less than the total weight of Set 5-6.</Paragraph>
  </Section>
class="xml-element"></Paper>
Download Original XML