File Information
File: 05-lr/acl_arc_1_sum/cleansed_text/xml_by_section/evalu/92/p92-1002_evalu.xml
Size: 4,825 bytes
Last Modified: 2025-10-06 14:00:08
<?xml version="1.0" standalone="yes"?> <Paper uid="P92-1002"> <Title>AN ALGORITHM FOR VP ELLIPSIS</Title> <Section position="15" start_page="11" end_page="304" type="evalu"> <SectionTitle> RESULTS </SectionTitle> <Paragraph position="0"> The algorithm selected the correct antecedent in 285, or 94% of the cases. For comparison purposes, I present results of an alternative strategy; namely, a simple linear scan of preceding text. In this strategy, the first verb that is encountered is taken to be the head of the antecedent VP.</Paragraph> <Paragraph position="1"> The results of the algorithm and the &quot;Linear Scan&quot; approach are displayed in the following table. null The algorithm performs considerably better than Linear Scan. Much of the improvement is due to &quot;impossible antecedents&quot; which are selected by the Linear Scan approach because they are closest to the VPE. A frequent case of this is containing antecedents that are ruled out by the algorithm. Another case distinguishing the algorithm from Linear Scan involves coreferential subjects. There were several cases in which the coreferential subject preference rule caused an antecedent to be selected that was not the nearest to the VPE. One example is: (13) a. But, darn it all, why should we help a couple of spoiled snobs who had looked down their noses at us? b. But, in the end, we did.</Paragraph> <Paragraph position="2"> Here, the correct antecedent is the more distant &quot;help a couple of...&quot;, rather than &quot;looked down their noses...&quot;. There were no cases in which Linear Scan succeeded where the algorithm failed. (14) a.</Paragraph> </Section> <Section position="16" start_page="304" end_page="304" type="evalu"> <SectionTitle> SOURCES OF ERROR </SectionTitle> <Paragraph position="0"> I will now look at sources of errors for the algorithm. The performance was worst in the Long Distance category, in which at least one sentence intervenes between antecedent and VPE. In several problem cases in the Long Distance category, it appears that intervening text contains some mechanism that causes the antecedent to remain salient. For example: &quot;...in Underwater Western Eye I'd have a chance to act. I could show what I can do&quot; .</Paragraph> <Paragraph position="1"> b. As far as I was concerned, she had already and had dandily shown what she could do.</Paragraph> <Paragraph position="2"> In this case, the elliptical VP &quot;had already&quot; means &quot;had already had a chance to act&quot;. The algorithm incorrectly selects &quot;show what I can do&quot; as the antecedent. The intervening sentence causes the previous antecedent to remain salient, since it is understood as &quot;(If I had a chance to act then) I could show what I can do.&quot; Furthermore, the choice made by the algorithm might perhaps be eliminated on pragmatic grounds, given the oddness of &quot;she had already shown what she could do and had dandily shown what she could do .&quot; Another way in which the algorithm could be generalized is illustrated by the follow example: (15) a. &quot;I didn't ask you to fight for the ball club&quot;, Phil said slowly.</Paragraph> <Paragraph position="3"> b. &quot;Nobody else did, either&quot;.</Paragraph> <Paragraph position="4"> Here the algorithm incorrectly selects '~fight for the ball club&quot; as the antecedent, instead of &quot;ask you to fight for the ball club&quot;. The subject coreference rule does not apply, since &quot;Nobody else&quot; is not coreferential with the subject of any of the possible antecedents. However, its interpretation is dependent on the subject 'T' of &quot;ask you to fight for the ball club&quot;. Thus, if one generalized the subject coreference rule to include such forms of dependence, the algorithm would succeed on such examples.</Paragraph> <Paragraph position="5"> Many of the remaining errors involve an antecedent that takes a VP or S as complement, often leading to subtle ambiguities. One example of this is the following: (16) a. Usually she marked the few who did thank you, you didn't get that kind much in a place like this: and she played a little game with herself, seeing how downright rude she could act to the others, before they'd take offense, threaten to call the manager.</Paragraph> <Paragraph position="6"> b. Funny how seldom they did: used to it, probably.</Paragraph> <Paragraph position="7"> Here the algorithm selects &quot;call the manager&quot; as antecedent, instead of &quot;threaten to call the manager&quot;, which I determined to be the correct antecedent. It may be that many of these cases involve a genuine ambiguity.</Paragraph> </Section> class="xml-element"></Paper>