File Information

File: 05-lr/acl_arc_1_sum/cleansed_text/xml_by_section/concl/04/p04-1020_concl.xml

Size: 1,401 bytes

Last Modified: 2025-10-06 13:54:05

<?xml version="1.0" standalone="yes"?>
<Paper uid="P04-1020">
  <Title>Learning Noun Phrase Anaphoricity to Improve Coreference Resolution: Issues in Representation and Optimization</Title>
  <Section position="9" start_page="2" end_page="2" type="concl">
    <SectionTitle>
7 Conclusions
</SectionTitle>
    <Paragraph position="0"> We have examined two largely unexplored issues in computing and using anaphoricity information for improving learning-based coreference systems: representation and optimization. In particular, we have systematically evaluated all four combinations of local vs. global optimization and constraint-based vs. feature-based representation of anaphoricity information in terms of their effectiveness in improving a learning-based coreference system.</Paragraph>
    <Paragraph position="1"> Extensive experiments on the three ACE coreference data sets using a symbolic learner (RIPPER) and a statistical learner (MaxEnt) for training coreference classifiers demonstrate the effectiveness of the constraint-based, globally-optimized approach to anaphoricity determination, which employs our conservativeness-based anaphoricity model. Not only does this approach improve a &amp;quot;no anaphoricity&amp;quot; baseline coreference system, it is more effective than the commonly-adopted locally-optimized approach without relying on additional labeled data.</Paragraph>
  </Section>
class="xml-element"></Paper>
Download Original XML