File Information

File: 05-lr/acl_arc_1_sum/cleansed_text/xml_by_section/concl/06/w06-2904_concl.xml

Size: 1,361 bytes

Last Modified: 2025-10-06 13:55:48

<?xml version="1.0" standalone="yes"?>
<Paper uid="W06-2904">
  <Title>Improved Large Margin Dependency Parsing via Local Constraints and Laplacian Regularization</Title>
  <Section position="11" start_page="27" end_page="27" type="concl">
    <SectionTitle>
9 Conclusion
</SectionTitle>
    <Paragraph position="0"> We have presented two improvements to the standard large margin training approach for dependency parsing. To cope with the sparse data problem, we smooth the parameters according to their underlying word similarities by introducing a Laplacian regularizer. More signi cantly, we use more re ned local constraints in the large margin criterion, rather than the global parse-level losses that are commonly considered. We achieve state of the art parsing accuracy for predicting undirected dependencies in test data, competitive with previous large margin and previous probabilistic approaches in our experiments.</Paragraph>
    <Paragraph position="1"> Much work remains to be done. One extension is to consider directed features, and contextual features like those used in current probabilistic parsers (Wang et al., 2005). We would also like to apply our approach to parsing English, investigate the confusion showed in Table 3 more carefully, and possibly re-investigate the use of parts-of-speech features in this context.</Paragraph>
  </Section>
class="xml-element"></Paper>
Download Original XML