File Information

File: 05-lr/acl_arc_1_sum/cleansed_text/xml_by_section/concl/06/n06-1012_concl.xml

Size: 1,203 bytes

Last Modified: 2025-10-06 13:55:09

<?xml version="1.0" standalone="yes"?>
<Paper uid="N06-1012">
  <Title>Reducing Weight Undertraining in Structured Discriminative Learning</Title>
  <Section position="8" start_page="93" end_page="93" type="concl">
    <SectionTitle>
7 Conclusion
</SectionTitle>
    <Paragraph position="0"> Discriminatively-trained probabilistic models have had much success in applications because of their flexibility in defining features, but sometimes even highly-indicative features can fail to increase performance. We have shown that this can be due to feature undertraining, where highly-indicative features prevent training of many weaker features. One solution to this is feature bagging: repeatedly selecting feature subsets, training separate models on each subset, and averaging the individual models.</Paragraph>
    <Paragraph position="1"> On large, real-world natural-language processing tasks, feature bagging significantly improves performance, even with only two feature subsets. In this work, we choose the subsets based on our intuition of which features are complementary for this task, but automatically determining the feature subsets is an interesting area for future work.</Paragraph>
  </Section>
class="xml-element"></Paper>
Download Original XML