File Information

File: 05-lr/acl_arc_1_sum/cleansed_text/xml_by_section/concl/00/w00-0710_concl.xml

Size: 1,101 bytes

Last Modified: 2025-10-06 13:52:52

<?xml version="1.0" standalone="yes"?>
<Paper uid="W00-0710">
  <Title>Learning Distributed Linguistic Classes</Title>
  <Section position="9" start_page="59" end_page="59" type="concl">
    <SectionTitle>
6 Conclusions
</SectionTitle>
    <Paragraph position="0"> The use of error-correcting output codes (ECOC) for representing natural language classes has been empirically validated for a suite of linguistic tasks. Results indicate that ECOC can be useful for datasets with features with high class predictivity. These sets typically tend to benefit from the Modified Value Difference Metric, which creates a condensed hyperspace of features. This in turn leads to a lower number of class boundaries to be learned per bit function, which simplifies the binary subclassification tasks. A voting algorithm for learning blocks of bits proves as accurate as an expensive feature-selecting algorithm. Future research will address further mechanisms of learning complex regions of the class boundary landscape, as well as alternative error-correcting approaches to classification.</Paragraph>
  </Section>
class="xml-element"></Paper>
Download Original XML