File Information

File: 05-lr/acl_arc_1_sum/cleansed_text/xml_by_section/abstr/04/w04-3223_abstr.xml

Size: 1,426 bytes

Last Modified: 2025-10-06 13:44:06

<?xml version="1.0" standalone="yes"?>
<Paper uid="W04-3223">
  <Title>Incremental Feature Selection and lscript1 Regularization for Relaxed Maximum-Entropy Modeling</Title>
  <Section position="1" start_page="0" end_page="0" type="abstr">
    <SectionTitle>
3333 Coyote Hill Road, Palo Alto, CA 94304
Abstract
</SectionTitle>
    <Paragraph position="0"> We present an approach to bounded constraint-relaxation for entropy maximization that corresponds to using a double-exponential prior or lscript1 regularizer in likelihood maximization for log-linear models. We show that a combined incremental feature selection and regularization method can be established for maximum entropy modeling by a natural incorporation of the regularizer into gradient-based feature selection, following Perkins et al.</Paragraph>
    <Paragraph position="1"> (2003). This provides an efficient alternative to standard lscript1 regularization on the full feature set, and a mathematical justification for thresholding techniques used in likelihood-based feature selection.</Paragraph>
    <Paragraph position="2"> Also, we motivate an extension to n-best feature selection for linguistic features sets with moderate redundancy, and present experimental results showing its advantage over lscript0, 1-best lscript1, lscript2 regularization and over standard incremental feature selection for the task of maximum-entropy parsing.1</Paragraph>
  </Section>
class="xml-element"></Paper>
Download Original XML