File Information

File: 05-lr/acl_arc_1_sum/cleansed_text/xml_by_section/abstr/04/p04-1006_abstr.xml

Size: 895 bytes

Last Modified: 2025-10-06 13:43:39

<?xml version="1.0" standalone="yes"?>
<Paper uid="P04-1006">
  <Title>Attention Shifting for Parsing Speech [?]</Title>
  <Section position="1" start_page="0" end_page="0" type="abstr">
    <SectionTitle>
Abstract
</SectionTitle>
    <Paragraph position="0"> We present a technique that improves the efficiency of word-lattice parsing as used in speech recognition language modeling. Our technique applies a probabilistic parser iteratively where on each iteration it focuses on a different subset of the wordlattice. The parser's attention is shifted towards word-lattice subsets for which there are few or no syntactic analyses posited. This attention-shifting technique provides a six-times increase in speed (measured as the number of parser analyses evaluated) while performing equivalently when used as the first-stage of a multi-stage parsing-based language model.</Paragraph>
  </Section>
class="xml-element"></Paper>
Download Original XML