File Information

File: 05-lr/acl_arc_1_sum/cleansed_text/xml_by_section/concl/98/p98-1091_concl.xml

Size: 1,496 bytes

Last Modified: 2025-10-06 13:58:05

<?xml version="1.0" standalone="yes"?>
<Paper uid="P98-1091">
  <Title>An Empirical Evaluation of Probabilistic Lexicalized Tree Insertion Grammars *</Title>
  <Section position="5" start_page="562" end_page="562" type="concl">
    <SectionTitle>
4 Conclusion and Future Work
</SectionTitle>
    <Paragraph position="0"> In this paper, we have presented the results of two empirical experiments using Probabilistic Lexicalized Tree Insertion Grammars. Comparing PLTIGs with PCFGs and N-grams, our studies show that a lexicalized tree representation drastically improves the quality of language modeling of a context-free grammar to the level of N-grams without degrading the parsing accuracy. In the future, we hope to continue to improve on the quality of parsing and language modeling by making more use of the lexical information. For example, currently, the initial untrained PLTIGs consist of elementary trees that have uniform configurations (i.e., every auxiliary tree has the same number of adjunction sites) to mirror the CNF representation of PCFGs. We hypothesize that a grammar consisting of a set of elementary trees whose number of adjunction sites depend on their lexical anchors would make a closer approximation to the &amp;quot;true&amp;quot; grammar. We also hope to apply PLTIGs to natural language tasks that may benefit from a good language model, such as speech recognition, machine translation, message understanding, and keyword and topic spotting.</Paragraph>
  </Section>
class="xml-element"></Paper>
Download Original XML