File Information

File: 05-lr/acl_arc_1_sum/cleansed_text/xml_by_section/concl/98/p98-1119_concl.xml

Size: 1,309 bytes

Last Modified: 2025-10-06 13:58:11

<?xml version="1.0" standalone="yes"?>
<Paper uid="P98-1119">
  <Title>Automatic Acquisition of Language Model based on Head-Dependent Relation between Words</Title>
  <Section position="6" start_page="726" end_page="726" type="concl">
    <SectionTitle>
5 Conclusions
</SectionTitle>
    <Paragraph position="0"> In this paper, we presented a language model based on a kind of simple dependency grammar. The grammar consists of head-dependent relations between words and can be learned automatically from a raw corpus by the reestimation algorithm which is also introduced in this paper. By the preliminary experiments, it was shown that the proposed language model performs better than n-gram models in test corpus entropy. This means that the reestimation algorithm can find out the hidden information of head-dependent relation between words in a raw corpus, and the information is more useful than the naive word sequences of n-gram, for language modeling.</Paragraph>
    <Paragraph position="1"> We are planning to experiment the performance of the proposed language model for large corpus, for various domains, and with various smoothing methods. For the size of the model, we are planning to test the effects of excluding the dependency relations with near zero probabilities. null</Paragraph>
  </Section>
class="xml-element"></Paper>
Download Original XML