File Information
File: 05-lr/acl_arc_1_sum/cleansed_text/xml_by_section/concl/95/p95-1031_concl.xml
Size: 1,547 bytes
Last Modified: 2025-10-06 13:57:28
<?xml version="1.0" standalone="yes"?> <Paper uid="P95-1031"> <Title>Bayesian Grammar Induction for Language Modeling</Title> <Section position="8" start_page="232" end_page="232" type="concl"> <SectionTitle> 7 Conclusion </SectionTitle> <Paragraph position="0"> This research represents a step forward in the quest for developing grammar-based language models for natural language. We induce models that, while being substantially more compact, outperform n-gram language models in medium-sized domains.</Paragraph> <Paragraph position="1"> The algorithm runs essentially in time and space linear in the size of the training data, so larger domains are within our reach.</Paragraph> <Paragraph position="2"> However, we feel the largest contribution of this work does not lie in the actual algorithm specified, but rather in its indication of the potential of the induction framework described by Solomonoffin 1964.</Paragraph> <Paragraph position="3"> We have implemented only a subset of the moves that we have developed, and inspection of our results gives reason to believe that these additional moves may significantly improve the performance of our algorithm.</Paragraph> <Paragraph position="4"> Solomonoff's induction framework is not restricted to probabilistic context-free grammars. After completing the implementation of our move set, we plan to explore the modeling of context-sensitive phenomena. This work demonstrates that Solomonoff's elegant framework deserves much further consideration. null</Paragraph> </Section> class="xml-element"></Paper>