File Information

File: 05-lr/acl_arc_1_sum/cleansed_text/xml_by_section/concl/04/w04-1303_concl.xml

Size: 2,675 bytes

Last Modified: 2025-10-06 13:54:20

<?xml version="1.0" standalone="yes"?>
<Paper uid="W04-1303">
  <Title>Putting Meaning into Grammar Learning</Title>
  <Section position="6" start_page="23" end_page="24" type="concl">
    <SectionTitle>
5 Discussion and future directions
</SectionTitle>
    <Paragraph position="0"> The work presented here is intended to offer an alternative formulation of the grammar learning problem in which meaning in context plays a pivotal role in the acquisition of grammar. Specifically, meaning is incorporated directly into the target grammar (via the construction representation), the input data (via the context representation) and the evaluation criteria (which is usage-based, i.e. to improve comprehension). To the extent possible, the assump- null tionsmadewithrespecttostructuresandprocesses availabletoahumanlanguagelearnerinthisstage are consistent with evidence from across the cognitive spectrum. Though only preliminary conclusions can be made, the model is a concrete computational step toward validating a meaning-oriented approach to grammar learning.</Paragraph>
    <Paragraph position="1"> The model draws from a number of computational forerunners from both logical and probabilistic traditions, including Bayesian models of word learning (Bailey, 1997; Stolcke, 1994) for the over-all optimization model, and work by Wolff (1982) modeling language acquisition (primarily production rules) using data compression techniques similar to the MDL approach taken here. The use of the results of analysis to hypothesize new mappings can be seen as related to both explanation-based learning (DeJong and Mooney, 1986) and inductive logic programming (Muggleton and Raedt, 1994).</Paragraph>
    <Paragraph position="2"> The model also has some precedents in the work of Siskind (1997) and Thompson (1998), both of which based learning on the discovery of isomorphic structures in syntactic and semantic representations, though in less linguistically rich formalisms. In current work we are applying the model to the full corpus of English verbs, as well as crosslinguistic data including Russian case markers and Mandarin directional particles and aspect markers.</Paragraph>
    <Paragraph position="3"> These experiments will further test the robustness of the model's theoretical assumptions and protect against model overfitting and typological bias. We are also developing alternative means of evaluating the system's progress based on a rudimentary model of production, which would enable it to label scene descriptions using its current grammar and thus facilitate detailed studies of how the system generalizes (and overgeneralizes) to unseen data.</Paragraph>
  </Section>
class="xml-element"></Paper>
Download Original XML