File Information

File: 05-lr/acl_arc_1_sum/cleansed_text/xml_by_section/abstr/04/w04-0310_abstr.xml

Size: 3,016 bytes

Last Modified: 2025-10-06 13:43:42

<?xml version="1.0" standalone="yes"?>
<Paper uid="W04-0310">
  <Title>Incrementality in Syntactic Processing: Computational Models and Experimental Evidence</Title>
  <Section position="1" start_page="0" end_page="0" type="abstr">
    <SectionTitle>
Abstract
</SectionTitle>
    <Paragraph position="0"> It is a well-known intuition that human sentence understanding works in an incremental fashion, with a seemingly constant update of the interpretation through the left-to-right processing of a string. Such intuitions are backed up by experimental evidence dating from at least as far back as Marslen-Wilson (1973), showing that under many circumstances, interpretations are indeed updated very quickly.</Paragraph>
    <Paragraph position="1"> From a parsing point of view it is interesting to consider the structure-building processes that might underlie incremental interpretation| what kinds of partial structures are built during sentence processing, and with what timecourse? null In this talk I will give an overview of the state-of-the-art of experimental psycholinguistic research, paying particular attention to the time-course of structure-building. The discussion will focus on a new line of research (some as yet unpublished) in which syntactic phenomena such as binding relations (e.g., Sturt, 2003) and unbounded dependencies (e.g., Aoshima, Phillips, &amp; Weinberg, in press) are exploited to make a very direct test of the availability of syntactic structure over time.</Paragraph>
    <Paragraph position="2"> The experimental research will be viewed from the perspective of a space of computational models, which make difierent predictions about time-course of structure building. One dimension in this space is represented by the parsing algorithm used: For example, within the framework of Generalized Left Corner Parsing (Demers, 1977), algorithms can be characterized in terms of the point at which a context-free rule is recognized, in relation to the recognition-point of the symbols on its right-hand side. Another relevant dimension is represented by the type of grammar formalism that is assumed. For example, with bottom-up parsing algorithms, the degree to which structure-building is delayed in right-branching structures depends heavily on whether we employ a traditional phrase-structure formalism with rigid constituency, or a cateogorial formalism with exible constituency (e.g., Steedman, 2000).</Paragraph>
    <Paragraph position="3"> I will argue that the evidence is incompatible with models which predict systematic delays in the construction of syntactic structure. In particular, I will argue against both head-driven strategies (e.g., Mulders, 2002), and purely bottom-up parsing strategies, even when exible constituency is employed. Instead, I will argue that to capture the data in the most parsimonious way, we should turn our attention to those models in which a fully connected syntactic structure is maintained throughout the processing of a string.</Paragraph>
  </Section>
class="xml-element"></Paper>
Download Original XML