File Information
File: 05-lr/acl_arc_1_sum/cleansed_text/xml_by_section/intro/06/w06-1637_intro.xml
Size: 7,312 bytes
Last Modified: 2025-10-06 14:03:58
<?xml version="1.0" standalone="yes"?> <Paper uid="W06-1637"> <Title>Priming Effects in Combinatory Categorial Grammar</Title> <Section position="4" start_page="308" end_page="309" type="intro"> <SectionTitle> 2 Background </SectionTitle> <Paragraph position="0"/> <Section position="1" start_page="308" end_page="308" type="sub_section"> <SectionTitle> 2.1 Structural Priming </SectionTitle> <Paragraph position="0"> Previous studies of structural priming (Bock, 1986; Branigan et al., 2000) have made few theoretical assumptions about syntax, regardless of whether the studies were based on planned experiments or corpora. They leverage the fact that alternations such as He gave Raquel the car keys vs.</Paragraph> <Paragraph position="1"> He gave the car keys to Raquel are nearly equivalent in semantics, but differ in their syntactic structure (double object vs. prepositional object). In such experiments, subjects are first exposed to a prime, i.e., they have to comprehend or produce either the double object or the prepositional object structure. In the subsequent trial, the target, they are the free to produce or comprehend either of the two structures, but they tend to prefer the one that has been primed. In corpus studies, the frequencies of the alternative constructions can be compared in a similar fashion (Gries, 2005; Szmrecsanyi, 2005).</Paragraph> <Paragraph position="2"> Reitter et al. (2006b) present a different method to examine priming effects in the general case.</Paragraph> <Paragraph position="3"> Rather than selecting specific syntactic alternations, general syntactic units are identified. This method detects syntactic repetition in corpora and correlates its probability with the distance between prime and target, where at great distance, any repetition can be attributed to chance. The size of the priming effect is then estimated as the difference between the repetition probability close to the prime and far away from the prime. This is a way of factoring out chance repetition (which is required if we do not deal with syntactic alternations). By relying on syntactic units, the priming model includes implicit assumptions about the particular syntactic framework used to annotate the corpus under investigation.</Paragraph> </Section> <Section position="2" start_page="308" end_page="309" type="sub_section"> <SectionTitle> 2.2 Priming and Lexicalized Grammar </SectionTitle> <Paragraph position="0"> Previous work has demonstrated that priming effects on different linguistic levels are not independent (Pickering and Branigan, 1998). Lexical repetition makes repetition on the syntactic level more likely. For instance, suppose we have two verbal phrases (prime, target) produced only a few seconds apart. Priming means that the target is more likely to assume the same syntactic form (e.g., a passive) as the prime. Furthermore, if the head verbs in prime and target are identical, experiments have demonstrated a stronger priming effect. This effect seems to indicate that lexical and syntactic representations in the grammar share the same information (e.g., subcategorization information), and therefore these representations can prime each other.</Paragraph> <Paragraph position="1"> Consequently, we treat subcategorization as coterminous with syntactic type, rather than as a feature exclusively associated with lexemes. Such types determine the context of a lexeme or phrase, and are determined by derivation. Such an analysis is exactly what categorial grammars suggest.</Paragraph> <Paragraph position="2"> The rich set of syntactic types that categories afford may be just sufficient to describe all and only the units that can show priming effects during syntactic processing. That is to say that syntactic priming is categorial type-priming, rather than structural priming.</Paragraph> <Paragraph position="3"> Consistent with this view, Pickering and Branigan (1998) assume that morphosyntactic features such as tense, aspect or number are represented independently from combinatorial properties which specify the contextual requirements of a lexical item. Property groups are represented centrally and shared between lexicon entries, so that they may - separately - prime each other. For example, the pre-nominal adjective red in the red book primes other pre-nominal adjectives, but not a post-nominal relative clause (the book that's red) (Cleland and Pickering, 2003; Scheepers, 2003).</Paragraph> <Paragraph position="4"> However, if a lexical item can prime a phrasal constituent of the same type, and vice versa, then a type-driven grammar formalism like CCG can provide a simple account of the effect, because lexical and derived syntactic types have the same combinatory potential, which is completely specified by the type, whereas in structure-driven theories, this information is only implicitly given in the derivational process.</Paragraph> </Section> <Section position="3" start_page="309" end_page="309" type="sub_section"> <SectionTitle> 2.3 Combinatory Categorial Grammar </SectionTitle> <Paragraph position="0"> CCG (Steedman, 2000) is a mildly contextsensitive, lexicalized grammar formalism with a transparent syntax-semantics interface and a flexible constituent structure that is of particular interest to psycholinguistics, since it allows the construction of incremental derivations. CCG has also enjoyed the interest of the NLP community, with high-accuracy wide-coverage parsers(Clark and Curran, 2004; Hockenmaier and Steedman, 2002) and generators1 available (White and Baldridge, 2003).</Paragraph> <Paragraph position="1"> Words are associated with lexical categories which specify their subcategorization behaviour, eg. ((S[dcl]\NP)/NP)/NP is the lexical category for (tensed) ditransitive verbs in English such as gives or send, which expect two NP objects to their right, and one NP subject to their left. Complex categories X/Y or X\Y are functors which yield a constituent with category X, if they are applied to a constituent with category Y to their right (/Y) or to their left (\Y).</Paragraph> <Paragraph position="2"> Constituents are combined via a small set of combinatory rule schemata: Coordination: X conj X =Ph X Function application is the most basic operation (and used by all variants of categorial grammar): sary for the analysis of long-range dependencies and for incremental derivations. CCG uses the same lexical categories for long-range dependencies that arise eg. in wh-movement or coordination as for local dependencies, and does not require traces: the man that I saw The combinatory rules of CCG allow multiple, semantically equivalent, syntactic derivations of the same sentence. This spurious ambiguity is the result of CCG's flexible constituent structure, which can account for long-range dependencies and coordination (as in the above example), and also for interaction with information structure. CCG parsers often limit the use of the combinatory rules (in particular: type-raising) to obtain a single right-branching normal form derivation (Eisner, 1996) for each possible semantic interpretation. Such normal form derivations only use composition and type-raising where syntactically necessary (eg. in relative clauses).</Paragraph> </Section> </Section> class="xml-element"></Paper>