File Information
File: 05-lr/acl_arc_1_sum/cleansed_text/xml_by_section/evalu/97/j97-4003_evalu.xml
Size: 4,744 bytes
Last Modified: 2025-10-06 14:00:21
<?xml version="1.0" standalone="yes"?> <Paper uid="J97-4003"> <Title>A Computational Treatment of Lexical Rules in HPSG as Covariation in Lexical Entries</Title> <Section position="7" start_page="563" end_page="564" type="evalu"> <SectionTitle> 7. Related Work </SectionTitle> <Paragraph position="0"> The powerful mechanism of lexical rules (Carpenter 1991) has been used in many natural language processing systems. In this section we briefly discuss some of the more prominent approaches and compare them with the treatment proposed in this paper.</Paragraph> <Section position="1" start_page="563" end_page="563" type="sub_section"> <SectionTitle> 7.1 Off-line Expansion of Lexical Rules </SectionTitle> <Paragraph position="0"> A common computational treatment of lexical rules adopted, for example, in the ALE system (Carpenter and Penn 1994) consists of computing the transitive closure of the base lexical entries under lexical rule application at compile-time. While this provides a front-end to include lexical rules in the grammars, it has the disadvantage that the generalizations captured by lexical rules are not used for computation. We mentioned in Section 2.2 that eliminating lexical rules in a precompilation step makes it impossible to process lexical rules or lexical entries that impose constraints that can only be properly executed once information from syntactic processing is available. A related problem is that for analyses resulting in infinite lexica, the number of lexical rule applications needs to be limited. In the ALE system, for example, a depth bound can be specified for this purpose. Finally, as shown in Section 6, using an expanded out lexicon can be less time and space efficient than using a lexicon encoding that makes computational use of generalizations over lexical information, as, for example, the covariation encoding.</Paragraph> </Section> <Section position="2" start_page="563" end_page="564" type="sub_section"> <SectionTitle> 7.2 Lexical Rules as Unary Phrase Structure Rules </SectionTitle> <Paragraph position="0"> Another common approach to lexical rules is to encode them as unary phrase structure rules. This approach is taken, for example, in LKB (Copestake 1992) where lexical rules are introduced on a par with phrase structure rules and the parser makes no distinction between lexical and nonlexical rules (Copestake 1993, 31). A similar method is included in PATR-II (Shieber et al. 1983) and can be used to encode lexical rules as binary relations in the CUF system (Dbrre and Eisele 1991; D6rre and Dorna 1993b) or the TFS system (Emele and Zajac 1990; Emele 1994). The covariation approach described in this paper can be viewed as a domain-specific refinement of such a treatment of lexical rules.</Paragraph> <Paragraph position="1"> The encoding of lexical rules used in the covariation approach is related to the work of van Noord and Bouma (1994), who describe the hand-encoding of a single lexical rule as definite relations and show how these relations can be used to constrain a lexical entry. The covariation approach builds on this proposal and extends it in three ways: First, the approach shows how to detect and encode the interaction of a set of lexical rules. Second, it provides a way to automatically obtain a definite clause encoding of lexical rules and their interaction. Finally, it automatically derives the frame specification for lexical rules such that, following standard HPSG practice, only the information changed in a lexical rule needs to be specified.</Paragraph> <Paragraph position="2"> 7.3 Alternative Ways to Express Lexical Generalizations Lexical rules have not gone unchallenged as a mechanism for expressing generalizations over lexical information. In a number of proposals, lexical generalizations are captured using lexical underspecification (Kathol 1994; Krieger and Nerbonne 1992; Meurers and Minnen Covariation Approach to HPSG Lexical Rules Riehemann 1993; Oliva 1994; Frank 1994; Opalka 1995; Sanfilippo 1995). The lexical entries are only partially specified, and various specializations are encoded via the type hierarchy, definite clause attachments, or a macro hierarchy.</Paragraph> <Paragraph position="3"> These approaches seem to propose a completely different way to capture lexical generalizations. It is therefore interesting that the covariation lexical rule compiler produces a lexicon encoding that, basically, uses an underspecification representation: The resulting definite clause representation after constraint propagation represents the common information in the base lexical entry, and uses a definite clause attachment to encode the different specializations.</Paragraph> </Section> </Section> class="xml-element"></Paper>