File Information
File: 05-lr/acl_arc_1_sum/cleansed_text/xml_by_section/metho/94/c94-1061_metho.xml
Size: 32,999 bytes
Last Modified: 2025-10-06 14:13:34
<?xml version="1.0" standalone="yes"?> <Paper uid="C94-1061"> <Title>CONCURRENT IJ~XICAIJZEI) I)EI)ENI)ENCY PARSING: THE ParseTalk MODEL</Title> <Section position="3" start_page="0" end_page="379" type="metho"> <SectionTitle> ('LI'J-- Comtmtational Linguistics Research Group </SectionTitle> <Paragraph position="0"> in in(xtem grammar theory, of. Daelemans, De Smedt & Gazdar, 1992), and by specifying a message passing protocol that is grounded in the actor computation model (Agha & I lewilt, 1987). As this protocol allows for asynchronous message passing, concurrency enters as a theoretical notion :it the level of grammar specification, not only as an implementatiolml feature. The ParseThlk m(xlel outlined in this paper can tllerefore be considered as an attempt to replace the static, global-control paradigm of natural language processing hy a dynamic, local-control mcxlel.</Paragraph> <Paragraph position="1"> The design of such a grammar and its associated parser responds to the demands of complex language performance problems. By this, we mean tmderstanding tasks, such as large-scale text o, speech understanding, which not only require considerable portions of grammatical knowledge but also a vast amount of so-called non-line guistic, e.g., domain and discourse knowledge. A major problem then relates to the interaction of the different knowledge sources involved, an issue that is not so presstug when monolithic grammar knowledge essentially boils down to syntactic regularities. Instead of subscribing to any serial model of control, we Imild Ul)On evidences flom COml)utational text understanding studies (Granger, Eiselt & llolbrook, 1986; Yu & Simmons, 1990) as well as psycholinguistic experiments, in particular those worked out for the class of inter:lctive language processing m(xlels (Marslen-Wilson & Tyler, 1980; Thibadeau, Just &. Cart)enter, 1982). They reveal that various knowledge sources are accessed in an a priori unpredictable order and that a signilicant amoullt of parallel processing occurs tit various slat',es of Ihe (htlll~all) I.:|nguage l)fOCCSSor. Therefore, computationally and cognitively plausil)le models of nalural language tmde,slandin,e, should account for parallelism at the lheoretical level of language description. Currenlly, I'arse'lalk provides a specificalion platform fo,' computational language performance modeling. I In the future, this vehiclc can be used as a testbed for the configuration of cognitively adequate parsers. Moving performance considerations to Ihe level of grammar design is thus in strong 1 We ()lily nlellti(lll lhat l)cIfoIlllallCe issues hi, COllie eVCll illOl't~ pl'essillg wht:n ilatl.lral \]angtlage illld(:rsland\[llg tasks are placed ill real-world environments and Ihtls addillonal complexity is added by uttgrammatical natural language inl:,Ul , noisy dala, as well as lexical, grammatical, and cofleeplual specification gaps. hi these cases, not only multiple knowledge sources have Io Im balanced but additional processing Sll'RIegles must be supplied to cope wilh these phenomena in a robust way. This places extra requirements on Ihe in((elation of procedural linguistic knowledl;c widfin a perR)(mance-orientcd language analysis framework, viz. slratt;glc knowledge how to handle incomplele or faulty hnguagc data and grammar Slx:Cifieatlons.</Paragraph> <Paragraph position="2"> contrast to any competence-based account which assigns structural well-formedness conditions to the gra|nmar level and leaves their computation to (general-puq)ose) parsing algorithms, often at the cost of vast amounts of ambiguous structural descriptions.</Paragraph> </Section> <Section position="4" start_page="379" end_page="381" type="metho"> <SectionTitle> 2 earseTalk's CONCEPTUAL FRAMEWORK </SectionTitle> <Paragraph position="0"> The Parse'lhlk model is based on a fully lexicalized grammar. Grammatical specifications are given in the format of valency constraints attached to each lexical trait, on which the computation of concrete dependency relations is based. By way of inheritance the entire collection of lexical items is organized in lexieat hierarchies (these constitute the lexical grammar), the lexical items forming their leaves anti lhe intermediary nodes representing grammatical generalizations in terms of word classes. This speciIication is similar to various proposals currently investigated within the unification grammar community (Evans & Gazdar, 1990). The concurrent computation m(xlel builds upon and extends the formal foundations of the actor model, a theory of object-oriented computation Ihat is based on asynchronous message passing.</Paragraph> <Section position="1" start_page="379" end_page="381" type="sub_section"> <SectionTitle> 2.1 The Grammar M~xlel </SectionTitle> <Paragraph position="0"> The grammar model underlying the ParseTalk approach considers dependency relations between words as the ftmdamental notion of linguistic analysis. A modifier is said lo depend on its head if the modifier's occurrence is permitted by the head but not vice versa 2. Dependencies are thus asymmetric binary relations that can be established by local computations involving only two lexical items; they are tagged by delxzndency relation names from the set (c) = 3 {spee, subj, ppatt .... } . Co-occurrence restrictions between lexical items are specified as sets of valencies that express various constraints a head places on permitted modifiers. These constraints incoqx)rate the R}llowing descriptive units: 1. eategorial: C= {WordAetor, Noun, Substantive, Preposition .... } denotes the set of word classes, and isa c = {(Noun, WordAetor), (Substantive, Noun), (Preposition, WordActor) .... } c C x C denolcs the st,bclass relation yielding a hierarchical o,dering in C (of. also Fig. I).</Paragraph> <Paragraph position="1"> 2. morl)hosyntaetle: A unification formalism (similar in spirit to Shicber, 1986) is t,sed to represent morphosyn-U~ctic regularities. It includes atomic tinms I'rom lhc set 'T= \[nora, acc ..... sg, pl .... }, complex tertns associating labels from the set PS = {case, num, agr .... } vo O with embedded terms, wthte disjunction (in curly braces), and coreferences (nnmbers in angle brackets).</Paragraph> <Paragraph position="2"> C/gl denotes the set of allowed feature structt,res, V the ;~ Ahhongh phrases are not explicitly represented (e.g., by non-lexical categories), we consider each complete sub|tee of Ihe dependency tree a phrase (this convention allows disconthmous phrases as welt). A dependency is not treated as a relation between words (as ~n Word Grammar (lludson, 1990, p.117), but bclween a word and a dependent phntse (as in Dependency Unification Grammar (llellwil, , 1988)). The mot of a phrase is laken Io be the representative of Ihe whole phrase. 3 Additionally, {O contains the symbol self which denotes Ihe currently considered lexical item. This symlml occurs in feature smmtures (see 2. lyelow) and ill the ordering relations onfer and occma (4. lxflow). unification operation, .L tile inconsistent element.</Paragraph> <Paragraph position="3"> Given u e Uand I ~_ L, tile expansion \[I : u\] denotes tile complex term containing only one label, I, with value u. If u is a complex term containing I at top level, the extraction u\l is detined to be the value of I in u. By definition, u\l yields J_ in all other cases.</Paragraph> <Paragraph position="4"> 3. c(mceptn-'d: The concept hierarchy consists of a set of concept names 9-= {Hardware, Computer, Notebook, Harddisk .... } anti a subclass relation isas= {(Computer, Hardware), (Notebook, Computer) .... } c Fx The set of conceptual role names ~. = {HesPer|, HasPrice .... } contains labels of l)ossiblc conccptual relations (a frame-style, classification-based knowledge representation model in the spirit of MacGregor (1991) is assumed). The relation tic ~ .q-x.~ x .9implemenls conceptual integrity conslraints: (f,, r,,q) c tic iff any concept subsumed by fe 5/ may be modified by any concept subsumed by,q< 5Vin relation re R, e.g, (Computer, hasPad, Harddisk) e de. From tic the relation per,tit-~ {(,V, r,y) < '.fx Rx 51 ~ f,2/< 5r: (f,, r,.,7) e tic A -C/saF* fAy &quot;~aj ~' 21} (* denotes the mmsitive closure) can be derived which explicitly states the range of concepts that can actually be related. For brevity, we restrict this exposition to the attribution of concepts and do nol conskler quantification, etc. (cf.</Paragraph> <Paragraph position="5"> Creary & Pollard, 19g5).</Paragraph> <Paragraph position="6"> 4. ordering: The (word-class specilic) set onfer c q~' contains n-tuples which express ordering constraints on lhe wdencies of each word class. Legal orders of modifiers must correslx)ud to an element of order Tile (word specific) functio,~ occurs : (c) --> 9,~) associates dependency names with the modificr's (and self's) text tx)sition (0 for valencies not yet occupied). Both specilications appear at tile lexical head only, since they refer to the head and all of its modifiers.</Paragraph> <Paragraph position="7"> With these definitions, a valency can be characterized as an clement of the set ,/2c q)x Cx glx ~, Focusing on one dependency relation from the example &quot;Compaq entwikkeh einen Notebook mit einer 120-MByte-Ilarddisk&quot; \['Compaq devek)ps a notebook with a 120-MByte hard disk&quot;\], the atxwe crileria are illustrated in Table 1. The fee-It|re structure of the two heads, &quot;rail&quot; and &quot;Notebook&quot;, is given prior to and after the establishment of rite dependency relation. The concepts of each of the phrases, 120MP,-I IARDDISK-00004 and NOTEBOOK-00003, are stated. The order constraint of &quot;Notebook&quot; says that it may be preceded by a specifier (spec) and attributive adjectives (,'tttr), and that it may be folk)wed by prepositional phrases (ppatt). The valency for prepositional phrases described in the last row states which class, feature, and domain constraints must be fulfilled by candidate modifiers.</Paragraph> <Paragraph position="8"> The predicate SATISFIES (of. Table 2) holds when a candidate modifier fullills the constraints stated in a specified valency of a candidate head. If SATISFIES evahmtes to true, a dependency valency.name is established (object.attribute denotes the value of the property attribute at object). As can easily be veriIicd, SATISFIES is fldlilled for the combination of &quot;mit&quot;, the prepositional valency, and &quot;Notebook&quot; from Table 1.</Paragraph> <Paragraph position="9"> ~,patt Efor,,, mi~ l lasl\[arddisk, 1 lasPrice, ... } TAIll.E I. An tlluslrali.n or grammatical specilicalions in the Parse'lhlk modal NOTI'~FR)OK4R)003 { <spec, atlr, self, ppan> } I(spec, 3), (mtr, 0), (self, 4), (IWW., 5)} not applical)le SATISFIES (modilier, valency, head) :<-~ modifier.class istl~ valency.class ^ ((\[valency.name:(modifier.featureskself)\] V valency.features) V head.features) ~ .L ^ 3 role < wdency.domain : (head.concept, role, modifier.concept) c permit ^3 <d, .... dn> ehead.on\[er: 3 k c{1, ..n} : (valency.name = d R ,', (V 1 -< i < k : (head.occurs (di) < modilier.positi(m)) ^ (V k < i<_ n : (head.occt*rs (di) = 0 v head. occurs (di) > modifier.position)) TAI|I,E 2. The SNI'ISFII,:S predicale Note that unlike most l)revious dependency grammar formalisms (Starosla & Nomu,a, 1986; Ilellwig, 1988; Jiippinch, Lassila & Lehtola, 1988; Fraser & l ludson, 1992) this criterion assigns equal opportunities to synlactic as well as conceptual conditions for computing valid del)eUdency relations, lnfommtion on word classes, morphosynlactic features, and order constraints is tmrely syntactic, while conceptual compatibility introduces an additional description layer to I)e satisfied before a grammatical relation may be established (of. Muraki, lehiyama & lVukumo chi, 1985; Lesmo & Lombardo, 1992). Note that we restrict the scope of tile unilicatiou module in our framework, as only morphosyntactic features are described using this sul)formalism. This contrasts sharply with standard unification grammars (and with designs \[or dependency parsing as advocated by llellwig (1988) and Lombardo (1992)), where virtually all information is encoded in tenns of the unification formalism 4.</Paragraph> <Paragraph position="10"> The grammatical styecifieation of a loxical entry consists of structural criteria (valencies) ~ behavioral descriptions (protocols). In order IO cal)ture ,elcwmt generalizatkms and to suplx)rt easy maintenance of grammar specilicalions, both are represented in hierarchies (cf. Genthial, Courtin & Kowarski (1990) and Fraser & Hudson (1992) for inheritance thai ix restricted to slructtlrgl criteria). The valem:y tfieran:hy assigns valencies to lexemes. We will not consider it in depth here, since it captures only traditional grammatical notions, like transitivity or rellexivity. The organizing principle is the subset relation on valency sets. The word dass tfferarchy conlains word class specilications lhat cover distributional :rod behavioral l)rOlx'.rties. Fit,. 1 ilh, stratcs tile behavk)ral criterion by defining for each class different messages (the messages for Word-Actor are discussed in Sections 3 and 4). Within the Noun l)art of the word class hierarchy, there am different meth()(Is for anal)hera resolution rellecting different structural constraints on possible antecedents for \[R)minal anaphora, retlexives and personal pronouns. The word class hierarchy cannot lye generate(l automatically, since elassilication of program specifications (commtmication protocols, in our ease) falls out of the scope of slate-of-the-art classilier '~ Typed unificalion formalisms (limele & Zajac, 1990) would easily allow for Ihe i\[it(:gration of word class iTll'omlatloll. Ordering constrahlls and conceptual re,frictions (sttch as Value range reslricthm~; or claboraled intet, rily (:(:,llslr;lilltS), however, ;ire not SO easily \[ral~sferable, because, e.g., the conceptual C()liSIlaillls go Far Imyond the levd of aIOIlliC SCIIlalII{C features still prevailhlg h~ unilicaliorL f(mnalisms. algorithms. On the other hand, the concept hierarchy is based on the subsumption relation holding between concepts, which is computed by a terminological classifier.</Paragraph> <Paragraph position="11"> Most lexicon entries refer to a corresponding domain concept and thus allow concepttml restrictions to be checked.</Paragraph> </Section> <Section position="2" start_page="381" end_page="381" type="sub_section"> <SectionTitle> 2.2 The Actor Computation Model </SectionTitle> <Paragraph position="0"> The actor model of computation combines object-oriented features with concurrency and distribution in a methodologically clean way. It assmnes a collection of independent objccLs, the actors, communicating via asynchronous message passing. An actor can send messages only to other actors it knows about, its acquaintances. The arrival of a message at an actor is called an event; it triggers the execution of a method that is composed of atomic actions, viz. creation of new actors c~ aetorTypo (acquaintances)), sending of messages to acquainted or a newly created actors (send actor message), or specification of new acquaintances (become (acquaintances)). An actor system is dynamic, since new actors can be created and the communication topology is reconligurable. We assume actors that process a single message at a time, step by step (l\[ewitt & Atkinson, 1979). For convenience, we establish a synchronous request-reply protocol (Licberman, 1987) to compute functions such as uniIication of feature structures and queries to a (conceptual) knowledge hase. In contrast to simple messages which unconditionally trigger the execution of a method at the receiving actor, we {loll ne complex word actor messages as fntl-lledged actors with independent computatiomd abilities. Departure and :nrival of complex messages are actions which arc perfornted by the message itself, taking the sender and the target actors as parameters. Upon arrival, a complex message determines whether a copy is forwarded to selected acquaintances of its receiver and whether the receiver may process the message on its own (of. Schacht, \[Iahn & Br~3ker (1994) for a treatment of the parser's behavioral aspects).</Paragraph> <Paragraph position="1"> The following syntax elements will he used subsequently: a program contains actor definitions (declaring the acquaintances and defining the methods of actors instantiated from this definition) and actor message dc~nitions (stating distribution and computation conditions).</Paragraph> <Paragraph position="2"> Method definitions contain the message key, the fonnal parameters and a composite action: I for var in set : (action) condition is a locally computable l)rcdicate, written as PREDICATE (actor'); actor stands for acquaintances, parameters, newly created actors, the performing actor itself (se_~ or the mldelined value (nil); actor.acquaintance yields the correstx)nding acquaintance of actor; fo__zr var in_. set: (action) evaluates action for each element of set.</Paragraph> </Section> </Section> <Section position="5" start_page="381" end_page="382" type="metho"> <SectionTitle> 3 A SIMPLIFIED PROTOCOL FOR ESTAB- LISIIING DEPENDENCY RELATIONS </SectionTitle> <Paragraph position="0"> The protocol described below allows to establish dependency relations. It integrates structural restrictions on dependency trees lind provides for domesticated concurrency</Paragraph> <Section position="1" start_page="381" end_page="382" type="sub_section"> <SectionTitle> 3.1 Synchronizing Actor Activities: Reception Protocol </SectionTitle> <Paragraph position="0"> A reception protocol allows tin actor to determine when all events (transitively) caused by a message have terminated.</Paragraph> <Paragraph position="1"> This is done by sending replies hack to the initialor of the message. Since complex messages can be quasi-recursively forwarded, the tmmber of replies cannot be determined in advance. In addition, each actor receiving such a message may need an arhitrary amount of processing time to terminate the actions caused by the message (e.g., the establishment of a dependency relation requires communication via messages that takes indeterminate time). There\['()re, each actor receiving the message must reply to the initiator once it has terminated processing, informing the initiator to which actors the message has been forwarded.</Paragraph> <Paragraph position="2"> A message is a reception message if(I) the receiver is required to (asynchronously) reply to the initiator with a receipt message, and (2) the initiator queues a reception task. An (explicit) receipt message is a direct message containing a set of actor identities as a parameter. This set indicates to which actors the receptior! message has been forwarded or delcg,'tled. The enclosed set enahles the receiver (which is the initiator of the reception message) to wait until all receipt messages have arrived 5. In addition to explicit receipts, which are messages solely used for termination detection, there are regular messages that serve a similar purpose besides their primary ftmction within the parsing process. They are called implicit receipt messages (one example is the headAccopted message described in Section 3.3). A reception task consists of a set of partial descriptions of the messages that must be received (implicit as well as explicit), and an action to be executed after all receipts have arrived (usually, sending a message).</Paragraph> <Paragraph position="3"> 5 This, of course, only happens if the distribmion is limited: The searchHead message discussed Ix:low is only distributed to the head of each receiver, which lllllSl (~etlr ill Ihe gallic st~rltellCe. This ellsures a finite RCIOF collection to distribute the IT~eRS~I~\[~ t~.), aTld gtlaranlees that t}le reception task is actually triggered.</Paragraph> <Paragraph position="4"> wordActor (head deps vals feats ...) # head, dependencies, valencies, and features acquaintance~-------&quot; searchHead (sender target init) # processed at candidate heads (~ from the message definition) (for val in vals: # check all valencies of the possible head (if SATISFIES (init val self) # valency check adapted from 7-able 2 (&giLd_ (gLe.&t~ haadFound (.,~eJ.\[ init val,name feats\val.name)) p.~LeJ;z~<; # reply to initiator, hnposing restrictions (head deps vals (feats V init.leats) ...) # expand grammatical description of head #J&g. ( e.s.g..~d (create receipt (self init {head})) ~))) # send a receipt with the head the message was forwarded to # de p, a G realizes the departure of a complex message headFound (sender target name headFeats) #processed at the initiator of a searchHead message (se_g_r~d (create headAccepted (self sender name)) ~); # reply to head , (sender deps vals (feats V headFeats) ,..)) # store sender as head of s~.~e~. , restrict so~tz~. 's features headAccepted (modifier target name) # processed at the head only (fo.#PS dep in deps: # check aft. dependencies i(j\[ (name = clap.name) # relafion, name is identical (send dep store (modifier)))) # send the dependency the message store to store tile nlodifier send (create receipt (self modifier {head})) ~) # send a receipt with the tread the message was forwarded to &quot;I'AIII,E 3. Method delinitiluis for sear('hllead, headl.'ountl~ heatlAccelltetl</Paragraph> </Section> <Section position="2" start_page="382" end_page="382" type="sub_section"> <SectionTitle> 3.2 Encoding Structural Restrictions </SectionTitle> <Paragraph position="0"> Word actors conduct a bottom-up search for l)ossil)le heads; the principle of non-crossing arcs (projectivity of the dependency tree) is guaranteed by the following forwarding mechanism. Consider the case of a newly illst:.lnliated word actor w. searching \[IS head to the left (tile opposite direction is handled in a similar way). In order to guaraatee projectivity one has to ensure that only word actors occupying the outer fringe of the detyendency structure (tyetween the current absolute head wj anti the rightlnOSt element w._i) receive the search message of w. (these are circled in Fig. 2) 6. This forwarding schenic is reflected in the following simplified message definition: defMsg searchHead (sender target initiator) ((if GOVERNED (target)Oistributelb head) # forward a copy to head, identified by head c 9) if. true g.g.l~) # the message is always processed at the target; # the computation event is concretized in the word # actor specification in Table 3 Thus, a message searching for a head of its initiator is locally processed at each actor receiving it, and is forwarded to tile head of each receiver, if one already exists.</Paragraph> <Paragraph position="2"> Additionally, direct messages are used to establish a dependency relation. They involve no forwarding and nlay lye specified as follows: defMscj <direotMessage> (sender target ...) (if true com q.@m_p~) # a direct message is always processed at the # target, no distribution condition can apply Below, a number of messages of this type arc tised for negotiating dependencies, e.g., headFound, headAceepted, receipt (each with difrerent imrameters, as represented by &quot;...&quot; above).</Paragraph> <Paragraph position="3"> ~' Additionally, w. may be governed by lilly word actor govemin I, wj, bu! due to the synchronization implemented by Ihe receipt protocol, each head of wj must be ltx:ated to the right of w..</Paragraph> </Section> <Section position="3" start_page="382" end_page="382" type="sub_section"> <SectionTitle> 3.3 All Excerpt fr(llll the Word Actor Script </SectionTitle> <Paragraph position="0"> The protocol for tx)ttom-up establishment of det)endencies consists el three steps: The search for a head (search-Head), the reply of a suitable head to the initiator of tile search (headFound), and tile acceplancc by the initiator (headAccepted), thereby Ix'coming a modifier of the head, The corrcslxmding method dc\[initions are given in Table 3 (note lhal Ihese mcth(xls are (lefincd for one actor type here, but tire executed by different actors during parsing). The protocol allows alternative attachments to be checked concurrently, since each aclor race\[vies searchHead may process it locally, while the message is simultaneously distributed to its head.</Paragraph> <Paragraph position="1"> The specification of methods as above gives a local view of Jill tic\[or system, stgting how each actor behaves when it ~vceives a message. For a global view taking the actors' interaction patterns into account, cf. Schacht, i lahn & Br6ker (1994).</Paragraph> </Section> </Section> <Section position="6" start_page="382" end_page="383" type="metho"> <SectionTitle> 4 AMBIGUITY IIANI)I.ING </SectionTitle> <Paragraph position="0"> There are two alternative processing strategies for ambiguities, viz. serial vs. p,'mdlel pr(xzessing. We here f(ycus on a parallel mode, specifying only necessary serializalkms.</Paragraph> <Paragraph position="1"> Whenever an ambiguity is detected, additional actors are created to represent dilTerent readings. The standard three-step negotiation scheme for dependencies can easily be accommodated le tills duplication process. When a word actor receives the second (or n-Ill) headFound inessai(e it does not immediately reply with a headAccepted message, but initiates the copying of itself, its modifiers, and the prospective head (which, in turn, initiates copying its m(xlifiers aed head, if any). Copying modifiers proceeds by seeding a copyStructure message R) each actor illvotved, wllich evokes a (standard) headAccepted inessage returned by tile actor copy. Copying the head is done via a duplicateStructure message, which will result in another headFound message to lye returned. Since this headFound message is addressed to lhe ungoverned COl)y, the COl)y m,'ly reply ;is ilStl\[lI try sending a headAccepted message. 1)uplication of actors allows tile concurrent protossing of alternatives, and requires only limited ove,head for the disiribntion of messages alllong duplicated actors,</Paragraph> <Section position="1" start_page="383" end_page="383" type="sub_section"> <SectionTitle> 4.1 Packing Ambiguities </SectionTitle> <Paragraph position="0"> Usually, a packed representation of ambiguous structures is preferred in the parsing literature (Tamnra et al., 1991).</Paragraph> <Paragraph position="1"> This is feasible when syntactic analysis is the only determining factor for the distribution of partial structures. But if conceptual knowledge is taken into account, the distribution of a phrase is not fully determined by its syntactic structure. Possible conceptual relations equally influence the distribution of the phrase. Additionally, the inclusion of an ambiguous phrase in a larger syntactic context requires the modification of the conceptual counterparts.</Paragraph> <Paragraph position="2"> In a packed representation, there would have to be several conceptual counterparts, i.e., only the syntactic representation can be packed (and it might even be necessary to unpack it on-the-fly). Consequently, whenever conceptual analysis is integrated into the parsing process (as opposed to its interpretation in a later stage, thereby producing numerous ambiguities in the syntactic analysis), structure sharing is impossible, since different syntactic attachments result in different conceptual analyses, and no common structure is accessible that can be shared (cf. Akasaka (1991) for a similar argument). We expect that the overhead of duplication is compensated for by the ambiguityreducing effects of integrating several knowledge sources.</Paragraph> </Section> <Section position="2" start_page="383" end_page="383" type="sub_section"> <SectionTitle> 4.2 Relation to Psycholinguistic Perfnrnmnce Models </SectionTitle> <Paragraph position="0"> It has been claimed that human language understanding proceeds in a more sequential mode, choosing one alternative and backtracking if that path fails (e.g., Ilemforth, Konieczny & Strube, 1993). This model requires the ranking of all alternatives according to criteria referring to syntactic or conceptual knowledge. The protocol outlined so far could easily be accommodated to this processing strategy: All headFound messages must be collected, and the corresponding attachments ranked. The best attachment is selected, aml only one headAccepled message sent. In case the analysis fails, the next-best attachment would be tried, until an analysis is found or no alternatives are left.</Paragraph> <Paragraph position="1"> Additionally, the dependencies established during a failetl path would have to be released. 7</Paragraph> </Section> </Section> <Section position="7" start_page="383" end_page="383" type="metho"> <SectionTitle> 5 COMPARISON TO RELNI'EI) WORK </SectionTitle> <Paragraph position="0"> The issue of object-oriented parsing and concurrency (for a survey, cf. Hahn & Adriaens, 1994) has long heen considered from a purely implementational perspective. Message passing as an explicit control mechanism is inherent to various object-oriented inaplementations of standard rule-based parsers (cf. Yonezawa & Ohsawa (1988) for context-free and Phillips (1984) for augmented PSGs).</Paragraph> <Paragraph position="1"> Actor-based implementations are provkted by Uehara et al. (1985) for LFGs and Abney & Cole (1986) for GP, grammars. Similarly, a parallel implementation of a rule7 Note that all psycholinguistic studies we know of are rel~:rring to a conslituency-based grammar model Since our grammar is based on dependency relations, principles such as Minimal Attachment cannot be transferred without profound modification, since in a dependency tree the number of nodes is identical for all readings. Therefore, principles adapted to the structural properties of dependency Irees must ix: formulated for preferenlial ranking.</Paragraph> <Paragraph position="2"> based, syntax-oriented dependency parser has been described by Akasaka (1991). The consideration of concurrency at tile grammar specification level has recently been investigated by Milward (1992) who properly relates notions fro,n categorial and dependency grammar with a state logic apl)roach, a formal alternative to the event-algebraic formalization underlying the ParseTalk model.</Paragraph> <Paragraph position="3"> Almost any of these proposals lack serious accotmts of the integration of syntactic knowledge with concepttml knowledge (cf. the end of Section 2.1 lot similar considerations related to dependency gram,nars). The develop,nent of conceptual parsers (Riesbeck & Schank, 1978), however, was entirely dominated by conceptual expectations driving the parsing process and specifically provided no mechanisms to integrate linguistic knowledge into such a lexical parser in a systematic way. The pscudo-pandlelism inherent to these early proix)sals, word expert parsing in particular (Small & Rieger, 1982), has in the mcantime been replaced by true parallelism, either using parallel logic programming envkomnents (Devos, Adriaens & Willems, 1988), .actor spccilicatious (Hahn, 1989) or a connectionist methodology (Riesbeck & Martin, 1986), while the lack of linguistic sophistication has remained.</Paragraph> <Paragraph position="4"> A word of caution should be expressed regarding the superficial similarity between object-oriented and connectionist models. Cnnnectionist methodology (cf. a survey by Selman (1989) of some now classical connectionist natural language parsing systems) is restricted in two ways compared with object-oriented comlmting. First, its commtmication patterns are determined by tile hard-wired toIx)logy of colmectiouist networks, whereas in object-oriented systems the tOl)Ology is llexible and reconfignrable. Second, the type and amount of (lata that can he exchanged in a connectionist network is rcslricted to marker anti value passing together with severely limited computation logic (and-ing, or-ing of Boolean hit markers, determining maximum/minilnum values, etc.), while none of these restrictions apl)ly to message passing models. These considerations eqtmlly extend to spreading aclivation models of nalural language parsing (Chamiak, 1986; llirst, 1987) which are not as conslraincd as connectionist ,nodels but less expressive than t, cneral message passing models underlying the object-oriented paratligm. As shotdd Ix: evident from the preceding exposition of the ParseTalk m(xlel, the complexity of tile data exchtmged and comput:ltions perfommd, in our case, require a full-llcdged message-l)assiug model.</Paragraph> </Section> class="xml-element"></Paper>