CONCURRENT IJ~XICAIJZEI) I)EI)ENI)ENCY PARSING: 
THE ParseTalk MODEL 
Norbert Br6keh Udo Hahn & Susanne Schacht 
Abstract. A grammar ,nodel for concurrent, object-ori- 
ented natural language l)arsing is introduced. Complete 
lexical distribution of grmnmatical knowledge is achieved 
huilding upon the head-oriented notions of valency and 
dependency, while inheritance mechanisms tire uscd to 
capture lexic;tl generalizations. The underlying concurrent 
computation m(xlel relies upon the actor l)aradigm. We 
consider message passing protocols for establishing de- 
l)cndency relations and amhiguity handling. 
1 INTRODUCTION 
In this pal)er, we propose a grammar model that combines 
lexical organization of grammatical knowledge with lexi- 
calized conlrol of the corresponding parser in an object- 
oriented specification framework. Recent developments in 
the lield of linguistic grammar theory have already yielded 
a rigid lexical modularization. This fine-grained decomlx)- 
sit|on of linguistic knowledge can he taken as a starting 
Ix)int for lexic,'tlized control. Current lexicalized grammars 
(for instance, IlPSG: Pollard & Sag, 1987; CG: t lcpple, 
1992; Lexicalized TAG: Sehabes, Abe|lie & Josh|, 1988), 
however, still consider lexical items as passive data con- 
taincrs whosc content is uniformly inteq)reled by global 
control mechanisms (e.g., unification, functional composi- 
tion, tree adjunction). Divmging front these l)remises, we 
assign full procedural autonomy to lexical units and treat 
then~ as active lexical processes communicating with each 
other by message passing. Thus, they dynamically estal)- 
lish heterogcncous communication lines in order to deter- 
mine each lexical item's funclional role. While the issue of 
lcxicalized control has early I)ecn investigated in the paua- 
diem of conceptual parsing (Ricsbeck & Schank, 1978), 
and word expert parsing in particular (Sinall & Ricger, 
1982), these prol)osals am limited in several ways. First, 
they do not provide any general mechanism f()r the sys- 
tematic incorporation of grammatical knowledge. Second, 
they do not supply any organizing facility to R)rmuh|te 
generalizations over sets of lexical items. Third, lexical 
communication is based on an entirely inform:l\[ l)rotocol 
that lacks any grounding in principles of distributed com- 
puting. 
We intend to remedy these metho(Iological shortconl- 
ings by designing a radically texicalized grammar on lhe 
basis of valency and dependency (these head-oriented 
notions aheady tigure in different shal)eS in many modem 
linguistic theories, e.g., kS suhc.'ltegorizations, case frames, 
thela roles), by introducing inheritance kS a major organi- 
zational mechanism (for a survey of at)plying inheritance 
('LI'J-- Comtmtational Linguistics Research Group 
Vreiburg University 
D-79085 Frciburg, Germany 
email: {nobi, hahn, sue}@coling.uni-frciburg.dc 
in in(xtem grammar theory, of. Daelemans, De Smedt & 
Gazdar, 1992), and by specifying a message passing proto- 
col that is grounded in the actor computation model (Agha 
& I lewilt, 1987). As this protocol allows for asynchronous 
message passing, concurrency enters as a theoretical 
notion :it the level of grammar specification, not only as an 
implementatiolml feature. The ParseThlk m(×lel outlined 
in this paper can tllerefore be considered as an attempt to 
replace the static, global-control paradigm of natural lan- 
guage processing hy a dynamic, local-control mc×lel. 
The design of such a grammar and its associated parser 
responds to the demands of complex language perfor- 
mance problems. By this, we mean tmderstanding tasks, 
such as large-scale text o, speech understanding, which 
not only require considerable portions of grammatical 
knowledge but also a vast amount of so-called non-line 
guistic, e.g., domain and discourse knowledge. A major 
problem then relates to the interaction of the different 
knowledge sources involved, an issue that is not so press- 
tug when monolithic grammar knowledge essentially boils 
down to syntactic regularities. Instead of subscribing to 
any serial model of control, we Imild Ul)On evidences flom 
COml)utational text understanding studies (Granger, Eiselt 
& llolbrook, 1986; Yu & Simmons, 1990) as well as psy- 
cholinguistic experiments, in particular those worked out 
for the class of inter:lctive language processing m(xlels 
(Marslen-Wilson & Tyler, 1980; Thibadeau, Just &. Car- 
t)enter, 1982). They reveal that various knowledge sources 
are accessed in an a priori unpredictable order and that a 
signilicant amoullt of parallel processing occurs tit various 
slat',es of Ihe (htlll~all) I.:|nguage l)fOCCSSor. Therefore, com- 
putationally and cognitively plausil)le models of nalural 
language tmde,slandin,e, should account for parallelism at 
the lheoretical level of language description. Currenlly, 
I'arse'lalk provides a specificalion platform fo,' computa- 
tional language performance modeling. I In the future, this 
vehiclc can be used as a testbed for the configuration of 
cognitively adequate parsers. Moving performance consid- 
erations to Ihe level of grammar design is thus in strong 
1 We ()lily nlellti(lll lhat l)cIfoIlllallCe issues hi, COllie eVCll illOl't~ pl'essillg 
wht:n ilatl.lral \]angtlage illld(:rsland\[llg tasks are placed ill real-world 
environments and Ihtls addillonal complexity is added by uttgrammati- 
cal natural language inl:,Ul , noisy dala, as well as lexical, grammatical, 
and cofleeplual specification gaps. hi these cases, not only multiple 
knowledge sources have Io Im balanced but additional processing Sll'RI- 
egles must be supplied to cope wilh these phenomena in a robust way. 
This places extra requirements on Ihe in((elation of procedural linguis- 
tic knowledl;c widfin a perR)(mance-orientcd language analysis frame- 
work, viz. slratt;glc knowledge how to handle incomplele or faulty 
hnguagc data and grammar Sl×:Cifieatlons. 
379 
contrast to any competence-based account which assigns 
structural well-formedness conditions to the gra|nmar 
level and leaves their computation to (general-puq)ose) 
parsing algorithms, often at the cost of vast amounts of 
ambiguous structural descriptions. 
2 earseTalk's CONCEPTUAL FRAMEWORK 
The Parse'lhlk model is based on a fully lexicalized gram- 
mar. Grammatical specifications are given in the format of 
valency constraints attached to each lexical trait, on which 
the computation of concrete dependency relations is 
based. By way of inheritance the entire collection of lexi- 
cal items is organized in lexieat hierarchies (these consti- 
tute the lexical grammar), the lexical items forming their 
leaves anti lhe intermediary nodes representing grammati- 
cal generalizations in terms of word classes. This speciIi- 
cation is similar to various proposals currently investi- 
gated within the unification grammar community (Evans 
& Gazdar, 1990). The concurrent computation m(×lel 
builds upon and extends the formal foundations of the 
actor model, a theory of object-oriented computation Ihat 
is based on asynchronous message passing. 
2.1 The Grammar M~xlel 
The grammar model underlying the ParseTalk approach 
considers dependency relations between words as the ftm- 
damental notion of linguistic analysis. A modifier is said lo 
depend on its head if the modifier's occurrence is permit- 
ted by the head but not vice versa 2. Dependencies are thus 
asymmetric binary relations that can be established by 
local computations involving only two lexical items; they 
are tagged by delxzndency relation names from the set © = 
3 {spee, subj, ppatt .... } . Co-occurrence restrictions be- 
tween lexical items are specified as sets of valencies that 
express various constraints a head places on permitted 
modifiers. These constraints incoqx)rate the R}llowing 
descriptive units: 
1. eategorial: C= {WordAetor, Noun, Substantive, Prepo- 
sition .... } denotes the set of word classes, and isa c = 
{(Noun, WordAetor), (Substantive, Noun), (Preposition, 
WordActor) .... } c C × C denolcs the st,bclass relation 
yielding a hierarchical o,dering in C (of. also Fig. I). 
2. morl)hosyntaetle: A unification formalism (similar in 
spirit to Shicber, 1986) is t,sed to represent morphosyn- 
U~ctic regularities. It includes atomic tinms I'rom lhc set 
'T= \[nora, acc ..... sg, pl .... }, complex tertns associat- 
ing labels from the set £ = {case, num, agr .... } vo O 
with embedded terms, wthte disjunction (in curly 
braces), and coreferences (nnmbers in angle brackets). 
¢gl denotes the set of allowed feature structt,res, V the 
;~ Ahhongh phrases are not explicitly represented (e.g., by non-lexical 
categories), we consider each complete sub|tee of Ihe dependency tree 
a phrase (this convention allows disconthmous phrases as welt). A 
dependency is not treated as a relation between words (as ~n Word 
Grammar (lludson, 1990, p.117), but bclween a word and a dependent 
phntse (as in Dependency Unification Grammar (llellwil, , 1988)). The 
mot of a phrase is laken Io be the representative of Ihe whole phrase. 
3 Additionally, {O contains the symbol self which denotes Ihe currently 
considered lexical item. This symlml occurs in feature smmtures (see 2. 
lyelow) and ill the ordering relations onfer and occma (4. lxflow). 
unification operation, .L tile inconsistent element. 
Given u e Uand I ~_ L, tile expansion \[I : u\] denotes tile 
complex term containing only one label, I, with value 
u. If u is a complex term containing I at top level, the 
extraction u\l is detined to be the value of I in u. By def- 
inition, u\l yields J_ in all other cases. 
3. c(mceptn-'d: The concept hierarchy consists of a set of 
concept names 9-= {Hardware, Computer, Notebook, 
Harddisk .... } anti a subclass relation isas= {(Com- 
puter, Hardware), (Notebook, Computer) .... } c Fx 
The set of conceptual role names ~. = {HesPer|, 
HasPrice .... } contains labels of l)ossiblc conccptual 
relations (a frame-style, classification-based knowl- 
edge representation model in the spirit of MacGregor 
(1991) is assumed). The relation tic ~ .q-x.~ × .9imple- 
menls conceptual integrity conslraints: (f,, r,,q) c tic iff 
any concept subsumed by fe 5/ may be modified by 
any concept subsumed by,q< 5Vin relation re R, e.g, 
(Computer, hasPad, Harddisk) e de. From tic the rela- 
tion per,tit-~ {(,V, r,y) < '.fx Rx 51 ~ f,2/< 5r: (f,, r,.,7) 
e tic A -C/saF* fAy "~aj ~' 21} (* denotes the mmsitive 
closure) can be derived which explicitly states the 
range of concepts that can actually be related. For brev- 
ity, we restrict this exposition to the attribution of con- 
cepts and do nol conskler quantification, etc. (cf. 
Creary & Pollard, 19g5). 
4. ordering: The (word-class specilic) set onfer c q~' 
contains n-tuples which express ordering constraints 
on lhe wdencies of each word class. Legal orders of 
modifiers must correslx)ud to an element of order Tile 
(word specific) functio,~ occurs : © --> 9,~) associates 
dependency names with the modificr's (and self's) text 
tx)sition (0 for valencies not yet occupied). Both speci- 
lications appear at tile lexical head only, since they 
refer to the head and all of its modifiers. 
With these definitions, a valency can be characterized as 
an clement of the set ,/2c q)× Cx gl× ~, Focusing on one 
dependency relation from the example "Compaq entwik- 
keh einen Notebook mit einer 120-MByte-Ilarddisk" 
\['Compaq devek)ps a notebook with a 120-MByte hard 
disk"\], the at×we crileria are illustrated in Table 1. The fee- 
It|re structure of the two heads, "rail" and "Notebook", is 
given prior to and after the establishment of rite depen- 
dency relation. The concepts of each of the phrases, 
120MP,-I IARDDISK-00004 and NOTEBOOK-00003, are 
stated. The order constraint of "Notebook" says that it may 
be preceded by a specifier (spec) and attributive adjectives 
(,'tttr), and that it may be folk)wed by prepositional phrases 
(ppatt). The valency for prepositional phrases described in 
the last row states which class, feature, and domain con- 
straints must be fulfilled by candidate modifiers. 
The predicate SATISFIES (of. Table 2) holds when a 
candidate modifier fullills the constraints stated in a speci- 
fied valency of a candidate head. If SATISFIES evahmtes 
to true, a dependency valency.name is established (objec- 
t.attribute denotes the value of the property attribute at 
object). As can easily be veriIicd, SATISFIES is fldlilled 
for the combination of "mit", the prepositional valency, 
and "Notebook" from Table 1. 
380 
Possible 
Modifier 
I)ossible 
I lead 
Valencies 
(only one of 
the set is 
consklered) 
Attributes 
class ~ (2 
features e q'l 
concept e F 
imsition e 9~. 
features e '\[l 
concept 6 .q7 
orrfer c 9' 
occurs: q)--> 9~) 
nanm ~ rD 
class < C 
I features e cH 
domain C ~. 
lmxical items 
(head underlined) 
tnjt einer 
120-MByte- 
Ilarddisk 
eincn 
Notebook 
prior Io 
dependency establishment 
Preposition 
self Ei'o r m t,,i~ 
I 
case dat 
pobj agr gen fern 
tnn'~t sg _ 
i~20M B -I IAR1)I)I S K-000()| 
5 
after 
dependency establishment 
i l>rC.lX)sition -\] 
self E~brn, ,nit~ 
case dat 
lx)bj agr gen fern 
ii1.1111 sj.\] _ 
(20MII~I IAI~,DDISK-00{k),I 
5 
case IteC 
self agr <1>= gen mas 
II tllII sg _ 
spe  \[;+ <l;j 
Cilge IICC 
sell" agr <1>= gen mas 
lltllll sg 
s lU-~ c ~tgr <l>-\[ 
l'l 'aLl ~form llXi~ 
NOT\[:I~,OOK-00003 
{ <spee, attr, sell', ppatt> } 
{ (spec, 3), (altr, 0). (sell', 4), (ppatt, 0)} 
l>patt 
I~relmsition 
~,patt Efor,,, mi~ 
l lasl\[arddisk, 1 lasPrice, ... } 
TAIll.E I. An tlluslrali.n or grammatical specilicalions in the Parse'lhlk modal 
NOTI'~FR)OK4R)003 
{ <spec, atlr, self, ppan> } 
I(spec, 3), (mtr, 0), (self, 4), (IWW., 5)} 
not applical)le 
SATISFIES (modilier, valency, head) :<-~ 
modifier.class istl~ valency.class 
^ ((\[valency.name:(modifier.featureskself)\] V valency.features) 
V head.features) ~ .L 
^ 3 role < wdency.domain : 
(head.concept, role, modifier.concept) c permit 
^3 <d, .... dn> ehead.on\[er: 3 k c{1, ..n} : 
(valency.name = d R 
,', (V 1 -< i < k : (head.occurs (di) < modilier.positi(m)) 
^ (V k < i<_ n : (head.occt*rs (di) = 0 
v head. occurs (di) > modifier.position)) 
TAI|I,E 2. The SNI'ISFII,:S predicale 
Note that unlike most l)revious dependency grammar for- 
malisms (Starosla & Nomu,a, 1986; Ilellwig, 1988; Jiip- 
pinch, Lassila & Lehtola, 1988; Fraser & l ludson, 1992) 
this criterion assigns equal opportunities to synlactic as 
well as conceptual conditions for computing valid del)eU- 
dency relations, lnfommtion on word classes, morphosyn- 
lactic features, and order constraints is tmrely syntactic, 
while conceptual compatibility introduces an additional 
description layer to I)e satisfied before a grammatical rela- 
tion may be established (of. Muraki, lehiyama & lVukumo - 
chi, 1985; Lesmo & Lombardo, 1992). Note that we 
restrict the scope of tile unilicatiou module in our frame- 
work, as only morphosyntactic features are described 
using this sul)formalism. This contrasts sharply with stan- 
dard unification grammars (and with designs \[or depen- 
dency parsing as advocated by llellwig (1988) and Lom- 
bardo (1992)), where virtually all information is encoded 
in tenns of the unification formalism 4. 
2.1.1 A Look at (;rammalic.'ll llierarchies 
The grammatical styecifieation of a loxical entry consists of 
structural criteria (valencies) ~ behavioral descriptions 
(protocols). In order IO cal)ture ,elcwmt generalizatkms 
and to supl×)rt easy maintenance of grammar specilica- 
lions, both are represented in hierarchies (cf. Genthial, 
Courtin & Kowarski (1990) and Fraser & Hudson (1992) 
for inheritance thai ix restricted to slructtlrgl criteria). The 
valem:y tfieran:hy assigns valencies to lexemes. We will 
not consider it in depth here, since it captures only tradi- 
tional grammatical notions, like transitivity or rellexivity. 
The organizing principle is the subset relation on valency 
sets. The word dass tfferarchy conlains word class specili- 
cations lhat cover distributional :rod behavioral l)rOlx'.rties. 
Fit,. 1 ilh, stratcs tile behavk)ral criterion by defining for 
each class different messages (the messages for Word- 
Actor are discussed in Sections 3 and 4). Within the Noun 
l)art of the word class hierarchy, there am different meth- 
()(Is for anal)hera resolution rellecting different structural 
constraints on possible antecedents for \[R)minal anaphora, 
retlexives and personal pronouns. The word class hierar- 
chy cannot lye generate(l automatically, since elassilication 
of program specifications (commtmication protocols, in 
our ease) falls out of the scope of slate-of-the-art classilier 
'~ Typed unificalion formalisms (limele & Zajac, 1990) would easily 
allow for Ihe i\[it(:gration of word class iTll'omlatloll. Ordering con- 
strahlls and conceptual re,frictions (sttch as Value range reslricthm~; or 
claboraled intet, rily (:(:,llslr;lilltS), however, ;ire not SO easily \[ral~sfer- 
able, because, e.g., the conceptual C()liSIlaillls go Far Imyond the levd 
of aIOIlliC SCIIlalII{C features still prevailhlg h~ unilicaliorL f(mnalisms. 
381 
WordAct.r 
~tartUp 
makeAdjacent 
le.ftContext 
searchllead 
headl?e, und 
headAccepied 
dtlpllcateSlmcture 
copyStnlettlre 
_epl~silion 
mit 
I?IGURE 1. Fragment of the word class hierarchy 
algorithms. On the other hand, the concept hierarchy is 
based on the subsumption relation holding between con- 
cepts, which is computed by a terminological classifier. 
Most lexicon entries refer to a corresponding domain con- 
cept and thus allow concepttml restrictions to be checked. 
2.2 The Actor Computation Model 
The actor model of computation combines object-oriented 
features with concurrency and distribution in a method- 
ologically clean way. It assmnes a collection of indepen- 
dent objccLs, the actors, communicating via asynchronous 
message passing. An actor can send messages only to 
other actors it knows about, its acquaintances. The arrival 
of a message at an actor is called an event; it triggers the 
execution of a method that is composed of atomic actions, 
viz. creation of new actors c~ aetorTypo (acquaintan- 
ces)), sending of messages to acquainted or a newly cre- 
ated actors (send actor message), or specification of new 
acquaintances (become (acquaintances)). An actor system 
is dynamic, since new actors can be created and the com- 
munication topology is reconligurable. We assume actors 
that process a single message at a time, step by step 
(l\[ewitt & Atkinson, 1979). For convenience, we establish 
a synchronous request-reply protocol (Licberman, 1987) 
to compute functions such as uniIication of feature struc- 
tures and queries to a (conceptual) knowledge hase. In 
contrast to simple messages which unconditionally trigger 
the execution of a method at the receiving actor, we {loll ne 
complex word actor messages as fntl-lledged actors with 
independent computatiomd abilities. Departure and :nrival 
of complex messages are actions which arc perfornted by 
the message itself, taking the sender and the target actors 
as parameters. Upon arrival, a complex message deter- 
mines whether a copy is forwarded to selected acquaintan- 
ces of its receiver and whether the receiver may process 
the message on its own (of. Schacht, \[Iahn & Br~3ker 
(1994) for a treatment of the parser's behavioral aspects). 
The following syntax elements will he used subse- 
quently: a program contains actor definitions (declaring 
the acquaintances and defining the methods of actors 
instantiated from this definition) and actor message dc~ni- 
tions (stating distribution and computation conditions). 
Method definitions contain the message key, the fonnal 
parameters and a composite action: 
actorDef ::= defActor actorType (acquaintance) 
method Def* 
methodDef ::=.m~ messageKey (param) (action) 
messDet ::= defMsg message-type (acquaintance) 
(((if condition distributeTo tag))* 
if condition com_~ 
((if condition distribuleTo tag))*) 
action ::= action; action 
I if condition (action) \[ else (action) \] 
I send actor messageKey (param*) 
I become (acquaintance*) 
I create actorType ' * (acquaintance) 
I for var in set : (action) 
condition is a locally computable l)rcdicate, written as 
PREDICATE (actor'); actor stands for acquaintances, pa- 
rameters, newly created actors, the performing actor itself 
(se_~ or the mldelined value (nil); actor.acquaintance 
yields the correstx)nding acquaintance of actor; fo__zr var in_. 
set: (action) evaluates action for each element of set. 
3 A SIMPLIFIED PROTOCOL FOR ESTAB- 
LISIIING DEPENDENCY RELATIONS 
The protocol described below allows to establish depen- 
dency relations. It integrates structural restrictions on de- 
pendency trees lind provides for domesticated concurrency 
3.1 Synchronizing Actor Activities: Reception Protocol 
A reception protocol allows tin actor to determine when all 
events (transitively) caused by a message have terminated. 
This is done by sending replies hack to the initialor of the 
message. Since complex messages can be quasi-recur- 
sively forwarded, the tmmber of replies cannot be deter- 
mined in advance. In addition, each actor receiving such a 
message may need an arhitrary amount of processing time 
to terminate the actions caused by the message (e.g., the 
establishment of a dependency relation requires communi- 
cation via messages that takes indeterminate time). There- 
\['()re, each actor receiving the message must reply to the 
initiator once it has terminated processing, informing the 
initiator to which actors the message has been forwarded. 
A message is a reception message if(I) the receiver is 
required to (asynchronously) reply to the initiator with a 
receipt message, and (2) the initiator queues a reception 
task. An (explicit) receipt message is a direct message con- 
taining a set of actor identities as a parameter. This set 
indicates to which actors the receptior! message has been 
forwarded or delcg,'tled. The enclosed set enahles the 
receiver (which is the initiator of the reception message) to 
wait until all receipt messages have arrived 5. In addition to 
explicit receipts, which are messages solely used for termi- 
nation detection, there are regular messages that serve a 
similar purpose besides their primary ftmction within the 
parsing process. They are called implicit receipt messages 
(one example is the headAccopted message described in 
Section 3.3). A reception task consists of a set of partial 
descriptions of the messages that must be received (im- 
plicit as well as explicit), and an action to be executed after 
all receipts have arrived (usually, sending a message). 
5 This, of course, only happens if the distribmion is limited: The search- 
Head message discussed Ix:low is only distributed to the head of each 
receiver, which lllllSl (~etlr ill Ihe gallic st~rltellCe. This ellsures a finite 
RCIOF collection to distribute the IT~eRS~I~\[~ t~.), aTld gtlaranlees that t}le 
reception task is actually triggered. 
382 
wordActor (head deps vals feats ...) # head, dependencies, valencies, and features acquaintance~-------" 
searchHead (sender target init) # processed at candidate heads (~ from the message definition) 
(for val in vals: # check all valencies of the possible head 
(if SATISFIES (init val self) # valency check adapted from 7-able 2 
(&giLd_ (gLe.&t~ haadFound (.,~eJ.\[ init val,name feats\val.name)) p.~LeJ;z~<; # reply to initiator, hnposing restrictions 
(head deps vals (feats V init.leats) ...) # expand grammatical description of head 
#J&g. ( e.s.g..~d (create receipt (self init {head})) ~))) # send a receipt with the head the message was forwarded to 
# de p, a G realizes the departure of a complex message 
headFound (sender target name headFeats) #processed at the initiator of a searchHead message 
(se_g_r~d (create headAccepted (self sender name)) ~); # reply to head , 
(sender deps vals (feats V headFeats) ,..)) # store sender as head of s~.~e~. , restrict so~tz~. 's features 
headAccepted (modifier target name) # processed at the head only 
(fo.#£ dep in deps: # check aft. dependencies 
i(j\[ (name = clap.name) # relafion, name is identical 
(send dep store (modifier)))) # send the dependency the message store to store tile nlodifier 
send (create receipt (self modifier {head})) ~) # send a receipt with the tread the message was forwarded to 
"I'AIII,E 3. Method delinitiluis for sear('hllead, headl.'ountl~ heatlAccelltetl 
3.2 Encoding Structural Restrictions 
Word actors conduct a bottom-up search for l)ossil)le 
heads; the principle of non-crossing arcs (projectivity of 
the dependency tree) is guaranteed by the following for- 
warding mechanism. Consider the case of a newly illst:.lnli- 
ated word actor w. searching \[IS head to the left (tile oppo- 
site direction is handled in a similar way). In order to guar- 
aatee projectivity one has to ensure that only word actors 
occupying the outer fringe of the detyendency structure 
(tyetween the current absolute head wj anti the rightlnOSt 
element w._i) receive the search message of w. (these are 
circled in Fig. 2) 6. This forwarding schenic is reflected in 
the following simplified message definition: 
defMsg searchHead (sender target initiator) 
((if GOVERNED (target)Oistributelb head) 
# forward a copy to head, identified by head c 9) 
if. true g.g.l~) 
# the message is always processed at the target; 
# the computation event is concretized in the word 
# actor specification in Table 3 
Thus, a message searching for a head of its initiator is 
locally processed at each actor receiving it, and is for- 
warded to tile head of each receiver, if one already exists. 
l ......... 
• ," " "-_~..7"N. / td,k,.: text l)ositio.sl 
• " (WkW k Z -. ------ 
FIG UR E 2. Forwardillg a seart'h message 
Additionally, direct messages are used to establish a de- 
pendency relation. They involve no forwarding and nlay 
lye specified as follows: 
defMscj <direotMessage> (sender target ...) 
(if true com q.@m_p~) 
# a direct message is always processed at the 
# target, no distribution condition can apply 
Below, a number of messages of this type arc tised for 
negotiating dependencies, e.g., headFound, headAc- 
eepted, receipt (each with difrerent imrameters, as repre- 
sented by "..." above). 
~' Additionally, w. may be governed by lilly word actor govemin I, wj, bu! 
due to the synchronization implemented by Ihe receipt protocol, each 
head of wj must be ltx:ated to the right of w.. 
3.3 All Excerpt fr(llll the Word Actor Script 
The protocol for tx)ttom-up establishment of det)endencies 
consists el three steps: The search for a head (search- 
Head), the reply of a suitable head to the initiator of tile 
search (headFound), and tile acceplancc by the initiator 
(headAccepted), thereby Ix'coming a modifier of the 
head, The corrcslxmding method dc\[initions are given in 
Table 3 (note lhal Ihese mcth(xls are (lefincd for one actor 
type here, but tire executed by different actors during pars- 
ing). The protocol allows alternative attachments to be 
checked concurrently, since each aclor race\[vies search- 
Head may process it locally, while the message is simulta- 
neously distributed to its head. 
The specification of methods as above gives a local 
view of Jill tic\[or system, stgting how each actor behaves 
when it ~vceives a message. For a global view taking the 
actors' interaction patterns into account, cf. Schacht, i lahn 
& Br6ker (1994). 
4 AMBIGUITY IIANI)I.ING 
There are two alternative processing strategies for ambigu- 
ities, viz. serial vs. p,'mdlel pr(xzessing. We here f(ycus on a 
parallel mode, specifying only necessary serializalkms. 
Whenever an ambiguity is detected, additional actors are 
created to represent dilTerent readings. The standard three- 
step negotiation scheme for dependencies can easily be ac- 
commodated le tills duplication process. When a word ac- 
tor receives the second (or n-Ill) headFound inessai(e it 
does not immediately reply with a headAccepted mes- 
sage, but initiates the copying of itself, its modifiers, and 
the prospective head (which, in turn, initiates copying its 
m(xlifiers aed head, if any). Copying modifiers proceeds 
by seeding a copyStructure message R) each actor ill- 
votved, wllich evokes a (standard) headAccepted ines- 
sage returned by tile actor copy. Copying the head is done 
via a duplicateStructure message, which will result in an- 
other headFound message to lye returned. Since this 
headFound message is addressed to lhe ungoverned COl)y, 
the COl)y m,'ly reply ;is ilStl\[lI try sending a headAccepted 
message. 1)uplication of actors allows tile concurrent pro- 
tossing of alternatives, and requires only limited ove,head 
for the disiribntion of messages alllong duplicated actors, 
383 
4.1 Packing Ambiguities 
Usually, a packed representation of ambiguous structures 
is preferred in the parsing literature (Tamnra et al., 1991). 
This is feasible when syntactic analysis is the only deter- 
mining factor for the distribution of partial structures. But 
if conceptual knowledge is taken into account, the distri- 
bution of a phrase is not fully determined by its syntactic 
structure. Possible conceptual relations equally influence 
the distribution of the phrase. Additionally, the inclusion 
of an ambiguous phrase in a larger syntactic context 
requires the modification of the conceptual counterparts. 
In a packed representation, there would have to be several 
conceptual counterparts, i.e., only the syntactic representa- 
tion can be packed (and it might even be necessary to 
unpack it on-the-fly). Consequently, whenever conceptual 
analysis is integrated into the parsing process (as opposed 
to its interpretation in a later stage, thereby producing 
numerous ambiguities in the syntactic analysis), structure 
sharing is impossible, since different syntactic attachments 
result in different conceptual analyses, and no common 
structure is accessible that can be shared (cf. Akasaka 
(1991) for a similar argument). We expect that the over- 
head of duplication is compensated for by the ambiguity- 
reducing effects of integrating several knowledge sources. 
4.2 Relation to Psycholinguistic Perfnrnmnce Models 
It has been claimed that human language understanding 
proceeds in a more sequential mode, choosing one alterna- 
tive and backtracking if that path fails (e.g., Ilemforth, 
Konieczny & Strube, 1993). This model requires the rank- 
ing of all alternatives according to criteria referring to syn- 
tactic or conceptual knowledge. The protocol outlined so 
far could easily be accommodated to this processing strat- 
egy: All headFound messages must be collected, and the 
corresponding attachments ranked. The best attachment is 
selected, aml only one headAccepled message sent. In 
case the analysis fails, the next-best attachment would be 
tried, until an analysis is found or no alternatives are left. 
Additionally, the dependencies established during a failetl 
path would have to be released. 7 
5 COMPARISON TO RELNI'EI) WORK 
The issue of object-oriented parsing and concurrency (for 
a survey, cf. Hahn & Adriaens, 1994) has long heen con- 
sidered from a purely implementational perspective. Mes- 
sage passing as an explicit control mechanism is inherent 
to various object-oriented inaplementations of standard 
rule-based parsers (cf. Yonezawa & Ohsawa (1988) for 
context-free and Phillips (1984) for augmented PSGs). 
Actor-based implementations are provkted by Uehara et 
al. (1985) for LFGs and Abney & Cole (1986) for GP, 
grammars. Similarly, a parallel implementation of a rule- 
7 Note that all psycholinguistic studies we know of are rel~:rring to a con- 
slituency-based grammar model Since our grammar is based on 
dependency relations, principles such as Minimal Attachment cannot 
be transferred without profound modification, since in a dependency 
tree the number of nodes is identical for all readings. Therefore, princi- 
ples adapted to the structural properties of dependency Irees must ix: 
formulated for preferenlial ranking. 
based, syntax-oriented dependency parser has been 
described by Akasaka (1991). The consideration of con- 
currency at tile grammar specification level has recently 
been investigated by Milward (1992) who properly relates 
notions fro,n categorial and dependency grammar with a 
state logic apl)roach, a formal alternative to the event-alge- 
braic formalization underlying the ParseTalk model. 
Almost any of these proposals lack serious accotmts of 
the integration of syntactic knowledge with concepttml 
knowledge (cf. the end of Section 2.1 lot similar consider- 
ations related to dependency gram,nars). The develop,nent 
of conceptual parsers (Riesbeck & Schank, 1978), how- 
ever, was entirely dominated by conceptual expectations 
driving the parsing process and specifically provided no 
mechanisms to integrate linguistic knowledge into such a 
lexical parser in a systematic way. The pscudo-pandlelism 
inherent to these early proix)sals, word expert parsing in 
particular (Small & Rieger, 1982), has in the mcantime 
been replaced by true parallelism, either using parallel 
logic programming envkomnents (Devos, Adriaens & 
Willems, 1988), .actor spccilicatious (Hahn, 1989) or a 
connectionist methodology (Riesbeck & Martin, 1986), 
while the lack of linguistic sophistication has remained. 
A word of caution should be expressed regarding the 
superficial similarity between object-oriented and connec- 
tionist models. Cnnnectionist methodology (cf. a survey 
by Selman (1989) of some now classical connectionist nat- 
ural language parsing systems) is restricted in two ways 
compared with object-oriented comlmting. First, its com- 
mtmication patterns are determined by tile hard-wired 
toI×)logy of colmectiouist networks, whereas in object-ori- 
ented systems the tOl)Ology is llexible and reconfignrable. 
Second, the type and amount of (lata that can he exchanged 
in a connectionist network is rcslricted to marker anti 
value passing together with severely limited computation 
logic (and-ing, or-ing of Boolean hit markers, determining 
maximum/minilnum values, etc.), while none of these re- 
strictions apl)ly to message passing models. These consid- 
erations eqtmlly extend to spreading aclivation models of 
nalural language parsing (Chamiak, 1986; llirst, 1987) 
which are not as conslraincd as connectionist ,nodels but 
less expressive than t, cneral message passing models 
underlying the object-oriented paratligm. As shotdd Ix: 
evident from the preceding exposition of the ParseTalk 
m(xlel, the complexity of tile data exchtmged and compu- 
t:ltions perfommd, in our case, require a full-llcdged mes- 
sage-l)assiug model. 
6 CONCLUSIONS 
The ParseTalk model of natural language underslanding 
aims at the integration of a lexically distrit)uted, depen- 
dency-based grannnar spccilication with a solid fonnal 
foundation for concurrent, object-oriented parsing (of. 
Ilahn, Schacht & BrOker (forthcoming) \['or a more elabo- 
rated presentation). It conceives communication among 
and within different kuowledgc sources (grammar, domain 
and discourse km)wlcdge) as the backbone for complex 
language understanding tasks. The main specification ele- 
ments of the grammar m(×lel consist of categorial, roof 
384 
phosyntactic, conceptual, and ordering constraints in terms 
of valency specificatious attached to single lexical items. 
The associated concurrent computation m~xlel is based on 
the actor paradigm of objcct-orieuted programmiug. The 
l'arseTalk model has been experimentally validated by a 
prototype system, a parser for German (for its implemenla- 
tional statns, cf. Schacht, Ilalm & Brt~ker, 1994). 
Acknowledgments 
The work reported in this paper is funded by grams from I)FG (grants no. 
I {a 209711-1, t la 2097/1-21 within a special research programme on cot,- 
nitive linguistics, We like to thank our colleagues, E Neuhatls, K. Sclmat- 
linger, M. Klenner, anti "Fit. llanneforth, for valnable eonlrllents arid stlp- 
Ix~rt. 
R EFIe~RENC le~S 
AlINE'Y, S. & COl.l:.. J. (19861. A govemment-bindlng parser. Proc. I6th 
NELS. pp.l-17. 
AGIIA, G. & IIEW\['I~I; C. (1987). Concurrent programming usiug 
actors. In A,YoneTawa & M. 'l'okoro, 17.ds. Oblect-Oriented Concur- 
rent Programming. pp.37-53, MIT Press. 
AKASAKA, K. (19911. Parallel parsing system based {m dependency 
grammar. In C. Brown & G. Koch, Eds. Nutural Language Untler- 
standing and Logic Progratmning, Ill. pp. 147-157. Norlh-I lolland. 
CIIARNIAK, li, (1986). A neat theory of marker passing. AAAI-86: 
Proc. 5th National Conf, on Artificial Intelh'gence, Vol.l. pp.584- 
588, 
CI~,EARY, L.G. & I~OLI,ARI), C.J. (19851. A compulatlonal semantics 
fl~r natural language. Proe. 23rd Annual Meeting of the Association 
for Comlmtational Lingu&tics. pp. 172-179. 
DAI:A.EMANS, W.; l)e SMFA)T, K. & GAZI)AP,, G. (1992). Inheritance 
in natural hmguage processing. Computational LinguLrtics, 18 (2), 
205-218. 
DI~.VOS, M,; AI)RIAENS, G. & WILI.EMS, Y. (1988\]. The parallel 
expert parser (I)FA~): a thoroughly revised descendent of tile wool 
expert F, arser (WEP). COLING'gs: Proc. 12th Intl. Conf. on Com- 
putational Linguistics. Vol.l. pp. 142-147. 
EMELF., M. & ZAJAC, R. (1990). "l~,ped unificatkm grammars. COL- 
ING'90: Proc. 131h Intl. Conf. on Cotnputatiotml Linguistics, Vol.3, 
pp.293-298. 
EVANS, R. & GAZDAR, G. (1990). The DATR Papers, Vol. 1. Univ. of 
Sussex, Brighton. (Cognitive Science Research Paper, CSRP 139). 
FI~-ASEI~., N.M. & I IUI)SON, R.A. (1992) Inheritance in word grammar 
Computational LinguL~'tics, lg (2), 133-158. 
GF.N'I'IIIAL, 1).; COURTIN, J. & KOWARSK1, 1. (1990), Contribution 
of a category hierarchy to the robusmess ol syntactic parsing. COL- 
ING'90: Proc. 131h Intl. Conf. on ComputationtH LinguL~'tlcs. Vol.2. 
pp.139-144. 
GRANGER, R.; FASEIT, K. & IIOIAH~,O()K, J. (19861. Parsing with 
parallelism: a spreading-activation model of inference lnOCeSsing 
dnrittg text understanding. In J,L. Kolodner & C.K. I~,iesbeck, Eds. 
I~perience, Memory, antt Reasoning. pp,22%246, i. F.rlbamn, 
tlAIIN, U. (19891. Making tmderstanders out of parsers: semantically 
driven parsing as a key concept for realistic text understanding ap- 
plications, lnternationaI J. of Intelligent SystetrLv, 4 (3), 345-393, 
I1AIIN, U. & AI)R\[AI{NS, G. (1994). Parallel natllral language F, rC, cess- 
ing: backgr(mnd and overview, In G. Adriaens & U, Ilalm, F.ds. 
Parallel Natural Long uage Processing. pp. 1 - 134. Ablex. 
IIAIIN, U.; SCIIACIrI', S. & I\]I~.OKER, N. (ftmhcoming). Concurrent, 
object-oriented natnral language parsing: the l'arseTalk model. 
International Journal of Iluman-Computer SttMies, Special Issue 
on Object.oriented Approaches in Artiftcial Intelligence and 
Ihonan-Computer Interaction. 
I1ELI.WI(\], P, (19881. Chart parsing according to the slot and filler prin- 
ciple. COLING '88: Pro& \]2th Intl. Conf, on Computational Lin- 
guistics, Vol.1, pp.242-244. 
IIEMFOI,tTII, B.; KONIF/..TZNY, L. & STRUIH:, G. (1993). Incrcmemal 
syntax processing attd parsing strategies. Pro& 151h Annual Conf. 
of the Cognitive Science Society, pp,539-545. 
I/EI)PIJ~, M. (1992). Chart parsing \[mmbek grammars: modal extensions 
and incrementality. COIJNG '92: Proc. 151h Intl. Conf, on Compu- 
tational Linguistics. I/o1.1. pp. 134-140. 
IIEW1TI; C, & KI'KINSON, P,. (19791. Slmcification and proof ted> 
niques for serializers. IEEIe. Transactions on Software lengineering, 
SE-5 (11, 10-23. 
I IIRS'I', G, (1987). Sertumti¢ Interpretation anti the Resolution of Ambi- 
guity. Cand)ridge University Press. 
\]IUI)SON, R. (19901. FmglLrh Word Gratmnar. Rasil Blackwell, 
Jfi, PI'INEN, ll.; I,ASSII.A, E. & I.EIlTOLA, A. (1988). I.ocally gov- 
erned trees and dependency parsing. COLING'88: Proc. 72111 Intl. 
Conf. on Computational Linguistics. Vol.l. pp.275-277, 
LESMO, L. & I£)MBARDO, V. (1992). 'l\]m assignment of grammatical 
relations in natural language pr~x:essing. COIJNG '92: Proc. 151h 
Intl. Conf. on Computationtd Linguistics. Vol,4. pp. 10911-1 (Y)4. 
LIFIIF, RMAN, II. (1987). Concurrent object-oriented pmgnnnnting in 
Act 1. In A. Yonezawa & M. 'lbkoro, Eds. Object-Oriented Concur- 
rent PrograJmning. pp.9-36. MIT Press, 
I.OMIIARDO, V. (1992). Incremental dependency parsing. Proc. 30th 
Annta:l Meeting of the Association for Comlmtational Linguistics. 
pp.291-293. 
Mac(;Itl:GOl't, R. (1991). The evolving technology of classification- 
based knowledge teprcsenlallon systems. In J. Sown, lid. Principles 
of Semantic Network.r, pp,385-400. Morgan Kaufmann. 
MARSI.IiN-WII~";ON, W. & TYIA~R, 1.. (1980). "llm temporal structure 
of spoken language understandhlg: Ihe l)erceptlon of sentences atu( 
words in sentences. Cognition, 8, 1-71. 
MII~WARI), I). (1992). Dynamics, dependency grmnmar and incremental 
inteq~retation. COLIN(; '92: Brae. 151h lntL Conf on Computa- 
tional Linguistics. Vol.4. pp.1 fF15-1099. 
MURAKI, K.; ICIIlYAMA, Sh. & FUKUMOCIlt, Y. (1985). Aug- 
inelltetl tlelx211dency grallntlar: a slmple interface between the grant- 
mar nile and the knowledge. Proc. 2nd Conf. F~uropean Chapter of 
the Association for ComI)utationtd Linguistics. pp.198-204. 
PlIIII.IPS, B. (198,11. An object-oriented parser. In B.G. Flara & (}. 
Guido, lids. Computationtd Models of Natural \[~mgu2:ge Process- 
ing. pp.297-321. Nonh-I Iolland. 
P()I.I.AI~I), C. & SAG, 1. (1987). ItJformation-BasedSyntaxandSeman- 
tics. Vol.l : FundamentaLr. Chicago University Press. 
RIF.SIH~CK, C. & MARTIN, C. (1986). I)irect memory access parsing. 
In J.1.. Kolodner & C,K. Riesbeck, I'.'ds. Experience, Memory and 
Reasoning. pp.2('~)-226, I .. F, dbat,n. 
RIF.SIIECK, C. & SCIIANK, R. (1978). (~mtprehension I W computer: 
ex\[xmtation-based analysis of sentences in context. In W.J.M.l.ev- 
ch & G.B. I:lnres d'Arcais, Fals. Studies in the Perception of Lan- 
gla:ge, pp 2-'17-293. J. Wiley. 
SCI/ABIiS, Y.; AIH:IlA.F,, A. & JOSIII, A. (1988), Parsing strategies 
wittl 'lexicalized' grammars: application to tree adjoining gram- 
mars. COLING "88: Proc. 12th Intl. Conf. on Computational Lin- 
guistics. Vol. 2. pp.578483. 
SCI\[ACIIT, S.; IIAIIN, U. & IH~,OKFJ~,, N. (19941. Concurrent lexical- 
ized dependency parsing: a Imhavioral view on ParseTalk events. 
COLING '94: Proc. 15th Intl. Conf. on Computational Lingulsties. 
(this volume) 
SIIIFBER, S. (1986). An Introduetion to Unitarian-based Approaches 
to Grumtm:r. Chicago University Press 
SEI,MAN, II. (19891. (3omu~cfionist systems for natural language under- 
standing. Arh~licial Intelligence Review, 3, 23-31. 
SMAIA., S. & R1EGFA~,, C. (19821, Parsing and comprehending with 
s;'onl experts (a theory and its realization), ht W. Lchuert & M. Rin- 
gle, Eds. Strategies for Natural Language P~ocessing, pp.89-147. 
L. Erlbaum. 
STAROSTA, St. & NOMURA, II, (1986). l.exicase parsing: a lexicon- 
driven approaeh to syntactic analysis, COl,IN(; '86: Proe. IIth Intl. 
Conf. on Computational Ling uistlcs, pp. 12"/-132. 
TAMURA, N.; IIOS, M,; /vlURAKAMI, II.; NISIIII)A, O.; YOSIIIMI, 
T. & JELINliK, J. (19911. l.azy evaluation of preference on a 
packed shared forest without unpacking. In C,G, l~rown & G. Koch, 
Eds. Natural I,anguage Understancling anti Logic Programming, 
II1. pp.13-26. North-llolland. 
TI IIBAI)FAU, R.; JUST, M. & CARPENTI~I~,, P, (1982). A model of the 
time cotlrse ned content of reading. Cognilive Science, 6, 157-203. 
UFAIARA, K.; OCIIITANI, R.; MIKAMI, O, & TOYODA, J. (19851. An 
integraled parser for text understanding: viewing parsing as passing 
messages among aclors. 111 V. I)ahl, P. Salnt-l)izier, F.ds. Natural 
Langta~ge IJnderstanding and Logic Progratmning. Proc. of the ?st 
Intl. WorLrholr. pp.79-95. Norlh-llollalld. 
YONEZAWA, A. & OIISAWA, I. (19881. Object-orienled parallel pars- 
ing for cc, nlext4ree grammars. COLING "88: l'roe. 12th Intl. Conf. 
on Computational Linguistics. Vol.2, pp.773 078. 
YU, Y. & SIMMONS, R, (1990). Truly parallel underslanding of text, 
AAAI-90: Proe. 8th National Conf. on Ar@cial Inlelligence. Vol.2. 
pp.996- I(XII. 
385 
