A Parser based on Connectionist Model 
Lliroshi NAKAGAWA , Tatsunori MORI 
l)ept. Elect:ronics and Computer Engineering, Yokohama National University 
156 Tokiwadai Hodogaya-ku, Yokohama 240 Japan 
net-mail-address a36646@ec, cent. u--tokyo, junet 
Abstract 
This paper proposes a parser based fully upon the 
conneet:i.oni.st modeL(ca\]led "CM parser" hereafter). In 
order to realize L~ile CM purser, we use Sigma-Pi- 
Units to implement ~ constraint of grammatical 
category order or word order, and a copy mechanism of 
suh~parse trees. Further more, we suppose there exist 
weak suppressive connection lisks between every pair 
of CM units. By these suppressive links, our CM 
parser explains why garden path sentences and/or 
deeply nested sentences are hard to recognize. Our CM 
parser also explains the preference principles for 
syntact:\[cally ambJ guotls sentences. 
I. Introduction 
In order to make clear a human parsing mechanism 
for natural language sentences, there remain some 
phenomena that are difficult to be explained by one 
.integrated principl, e. These phenomena include 
cognitive difficulties to recognize garden path 
sentences or deeply nested sentences, and preference 
of structurally ambiguous sentences. All the parsing 
mechanisms proposed so far, for instances the top- 
down parsJngs /Pereira \]980/, the left corner parsing 
/.Johnson-.Laired ~983/, Marcus's parsing model/Marcus 
\]980/, Shieher's shift~rednce parser /Shleber 1983/, 
and so on, have not yet sncceeded to explain all of 
these phenomena under one simple integrated 
principle. Note that all of them are based on symbol 
manJ pu\]atien paradigm. 
Recently a connectionist model ( called CM 
hereafter ) approach has been noticed in many area of 
cognitive science including hatura\], language 
recognition° This approach has some advantages that 
the symbol manipulation approaches do not have. One 
advantage is that it is easy to use not only 
syntactic informations but also semantic and/or 
contextual informations in a uniform manner /Reilly 
19~4/. One fruitful result of this approach is the 
explanation about recognition of semantic garden path 
sentences like "The astronomer married the star" 
/Waltz 1985/. Another advantage is as follows. Since 
the connectionist model is a parallel system without 
any central, controller I and an activation level of 
each unit and a connectlon strength between units may 
be presented as continuous values\] it alludes much 
more flexible approaches than symbol manipulation 
approaches do. And we also expect it can simulate 
some aspects of human mental processing of sentence 
parsing. 
This paper is concerned with the second 
advantage in parsing. The paper proposes a CM parser 
which can explain the above mentioned phenomena as 
preferences etco in one integrated principle. 
2. Parser based on conneetionist model 
Here we omit the technical details of the CM 
/MeCle\].land&Rumelhart 1986/, but we must make clear 
that we stand for the so called "localist" view in 
which one symbol corresponds to one unit. Tberefore 
in our CM parser, syntactical categories like noun 
phrase are represented by a unit in the CM, and a 
parse tree is represented as a network in which 
suitable syntactical categories being activated are 
connected. In order to realize a CM parser, we have 
to make clear the following two problems: 
(1) How to express a word order or a syntactical 
categories order appearing in phrase structure 
rules. For example~ in a rule S -~ NP VP, NP must 
precede VP. 
(2) How to represent a ease when a parse tree is 
generated by recursive phrase structure rules. 
Consider rules as follows: S -~ NP VP, NP ---9 NP S and 
--> Comp S. The same pattern, in this case a pattern 
corresponding S-~ NP VP, may appear more than once :in 
a parse tree of one sentence. In order to represent 
this case, we need a copy mechanism of a partial 
parse tree pattern corresponding to the phrase 
structure rule in a connection network. Otherwise we 
have to prepare infinite number of copies of a 
partial parse tree pattern in advance° Of coarse this 
preparation is non-realistic not on computer hardware 
bat on human we\]ware. In Fauty's CM parser mentioned 
in /MeClelland&Kawamoto 1986/, the length of sentence 
is limited because of the above described 
preparation. 
2.1 Phrase structure sub-network 
Consider the next rule. 
C ---> A B (3) 
This rule has at least two meanings. One is that: the 
category C consists of the category A and the 
category B. Another is that'\]he category B follows 
the category A. This meaning is concerned directly 
with the problem (i). To represent a case that a word 
is coincident with some syntactic category, we 
modify (3) as follows. 
C -9 word 
Since this rule is one variant of rule of type(3), we 
study about only rules of type (3) hereafter. We will. 
explain about a sub-network that corresponds to the 
phrase structure rule (3). 
We solve the problem (i) by introducing a 
trigger link that is presented as .-~-~ in figures. 
Namely " A ~.t > B" expresses that B follows A. From 
the viewpoint of the CMp the meaning of this trigger 
link is that the unit for category B ( called "B 
unit" hereafter) can be activated only when the unit 
for category A (called "A unit" hereafter) is fully 
activated. Due to the trigger link, the A unit must 
he activated chronologically faster than the B unit. 
The trigger link is realized by a Sigma-Pi-Unit 
/McClelland & Rumelhart 1986/ that includes a 
multiply operation. Figure 1 shows a concept of 
Sigma-Pi-Unit in the CM. 
Figure 1. Sigma-Pi-Unit 
In Figure 1, B and C are CM units. They send outputs 
whose values are fb and fc expressed as pos:itive 
values , to the A unit. These values are 
corresponding to the B and C unit's activation levels 
respectively. WIA is a weight of link from B and C 
to A. The input to the A unit is as follows. 
WlA*fb*fe 
If the B unit's activation level:fb=0~ then the C 
unit's activation level, does not transmit to the A 
unit at all. \]:n other words, the B(or C) unit~s 
activation level is an on-off switch for actJ vation 
transmission from the C(or B) unit to the A unit~ 
Using Sigma-Pi-Units, a sub-network of phrase 
structure rule (3) is represented as shown in Figure 
2. The weight WA> B is very small in this case, but 
note that it depends on some semantic information° 
45~ 
C 
WA>C WB>C 
Figure 2o Sub--network of C -> A B 
This network will be presented in a simpler form 
using a trigger link " A -~-~ B" hereafter as shown 
in Figure 3. A-, B-, and C-connectors' structures 
appeared in Figure 3 are explained in Section 2.3° 
C~conuector 
A.-connector B-connector 
network is copied to the programmable sub-networks 
via the connection activation system. In order to 
implement a copying mechanism of phrase structure 
rules in the form of C -~ A B , we use three CID 
mechanisms. They are for bidirectional connections 
between the A unit and the C unit, between the B unit 
and the C unit, and between the A unit and the B unit 
respectively. We omit the further details because of 
the limited paper space. 
Central network Connection Activation System 
Figure 4. A simple connection infoprmation 
distribution (CID) mechanism 
Figure 3. Simpler form of Figure 2's network 
2.2 Copying sub-network 
Our final goal is to make clear a mechanism of 
building a parse tree \]for a whole sentence by 
connecting sub-networks. For this purpose, the 
simplest method is preparing parse trees of all the 
possible sentence structures. \]in principle this 
method is not possible, because there are infinite 
number of possible sentence structures. Other method 
is preparing a number of copies of a sub-network for 
each phrase structure rule in advance. For example~ 
ten sub~-uetworks of S -~ NP VP, ten sub-networks of 
VP -~ V NP, and so on. When a parser reads a 
sentence, it selects some sub-networks from these 
prepared set of sub-networks, and connects them to 
make a parse tree of the input sentence. This method 
seems to work well and solves the above mentioned 
problem (2). Unfortunately this method has a serious 
deficiency as follows. From the view point of 
learning in the CM, all the weights of connection 
links of sub-networks are learned by parsing or 
recognizing a number of sentences. It is a plausible 
hypothesis that once a human becomes to be able to 
parse some structure of sentence, he/she ever can 
parse that structure since then. In order to explain 
this hypothesis, the above mentioned weights learning 
must be uniformly done for all copies of sub-networks 
of the same phrase structure rule. But this uniformly 
learning is too artificial for the human mental 
learning processes. 
A solution avoiding these difficulties is as 
follows. There is only one central sub-network for 
one phrase structure rule, and all learning processes 
are done on it. In parsing, when a parser needs a 
sub-network of some rule, the parser makes copies of 
the sub-network and connects them into a suitable 
place of a parse tree yet to be constructed. 
A sub-network copying mechanism is implemented 
as an application of the connection information 
distribution (CID) mechanism /McClelland 1986/. 
Figure 4 is a simple example of copying. The 
programmable sub-networks are implemented with the 
Sigma-Pi-Units. There are a lot of yet to be 
progra~ned programmable sub-networks, namely blank 
sub--networks. When the input comes in, the 
corresponding connection pattern of the central 
2.3 Connecting sub-networks 
To generate a parse tree, we need a mechanism of 
generating connection links dynamically. 
Unfortunately the CM ham not yet had this mechanism. 
Instead of this mechanism, we use a connector that 
changes connection dynamically by SJgma-Pi-Un\]ts. 
There are three kinds of connector, namely A-, B-, 
and C-connector as shown in Figure 3o We will explain 
these connectors' functions in this section° 
C-connector : If a C unit of a sub-network is 
activated, the C-connector sends requests for 
connection to A-connectors of blank sub-networks or 
B-connectors whose sub-network's B unit is the same 
syntactical category as the sender sub-network's C 
unit's syntactical category. More than one 
connections may be established by these requests, 
however, they suppress each other, and at last the 
connection from the most strongly activated B'un:Lt 
wins. Even if a C unit is not so strongly activated, 
the C-connector sends these requests. Before a human 
has read a whole sentence, or even if he/she reads 
only few words, he/she predicts a complete or fairly 
large part of parse tree of possible sentence, This 
is why we adopt this low threshold strategy of 
requests sending. 
A--connector : When an A-connector receives a request 
for connection from the other sub-network's C- 
connector, if the A-connector has not yet received 
any other requests for connecting, the A-connector 
makes a copy of sub-network whose A unit's syntactic 
category is the same as the syntactic category of C 
unit of the sender sub-network. By this copying, a 
parse tree grows in bottom-up manner. 
B-connector : When a B-connector receives a request 
for connection from the other sub-network's C- 
connector, if the B unit's syntactic category is the 
same as the sender sub-network's C unit's syntactic 
category, a connection between the sender's C- 
connector and the receiver's B-connector is 
established. If more than one connections are 
establisiled, they suppress each other. Finally the 
most strongly activated connection inhibits other 
connections. This suppressive or exclusive 
connections are expressed as \[ X Y \] shown in 
figures~ \]in this expression, connections between X 
and Y are mutually suppressive or exclusive° 
The above described connectors structure are 
shown in Figure 5,6 and 7 respectively° 
455 
\[?r ONI 
unit C 
....... ) request 
~-r- . -~-~~ <--~C'~ ac knowledge 
4-~" To A- or B- 
connectors 
..... :negative weight link 
Figure 5. C-connector (-~ :unit 
request 
acknowlodge 
,tom C connector _I/9. 
Figure 6. A-connector 
To unit A 
request 
acknowledge <- 
From C-connector 
request 
acknowledge <--- 
Trigger input from A- 
v To unit B 
Figure 7. B-connector 
2.4 Parsing on the CM parser 
To summarize the above described CM parser, we 
sketch a parsing process of a sentence '!I eat 
apples." Phrase structure rules used in this example 
are as follows. S -9 N VP and VP -9 V N. 
Parsin~ ~rocess 
(I) The CM parser reads "I" , and a unit for category 
N is activated. 
(2) The C-connector of the N unit sends a request for 
connection to an A-connector of the currently usable 
blank sub-network. 
(3) When an A-connector receives the request, it 
makes a copy sub-network of S -9 N VP. Since the N 
unit of the copied sub-network is fully activated, 
the trigger link from the N unit to the VP unit 
becomes active. 
(4) Tile CM parser reads "eats", and a unit for 
category V is activated, and a request for connection 
is sent from its C-connector to some A~connector. 
(5) When an A-connector receives this request, it 
makes a copy sub-network of VP -9 V N. Not only the V 
unit but also the VP unit is activated. Of course 
the trigger link from the V unit to the N unit is 
activated. 
(6) The VP unit sends a request for connection via 
its C-connector. This request is received by the B- 
connector of the previously copied sub-network for 
the phrase structure rule S -~ N VP, because this 
sub-network's B unit's category is VP, and the sender 
sub~network's C unit's category is also VP and 
triggered as you see at stage (3). 
(7) The CM parser reads "apples", and a unit fo~ 
category N is activated, and a request for connection 
is seat from its C-connector. 
(8) This request is received by the B-connector of 
the copied sub-network at(5). This activates the C 
unit of this sub-network whose category is VP. This 
456 
activation causes that the B unit of the sub-network 
of S -9 N VP. Finally;. its C unit whose category is S 
becomes fully activated, namely the sentence is 
recognized and the parse tree is accomplished° 
The result parse tree is shown in Figure 8. For 
compact expressions, the A- B- and C-connectors are 
omitted in the rest of the paper. 
S /\ 
N t -> VP 
I I A-con B-con 
I I C-con C--con 
I I /\[% 
V N I I 
A-con B-con 
I I C-con C-con 
I I V N 
i I _ ea t____ss 
Figure 8. An example parse tree made by the CM parser 
Intuitionally, our CM parser is a parallel\[ \].eft 
corner parser. Speaking more precisely, owing to use 
a trigger link which predicts syntactic categories 
of the next incoming word, Our CM parser is regarded 
as a parallel left corner parser with a continuous 
activation level for each generated nonterminal 
symbolrepresentingsomesyntacticcategory. 
3. Control on resource bounded condition 
It is well known that a human memory system 
consists of at least two levels namely the short term 
memory and the long term memory respectively. A 
capacity of short term memory is limited to 7 4~ 2 
chunks. In the CM, an implementation of short term 
memory has not yet been cleared. But intuitionally, 
the sum of all units' activation level is bounded. 
We implement this bound by the almost equivalent 
mechanism as follows. Namely there exist weak 
suppressive connection links between every pairs of 
units. Owing to this limitation, even if our CM 
parser is parallel one, it is impossible in parsing 
to maintain all possible candidate parse trees. Since 
our parser is based on the CM, the most promising 
parse tree is the most strongly activated one. Other 
parse trees are suppressed by the most promising one 
through the suppresszve or the exclusive connections 
described in Section 2.3. In the rest of the paper, 
we propose explanations for control mechanisms of the 
CM parser especially about parsings of deeply nested 
sentences, garden path sentences and preferences of 
syntactically ambiguous sentences, 
4. Recognition of deeply nested sentences 
Our CM parser can explain why deeply nested 
sentences like "The man who the girl who the dog 
chased liked laughed" are hard to recognize for us 
human. Figure 9 shows a network being built just 
after the CM parser reads "The mall who the girl who 
the dog chased". Here, since the NP 3 unit is strongly 
activated, the VP2/NP unit is strongly predicted and 
it is the right prediction. But since the NP 1 unit 
and the S unit are also activated, the VP 1 unit is 
also predicted. Therefore when the CM parser reads 
"liked", it is not very easy to select the VP2/NP 
unit definitely. As seen in this example, when the CM 
parser reads a word at the deeply nested level, there 
may be a case that more than one units are strongly 
activated and predicted, If they have nearly the same 
activation level, it is not easy to select the right 
unit. Th:~s is one possible explanation why it is bard 
for us human to recognize deeply nested sentences, if 
the CM is a plausible model of the human mental 
process° 
S 
2NP2 ........ ~1 
Det~-k~--~ N Comp---- ~ ---S/N P 
!' I~ln wl\[o /P3~ 
De~t "~- " N Comp~S/NP 
I I i i\ the ~ who NP~VP/NP 
..... i 
i I I t h__£e ~ chased 
Figure 9o A parse tree (connected network) 
just after "The man who the girl who the dog chased" 
5o Gard~,n path sentences 
If there are more than one possible syntactic 
structures for the input sentence, the CM parser 
makes more than one parse tree networks corresponding 
to them in a parsing process. If one of them is much 
more strongly activated than others, the parser 
easily ~e\]ects it as the right network. But more than 
one networks are often activated to almost the same 
\]evel. \[n the case, how to select one of them depends 
on many factors, for instance a contextual or a 
semantic inforl,ation° There is a worse case as 
follows_ Assume that a parser reads some words of the 
sentence, and there are more than one parse trees. 
One of them has the highest activation level than 
others at that time. But when the parser reads the 
next word, if the highest parse tree turns out to be 
syntactically impossible, some weakly activated parse 
tree is forced to be activated to the highest level 
suddenly. This forced sudden change of the activation 
level may cause us human a difficulty to recognize 
the sentence. This is an informal explanation for 
cognitive difficulty of recognizing garden path 
sentencE, s. 
\]n order to explain what parse tree is chosen, 
we have to recognize which exclusive connection plays 
the main role of preference between possible parse 
trees. Without loss of generality, it is sufficient 
to explain how one of two parse trees is chosen. In 
short, this choice point is such that an upper part 
of tree from this point is common to the both trees, 
and a part of trees that are below this choice point 
are different. Figure i0 shows a network generated 
for a garden path sentence "The cotton clothing is 
made of grows in Mississipi." The wrong parse tree 
including the S~ unit is preferred while our CM 
parser reads "T~e cotton clothing is made of" , 
because in the phrase structure rule ~ -~ S/Np, the 
connect\[on link from the S unit to the ~nit is 
weak, and "clothing" is NP. But when the CM parser 
reads "i~rows" , the wrong parse tree including the S a 
unit is rejected syn£actically, and the right but 
weakly predicted VP. unit must be connected the VP 
unit for "grows". ~ybe humans feel cognitive 
difficulty at that time. Note that although our CM 
parser should do a lot of works to parse a garden 
path sentence~ namely the forced sudden change of 
activation levels , finally it succeeds to parse the 
garden path sentence as well as human. It is a main 
difference of performance between our CM parser and 
Shieber's shift reduce parser. 
/ \ \ | rejected / 2 ,y< / 
NP Mod~NP~-~--VP/Np | 
vP 
The cotton clothin~ is made of rf~ 
Figure lO. The parse tree network just after 
"The cotton clothing is made of grows" 
6. Parsing Preference 
If there are more than one possible syntactic 
structures for the input sentence after the entire 
sentence was input, one of them is preferred over 
others. In order to explain the parsing preferences, 
some syntactical preference principles such as Ri_i~ 
Association , Minimal Attachment and so on, have been 
proposed so far in TFord 1982/ etal. But there are 
some problems about these principles. The most 
important problem is which principle should be used 
in parsing the given sentence /Schubert 1984/. Since 
our parser is based on the CM, the parsing 
preferences are uniformly explained using each of the 
activation level of the units being the components of 
parse tree for the given sentence. This preference 
mechanism with the activation levels is regarded as 
the minimal attachment principle for some cases and 
us the right association principle for other cases. 
In this section, we will show some examples about 
this matter. 
The first example is about the sentence 
"John bought the book which I had selected for Mary." 
If we adopt the phrase structure rule VP -9 VP PP, 
the result parse tree of this sentence generated by 
our CM parser is the one shown in Figure Ii. 
John bought the book whi:h had selected for Mar L 
Figure ll. The example of parse trees of structural 
ambiguous sentence 
457 
There are two promising parse trees for this 
sentence as shown in Figure ii. I~ the tree including 
the VP 1 unit is preferred, the PP unit of "for Mary" 
is strongly cennected to the VPI/NP unit. If the tree 
Including the V} 2 unxt xs prefezred, Am II a\]t .Ls 
strongly connected to the VP 2 unit° Now we examine 
the activation levels of these two unit. The VPl/NP 
unit .i.s activated direet\].y by the V ulit for "h~'{<'I 
selected"° It is also indirect\].y activated and 
triggered by the N |,nit for "I"o On the other hand, 
the VP 2 unit is indirectly activaLed by the V unit 
for "hougi~t" and the NP unit for "the book which.." 
and so on. By this comparison, the VP1/NP unit is 
known to be more st:rougly activated than the VP 2 
unit° Therefore tile PP unit for "for Mary" :i.s more 
, .stroi\]f, l y connected to the VP\] N / ~) unJ. t than the. Vl<2 
unit, and the parse tree including the VP\] unit is 
preferred. The resu\]t coincides w\[th t:fie rig~l: 
associat:i.on principle that is likely used when humans 
parse this example sentence° 
As you see in the example, many cases of the 
process of connecting to the most strongly activated 
unit are explained as the right association 
principle, gut there are other cases in which the 
control mechanism are not so clear. (\]onsider the next 
example. 
"Johu carried the groceries for Mary." 
Here we phrase structure rules of the Chomsky aornml 
form. For instance, VP .--> VP PP, VP -> V NP, and so 
on. The result pazse trees are shown in Figure \]2. 
• Notice that the native speakers of English show 
definite preferences for the parse tree including 
VP 2. Now we are required to explain a parsing control 
mechanism ~vhi. ch causes this preference° If the PP 
unit for "\]for Mary" Js connected to the NP I unit, the 
parse tree ieckuding the VP\] unit Js preferred. If 
the PP unit is connected to the VP 2 unit, the parse 
tree including the VP 2 unit Js preferred. The NP\] 
unit is activated direct\] y by the NP unit for "the 
groceries". On the other hand, the VP 2 urlJt is 
activated hy both the NP unit for "the grocer:ies" and 
l:he V unit for "carried" but indirectly. We can not 
determine which parse tree :\[~ preferred without 
further information for instance, the weight of every 
connection \] :ink. If the weight of the connection link 
from the VP unit to the VP 2 unit is very heavy, our 
parser prefer the parse tree including the VP 2 unit. 
From the viewpoint of phrase structure ro:l.'es, hy 
this connection link's heavy weight, we can regard 
the phrase structure rules VP --~ V NP and VP -~ VP PP 
ms on\].y one rule VP -~ V NP PP. Using this rule in 
parsing minimizes the resnJtant number of nodes. If 
we adopt the minimal attachment principle, the parse 
tree including the VP 2 unit is preferred. In short~ 
the minim~d, attachment principle is explained in our 
parser's performance. 
S 
NP ~/tt"~'-~ 
I 
v .... t__\[Np NP \] 
NIP ...... PP ..... ~ pp 
John carried the fa\[9_cerigs fo_zjMarv 
Figure 12. The parse trees for "John carried °°" by 
VP --> VP PP and VP -} V NP etco 
As you know from these examp\].es, the minima\] 
attachment principle and the. right: association 
principle are integrated in our CM parser by 
determining the appropriate weights of connection 
links. This result is completely compatib_\]e the CM',q 
principle that all informatJons are represented as 
connection \].ink's weights. 
7. Conc \].usions 
We proposed a parser based fully on the CM. 'By 
introducing an upper bouud for the sin, of e-ach un:itts 
activation level into this CM parser, we can explain 
why garden path sentences and deeply nested sentences 
are hard to recognize° Our CM parser can integrate 
the minimal attachment princil)le and the right 
association principle into one pl'inciple that the 
most: strongly activated unit is se\].ected. Future work 
to be studied is to unify semantic and context 
Juformationa into this CM parser° 
A c k n owl e d g erie n i: 
We thank members of the special interest group 
of artificial intelligence so called "AIUF, O'~ and i)c. 
EoEash~da at ETL. \[Iis elegant theory encouraged us io 
study about the work of this field. The research was 
supported \]ly the Grant-in-Aid for Special Prsjecl 
Research o{ the ministry of educatiou,se:ience and 
culture, and the Inamor:i Foundation° 

References 

Ford,M. Bresnan,J. & Kaplan~ R. i982), "k 
competence--based theory of syntactic closure", in 
Bresnan,J°(ed.), The Mental Representation of 
Grammat::i.ca\] Relatious~ M\]:T Press 

Jonson-Laired,P.N. (1983), "Mental Models" 
Cambridge \[Iniversity Press 

Marcus,M. (1980)"A Theory of Syntactic RecognJ gioll 
for Natural Language" MIT Press 

McCleil.and,J.L. (1986), "The Programmable Blackboard 
Model of Reading", in Parallel Distributed 
Processing Volo2, The MIT Press 

McClelland,J.L2~ Kawamoto,A.H. (1986), "Mechanisms 
of Sentence Processing:Assigning Roles to 
Constituents", in Parallel Distributed Processing 
Vol. 2, The MIT Press 

McClelland,J.L. & Rumelhart,D.g. eLal.(1986) , 
Parallel Distributed Processing Vol.l Vo\].°2~ The 
MIT Press 

Pereira,F.C~N. & Warren,DoII.D (1980) "Definite Clause 
Grmmner for Language Analysis", Artifo Intel\].. 13 

Reil I y,RoG~ 
Aspects of 
COTING'84 
(1984) "A Connectionist Model of some 
Anaphor Resolution" 
pp.144-149 

Schubert,l,.K. (1984) UOn ParsJng Preferences"~ 
COLING~ 84 pp. 24"1-250 

Shieber,SoMo (1983) "Sentence disambiguatior, by a 
shift-reduce parsing ' " techn\].que , SRI international. 
Technical Note 281 

Waltz,D.I,. & Pollack,J,B.(\].985) " Massively Para\].ie\] 
Parsing:A Strongly Interactive Model of Natural 
Language Interpretation", Cognitive Science 9, 
pp.51-74 
