INCREMENTAb INTERPRETATION: 
APPLICATIONS, THEORY, AND RF, LATIONSHIP TO DYNAMIC SEMANTICS* 
David Milward & Robin Cooper 
Centre for Cognitive Science, University of Edinburgh 
2, Buccleuch Place, Edinburgh, EH8 91,W, Scot, land, davidm@cogsci.ed.ac.uk 
ABSTRACT 
Why should computers interpret language incremen- 
tally? In recent years psycholinguistic evidence for in- 
cremental interpretation has become more and more 
compelling, suggesting that humans perlTorm semantic 
interpretation before constituent boundaries, possibly 
word by word. However, possible computational ap- 
plications have received less attention. In this paper 
we consider various potential applications, in parti- 
cular graphical interaction and dialogue. We then re- 
view the theoretical and computational tools available 
for mapping from fragments of sentences to flflly sco- 
ped semantic representations. Finally, we tease apart 
the relationship between dynamic semantics and in- 
creinental interpretation. 
APPLICATIONS 
Following the work of, for example, Marslen-Wilson 
(1973), .lust and Carpenter (1980) and Altma.nn al\]d 
Steedrnan (1988), it has heroine widely accepted that 
semantic i11terpretation in hnman sentence processing 
can occur beibre sentence boundaries and even before 
clausal boundaries. It is less widely accepted that 
there is a need for incremental interpretation in com- 
putational applications. 
In the \[970s and early 1980s several compntational 
implementations motivated the use of' incremental in-. 
terpretation as a way of dealing with structural and 
lexical ambiguity (a survey is given in Haddock 1989). 
A sentence snch as the following has 4862 different 
syntactic parses due solely to attachment ambiguity 
(Stabler 1991). 
1) I put the bouquet of flowers that you gave me for 
Mothers' Day in the vase that you gave me for my 
birthday on the chest of drawers that you gave me 
lbr Armistice Day. 
Although some of the parses can be ruled out using 
structural preferences during parsing (such as \[,ate 
C'losure or Minimal Attachment (Frazier 1979)), ex 
traction of the correct set of plausible readings requi- 
res use of real world knowledge. Incremental inter- 
pretation allows on-line semantic tiltering, i.e. parses 
of initial fragments which have an implausible or an- 
olnalous interpretation are rqiected, thereby preven- 
*'.lPhis research was supported by the UK Science and Gn- 
glneerlng l~.esearch Council, H, esearch Grant 1tR30718. 
Ling ambiguities from multiplying as the parse pro- 
ceeds. 
However, onqine semantic filtering for sentence pro- 
cessing does have drawbacks. Firstly, for sentence 
processing using a serial architecture (rather than one 
in which syntactic and semantic processing is perfor- 
lned in parallel), the savings in computation obtained 
from on-line filtering have to be balanced against the 
additional costs of performing selnan~ic computations 
for parses of fl:agments which would eventually be ru- 
led out anyway from purely syntactic considerations. 
Moreow~r, there are now relatively sophisticated ways 
of packing ambiguities during parsing (e.g. by the nse 
of graph-structured stacks and packed parse forests 
(2blnita 1985)). Secondly, the task of judging plausi- 
bility or anomaly according to context and real world 
knowledge is a difficult problem, except in some very 
lilnited domains. It, contrast, statistical techniqnes 
using lexeme co-occurrence provide a relatively sim- 
ple mechanism which can imitate semantic filtering 
in many cases. 1,br example, instead of judging bank 
as a lhmncial institution as more plansible than bank 
as a riverbank in the noun phrase the rich bank, we 
can cornpare the number of co-occurrences of the le- 
xemcs rich and bank1 (= riverbank) versus rich and 
bank2 (= financial institution) in a semantically ana- 
lysed corpus. Cases where statistical techniques seem 
less appropriate arc where plausibility is affected by 
local context. For example, consider the ambiguous 
sentence, The decorators painted a wall with cracks in 
tim two contexts 517~c room was supposed to look" run- 
down vs. The clients couhln't afford wallpaper. Such 
cases involve reasoning with an interpretation in its 
immediate context, as opposed to purely .judging the 
likelihood of a particular linguistic expression in a gi- 
ven application domain (see e.g. Cooper 1993 for dis- 
cussion). 
Although the usefulness of on-line semantic filtering 
during the processing of complete sentences is deba- 
table, filtering has a more plausible role to play in in- 
teractive, real-time environments, such as interactive 
spell checkers (see e.g. Wirdn (1990) I'or arguments for 
incremental parsing in such environnlents). IIere the 
choice is between whether or not to have semantic ill- 
tering at all, rather than whether to do it on-line, or 
at the end of the sentence. 
q'he concentration in early literature on using in- 
cremental interpretation for semantic filtering has 
perha.ps distracted f'roln SOlne other applications 
which provide less controversial applications. We will 
748 
consider two in detail here: graphical interfaces, ~md 
d ialogttc, 
The I,'ounda.tions for Intelligent Cral)hics I'roje(:{; 
(I,'l(l) I (:onsidered various wa,ys in which natural hm-- 
gu;~ge input could be used within eoml}uter a.idcd de- 
sign syste, ms (the i}~rl;ieula.r al)plicai;ion studied was 
eoull)ul;er aided kitchen design, where users would not 
uecessarily I)e professional designers). Incremental in- 
terpretation was considered to be useful in enabling 
imme(li;m~ visual feedl)aek. Visual feedback could be 
used to l}rovide Colllh:lna.I;ion (\[or ex;tlnl)le~ l)y hig- 
hlighting a.n olo.ieet \].el'erred to by a stleeeSSFlll deli- 
nite des('xiption), or i{; could be used to give the tiber 
a.n iml)roved ch3.l\]{;e of a.('.llievillg suc{:essi'ttl r{?l'{~'l'elice. 
I,'or Cxmul)le, if sets o\[ possil}le ref(:l'en~s \['or a dellnit.e 
noun phrase a.re highlighted chlrillg wol'(I I)y word i)1'o 
eessillp; then the user knows how mue\], or how little 
infornt~t,ion is re{luire(l for su(:{:esslid refepenee. :~' 
Iluma, n dia.logue, in \]);n'tieular, I;ask oriented (lialo- 
guc is eha.rac:terised I)y a. large numl)ers of sell-rel)airs 
(l,eve\[t 198:I, ('arlctta et ;d. \[9!}:1), such as hesita.ti-. 
()\]IS} il/;~el;tiol.ls} ;lll(l ix~.\[)\]~l/;elllelltS, \[1, iS ~L\]SO Cot|lll\[{)\]t 
to llnd interruptions reqltesting exti:a ela.rilieati(m, or 
disagreements I)cl'ore I;he (211(I of ~t. Sell\[ell(;{'. I(, iS evelt 
l)ossil)le for seutenecs stnr{.cd I) 5, one (lialogtte 1)acl, ici - 
pant to be \[in\[shed 1)), a.nother. Al)l}lieations involving 
I;hc understa, tlcLing of dia.logues inehn(le informal.ion 
extraction I¥om eonversal, ional (latal)ases, or eoml)U- 
ter monitoring of conversal.ions. It. also m&~y I)e useful 
to include SOllle \['(20~Ltll?es of \]l/ll\[i;I.ll dialogue in man 
m~u:hine (lia.logue. I,'ov ('.Xaml)le, inl.,~rru\[}t:ions (::m I)e 
IlSe(-1 D)\]' early signalling of errors and P~.nll)iguit,ies. 
I,(21; us tirsl; eonsidcl' SOl\[l{2 eXaml)les of sel\[2rel)air. 
IllSel:l;iOllS a,dd extl'~l, infol'nmtion, usually uto(li\[iel:S 
(2 .~. 
2) We sLarl, in {,\].le middle wil, h .... in the mid(lie of 
the i)N)er with a. I)h\]e disc (I,evelt 1983:ex.3) 
l\]e\])l~cements corr('.et l)ieees o\[ infotnlal,ion e.g. 
3) (I() from left, again to uh ..., from pink ap;~iu to 
I)lue (I,evelt 1983:ex.2) 
In s(n ne eases informal;ion from \[.he eorre{¢e{l nlaterial 
is ineorl)oI'a.ted il~l:,) \[,he fin;d messnp.;e, l,'or examl)le , 
{:ot\]s\[(\[er ;~ : 
d) a The three main sources of data come., uh ..., 
they can I}e Found in the vet'erenee~ 
b ,Iohn t)ol.ic~(t that the (,\[{l mau and his wife, uh 
L.\[oIIIL (JOtlllCiIs Initi~tive in Cognitive L;cieuce/ll(JI, (h':mt; 
8826213, IBdCAAI) and ()entre for (Mgni{:ive Science, /\]ldver- 
sity o\[ l,Minburgh. 
2Thls exmnple was inspired by the work of I1;uldock (1 !)87) 
on in{:pcmenl~t\[ hltcrl)retatlon of de\[\[nile nouu l)hrases, llad- 
d\[)cl,\[ iI~;(!d ;l~li ill(21'(!lli(!ll{;:ll C{}li~;I,iNtilll~ I};\]~st1{t al)\[}ro;~{ h following 
Mellish {198s) t;o l}r(wide ;m explanaCi(~tl of why it i:~ i}{,~sil}le 
{.o use I;he llOllll i)hl'~ts(! \[h{" ~'a6~)it £7l t\],~: t~a\[ e'¢C:ll whell t:hm'e 
are I.w{} h;M.s, bul. only on(: h;tt wilh ~ P;d}h\]l in il. 
8 \[,;Xallll)le (;t) iS i'ecollslJrll(:lJed i'l'onl ttli ;IC|,(ID~\] iil;tet;~.llCe. J')x 
D.|llpl(2b (I)) D~lld ({i) *,V{'I'¢: COIIS{|'/IC{;{2d. 
.... that the man got into t,he ear and the wife 
was with him when fl\]ey left the house 
e \]!',very boy took, uh ..., he should have takeA| a 
water 1)ottle wil.h him 
In (a.), the corrected ma.terial the thre, .main .sou.rcc.s 
of data come, provides {,he anteeedent for the pronon\]~ 
the:\]. In (b) the corrected m:~terial tells us that the 
ma.n is boCh old mid has a. wife. In (e), the pronoun 
he is bound I)y \[,he qmmtifier ever:/boy. 
l"or ~ system to understand dia.logues involving sell 
repa.irs such as {,hose iu (d) would seem t,o require. 
either a.n ~d)ility to interl)ret increment, a.lly, or the 
use of a grammar which in(:ludes self repa.ir as a 
synta.etic {:onst, ruetion a.kin to non-constituent coor- 
dination (the relationshil)I)ctwec, n coordin;~t,ion and 
s(;li2eorrection is noted I)y I,evelt (1983)). 1:or a. sy-- 
stenl to generate self tel)airs might also I:equire in- 
ePel/iCltta\] i lltel?\[)lX;\[,atiOll\] aSStllnilt~ ~/ \])Poeess whelx2 
the systeln i)erI'oP1\]ts on-line n|ollitorillg of its olll;put 
(a.kiu to l,evelC's Inodel of the hulmm sell2rel)a.ic me- 
eha.uism). I1; has been suggested t.hat geltel'al;iOll of 
seir tel)airs is useful in eases where there are severe 
time const::~ints, o," where there is rapidly cha.nging 
I)a.ckground inform;~t, ion (Ca.tier\[a, l).e.). 
A more cOral)ell\[rig al'gttmetlt for inerententaI inter 
i)rcta.tiolt is in'ovided I)y considering dialogues invol 
ring illtel?rtll)tiOllS. Consider tit(' following dialogue 
from \[,he TILAINS COl?pits ((:II'()S,S el, al. 1 1!)9!1): 
5) A: so we should move the engin0 a.t Avon, 
engine l!;, l:o ... 
B: engine E I 
A: 1')1 
l~: okay 
A: {mgiue I'21, \[,(} Bath ... 
This re{luires hlterl)l:ctatiol| by Sl){'.aker I~ I)el'ore the 
end o\[' A's sente,lee to allow ol).iection to the al)po. 
siLion, lhe cn:/mc al Auon, {m:lbzc 1¢. All exa.ml)Ie o\[' 
the potential use of iltterl'uI)tiolls in Illllllat\] eotnl)uter 
intecaction is the followiug: 
6) User: Pitt {;lie plllleh ()lit() ... 
(\]OlllI)ll|;(w: The i)1111(211 (;;lll~l; 1)(' IIv.)ved. It}s b()\]- 
t,ed to the lloor. 
lit this exalnl)le , interl)ret, a.tion lllUSL lie\[ Oll\]y /)e I)e- 
12)r( the end o\[' l:he Sellt,ell('e, I)ul; I)el'ol:e a (:otmt.il, uent, 
I)otm(lary (Idle vePI)I)hrase in the user's eomntand has 
not yet I)e.eu {:Oml)lel.e(I ), 
(JURREN'I? TOOLS 
1.. Syntax to S{mmnti(: lh~i)resentation 
In {.his section we shall briefly review work on pro- 
vicling semantic representat;ions (e.g. lambda, expres.. 
sions) word I)y word. Traditiomd layered models o\[ 
selll,ell(te \])l'oeessit/g lips\[ huild a. Full synta.x \[,FC:e fOF a. 
Setttellee} alld thetl extl;aeL a. sel\[lalltic i;el)l'eSellt+/,tioll 
f't'om \[.his. To adapt this to an in{zrement,al l)erspec- 
1,ire, we I/eed to \])e ~d)le to l}rovide synt, a.ci;ie Sgt?tlCtUt:es 
749 
(of some sort) for fragments of sentences, and be able 
to extract semantic representations from these. 
One possibility, which has been explored mainly 
within the Categorial Grammar tradition (e.g. Steed- 
man 1.988) is to provide a grammar which can treat 
most if not all initial fragments as constituents. 'Phey 
then have full syntax trees from which the semantics 
can be calculated. 
However, an alternative possibility is to directly 
link the partial syntax trees which can be %rmed fol: 
nOl>COnstituents with flmctional semantic representa- 
tions. For example, a fragment missing a noun phrase 
such as John likes can be associated with a seman- 
I, ies which is a function from entities to truth values. 
Ilence, tam partial syntax tree given in Fig. 14, 
S 
/ \ 
np vp 
John / \ v np~ 
likes 
F'ig. I 
call be associated with a semantic representation, 
Ax. likes(john,x). 
13oth Categoria l approaches t;o incremental inter- 
pretation and approaches which use partial syntax 
trees gel; into difficulty in cases of left recurs|on. Con- 
sider the sentence fragment, Mary thinks dohn. A 
possible partial syntax tree is provided by Fig. 2. 
S 
/ \ 
np vp 
Mary / \ 
v S 
thinks / \ 
np vp} 
John 
Fig. 2 
llowever, this is lie| the only possible partial tree. 
la fact there are infinitely many cliff>rent trees possi- 
ble. The completed sente.nce may have an arbitrarily 
large number of intermediate nodes between the lower 
s node and the lower hi). For exarnple, John could 
be embedded within a gerund e.g. Mary thil&s John 
leaving here was a mistake, and this in turn could be 
enfl)e(lded e.g. Mary thinks John leaving here being 
a mistake is surprising. John could also be embed- 
ded within a sentence which has a sentence modifier 
requiring its own s node e.g. Mary thinks John will 
go home probably 5, and this can be flu'ther embedded 
4'Phe downarrow notation for missing constituents is adop- 
ted from Synchronous Tree Adjoining (}rammm" (Shleber & 
Schabes 1990). 
5'\['he treatment of probably as a modifier of a sentence 
is perhaps controversial. Ilowever, treatment of it: as a verb 
phrase modilier wauld merely shift the potenl, ia\] left recurs|on 
~o Ihe verb phrase node. 
e.g. Mary thinks John will go home Frobablg because 
he is tired. 
The problem of there being an arbitrary mmg)er of 
different partial trees for a particular fragment is re- 
fleeted in most current approaches to incrementM in- 
terpretation being either incomplete, or not flflly word 
by word. For example, incomplete parsers have been 
proposed by Stabler (11991) and Moortga.t (1988). Sta- 
bler's system is a simple top-down parser which does 
not deal with left recursive grammars. Moortgat's 
M-Systeln is based on the Lambek (~ah:ulus: the pro- 
blem of an infinite lmmber of possible tree ka.gments 
is replaced by a corresponding problem of initiM fl:ag- 
ments having an infinite number of possible types. A 
colnplete incremental parser, which is not fully word 
by word, was proposed by Pullnan (1986). This is ba.~ 
sed on arc-eager left-corner parsing (see e.g. l{esnik 
To elmbIe complete, fully word by word parsing re 
quires a way of encoding an intinite nmnber of partiM 
l, rees. There are several possibilities. 'Fhe first is to 
use a language describing trees where we can express 
the fact that ,\]ohn is donfinatcd by the suode, but 
do not have to speciiy what it. is ilmnediately domina- 
ted by (e.g. D-Theory, Marcus et ah 198a). Semantic 
representations could be tbrmed word by word by ex- 
tracting 'default' syntax trees (by strengthening do- 
minance links into immediated dominance links whe- 
rever possible). 
A second possibility is to factor out recursive struc- 
tures from a grammar. Thompson et al. (1991) show 
how this can be done for a phrase structure gram- 
mar (creating an equivalent 'Pree Adjoining (;rammar 
(,Ioshi I987)). The parser for the resulting grammar 
allows linear parsing tbr an (infinitely) parallel sy- 
stem, with Cite absorption of each word performed 
in constant time. At each choice point, there are 
only a finite number of possible new partial TAG trees 
(the TAG trees represents the possibly inlinite nmn- 
bet of trees which can be forlned using adjunct|on). 
It should agMn be possible to extract 'default' seman- 
tic values, by taking the semantics from the TA(I tree 
(i.e. by assuming that there are to be ,to adj unctions). 
A somewhat similar system has recently been propo- 
sed by Shieber and Johnson (191t3). 
The third possibility is suggested by considering 
the semantic representations which are appropria.te 
during a word by word parse. Although there are 
any number of dill'trent partial trees for the fragment 
Mary thinks John, the semautics of the fragment can 
be represented using just two lambda expressions6: 
AP. thinks(mary,I)(john)) 
AP. AQ. Q(thinks(mary, P(john))) 
Consider the tlrst. The lambda abstraction (over a 
(;Two representa~,ions are appropriate if t:here are no VP- 
modifiers as it, dependency grammar. If V1)-modificatlon is 
Mlowed, I, wo more expressions are required: 
AP. AR,. (II,(kx.thinks(mary, x)))(P(john)) and 
5p. an.. aQ Q((ll,(Xx.thhlks(mary,x)))(P(john))). 
750 
fimctional item of type e--}l;) can 1)e thought of as 
~t way of encoding mt intinite set of pnrtial sema.ntie 
(tree) structures. For cxmnple, the eventual semantic 
structure may embed john at ~my depth e.g. 
t hinks(nm.ry,sleeps (j oh n)) 
thinks(nmry,possibly(sh'.eps(johu))) 
etc. 
The second exl)re.ssion (~ fimctiona\] item over type 
e-+t; and t-+t), allows for eventual structures where 
the main senten<:e is embedde.d e.g. 
l>ossibly(l, hinks(nmry,sleeps(john))) 
This third possibility is therefor<; to l>rovide a, syntac- 
tic correlate of lambda expressions. In l)rm:tice, ho-- 
wever, l)rovided we are only interested in mai)l)ing 
from ~ string of words to ~ semantic representa.tion, 
~md don't need explicit synta.x trees I.o be eonstru(> 
I;e(|, we (:tin \]nerely use the types of the 'synta(:- 
tic lambda, expressions', ra~ther them the expressions 
themselves. This is essentially the approach taken in 
Milward (\]992) in order to provid(; eontplete, word 
l)y word, incrementM interpretation using simple \]e- 
×ieMised gr~umna.rs, snch as a lexiealised version of 
formal dependen<'y g;ral-lnrlar and simple eategorial 
gra.lll lllar 7 . 
2. Logi(:al Forms to Smnantic Filt;ering 
In l)ro(:essing the sent(race A,larg introduced John to 
Susan, a, word-by-word ;~l)l)roach such as Milward 
(1992) provides the following logical fornls alTte, r the 
eorresl>OlMing sentence fr~gments are al~sorbe(l: 
Mary M ). l ) ( mary ) 
Mary introduced Ax. Ay.inl.r (mary,x,y) 
M~wy introduced John Ay.intr(mary,john,y) 
Mary iul;rodu(:ed .John to Xy.inLr(mm'y,john,y) 
Mary introdn(:ed .John to Su<: ini.l'(mary,,john,sne) 
li':mh input level rel)res(mtatiotl is apl)ropria.tc for the 
mealfing ol7 ~n incomplel, e senl;enee, I)eing either ~ pro- 
positi(m or a, function into a proposition. 
/n Cha.ter et al. (1990) it is argue(1 tlmt l, he 
inerementMly derived meanings are not .indged 
\[br plausibility directly, but instead ~re first tur~ 
ned into existentially (luantified propositio.s, l,br 
exa.mp\[e, instead of .ju(lghtg tim plausiiMity of 
)~x..~y.int:r(Inary, x,y), we judge the plausil~ility of 
_~(x,q',3(y,T,intr(mary, x,y))) s. This i~ just the 
proposition Mart introduced something to something 
using ~ generalized quantilier not~tion or the tbrm 
Quantitie.r (Variable,R(:stri('tor,Body). 
A|though the lambd:~ exl)ressions are built ul) mo- 
notonieMly, word by word, the, l)rOl)ositions \[brined 
7Whe version of categorial grammar used is AP. (Sttcgorial 
C, rmnmar with Associativity. 
~'\[hc prol)oSil;ion '.P is alw~Lys true. See Chatter et ~tl. (IDg,t) 
for discussion (ff whether it; is more al)prol)ri~tl:(: to use ~t lit-HI- 
trivial rcsl, rictor. 
from them may need to be retraeted, a.long with all 
the resulting infi~relmeS. I,'or examl)le, Mart intro- 
duced something to something is ina.pl)ropriate if the 
thml sentence is Marg introduced noone to anybodg. A 
rough algorithm is as follows: 
l. Parse a. new word, Word/ 
2. l"orm ~ new lambda expression by eoml)ining the 
lambda~ exl>ression formed after parsing Wordi_ 1 with 
the lexieal semantics h)r Word/ 
3. Form a. proposition, Pi, by existentia.lly quantit):-. 
ing over the la.mbda a/)stracted va.riables. 
4. Assert Pi. If Pi does not ent~dl Pi-1 retraet I)i_ 
~md all conclusious made ft:om it s. 
5. Judge the. phmsilfility of Pi. If iml)hmsible, block 
this del:iwlLio,. 
It is worth uoting tht~t the need for retraction is not 
due to ~x failure to extract the eorrect qeast eolnDlit- 
menC' propositiotl from the semautic (:ontent of the 
fragment Mary introduced, 'l'hi~ is due to tim fm:t 
that it, is I)ossible to find pairs of l)ossible eontinuati- 
ous which m:e the negation el each other (e.g. M(rrg 
introd'ltccd noonc to anybody and Mary inl,'rodltced so- 
meone to somebody). The only propositiou comps> 
tibk', with both a proposition, p, and its negation, ~1 ) 
is the trivial proposition, "P (see (.:hater et al. for 
further discussion). 
3. IneremeiH;al Quantith~r Seeping 
So fa.r we have only considered semantie r(~presental.i- 
ons which do not involve (lll~uttiliel'S (except I'or the 
exist(mtial quantifier introduced by the mechanist, 
~d)ove). 
In senten(:es with two oF more qmmtiliel;s, there is 
generally ~m ~mabiguity eon(:erning whiC| quantifier 
has wider s(:ope. 1"or exm:nple, in sentence (a) below 
tim preferred reading is lbr the same kid to have ('Aim- 
bed every tree (i.e. the ml.iversal quantilier is within 
the scope of the existeutia.I) whereas in sentence (b) 
the preferred reading is where the universal quantifier 
has scope over the existential. 
7) a A I, ireless kid eliml)ed every tre.e. 
b There was ~ tish on every l~latc '. 
Scope prefiwenees sometimes seem to I)c esl, al)lished 
bel'ore the end of tz sentence. \],'or example, in seutenee 
(a) below, there s('.ellla a l)referell(:e for all Oll(,er seol)e 
reading for the first quantifier as soon as we inl;erl)rel; 
child. Ill (13) the i)refereu(:e, by the time we get to e.g. 
gram.mar, is \[~.)r adl ituwr scope re~ding for the lh:st 
qu a.ntiller. 
8) a A te~eher gave every child a great deal of he-. 
mework Oll gralflnlar. 
91{ctractlon call be performed by using ~t tagged dattd)ase, 
whm'e e:tch In'OpOsition is l)alrcd with a sel: ,f s()tll'C(~ (!.~. 
given (P-~Q,{u4}), and (P,{nS})then(Q,{u4dtS}),:ml I,c 
deduced. 
Z~I 
b Every gM in the class showed a rather strict 
new teacher the results of her attempt to get 
the grammar exercises correct. 
This intuitive evidence can be backed up by consi- 
dering garden path effects with quantifier scope tun- 
biguities (called jungle paths by Barwise 1987). The 
original examples, such ~s the fbllowing, 
9) Statistics show that every 11 seconds a man is 
mugged here in New York city. We are here today 
to interview hiln 
showed that preferences for a particular scope are 
established and are overturned. 'Po show that pre- 
ferences are sometimes established before the end of' 
a sentence, and before a potential sentence end, we 
need to show ga.rden path effects in examples such as 
the following: 
10) Mary pttt the inIbrmation that statistics show 
that every 11 seconds a man is mugged here in 
New York city and that she was to interview him 
in her diary 
Most psycholinguistic experimentation has been con- 
cerned with which scope preferences are made, rather 
than the point at which the preferences are establis- 
hcd (see e.g. Kurtzman and MacDonald, 1993). Given 
tile illtuitive evidence, our hypothesis is that scope 
preferences can sometimes be established early, befbre 
the end ofa sentence. This leaves open the possibility 
that in other cases, where the scoping inIbrmation is 
not particularly of interest to the hearer, preferences 
are determined late, if at all. 
3.1 Increlnental Quantifier Scoping: hnple- 
lnent, ation 
Dealing with quantifiers incrementally is a rather si- 
mila.r problem to dealing with h'aglnents of trees incre- 
mentally, a,st as it is in-,possible to predict the level 
of embedding of ~r noun phrase such as John from tile 
fragment Mary thinL's John, it is also impossible to 
predict the scope of a quantifier in a fragment with 
respect ~o the arbitrarily large number of quantiliers 
which might appear later in the sentence. Again the 
problem can be avoided by a tbrm of pacldng. A par- 
ticularly simple way of doing this is to use unseoped 
logical forms where qmmtifiers are left in situ (silni- 
lar to the representations used by Hobbs and Shieber 
(1987), or to Quasi Logical Form (Alshawi 1990)). For 
example, the fl'agment Every man gives a boot" can be 
given the tbllowing representation: 
I1) kz.gives(< V,x,nlan(x)>,< ~,y,book(y)>,z) 
Each qnantitied term consists of' a quantitier, a va.ria- 
ble and a restrictor, but, no body. To convert lambda 
expressions to unscoped propositions, we replace an 
occurrence of each argument with an empty existen- 
tia.l quantitier term. In this case we obtain: 
12) gives(< V,X,ITIall(X)>,< 3,y,book(y)>,< -~,z,'l'>) 
Scoped propositions can then be obtained by using an 
outside-in quantifier scoping algorithm (Lewin, 1990), 
or an inside-out algorithm with a free w~riable con- 
straint (IIobbs and Shieber, 1987). The propositions 
fbrlncd can then be judged for plausibility. 
To imitate jungle path phenomena, these pla.usi o 
bility judgements need to feed back into the scoping 
procedure for the next fragment. For example, if' every 
man is taken to be scoped outside a book after proces- 
sing the fragment l?vcry man ga~c. a book, \[;hen this 
preference should be preserved when deterlnining the 
scope for the full sentence l?very uza~t gave a book lo 
a child. Thus instead of doing ~dl quantitier scoping 
at the end of the sentence, each new quantilier is sco- 
ped relative to the existing quantifiers (and operators 
such as negation, intensional verbs etc.). A prelimi- 
nary irnplemenl, ation achieves this by annotating the 
semantic representations with node nantes, a.nd re- 
cording which quantifiers are 'discharged' at. which 
nodes, and in which order. 
DYNAMIC SEMANTICS 
l)ynamic semantics adopts the view that "the mea-- 
ning of a sentence does not lie ill its truth conditi- 
ons, but rather in the way ill which it changes (tile 
representation of) the in\[brmation of the intcrl)reter" 
(Groencndijk and Stokho\[', \] 991). At first glance such 
a. view seems ideally suited t.o incremental interpreta- 
tion. Indeed, Groenendijk and Stokhof claim that the 
compositional nature o\[' l)ynamic Predicate Logic en- 
ables one to "interpret a text ir~ an on-line ntauner, 
i.e., incrementally, processing a.nd interpreting each 
basic unit as it comes along, in the context created 
by the interpretation of the t.ext so fa.r'. 
Putting these two quotes together is, however, mis- 
leading, since it suggests a more direct mapping bet- 
ween incremental sem~mtics and dyna.mh: semantics 
than is actually possible. In an incremental semantics, 
we would expect the informtttiou state, of an interpre- 
ter to be updated word by word. In contrast, in dyna- 
mic semantics, the ol:der in which states are updated 
is determined by semantic st;ructure, not by left-to- 
right order (see e.g. I,ewiu, 1992 \[br discussion). For 
example, in 1)ynanfic Predicate Logic ((~roenendijk ,~ 
Stokhof, 1991), states are threaded from the antece- 
dent of a conditional into I, he conseque~d~, and from 
a restrictor of' a quantitier into I;he body. Thus, in 
interpreting, 
13) John will buy it right away, if a car impresses him 
the input state for evMuation of .John will bug it right 
away is the output state from the a.ntecedent a ear 
hnp,vsses hhn. in this ease the threading through 
semantic structure is in the opposite order to the order 
in which the two clauses appear in the sentence. 
Some intuitive justification for the direction of 
threading in dynamic semantics is provided by cou- 
sidering appropriate orders for evaluation of proposi- 
tions against a database: the natural order in which 
752 
l,o cvMual;e a, conditiona,1 is first, 1,o add the antecedenl;, 
illl(\[ thell see if I.he COllSC(lUOlll, c{i.ll 17c: l)roveli. \]to is 
()lily ai, tile sentence lew;l iu ,siniple na,rrative texLs 
0ha,t I,he l)l;esenl;al, ion ordor itlld I,ho iw, l, ur~d order o\[ 
4"wahl3J;ion necx~s,<sarily coincide. 
The orderhig of a,n~tl)hors and theii: autel:edent, s is 
o\[l;en used inl'orinMly/,o jusl,ify lefl,-l,o-riglll; i, hreadiug 
or thi:eadilig through selllaait, ic sl;rtlC(,llrO, llowew',r, 
(,\]iroa(lingj fl;Olll \]el'\[,-to-righ{ disa,llows cX;/llllJlcs (3\]" Ol) 
(;tonal c;d;aphoi;a,, as ill example (1'3), mid examI)les 
o\[' c:oiilpll|sory c,a.l;a,p}lora, a,s ill : 
14) I lesitle her> every girl could see a, large cl:acl~ 
Sitnihu:ly, l;\]irei~ding li'oin the aui;ccedeui,s o1' conditio-- 
ua, lS into t;he COllSl'x\[llOll\[; fails f'()r (!X31ill)JeS sHCh &s: 
15) EVC'L'y boy will be a.I)lc t,(i s(!(', olLi, <)\[' a window il 
he walll;s I,() 
I{, is also 1)ossilTle l;o gel; ,SOli{,ellCCS with <(\[Olil,:cy' 
rea,dhlgs, })ill; whei:c the inde\[inil;c is hi I, hc conse- 
qllOlil,: 
\[6) A 81,11de, tit wilt a, tteud the COli\['creliCe it' we C(~II ~2~(\]0 
l;ogei,hoi: enough lliOli(?y \[or her air \[are 
This ,qCli\[;OliCO ,q(!oHis \[;o g(;1L, ;i, reading where we aro liOt, 
lalking &l)ollb it particul~u' sl0tl(i()tll, (~/11 O\[IL(H' exist, on 
0ial), or ~d)Olil; i~ typic~d stu(\]cul, (a geiloric, reading). 
Mol;0Ovo, i;, as noi;ed by Zcewd; (\] {).90), t;}10 115(! of ~:llly 
kind o\[' o.rch?rcd i;hl:0a,dhig will tend 1;o tail Ibr l~ach- 
I)el;ers s(}ii\[a'~lll':os~ sllc, h {ts: 
17) I,;vcry man who loves lwl: a.pl~reciai,es a wonlan 
who liw~s with him 
I"or tliis l,:hid of e×ainple, il, is still possible t,o il~;C ii 
sl,a, ndard tlyiittlnic SOliH'~.lll;\]c,q~ I)lll; only if i, hei!e is SOlile 
prior level of FOt'OI!el\]CC rcsohll;ioil which reorders the 
a.iil;ccedonl, s alld a.lla\])llors appropriately. FOI: c.X;llll 
I)le, \[\[' (I 7) is converl,ed into t, he ~(ionl,:ey' selil,el\]ce: 
1 8) I,\]vel'y ill&It who loves a. wollH/,ll who lives wMi hiin 
al)preciai, es hot 
Whe, n we consider t\]lreading (7\[' possible wor\]ds, as 
hi (JIMal, e Seuia,ui;\]cs (Ve\]tman 19{)0)> l:he noed I;o (\]i- 
si, inguish bci;woen L\]le orcler o\[ eva,\[uai, ion aitd l;he oi; 
(ler of I)resentat, ion I)ccolues inore cl(~ar cut,. (ton,sider 
I,i:yiug LO 17e\]'l()r\[n i;hroa,dh\]g hi le\['D-l;o-rigiil, Ol:dO..r tlil 
ring in/;el:ITrel, al;iol\] o\[ l,he s(~ll|;(~ltCO~ ,loire h:fl if Marg 
l<ffl,. Afi, cr processiltg the 1)roposition ,ioh, left I;he 
set, o\[' worlds is refined down to those, worlds in which 
.lolu~ left, Now consider processing 0 Mary lejT. \[Iere 
we wa,nl; l,o ix;ilti;rodtlce 8OlllC' Wt)i:i(\[S, \[;\]lOSO in which 
noit, hor Ma.ry or &)hn le\['L I lowever, 1,tits is nor allo- 
wed by {Jpc\[atc ,qenlanl,ics whMl is ~limilmtiv< cacti 
new piece of in\[orma.t, ion cAlli ouly l'tii'l,\[lei: i'e\[hic l,hc 
set; of worlds. 
It is woiq;h liOl;ing t, hal; I.he diliii:nll,ic!,~ in l;l'3'illg 
1;0 c, onll)ine c\]iuiinal;ive seiuaili;ics with le\['tq,o-righl, 
tlu:e~t(linp; a.pply t,o c, ousl;i:a.illl, basc(l ,Seliiaili,ic.s as welt 
as I;o Upda,te Sema.ntics. tladdock (1987) uses iucre.- 
inentM re{inenient of sets of possible ret~rents, l,'or ex 
ainl)le , the ell'cot, of processing t/w rabbit in I;he ttOllli 
plll'aSc Utc ~ rabbil ht l/to hat is to provide it set of all 
l;~dol)ii,s. The pi:oces,sing o\[' i~t I't?IIII(!S this sel~ to ra.b- 
bil;s which are ill SOlllCl,liillg;. h'inaHy, proccsSillg o17 \[hc 
\]lal relines the sol; t,o i:~d)lfits which al:(} ill a. ha, t. IIo 
"W(\]veI' 1 I1()'~¥ COllSid(;r \]l.i'oces,sing th<~ rclblTit 71l #loltc of 
thc bomcs. Ily tile {inie Uu: rabbit in has been proces- 
so'd, the only ra.bloits i:eula,ining in c.onsidcra,l;ioll are 
r~d)bits which arc in solncl;hing. This im:orrecLly l:ule,~ 
(ml, the possibilii, y of the uoun phrase referring to a, 
I:~d)l)ii, which is in nothing a.l, a.\[l. The case is *u'l;ually 
a parMle/ to the earlier example of Mary introduced 
som, e, olm, to somethin,q being iimpl)rOl)rial;e if the final 
sontcncc is Mar.Is i*~h'odtu:cd noonc to asqjbodg. 
A/ldlough t, hi,s discussion has argued {haA i~ is no1. 
possilTle to i;hread the sti~t('s which are us(:d by a, (lyua- 
lille (71" eliHlimigive setlta.ill, ics troll\] 1(;\['1, 1;o i'ip;\[il,, word 
by woM, t;liis sliould not; Im taken as a.ll a, rgtlfiiOllt, 
against 1,he ilSf! 0|' SIlC,\[I ;I, ,S(~,IIItI, III;IcS ill il/creiitelll;a\[ iit- 
l;erpretal, ion. \¥hai, is rcqltired is it slight, ly IIIOl'O indi- 
rect al)proach, \[11 the I)l:f.',ql'!ltl, iUll)ielllenlLa, l, ioll , SOlllall- 
I;IC .M,I;tlCI, III'(;S (akin tt7 logica,l f(71HIIS) ;11:(\] built word 
I)y word, allel (;ach StlTIlCtltre i,~ then ewdlta.l;ed indc- 
pclldcntly tiSillg ~t dynalnit: SOlilltlll, i(:8 (with t;hl;C/i..dili{{ 
l)erf'ornied ~c<:ording I,o t;he sl,rHct;ure of 1;he logica.I 
\['Orlrl). 
IMPI,EMENTATION 
At, present there, is ~r Iimii;cd implenicn/,al;ion, which 
i)ertbrui;~ ~t iuapping from sent, cm'e \['ragmcut.s I;o l'ully 
scol)ed Iogi('iL\[ rel)resenta.t, ions. '1'o illustrate its ope- 
ration, (:ousider the, Ibllowing discourse: 
19) London has a lower. Every pa, reni; shows it .... 
We assume ghat, the \[irst so'hi;once has 1oceu processed, 
aml coilcentra.l,e on i)ro(:esshig i, he l"ra,glnen{. The iul-- 
\[)l(:lil(',lll;a,l, iOll COllSiSi,,s (7\[ flVO lnodules: 
I. A word-J)y-word hicreiricltl, a,i parser \['or a, lexicldi- 
sod vcrsioli oi c' del)el/denc, y gl:alltlrlal: (Milward, 1992). 
'l'hi,s I;akes f\]:aginettts of sentences and liiaps them I,o 
Ilnscoped logical forlns. 
IN I>UT: Ew',ry |larolll; shows it 
OIJ'I'I>UT: 
Xz.show(< V,x,pareiit (x) >,< pronou u,y>,z) 
2. A nio(hde which replaces la.inl)di~ a.bstra.ci, c'd varia-. 
bk~s wil,h cxist;cnl;ial quanl, i\[iers in sil;u. 
INI)tJT: ()ll|;\[)llt frolli \]. 
()\[I'I'I~{ST: show(< V,x,l)ltrO.lll;(x)),<prOllfTIllt,y>,< 
_t,z,T>) 
~{. A I)l;OiK)Ull coindexing procedlu:e which replace,s 
i)lX)liOlln varia,1)l<'s w\]l;ll a. val:ia,I)le \[rolll ghe, s;tlnc son- 
ICll(;(;~ or froltt the 17recediug COligOxt,. 
IN I>lJ'l': ()lll,lTlli;(s) \['i:onl :2 a.ll(I a list o\[' wu'ia.bles ava,i 
lablo \['roln tile c, ontcxl;. 
()\[}TI)/J'I': show(< V~x,17areltl;(x)\]>~w,'< )\],z~'l).'>) 
753 
4. An outside-in quantifier seopiug algorithm based 
on l,ewin (1990). 
\[NPUT: Output fl:om 3. 
OU'PPUTI: V(x,parent (x),3 (z,T,show (x,w,z))) 
CUll'PUT2: 3(z,T,V(x,parent (x),show (x,w,z))) 
5. An 'evaluation' procedure based on Lewin (I 992), 
which takes a logical form containing free variables 
(such as the w in the LF above), and evahlates it using 
a dyualnie se,nantics in the eontext given by the pre- 
ceding sentences. The outl)ut is a new logical fol:m 
representing the context as a whole, with all variables 
correcLly bottlld. 
INPUT: Output(s) fi'om d, and the eolltext, 
3(w,m,tower(w) &: has(london,w)) 
OUq)PUTI: ~(w,T,tower(w) & has(london,w) 8z 
V(x,parent(x) ,3 (z,T,show(x,w,z)))) 
OUTPU'I?2: 3(w,T,~(z,T,tower(w) &: has(hmdon,w) 
& V(x,parent (x),show (x,w,z)))) 
.;\~ present, the coverage of module 5 is limited, and 
module 3 is a naive coindexing procedure which al- 
lows a pronoun to be coindexed with any quantified 
variable or proper noun in the context or the current 
Selltence. 
CONCLUSIONS 
The paper described some potential applications 0\]7 in- 
cremental interpretation. It then described the series 
of steps required in mapping fi'Oln initial fragments 
of sentences to propositions which can I)e judged for 
plausibility. I,'inally, it argued that the apparently 
close relationship between the states used in incre- 
mental semantics and dynamic semantics fails to hold 
I)elt)w the sentence level, and briefly presented a more 
indirect way of using dynamic semantics in increinen- 
tal interpretation. 
REFER,ENCES 
Alshawi, II. (1990). Hesolving Quasi Logical Forms. Co'mpu- 
tational Linguistics, 16, p,133-144. 
Alt;lnann, (\].T.M. and M.J. Steeclinan (1988). Interaction wiH~ 
Context during Hmnan Speech Coml)rehension. Cognition, 
30, 11.191-238. 
Barwise, J. (11987). Noun Phrases, Generalized QuanLifiers 
and Allaphors. \]n P. Gardenfors, F,d., Generalized Quanti- 
tiers, 11.1-29, Dordrecht: l{eidel. 
(Jarletta, J., H. Caley and S. Isard 11993). A Collection of 
Self-repairs from the Map Task Corl)us. \]{esearch Heport, 
HGI{C/TR-,tT, Uniwq'sity of Eclinburgh. 
Chater, N., M.J. Pickerillg and \]).H. Milward 11994). What 
is Increnaental Interpretation? ms. '15 appear in Edinburgh 
Working Papers in Cognitive Science. 
Cooper, H. (1993). A Note on t;he l/elal;ionship between Lin- 
guistic :\['heory and Linguistic Engineering. l{esem'ch Heport, 
IICI~C/ItP-,t2, University of 13clinburgh. 
Frazier, L. (1979). On Cornprehendiny .5'e~tences: Syntactic 
Parsim.\] Strategies. Ph.D. Thesis, lJnive.rsil~y of Connecticut. 
l:)uhlished by lilclim,a University I,inguistics Club. 
Groenendijk, J. and M. Stokhof (1991). l)ynamic Predicate 
Logic. Linguistics and Philosophy, 1~, I).39-100. 
Gross, 1)., J. Allen and I). 'lYaum (1993). The THA\[NS 91 
Dialogues. TllAINS Technical Nol~e 92-.:l, Computer Selence 
1)el)i;., University of Hochester. 
IIadd0ck, N.J. (1987). Incremental semantic interpretation 
and incremental syntactic analysis. Ph.l). q'hesis, Univer- 
sity of Edinburgh. 
\]laddock, N.J. (:\[989). Computational Models of hmrementa\] 
Sentant;ic \]nterpretation. LarLquaqe a~d Cognillve Proces- 
ses, It, (3/41, Special issue, p.337-368. 
Hobbs, J.\[I.. and S.M. Shieber (t987). An Algorithm fin' (\]e- 
nerating Quantifier Scoping. Computational Linguistics, 
8, t)47-63. 
Joshi, A,K. (1987). An Introduction to Tree Adjoining 
C~rammars. In Mmmster-l/alner, Ed., Mathematics of Lan~ 
guage, Amsterdam: John Benjamins. 
Just, M. and P. Carl)enter 119801. A Theory of Heading 
from Eye Fixations to Comprehension. Psychological Re- 
view, 87, p.329-354. 
Kurtzman, H.S. mad M.C. MacDonMd 119931. l{esolution 
of Quantifier Scope Ambiguities. Cognition, 48131, p.243- 
279. 
l,ewin, i. (1990). A Quantifier Scoplng Algorithm wlthcmt a 
Free Variable Constraint. In Procccdinos of COLING 90, 
Helsinki, vol 3, p. 190-194. 
Lewin, i. (19921. Dynamic Quantification in Logic and Com- 
putathJnal Senlantlcs. l~esearch reporl;, Centre for (:ogni- 
tive Science, University of Edinl)urgh, 
Levelt, W.J.M. (1983). Modelling alld Self-Hepair in Speech. 
Cognition, 1~, p.,t1-10,1. 
Marcus, M., D. IIindle, and M. Fleck (1983). D:I'heory: 
qhlking about Talking about Trees. Ill Procecdisi.qs of the 
gist A CL, CamlJridge, Mass. p.129-130. 
Marslen-Wilson, W. (1973). Linguistic Structure and Speech 
Shadowing at Very Short \],atencies. Nature, 244 , p.522- 
523. 
Mellish, C.S. 119851. Computer lnterprclation of Natural 
Language Descriptions. Chlchester: Ellis Horwood. 
Milward, D.IL (1991). Axiomatic (@ammar, 
Non-Constituent; Coordination, and incremental Interpre- 
tation. Ph.D. Thesis, University of Cambridge. 
Milward, D.R. (t992). Dynamics, \])ependency (3rammar 
and \[ncremental Interpretation. \[n Proecedinqs of CO- 
LIN(~ 92, Nanies, vo\] ,t, i).1095-1099. 
Moortgat, M. (t988). Catcgorial lnve.stigatlons: Logical a,d 
Linguistic Aspects o/the Lambek Calculus, Dordrecht: Fo- 
ris. 
Puhnan, S.G. (1986). (-~rannnaI'S, Parsers, and Memory IA- 
mitations. Language ,nd Co~lnitiv~ l'rotesses, 1(.?), 1).197- 
225. 
Hesnik, P. (19921. Left-corner Parsing and Psychological 
Plausibility. In Proceedings of COLING 92, Nantes, vo\[ 1, 
p.191-197. 
Shieber, S.M. and M..Iolu~on (1993). Variations on Incre- 
lnental Interpretation..lour~*al of l~sycholinguistic Rese- 
arch, 2212), I).287-318. 
Shieher, S.M. and Y. Schabes (19901. Synchronous Tree- 
Adjoining Grammars. In Proceedings of COLING 90, \[\[el- 
sinki, vol 3, p.253-258. 
Stabler, E.P. 1199\[). Avoid the pedestrian's paradox. In 
tL Berwick, S. Abney, and C. Tenny, Eds., Principle-Based 
Parsing: Computation and Psgcholinguisties. l(luwer. 
Steedman, M. (1988). Colnbinators and Grammars. In 
H. Oehrle et al., Eds., Catcgorial Grammars and Natu- 
ral Language Structures, pal %442. 
Thompson, II., M. Dixon, and J. Lalnping (1991). Compose- 
Heduce Parsing. In Proceedings of the 29th A CL, 11.87-97. 
Tomita, M. (19851. Efficient Parsing for Natural Languaqe. 
Kluwer. 
Veltman F. (1990). \])efaults in Update Semantics. In 
H. Kaml), Ed., Conditionals, Defaults and Belief Rcvi.sion, 
DYANA Report 2.5.A, Uentre for Cog,fitive Science, Uni- 
versity of Edinburgh. 
Wirdn, M. (1990). Incremental Parsing and l{eason Main- 
ienance. In Proceedi'a9 s of COLIN(; 90, Helsilfld, vol 3, 
p.287-292. 
ZeevaG II. (1990). S\[.al.ic Semantics. in ,J. van Benthem, F,d., 
Partial and Dynamic Semantics I, I)YANA Iieport 2.I.A, 
Centre for Cognitive Science, LTniversity of \]gdlnburgh. 
754 
