American Journal of Computational Linguistics 
Mi crofIi che 13 
UNDERSTANDING 
CONCEPTUAL INFERENCE 
Computer Science Department 
University of Maryland 
College Park 
Copyright 1975 
Association for Computational Linguistics 
AB S'T RAC T 
Ally theory ot languaya must also be a theory of inference 
and memory. It does not appear to be possible to "understand" 
even t~le simplest of utterances in a contextually meaningful way 
in a system in which language fails to interact with a language- 
free memory and belief system, or in a system w!lich,lacks a 
spontaneous inference reflex. 
People apply a tremendous amount of cognitive effort to 
understanding the meaning content of language in context. Yost 
of this effort is of the form of spontaneous conceptual inferences 
wnic:~ occur in a language-independent meaning environment. I 
have developed a theory of how humans process the meaning content 
of utterances In context. '~'11~ theory is called Conceptual Llcmory, 
and has been implemented l~y a computer program whic:~ is designed 
to accept as input analyzed Cnnceptual Dependency (Schank et al.) 
meaning graphs, to generate many conceptual inferences as auto- 
matic responses, then to identify points qf contact among those 
inferences in "infe~ence space". Points of contact establish new 
patnways tilrough existing memory structures, and hence "knit" 
each utterance in witn its surrounding context. 
Sixteen classes of conceptual inference have been identified 
and implemented, at least at the prototy2e level. Tnesc classes 
appear to be essential to all higher-level language corn7re;1ension 
processes. Among them are causative/resultative (those which 
predict cause and effect relations) , motivational (those which 
predict and describe actors' intentions), enablement (those which 
predict the surrounding context of actions), state-duration (those 
which predict the fuzzy duration of various states in the world) 
normative (those which assess .the "normality" of a piece of 
information - how unusual it is) , and specification (tilose which 
predict and filltin missing conceptual information in a language- 
communicated meaning graph). 
Interactions of conceptual inference witi~ the language 
processes of (1) word sense promotion in context, and (2) ident- 
ification of referents to memory tokens are discussed. A theoreti- 
cally important inference-reference "relaxation cycle" is idenk~fied, 
and its solution discussed. 
The theory provides the basis of a computationally effective 
model of language comprehension at a deep conceptual level, and 
should therefore be of interest to computational linguists, 
psychologists and computer scientists alike. 
TABLE OF CONTENTS 
1 . The Need for a Theory 
........ of Conceptual Memory and Inference 
2 . A Simple Example .......................... 7 
3 . Background .........m...................... 10 
4 . A Brief Overview of the Conceptual 
...... Memory's Inference Control Structure 15 
5 . The Sixteen Theoretical Classes 
of Conceptual Inference ................... 20 
5.1. CLASS 1: Specification Inferences ..... 21 
5.2. CLASSES 2 and 3: Resultative 
............. and Causative Inferences 25 
5.3. CLASS 4: Motivational Inferences ..... 26 
......... 5.4. CLASS 5: Enabling Inferences 27 
5.5. CLASS 6: Action Prediction Inferences . 28 
5.6. CLASS 7: Enablement Prediction 
Inferences ........................... 30 
......... 5.7. CLASS 8: Function Inferences 31 
5.8. CLASSES 9 and 10: Missing Enablement 
and Intervention Inferences .......... 33 
5.9, CLASS 11: Knowledge Propagation 
Inferences ........................... 34 
5.10. CLASS 12: Normative Inferences ...... 35 
5.11. CLASS 13: State Duration Inferences .. 38 
5.12. CIiASSES 14 and 15: Feature and 
Situation Inferences ................ 41 
5.13. CLASS 16: Utterance Intent 
Inferences .......................... 42 
6 . Summary of the Inference Component ........ 43 
7 . The Inference-Reference Relaxation Cycle 
in Conceptual Memory ...................... 44 
8 . Word Sense Promotion. and Implicit Concept 
Activation in the Conceptual Memory ....... 48 
Conclusion 
APPENDIX A . Causal Chain Expansion 
................. Computkr Example 52 
APPENDIX E . Inference-Reference Relaxation Cycle 
................... Computer Example 57 
REFERENCES .................................... 62 
1. The ~leed for a Theory of Conceptual Memory and Inference 
aesearch in natural language over the past twenty years has 
been focussed primarily on processes relating to the analysis of 
individual sentences (parsing). Most of the early work was devoted 
to syntax. Recently, however, there has been a considerable 
thrust in the areas of semantic, and importantly, conceptual 
analysis (see (~2) , (Ml) , (Sl) and (C1) for example) . Whereas a 
syntactic analysis elucidates a sentence's surface syntactic 
structure, typically by producing some type of phrase-structure 
parse tree, conceptual analysis elucidates a sentence's meaning 
(the "~icture" it produces), typically via production of an 
interconnected network of concepts which specifies the interrela- 
tionships among the cohcepts referenced by the words of the 
sentence. On the one hand, syntactic sentence analysis can more 
often than not be performed "locally" that is, on single 
sentences, disregarding any sort of global context; and it is 
reasonably clear that syntax has generally very little to do 
with the meaning of the thoughts it expresses. Hence, although 
syntax is an impoatant link in the understanding chain, it is 
little more than an abstract system of encoding which does not 
for the most part relate in any meaningful way to the information 
it encodes. On the other hand, conceptual sentence analysis, by 
its very definition, is forced into, the realm of gen,e~d~ WULLU 
knowledge; a conceptual analyzer's "syntax" is the set of rules 
which can produce the range of all "reasonable" events that 
might occur in the real world. Hence, in order to parse corlcep- 
tually, the conceptual, analyzer must lnteract with a repository 
of world knowledge and world knowledge handlers (inferential 
processes). This need for such an analyzer-accessible korld 
knowledge repository has provided part sf the morivation for 
the development of the following theory of conceptual inference 
and memory 
however, the production of a conceptual network from an 
isolated sentence is only the first step in the understanding 
process. After this first step, the real question is: what 
happens to this co~ceptual network after it has been produced 
by the analyzer? That is, if we regard the conceptual analyzer 
as a specialized component of a larger memory, then the allocation 
of memory resources in reaction to each sentence follows the 
pattern: (phase 1) get the sentence into a form which is under- 
standable, then (phase 2) understand it! It is a desire to 
characterize phase 2 which has served as the primary motivation 
fbr developing this theory of memory and inference. In this sense, 
the theory is intended to be a charting-out oef the kinds of pro- 
cesses which must surely occur each time a sentence's conceptual 
network enters the system. Although it is not intended to be an 
adequate or verifiable model of how these processes miqht actually 
occur in humans, the theory described in this paper has never- 
theless been implemented hs a computer model under PDP-10 
Stanford 1.6 LISP. While the implementation follows as best it 
can an intuitively correct approach to the various processes 
described, the main intent of the underlyinghheory is to propose 
a set of memory processes which, taken together, could behave 
in a manner similar to the way a human behaves when he "understands 
language" . 
2. A Simple Example 
The attentive human mind is a volatile processor. My conjec- 
ture is that information simply cannot be put into it in a passive 
way; there are very primitive inference reflexes in its logical 
architecture which each input meaning stimulus triggers. I will 
call these primitive inference reflexes "conceptual inferences", 
and regard them as one class of subconscious memory process. I 
say "subconscious" because the concern is with a relatively low- 
level stratum of "higher-level cognition", particularly insofar 
as a human applies it to the understanding of language-communicated 
information. The eventual goal is to synthesize in an artificial 
system the rbugh flow of information which occurs in any normal 
adult response to a meaningfully-connected sequence of natural 
language utterances. This of course is a rather ambitious project. 
In this paper I will discuss some important classes of conceptual 
inference and their relation to a specific formalism I have 
developed (Rl) . 
Let me first attem?t, by a fairly ludicrous example, to 
convince you (1) that your mind is more than a simple receptacle 
£or data, and (2) that you often have little control over the 
thoughts that pop up in response to something you perceive. Read 
the following sentence, pretending you were in the midst of an 
absorbing novel' 
EARLIER THAT EVENING, MARY SAID SHE HAD KILLED HERSELF. 
One of two things probably occurred: either you chose as referent 
of "herself "- some person other than Mary (in which case every- 
thing works out fine), or (as many people seem to do) you first 
identified "herself" as a reference to Mary. In this case, 
something undoubtedly seemed awry: you ~ealized elther that your 
choice of referent was erroneous, that the. sentence was part of 
some unspecified "weird" context, or that there was simply an 
out-and-out contradiction. Of course, all three interpretations 
II 
are unusual in some sense because of a patzntly obvious" 
contradiction in the picture this utterance elicits. The sentence 
is syntactically aqd semantically impeccable; only when we "think 
about it" does the bis fog horn upstairs a1ert:us to the implicit 
contradiction: 
MARY SPEAK AT TIME T 
enablement infer-ence 
MARY AEIVE AT TIME T 
MARY NOT ~IVE- AT TIME T 
1' 
state-duration inference 
MARY CEASES BEING ALIVE AT TIME T-d 
T 
resultative inference 
MARY KILLS HERSELF AT TIME T-d 
Here is the argument: before reading the sentence, you 
probably had no suspicion that what you were about to read contalned 
an implicit contradictiun. Yet you probably discovered that 
contradiction effortlessly! Could there have been any a prior-i 
"goal direction" to the three simple inferences above? My 
conclusio~ is that there could not have been. If we view tne 
mind as a multi-dimensional "inference space", then each incoming 
thought produces a spherical burst of activity about the point 
where it lands in this space (the place where the conceptual 
network representing it is stored). The horizon of this sphere 
consists of an advancing wavefront of inferences - spontaneous 
proDes Which are sent out from the point. Most will 
lose momentum and eventually atrophy; but a few will conjoin with 
inferences on the horizons of other points' spheres. The sum of 
these "points of contact" represents tne integration of the 
thought into the existing fabric of the memory in that each point 
of contact establishes a new pathway between the new thought and 
existing knowledge (or perhaps among several sxisting pieces of 
knowledge). This to me is a pleasing memory paradigm, and there 
is a tempting analogy to be drawn with neurons and actual physical 
wavefronts as proposed years ago by researchers such as John 
Eccles (El). The drawing of this analogy is, however, left for 
the pleasure of you, the reader. 
This killing example was of course more pedagogical than 
serious, since ic is a loaded ~tterance involving rather black 
and white, almost trivial lnterences. But it suggests a powerful 
low-level mechanics for general language comprehension. Later, 
I will refer you to an example which shows how the implemented 
model, called MEMORY and described in (Rl), reacts to the more 
interesting example MARY KISSED JOHN BECAUSE HE HIT BILL, which 
is,perceived in a particular context. It does so in a way that 
integrates the thought into the framework of that context and 
which results in a "causal chain expansion" involving six 
probabilistic inferences. 
3. Background 
Central to this theory are sixteen classes of spontaneous 
conceptual inferences. These classes are abstract enough to be 
divorced from any particular meaning representation formalism. 
However, since they were developed concurrently with a larger 
moiiel of conceptual memory (R1) which is functionally a part of 
a language comprehension system involving a conceptual analyzer 
and generator (MARGIE (S3)), it will help make the following 
presentation more concrete if we first have a brief look at the 
operation and goals of the conceptual memory, in the context of 
the com~lete language comprehension system. 
The memory adopts Schank et al.'s theory (Sl.S2) of Conceptual 
aependency (CD) as its basis for representation. CD is a theory 
of meaning representation which posits the existence of a small 
number of primitive actions (eleven are used by the conceptual 
memory), a number of primitive states, and a small set of 
connectives (links) which can join the actlons and states 
together into conceptual graphs (networks) . Typical -of the -links 
are : 
tne ACTOR-ACTION "main" link 
the ACTXON-OBJECT link 6 
the CAUSAL link 
m- 
the DIRECTIVE link 
% 
and the STATECHANGE link 
e=l -, 
Each primitive action has a case framework which defines conceptual 
slots which must be filled whenever the act appears in a conceptual 
graph. There are in addition TIME, Location and LNSTrumental 
llnks, and these, as are all conceptual cases, are obligatory, 
even if they must be irifsreiitially filled in by the conceptual 
memory (CM). Figure 1 illustrates the CD representation of the 
sentence MARY YISSED JOHN BECAUSE HE (JOHN) HIT BILL. That 
conceptual graph is read as follows: John propelled some unspec- 
ified object X from himself toward Bill, causing X to come into 
physical contact with Bill, and this entire event cause Mary to 
do something which resulted in her lips being in physical contact 
with John! Furthermore, the entire event occurred sometime in 
the past. Chapter 2 of (Rl) contains a fairly complete overview 
of the CD representation. 
Assuming the conceptual analyzer (see (R2) ) has constructed, 
in consultation with the CM, a conceptual graph of the sort 
typified by Tigure 1, the first step for the CM is to begin 
"integrating" it into some internal memory structure which is more 
amenable to the kinds of active infezence manipulations the CM 
wants to perform. ?his initial integration occurs in three stages. 
First is an initial attempt to replace the symbols (JOHN, MARY, 
BILL, X, etc.) by pointers to actual memory cance'pts and tokens 
of concepts. Each concept and token in the CM is represented by 
a unique L-ISP atom (such as C0347) which itself bears no intrinsic 
meaning. Instead, the essence of the concept or token is captured 
in a set of features associated with the symbol. Thus, for 
instance, an internal memory token with no features is simply 
"something" if it must be expressed by language, whereas the 
token illustrated in Figure 2 would represent part of our 
knowledge about Bill's friend Mary Smith,, a female human who 
owns a red Edsel, lives at 222 Avenue St., is 26 years old, and 
so forth. This set ofl features is called C0948's occurrence set, 
and is in'the implementation merely a set of pointers to all 
other memory structures in which C0948 occurs. The process of 
referent identification will attempt to isolate one token, or 
second best, a set of candidate tokens for each concept symbol 
in the incoming graph by meansof a feafure-intersecting algarithm 
described in (Rl). 
Reference identification is the first stage of the initial 
integration of the graph into internal memory structuxes. The 
. . 
0 
JOHN <===> PROPEL e--- X? 
FIGURE 1 
1 \ 'r MARY <===> DO 
Conceptual dep,endency representation 
of tne sentTence ":dary .kissed John 
because he hit Bill. 
II 
(a) 
(ISA # #PERSON) 
(SEX # #FEPniLE) 
(i4AlIIE # I!L\RY) 
(SUWJAIIE # SL4ITII) 
(OITNS # C0713) 
(RESIDENCE # CO8.46) 
(TSTART # C0654) 
(20718 is the token representing the Eclsel wilicil ilary owns, 
C0846 is tile token for Mary's place of residence;. C0654 is a 
time token with a numeric .value on the CM's time scale rep- 
resenting i.larP1s time of birth in 19481 
va 1 
BILL 
va l 
X? <asg> PHYSCONT c---- L~PS c=es> PHYSCONT +---- 
7' 
JOHN 
I par' 
(dl 
---, BILL / \ 
FIGURE 2 
+-- JOHN 
P 
<=--= -------- 
,,,,-,E33~P=33=B=~l~~ZZ -ax,3aa------ 
The memory token and-its 
occurrence set -.inich rep- 
re'seq t Mary -Smith . 
(b) 
second and thiqrd stages are (2) the isolation of subgraphs which 
will 
form the beginning - inference,queue -- (input to the spon- 
taneous inference component) , and (3) the storage of the graph 
dependency links themselves as pointers in the memory.8ust as for 
simple concepts and tokens, composite structures (actiqns and 
states) are stored under a unique internal syntbol, and this symbol 
may also have an occurrence set. In addition, there are several 
other properties associated with each composite structure S: 
the recency of S's activation by explicit reference (RECENCY), 
the recency of S's activation by implicit (inferential) reference 
(TOUCHED), the degree to which S is held to be true (STRENGTH), 
a list of other composite structures from which S arose in the 
memory - its inferential antecedants-(REASONS), and a liig of 
other composite structures in whose generation S'played a role 
as antecedant (OFFSPRING). RECENCY and TOUCBED are also pTop- 
erties of concepts and tokens, and are used in thebreferent 
identification process. 
Figure 3 shows the memory structures which result from the 
conceptual graph of Figure 1 after the initial integration. The 
net result of the initial integration is a set of starting memory 
structures, (actually, a list of pointers to their symbols, such 
as (C2496 C2301 (22207)). Each of these structures references 
memory concepts tokens and ~ther composite structures. 
Regarding the referent identificatian prosess, fur those 
concepts. and tokens which could not.be uhiquely identified, new 
temporary tokens will have beeA created, each having as its 
initial occurrence set a list of: what is khown about the entity 
so far. 
After the initial integration, the inference component is 
applied simultaneously to each memory structure ("point in 
inference space") on the starting inference queue. 
t # (ISA # #PHYSOBJ) 'k, 
((CAUSE* *( 
(UNSPECIFIED #) # (ISA # #PrnSON) 
3 (SEX # WALE) 
-/ (NAME # BILL) 
-,# ( ISA # ,#PERSON) 
(CAUSE * * ( SEX #FEMALE 
(NAME-# MARY) 
(TIME * # (ISA # #LIPS) 
(PART # *) 
# (ISA # #TIME) 
(BEFORE # N) 
(N is the numeric "now" 
on the CM's time scale) 
The internal memory structures resulting from the 
CD graph of Fig. 1. 
Figure 3 
4. A Brief. Overview ot th.e Conceptual Memory's Inference 
Control Structure. 
The control structure which implen~ents the Cbl irlfcrcnce reflex is o 
breadth-first monitor whose queue at. any moment is a list of pointcrs to 
depencbhcy structures which have arisen by infcrencc from thc beginning 
sti-uctures isolated during the initial integration. It is the inference 
monitor's task to exmine each dependency structure on the queue in turn, 
isolate its predicate, prepare its arguments in a standard format, collect 
several ti11e: aspects from the structure's occurrence sct-, then call the 
inference*molecule associated with the predicate, passing along the argu- 
ments mil time info~mation. 
All inferential howledge in the (3rI is contained in inference molecules, 
which 1 ie in one-one correspondence with conceptual predicates. inference 
molecule is a structured LISP program which can perform arbitrary discrimina- 
tion test's on a relevant dependency structure's features and features of all 
involved concepts and tokens, and ~d~ich &an call on specialist programs to 
car1-y out standard test information retrieval functions. ("CAUSER" is an 
example of such 2 specialist. It will scan back causal sequences from struc- 
ture S until it locates a volitional action, then it returns the actor of 
that action as the primary causing agent of S ) Inference molecules are 
hence multiple-response discrimination networks dlose responses are cencep- 
tual inferences (of the various theoretical types to be described) which can 
be made from the clependency structure. Each potential inference within the 
inference nlolecule is called an inference atom. 
Tile contribution of an inference atom which has been found applicable 
to the dependency structure reports eight pieces of infonnation to a com- 
ponent of the monitor called the structure generator, whose job it is to 
embody each ne\i inference in a memory structure. These.eigh't pieces of in- 
formation are the following: 
1. a unique mnemnic which indicates to ~QIlich of the 16 
theoretical clqsses the ne\c inference belongs (this 
mnemonic is associated liitT.1 the new st.ructure only 
tcmp~rarily on the inference queue for subsequent 
control purposes) 
2. the "reference name" of the genernt-infi inf el-enGe atom 
(each atom has a unique name which is :issociatcd with 
the new memory structure for control purposes) 
3. the dependency structure (a predicate which binds to- 
gether several pointers to concepts, tokens and other 
structures), which is the substance of the new inference 
4. 
a detault "significance factor" which is a rough, ad hoc 
measure of the inference's probable relative significance 
(this is used only if a more sophisticsrted process, to be 
described, fails) 
5. a REASONS list, rihich is a list1 of all other stzuctures 
in the Chl which were tested by the discrimination net 
leading up to this inference atom. &very dependency 
structure has a REASONS list recording how the struc- 
ture arose, and the .REASONS list plays a vital role in 
the generation of certain tpes of inference 
6. 
a "propagation strengrh factor" \\-hich, \chon multiplied 
by the SmY;TIls (degree of belief) of all structures 
on the WOES list, produces the SI'RENG'ITI of the nerc 
inference. (There is a need for better heuristics here 
incidentally -- see (Zl) for instance.) 
7. a list of modifying structures (typically time aspects) 
which become the new inferred structure's initial occur- 
rence set 
8, 
propagation and strength factors for each mdifyihg struc, 
ture Fiyre 4 illustrates the small implemen-ted 
h%GCl&WGE, (something undergoes n negative changc 
on some scale) inference molecule. It is inqludcd to 
communicate the gestalt rather than corrcct specifics dt 
this early stage of development. 
The two other main components of the inference monitor are the eval- 
uaror and the structure merger. It is the function of the evaluator to 
detect exact ancl fuzzy contradirtions and confirmqtions (points of contact) 
bct~een eacn neli inference ds it arises and. e.ui.s t inp memor). depa~dcncy 5 true - 
tures. Because "fuzziness" in the matching process implies ;lcccss to a vast 
number of heuristics (to ill&tratc: uoulcl it be more like our friend. the 
- -- 
(IPROG NEGCHANGE (ilN PE SC1 (X1 X2) ( 
ICON0 ( (EVENT UN) 
(CO~ ( (F1 (@f SA PE @#PERSON) 1 
aNEGCHANGE1 
(@WANT PE (GU (9POSCHANGE PE 'SC) 1 I 
(0.95 1.0 (CAR UNL) 
(@TS a* (TI UN)) 
+FEnPLE OFTEN WANT TO BETTER 
-f HEflSELVES AFTER SOME NEGCHANGE 
(1.0 (CAR UNII) 
(COND -1 (AND (SETQ X1 (F1 (ezMFEEL* @- @SFIEGEtlOTI ON PE) 1 1 
IS€ To X2 ! GLOBALF I EID 1.1 
(I R eNEGCHANGE2 
(ePOSCHANGE X1 @#JOY 
(0.9 ! .0 , KAR UN) XZ) *PERSON GETS HAPPY WHEN ENEMY 
(eTIME @s (TI UN)) WSUFFERS NEGCHANGE a 
fl.0 (CAR -UN) 1) 
1 
(COND ( (AND (SETQ X1 (CAUSER (CAR UN) 1 I 
(NOT (EQ (CAR X4 (a2 (LOR XI)) 1) 1 
1 
( I R QNEGCMANGE~ 
(@*nFEEt% ,PE e#NECEf;OT 1 ON (CAR XI 1 *PEOPLE N' T L I KE 
(8.95 1.0 (CAR UN1 (CD3 X1)) -OTHERS HO HURT THEM 
(@TS ea (TI UNl1 
l?" 
(1.0 (CAR UN))) 
1 
1 
1 
( (HASPRP PE (el SA PE @#PHYSOBJ) 1 
(CON0 ( (AND (SET0 X1 (F1 (sxOUN* PE 1 I I 
(SESQ X2 (CAUSER (CAR UNT) 
(NOT (EQ X1 (CAR C2) 1) 
1 
( I R *NEGCHGNtE4 
(exflFEEL* X1 eBNEGEflOTION (CAR X2) I #IF X DAMAGES Y 'S PROPERT 
(0.85 1.0 (CAR UN) X1 (CDR X2) 
#THEN X MIGHT FEEL ANGER 
(TS e* [TI UN)) -TOWARD Y 
(1.0 (CAR UN))) 
1 
An inference ~nslecule used by the current program. 
FIGURE 4 
The NEGCHANGE inference molecule. 
17 
Lawyer or our friend the carpenter to own a radial arm saw?), 
the evaluator' delegates most of the matching responsibility 
to programs - again organized by conceptual predicates - called 
normality molecules ("N-molecules") . N-molecules, which will1 
be discussed more later, can apply detailed heuristics to ferret 
oqt fuzzy confirmations and contradictions. As I will describe, 
N-molecules also implement one class of conceptual inference 
Confirmations and contradictions discovered by the evaluator 
are noted on special lists which serve as sources for possible 
subsequent responses by the CM. In addition, confirmations lead. 
to invocation of the structure merger, which physically replaces 
the two matching structures by one new aggregate structure, and 
thereby knits together two lines of inference. As .events go, this 
is one of the most exciting in the CM. 
Inference cutoff occurs when the product of an inference's 
STRENGTH (likelihood) and its significance factor falls below 
a threshold (0.25) . This ultimately restricts the radlus of each 
sphere in the inference space, and in the current model, the 
threshold is set. low to allow considerable expansion. 
Figure 5 depicts the overall strategy of the inference 
monitor. (Rl) contains a fuller account of the inference control 
structure, whose description will be terminated at this point. 
Enter.. , sixteen- theoretical classes of conceptual inference 
which fuel this inference reflex. 
m- ? 
4 - 
7 
v 
!NEWINFS: ( * * * # * ... * 1 
STRUCTURE 
GENERATOR 
next level becomes 
new !NEWINFS 
I 
,( * * * * *.* m.. * 1 
(* * * * *I 
INFERENCE I 3 
HOLECULE - : =a= pred i ca te 
I 
iTTTiTamm i 
cut off i nferences 
-TT- 
N-MOLECULES 
\ 
REORDERER 
FIGURE 5 
==3> ( * * ) 
r? 
. - 
The inTerence .monitor'. 
# 
\/ 
TTTTT TT 
h 
7Ft; 
end to 
TOFF I NFS 
! I NFS 
(* * * * * ... (.-*) 
====,> ( * * * ,* * * 1 
111111 
/ 
1 EVALUATOR 
H 
/\ 
====as===> 
( * * * * * ) 
1 1 I 
5, The Sixteen Theoretical Classes of Cohceptual Inference. 
It is phenonienological that mbst of the human language experience 
focuses on actions, their intcndcd and/or resulting states, and tllc causal- 
ity and enabling states which surround them. 'Tllore seems 'to be an ines- 
capable core of notions related to actions. causation and cnnble~ent \t-l~ich 
alr~~ost anyone w11o introspects long enough 'wi 11 indcpcndently di scovcr . In 
his "Cold i(arriorU nlodel; ilbelson (Al) \?as pcrhq~s the first to attempt :I 
colilputationally formal systematization of this fundm~cntnl core of meaning 
relations. It is of tllc utmost primacy in his system, which models thc 
political ideologies and behavior patterns of a rabid right-wingcr, to dis- 
caver 'and relate the underlying i~u~osCSy enablemcnt and causality surrouncl- 
ing events in some 1l)pothctical intcrnat-ional scenerio or crisis. Again, in 
Scha~k ~t alls CD theory, the sale emphasis arose more or less incle~~endcntly 
in a system of meaning represelltdtion for everyday utterances: causality, 
actions, state-changes and enablement were recurring.themcs. Not su~l-~risfngl\~, 
tile same notions have emerged as central in my analysis of the infcrencc re- 
flex: over half of the-16 classes relate to this "nction-intcntion-cnusality- 
enablement - kno~qlecige" conp1c.x. 
In the following descriptions of these 16 classes, keep in mind that all 
types of-inference are apelicable to every subcomponent of every utterance, 
and that the 01 is essentially a parallel silwlation. Also bear in mind that 
the inference evaluator is constantly perf61-rning matching operations on each 
new inference in order to detect interesting' interactions between inference 
spheres. It sllould also be emphasized that conccl7tuni inferences are l~oh- 
abilistic and predictive in naturc, and that . 1w . makine then~ in nl~parently 
iiasteful quantities, the 01 is not seeking one- r'esult or tnith. Rather, 
inferent fa1 expansion . is a,n endeavor which broadens each piece of informat ion 
into its surr~undillg spectrum to fill out the inCormation-rich situation to 
~Ishich the information- lean utterancc ]night refer . The ~~1''s - gropings will 
resable more closely the solutiorl of il jigsaw puzzle than the more goal- 
directed solution of a cross\trord p..~zzlc. 
The following discussions can only sketch the main ideas 
behind each inference class. .See (~1) for a more cornprehensive~ 
treatment. 
20 
5.1 CLASS 1 : SPECIFICATION KNFERENCES 
PRINCIPLE : The Bl must be able to identify and attempt-to fill in 
each missing slot of an incoming conceptual graph. 
~IPLES : **~ohn~.was driving home from work. 
He hit Bill's cat. 
(inference) It was a car which John propelled into 
the cat. 
**Jolln bought a chalk line. 
(inference) I-t was probably from a hardware store 
that John bought the chalk Jine. 
DISCUSSION: 
Our use of language presupposes a tremendous underlying lufoiqledge about 
the rcorld. Because of this, even in, say, the most explicit teclu~ical ~rit- 
-ing, certain assumptions are made by the ~i~itar (speakera about the compre- 
hender ' s knowledge - - that he can fill in the plethora of .detail surrounding 
each thought. In the 01, this corresponds to filling in all the missing con- 
ceptual slots in a graph. 
The utility of such a process is t~iofold'. Firsr, Cbl failures to specify 
a missing concept can serve as a source of requests..for moreinfarmation (or 
goals to seek out tlla't information by CM actions if.01 is controlling a ro- 
bot). Second, by predictively completing the graph by application of general 
pattern howledge of the modeled world, novel relations qong specific con- 
cepts and ~okens will arise, and these can lead to potentially significant 
discoveries by other inferences. 
To illustrate, a very common missing slot is the instrumental case. 
lie generally leave it to tile imaginative powers of the hearer to su-mise 
the probable instrmental action by which some action occurred: 
(husband to wife) I went to Sl-lARS to'day. 
(wife to husband) How? I had the car all day! 
Here, wife fills in the instrumental slot as: 
"Husbimrl drove a car to SINIS" 
(clearly relying-on some specific hcur~st~cs,,sucl~ as tllc distaqce from thcir 
home to SEARS, etc.), and this led to her discovery of a contradiction. 
That she wy have been premature in the specification (and:had later to 
undo it) is of secondary importance to the phenomenon that she did. so 
spontaneously . 
In the CM.specification inferences, as all inferences, are implemented 
in the form of structured progrryns which realize discrimination nets whose 
fzern~inal nodes are concepts and tokens rather- than inferences, as in generrjl 
inference molecu'les. These specification procedures are called specifier 
molecules ("S-molecules"), and are quite similar to inference molecules. 
Fig. 6 sllows a small prototype of the PROPEL specifier molecule Mlich can 
predictively fill in the missing object of a PROPEL. aktion, as in llJohn hit 
Pete." That p.irticular "specifier atom" is sensitive to context along one 
simple dimemion if the actor is holm t~ be grasping an object (this 
prototype doesn't care wnerner it's P wet noodle or a bludgeon), at the time 
of the action, the molecule will infer that it was the grasped object lihich 
\$as propelled, as in ItJohn picked up the flolier pot. He hit Pete." Other- 
wise, the molecule will assume '%and of the actor". This is ridiculously 
- 
oversimplified, but it represents a certain philosophy 1 \<ill digress a 
moment to reveal. 
I, as many other people (see W1, H1, C1, for anstance), have come to 
be1 ieve that passive data structures are fundamentally awkward for repre- 
/ 
senting knowledge in any detail, partitularly for the purposes typified by 
(SPROG *-PROPEL* (UN v AC 08 OF lw #(XI xz ~31 ( This is a simp1 i f ied speci'f'ier 
(COND ( (NULL (CADR V) molecule co~itaining 'just an3 ob'ect 
(COND ( [AND dSETQ XI -CC (6ISA @- d%HAND) 
specit-iw atom. (NULL (CAPR v)] is 
@PART o- ACII) I test for lack of object s~eci f icat io 
(ether specifier atonls go here) 
1 
(RETURN V) 
1) 
FIGURE 6 
If0 unsp'cifiea, the aton! locates 
the hand of the acto+, assighing 
it to X1. It then checks to 
sea if an$h,ing is located in XI. 
If sq:lsth~nc_l is found, it is bound 
to X2, and the LOC structure which 
expresses this infornlat i on is. 
bound to X3. If nothing 4s located 
in the actor's hand, hls hand 
itself (XI) is inferred. The 
(LIST X3) in the first SP cal l 
is the list of REASOPJS (just.one 
here) j.ustifying the specificatibn 
~f the-object the actor uas'h~lding 
as th2 03joct of ths PROPEL, 
The PROPEL specifier molecule. 
this sinil?le P1lOPLL exanplc. The needs for "special. case heuristics" in 
cvcn such a hodest ppcrat ion as this quicltly overtaEc one s prol~~rss at 
dcvis ing fJeclarativef' Ilremorx strucrures . Programs, on the other hand; 
are quick ant1 to the point, quite flexible, and have-as much "aesthetic 
potelltiB1"a.s e-ven the most elegant deatarative structures. 
life-sire procedure for this very narrow process of specifying the 
lnissillg object -of a PROPEL action ivould obviously rec~fiirc-many more tests 
for related contexts ("Jolm was racing clown. the hill on lzis bike. He hit 
ill. ) But independ-cnt of the f idel ity with ~,lhicil any given S-molecule 
executes its task, there is a vcry important claim buried hoth here and 
in the other inferential procedures in tile (31. It is that there are cer- 
tain ccqtral tasks in wIi.ic11 the ~lecision -- proccs,~ ~~lust seek out the contcxt , 
rather tlli11 context seeking out the qpropriate decision process. In other 
~qords, much of* the inference cal2ability requircs specialists I~IO ino~\r a 
priori exactly rihat dimensions of context could yossibly affect thc gener- 
ation of cvery potential inference, and tl-lcse specialists carry out active 
probes to search. for those dbnensions before ory inference is generated. 
I can imagine no "uniform contcxt mechanism" which accounts for the i~uman's 
diverse ab2lity to attend to the relevant and ignore the sul>erfluous. 
yv 
conjecture is that tile mecl~anism for contextual guidance of inference is 
highly distributed tl~roughout the memory rather than. ccntral i zed as a corn- 
goncnt of the. memory' s control structurc . 
5.2 CLASSES 2 and 3: Rl$SULmTIVE and CAUSATIVE INFERENCES 
PRINCIPLE ? If an action is perceived, its probable resulting states 
should be inferred (RESULTATIVE). 
If -a state is perceived, 
the general nature of its probable causing action (or a 
specific action, if possible) should be inferred (CAUSATILT). 
EWPLES: **blary hit Pete with a rock. 
(inference) Peteprobably became hurt. (RESULrI'ATI\T.) 
**Bill was angry at Nary. 
(inference) Llary ' may have done something to Bi 11. (LlUSilTI\F) 
DISCUSSIO& 
These t~o classes 6f inference embody the 01's ability to relate nq- 
tions apd states 'in causaI sequences relative to the (S1I.s models of 
causality. In addition to serving as the basis for blOTIV:\TlOS2L in-- 
ferences and contributing to the general e~~ansion process, ~lUS~~TI\l~ 
and RESULTATILT inferences ofbten achic~~c the rather exotic folm of undcr- 
standing I have termed "causal cllain expansion." It is this process uhidl 
makes e~~licit the oft-abbreviated statements .of causality: language com- 
municated predications or causality must aluays (if only subconscieusl!-) 
be explained in ternis of thc compre11ende~-'s models of causality, and fail- 
ures to do so signal a lack of understanding and form another souKe 01 QI 
queries for mor-e information. Cabsal e.qansion successes on the other hand 
result in important intervening act ions and states \t?~ich draw out ("~oucI~"] 
surfounding context and serve as the basis far inferences in other cate- 
gories. 
Appendix A contains the computer printout from >lE\X)R1,, tracing a 
causal expansion $or "Mary kissed John because he hit Bill" in a particular 
context \3hich makes the explanation plausible. 
5.3 CLASS 4 : MOTIVATIONAL .INFERENCES 
The desires (intentions) of an actor can frequently be 
inferred by analyzitlg the states (ESULTATIVE inferences) 
which result 'from an action he executes. These \VAN?- STATE 
patterns are essential to understanding and should be made 
in abundance. 
**John pointed out to Lkry that she hadn-. t done her chores. 
( inference) Mary may have felt guilty . (RESULTAT IVE) 
(inference) John may have wanted Mary to feel guilty. 
(MOTIVAT I ONAL ) 
**Andy blew on the hot meat. 
(inference) Andy may have wanted the meat to decrease 
in temperature. 
DISCUSSIOK: 
- 
Language is a dual system of communication in that it usually corn- 
hunicates both the actual, and, either explicitly or by inference, the 
- 
intentional-. ifliere the intentions of actors (the set of states .they de- 
sire) are not explicitly communicated, they must be inferred as the 
immediate causality of theI action. In the Ch! candidates for bDTIVATIONAL 
inferences are the WSULTATIVE inf er'ences f3 the oil can produce trorn 
an action A: for each RESULTATIVE inference Ri which the (34 could make 
from A , it conjectures that perhaps the actor of A des5,red I? 
i 
Since the generat ion of PRTIVATIONAL inference is dependent upon the 
results of another class of ihference (in general, the actor could have 
desired things causally removed by several inferences from the immediate 
resul tS of his act ion) , the bY3TIVATIOW inference process is implemented 
by a special proceuur* POSTSCAN 1~~11ich is invoked betweer "passes" of the 
nuin breadth-first monitor. 
Thesc passes will bc discussed nlorc later. 
Once generated, each >DTIVATIOW w'i11 generally lead back- 
\,lard, via CAUSATILT inferences, into an entirc causal chain ~dlich lead up 
to the action. This chain will f12quently connect in interesting ways with 
chains working fonuard from otllcr actions. 
5 .4 CLASS 5 : . E.NA.BL.1N.G. I.JSl@ERENCES 
Every action has a set of enabling conditions -- conditions 
which must be met for the action tc~ begin or proceed. The 
Of needs a rich h0\4ledge of these conditions (states), and 
should infer sutable ones to surrounJ. each perceived action. 
"~olm saw IrJary yesterday, 
(inference) Jahn and 'kry were in the same- general locjtion 
sometime 'yesterday. 
**Mary told Pete that John MS at the store- 
(inference) Mary he~+r-tliat Jolm was at' the store. 
DJSCUSSION: 
The example at the beginning df- the paper contained a co~~trndiction 
which could be discovered only ,making a very simple enabl ing inference 
about the action of speaking (any action for that mattcr) , namcly that tllc 
actor was alive at the time! Enabling inferences can fruitfully lend from 
the known action through the cnabl ing states to prdicnt ions about other 
actions the actor niight liave performed in order to set up bthe enabling states 
for the primary action. 
This idea is closely related to the next class of 
inference. 
5-5 CI3S.S 6: ACTION PREDICTION INFERENCES 
PRINCIPLE : 
Whenever some WANT STATE of a potential actor is knohn, 
predictions about .possible actions the actor might perform 
'to achieve the state should be attempted. 
These predic- 
tions will provide potent potential points of contact 
for subsequently perceived actions. 
EXANPLES : **John '~vants some nails. 
(inference) John might attemp-t to acquire some na-ils. 
**Mary is furious at Rita. 
(inference) bhry might do something to hurt Rita. 
DISCUSSION: 
Action prediction inferences serve the' Inverse role of EfOT1VATIONA.L 
inierences, in that they work forward from a kno~m \VANT STATE pattern into 
predictions about future actions which could produce the desired state. 
Just as a bRTILT~lTIONAL inference relies upon RESUZTATIVE inferences, an 
ACTIOX PRLDICTION inference relies upon CI\USATIVE Wferences which can be 
generated from the state the potential actor desires. Because it is often 
impossible to anticipate the ef - ic causing action, ACTION PREDIDION 
inferences typically will be,'more general expectmies for a class of ~os- 
sible actions. In the nails example abve, the general expentancy is, sim- 
~ly that John may do something which normally causes a PTIWNS -(in CD ter- 
minology,, a change of location of some object) of some nails from some~\~here 
to himself. Often the nature of the desired state is such that some specific 
action can be predicted ("John is hungry. . . John will ingest food.") 
By mah- 
ing specific actio~l predictions, a new crop of enabling inferences can be pre- 
dicted ("John must be near food. ", etc. ) ,, and those conditions which cannot be 
assumed to be already satisfied can.serve as new \LAW-SpTEs of the actor. 
Thus it is through bDTIVATIOhlAL, ACTION PRFBICTION and ENABLING inferences 
that the CM can model [pr~dict) the pr-oblem-solving bel~avior of each actor. 
Predicted act i~ns ~lihich match up \ii th subsequently perceived conceptual 
input serve as a very real measure of the 8!'s success at piecing together con- 
nected discourse and stories. 1 suspect in addition that ACTION P~ICTION 
inferences will play a key role in the eventual solutions of the "co~ltextuhl 
guidance of inference" problem. Levy {Ll) ha5 some interesting beginning 
thoughts on this topic. 
Fig. 7 illustrates the ACTION PREDICTION in£ erence cycle. 
? 
(UANT 'P I 
I 
'I * 
,Tf'*.* - 
cancause I-e l at i ons 
R 
\ 
(causa t i vb i'n'ferences 
\ froni S) 
infcr 
tWAfiT P AiE') 
f8r each Ai $ -j 
uhich cannot 
be expl ici t ly 
found or 
assumed to be 
true 
. . - - . . -- . - -. . .- . 
(WANT P *I - AlEl 
1-1 111 A1E2 
I / 
/ A2E 1 
C 
- 
4-1 
4 
(WANT P *I. 
(WANT P *) 
I 
I 
2 
1-1 --. - Liml 
A3tr -0-2 - 
- - 
# -- 
4 
111 A3E2 I-' AiE3 
I 
What the act 1011 predictiot~ jlifere~~ce process tries to do; 
- 
'A.E is enabling state j 
j for actionoi. 
FIGURE 7' 
The action,predict.Lonrinfere*e pyocess. 
5.6 CLASS 7 : ENABLEMENT PREDICT.ION INF'ERE'NCES. 
If a potential actor desires-h statc rd~ich is a comnbn 
cnnbl ing condit ion for sane spcci f ic action, then it 
can bc inkred that the :r;tor irigi~t \~:istl to csccutc 
t:la t act ion. 
1:,LLWLlSS : ** bhry nskcd .John to tilrn on thc 1 igllt . 
( inference) 3h1-y probnl3ly \<ants to scc s~~nc~hinp-. 
** .4nd\' \ic~nts the mcnt to be cool. 
(infcrcncc) hdy might \\.:mt to cat the meat. 
L) I SL'USS~I 0s : ' 
Inferences in this class arc, in ;t scnsc, thc inverse of ILX.~lBLIN(; 
infcrenccs, because the)* sttempt to predict an nction from an enabling 
state ~IO~II to be desired by a \iould-bc actor. I\llc-reas an ilCTIOS PRLDIC- 
TIOS infcrcncc- pi-cdic ts. 3. possible future act ion -to fulfill the dcsil-ed 
state, cnablemcnt prediction rlra\is out thr motivation oi the desire for the 
state 11)- identif'ying n probable action the statc \tould cnable. .Ilthougl~ 
(as \<it11 :ILTIOS FIUiD1CI'IOS infercncc) it ill rrequcntly happen that no 
specific action ccul bc anticipated (since most states covld cnablc infinite- 
ly many , specific actions), it is ncvcrti~.clcss possible to' form general prc- 
dictions about the nature of rrestrictions on) the enabled raction. If, 
for example, John mlks over to Slrll-y, then a RISU112'.-lTI\E inference is that 
he is near lary, and a ;\K)TI\-ATIOLU inference is .that he \cants to be. near 
\LIRY At this point an 1;UULDEZT PEDIC'l7OS infcrcnce can bc made to repre- 
sent the gencral class of intcrnctions .Jolm might have in mind. This \<ill 
be of'~pa~ticu1ar significance if, for instance, the 01 h~orit; already that 
Jolm had something t.o tcll her, since thcn the infcrr-cd act ion pattern 1,-oiould 
match quite ~cll the nction of verbal comu~ication in \ihichmthc state of 
spatial proximity plays a key enabling role. 
5.7 CLASS 8: FUNCTION INFERENCES 
PRINCIPLE: Control over some physical obJ-ect P is usually desired by 
a potential sctol: because )LC is algaged* in an alg.oritl?m in 
1~11ich P plays a role. The 01 silould attempt to infer 
a probable action from its knowledge of Y's normal func- 
t ion. 
EXQIPES: **Nary \cants the book. 
(inference) Mary probably wants to rend the book. 
*.*Jolm wants a knife.. 
(inferenceJ J-oh probably \jWantsm to' cut something rri'th 
the knife, 
**Bill IiBs to pour sundaes dam girls' dresses. 
Bill asked Pete to. had him the sundae. . . 
Function inferences E017:1 a very diverse, rather colorful subclass 
of LrZULE\KhTaPWICTIO?; infer~nce. 
The underlyiflg principle is that 
desire ofbinrmediate control over-an object is ~suall!~ tantamount ro a 
des-iresto use that objest in the norn:al function of objects of that t?~c, 
or in1 some function Wh 1s peculiar to the oLlj tct -and/or actor (third 
example abve). In the a!, normal functions of objects are stored as 
(hXT S Y) patterns, as in Eig. 8 for things that are printed matter. 
Before applying SFCT patterns, the Qi first checks for unu3ual relations 
involving fl~e specific actqr arid specific object (by escludifig paths iilhich 
include. tlle nodl ISA relations between, say sundae and food). Thus, that 
Bill is Imor~n to requn-e su~draes for slightly different algoritluns from m~st 
people vill be discovered and uscdin the prediction. The result of a 
NSCI'IOS inference is always sorllc. predicted action, assumed to be part of 
some -alrorithrn* in r~ilich the actor is enpqgecl ?d: 
(MTRANS * * #PRINTEDflATTER 
(I SA # #PERSON) (lSA, gCP) 
(PART # *I 
The inelnory structure which stores 
the norinal function 'of pri~i ted matter. 
FIGURE 8 
A "normal-f unction-of'' 
memory structure, 
5.8 CLASSES 9 and TO: MISSING ENABLEMENT and 
INTERVENTION INFERENCES 
PRINCIPLE: If a \tlould-be actor is ho\m to have bceil ~msuccessful 
in achieving some action, it is often possible to infer 
the absence of ane of the action's enabling states (31ISS- 
1% LIZi3L131EEUT]. If a potential actor is horn to desire 
that some action cease, itm can be predicted that he ~cill 
attempt to remove one or more enabling states of the 
action (I~~'LR~l~hTION] . 
EL: **>hry couldn't see the horse-s finish. 
( inference) Something bloc_kcd Nary's view. (FIISSING 
Li'BL I3m) 
She. cursed the man ih front of her. . . 
**>hry saw that Baby Billy was running out inlo the street. 
(inference) Fhry \<ill pick Billy qff the ground (IhTER- 
\l:\TIox) 
She ran after him ... 
DISCUSSION: 
Closely related ta the other enabling inferences, these forms attempt 
to apply horiledge about enilblement relations to infer the cause of an 
action's failure (in the case of MISSING LWLBlEEVT),, or to predict a \L'h\'T 
XOT- STATE lihich can lead b~- act ion predict ion inference to possible actions 
of intervention on the part of the 1\A\Ter. In the second example above, 
Pkry (and the 01' first niust realize (via RESULTATWE inferences) the 
potentially undesirable consequences- of Billy's running action (i.e., 
possible hZGCWUGE for Billy) From th.is, the CbI can retrace, lo~ate the 
running action ~chich could lead to such a hTGmtYGE, collect its enabling 
states., then conj-ecture that Mary might desire ,to annul one or more af them. 
Mong them tor instance would be that Billy's feet be in intermittent PkB'S- 
COhT with the ground. From the (\\'ANT (NOT (EIIYSCONI' FEET GROUbLD))) struc- 
ture, a- subsequent ACTION PN3ICTION inference can arise, predicting thnt 
bhryjmight put an end to (PtIYSCOK FEET GROUND) . This jvill in turn ~C~UJ 1-c1 
her to be located near Billy, and that prediction \<ill match the RliSUIJT12TIi7~ 
inicrence made from her directed running (the ncxt utterance input), hitting 
the two thoughts together. 
CLASS 11: KNOWLEDGE PROPBAGATION INFERENCES 
PK1NCIPL;l' : Based on what the 01 knows an actor -to know, it can often 
infer othcr knoideledge whicll must also be available to the 
adtor. Since nost conceptual inferences involve the in- 
tentioris lof actors, this modeling of .howledge is crucial. 
LwPUS : 
**John sa\i Maly beating Pete with a baseball bat. 
(interence) Johj probably s knew that Pete was getting ' hurt. 
**Betty asked Bill ior .tKe aspif in. 
(inference) Bill probably surmised tjiab Betty uasn'! t feel - 
ing well. 
DISCUSSION: 
Nodeling the knowiedge of potential actors is fundamentally difficult . 
Yet it is essential, since most all intention/prediction-related inferences 
n~ust be based in part on guesses about what Mo~ilcdge each actor has avail- 
able to him at various times. The ChI currently models others' howledge 
by "introspecting" on its orm: assuming another person P has access to- 
'the same kinds of information ns the 01, P might be expected. to makc 
some of the sanlc inferences. the 81 does. 
Since the 01 preserks n logical 
connecti.~:ity mng all its inferred structures (by the PJL~SOXS and OFFSPRT?6 
properties of each structure), after inferences of othca types ha,ve arisen 
fro111 so~ne mit of information U, the C?! can retbrn, determine \iho the 
original fact U, locate U's OFFSPI?I?PG (those other nemory structures which 
arose by inference from U),, then infer that P may also be asare of each 
of the offspring. hs with btDTIVATIOW inferences (~\~hi.ch rely on the 
WSULTATIPE inf crcnces from a structure) , mOlr;LT.I)GE .PROPACLATION inferences 
are implemented in the procedure POSTSCAS- whicl~ runs after the initial 
breadth- f ir'st inference expansion by the monitor. 
?.lodeling o thcrs ' ho~cledgc denancls n rich klo\iledgc of .what is normal 
in Tile ~corld. ("does John Smith how that kissing is n sitgn .of gffcct ion?") . 
In fact, all inferences must rely upon default ossunyti~np~ ahoutit nonylit?;, 
sincc most of the Q? s .ho~~~ledgc' (and presumal~ly a. laman' s) csists in the 
folm of general ~atterns, rather than specific relations among specific con- 
cepts and tokens. The next  lass of .infcrcnce 111yl-anents my l~elief that 
patterns, just as inficrences, should bc rcalizcd in 'the 01 1,). - active rrog~nms 
rather tlla11 by rxissive dccltlrat ivc data structures. 
5.10 CLASS 12 : NORMATIVE INFERENCES 
PPINCIPU: 
The 01 lnust mke heavy relia~lce.upon programs wnlnl, encode 
comonsense pattern inionnation about the modeled ~iorld. 
Wlcn the retrieval ~d a sought-after wit of infonilation 
fails, the relevant normality program should be cxccuted 
on (pattern applied to) fiat infohnation to asscss its 
likelihood in 'the absence of cxpl icit inf orhat ion .. 
~uip~s :* **Does Jolm Smith o~m a book? 
J 
(inference) Probably so; middled-class business executives 
normal lg om boob . 
**\Pas John Likely to have been aslc-ep at 3 p~n yesterdmr? 
(inference) bbst likely .not, since he has a qormnl day- 
t'imc job, and yesterday - was a 'workday. 
DISCUSSION: 
- 
There are several lo\(-level information retrieval p.roceclures in the 
01 \;i~ich search for explicit i~~folmat ion units as Girccted by spccif ic 
ixferellce inolecules. Sucll sci+l.ches are on the i?nsl s of - forn~ alone, and 
successesresult in precise matches, while failures are total. If there 
were no recourse for such failures, the Cll would quickly ~rind to a halt, 
being unable to makc intelligent assumptions There must be so~s nnre 
positive and flexible n:echanism to -ameliorate "syntactiLt'-. Lookup failures. 
In the Of, Ellis ability $0 ,make intelligent assumptions is ip?,lementi 
ed by having the lo\\*-level 1-oolup procedures defer control to the appro- 
priate normality molecule (N-molecule) which will perform systemmatic tests 
organized in single-responw discrimination nets, to the unlocatable  in^ 
formation. The goal is to arxive at. a terminal node in the. net d~cre a 
real number between r and 1 is 1-ocated. 11 swc sequence of tests 
leads to such a number, the K-inolccule returns'. it as the asscsscd likeli - 
hood ("compatibility" in fuzzy logic t.eiminolo~.)' (ZI),) sf X beins true. 
Nthough the test in the N-molecules are themselves ciiscrete, they 
result in the fuzzy compatibility. The point of course is that the tests 
can encode quite diverse ad very -specific heurist.ics peculiar to each small 
domain of patterns; For instance, based on hbwn (or N-molecule infcrrab1e.-- 
one N-molecule can call upon otherq in its testing p.rocess!) features of 
either Jo)m or the hammer, we ~voulcl suspect athe compatiba?lity of each of thc 
follo\~ing four conj ectures. to form a decreasing sequence : 
1. John Smith owy something. (very likely, but dependent 
on his age, society in which he lives, ctc .) 
2 John Smith oms a hammer. (probably, ':ut potentially 
related to featurcsaof Jolin, such as his profession) 
5. John Smith oms a claw hammer with a \iooclen handle. 
(maybe, but again dependent on featurcs of John and 
models o'f hmcrs in general -- i .c., IIOK likely is 
any given ,hanmer to have a claw and icoodcn l~andle?) 
4. Jolm Smith owns a 16 0:. Stanley clau hmcr ~ith a 
steel-reinforced \cooden hnndle and a. tack puller- on 
the claw.. (likelihood is quite low unlcss the K-mole- 
culc can locate some specific hints, such as that 
* 
__- -- 7 -- - 
Jolln ~suall~~bui's~o~d equipment, etc . ) 
A successful N-molecule assessrn6nt results in the* creation of .the 
assessed information .as a per~~anent\l, eeyl ici t memory set ructure \cl:hose STRISGTI i 
is the assessed compatibility.. 'This structure is thc normatire inference. 
One is quickly arced by his om ability to rotc (usually quite nccuratc1)-1 
comonsensc conjecture s~ic11 as these, and. thc- process sccms usually to hc 
quite sensitive to features of the entities invol~cd -n tllc conjecture. It 
is my feeling that importahT insights. can bc gni~lecl via a ~fi~rc thorough 
tt 
investigation of the normative inference" process -i 11 huntms .' 
A.not11cr role of K-molcculcs is' menr ioncd in (111) ~ith rcspcct to 
the infercncc-rcfercnce cycle I vill dcscril>c shortly. 
1. 9 sho~ the 
substance of a prototype N-molecule for assessihg dependent). structures of 
the form (0hX 1' )ob (person I' oms object X ). 
s P a rneniber. of a pure cobimunal society, or is it an infant? 
if so, very unlikely that P puns,X 
otherwise, does X have any drstinctihfe conceptual features? 
if so-,..assess each one, form the.product-of likelihoods, ahd call it 
N. fl will be used at the end to mitigate the li.kelihood whlch uould 
normal lgabe assigned. 
is~X l iv'ing? 
if so, is X B person? 
i s Pg a slave ouner: and does X possess character / $ t i cs 
of a slave? if so, likelihood is IOU hut non-zero 
otheruise likelihood is zero 
otherwise, is X a non-human animal Or a plant? 
if so, is X donrestic-in P's culture:! 
if so, does P have a fear of X's or is 
P al lergic 
to X's of .this type? 
if so, likelihood ys 101.1 
otherwise, Iike'laihooci is-moderate 
otherwise, is X related to actions P. does in any special 
way? 
i f so, l i ke l i hwd i s bow, but non-zero 
o therw i se,- 1 i ke l i hood i s near-zero 
otherwise, does X have a normal function? 
if so, does P clo actions like this normal f.unct,ion? (Note here 
that we aould want to kook.& P*s profession, and actions commonl~ 
as60ciated with that profession,) 
if so, Iik'elihood is nioclerateli high 
1 
otherclise, is X a conlrnon personal' item.- 
if so, is ist s value ui thin P'niedns 
if so, likelihood is high 
if not, likelihood is low, but non zero 
otheruise, is X a common household. item? 
i f so, i s P a homeowner? 
i-f so, is X within P.'s means? 
if so, likelihood'is high 
otherwise, likelihood. is moderate 
otherwise,a likelihood is-locc, but non-zero 
Hov we m igl~t,g~-afiwidii~g 
\vhetller pers~n P 'ivv~w :alM 
FIGURE 9 
- 
-The normality-molecule 
disctimination'network for the 
(OWNSq P X) . 
5.11 CLASS 13 : STATE DWTTON 'INFERENCES 
PRINCIPU .. - : bbst interesting States in the worlcl . are transient . Thc 
M must have the ability to make specific predictions 
about themcpecterf (fuzzy] cluration of an arbitrary state 
so that informattion in the CM car1 be kept up to date. 
LXAhPLES: **~ohn handed Mary the orange peel. 
{tomorrow' IS Wry still holding the orang6 peel? 
(inference) Almost certainly not. 
**~ita ate lunch a half hour ago, 
Is she hungry yet? 
(inference) Unlikely. 
IIISCUSSION: 
Time features of states relate-in critical ways to the likelillood 
those states will be true at s'ome given time. 
The thought of a scenario 
~cl~ereln the 01 is informed that hhry is holding an orange peel, then. 50 
years "later, uses that information in the generation of some other infer- 
ence is a bit unsettling! The M must simply possess a low-level function 
whose job it is to preciilct mornal durations of states based on the particulars 
of the states, and 'to use that informatim in marking as "terminated" those 
states whose 1 ikelihood has diminished below some thresl~olld . 
hly conjecture is that a human notices and updates the temporal truth 
of a ;tate only when he is about to use it in some cognitive activity -- ' 
that most of the transient howledge in our-heads is out .of date Wtil \ie 
again attempt to use it in,- say,.some inference. Accordingly, before using 
any state information, the CM first filters it. througlt the STATE DURATIOIt 
inference' proccss to arrive at an updated estimate of the state's likeli- 
hood as a .function of its known starting time (its TS feature,-in CD 
notation-) . 
The *lenlentation of this process in the Q! is as follows: an (NDLTR S ?) 
structure is constructed f6r the state S who% duration-is to be pre- 
dicted, and this is passed to the NDUR specifier molecufe: The NDUR S- 
molecule applies discrimination tests on featur~s of the ~bjects involved 
in S. Terminal nodes m tne net are duration concepts (typically fuzzy 
ones), such as #ORDERliOUR, #ORDERY&U. If a terminal node can be success- 
fully reached, thus locat-ing such a concept D, the property MNU: (Cltnractcr- 
istic time-.function) is retrieved from D's property list. C1I.W is 
a step function of STRE;\GTH vs. the amount of time some state 
has been i~ existence (Fig. 10). From this function a STREhrm is computed 
for S and bpcomes S's. predicted likelillood. If the STREVGTl1 turns out 
to be sufficiently low, a (TI: 3 ?ow) structure is predictively generated 
to make S's 101~ likelihood eQlicit. The STATE.DURATIOX inference thus 
acts as a cleansing filter on state infomation rd~icll is fed to various 
other inferehce processes. 
'r 
I 
STRENGTH 
A typical STRENGTH fu~lction fpr fuzzy duration #ORDERHOUR. 
0 5 T'"-T < WlflAX has strength 91 
WlMAX r 1'--T < W2MAX has strength S2 
m.. * 
Ty-J 2 WnMAX has strength O 
Tlre forinat of a fuzzy duration .coacept's step fu~rction. 
F IGURE '10 
A typical characterlstla 
STTCE'NGTH function for the 
state-duration inference - 
process, 
5.12 CLASSES 14 and 15: -FEATURE and SITUATION INFEPENCES 
PRINCIPLE: bhny inferences can bc bsed solely on commonly sbserved 
or learned associations, rather than upon "logical" re: 
lations such as causation., motivation, and so forth. In 
a rough way, we can compare these inferences to the phe- 
nomenon of visual imagery which constructs a "picture" 
of a thdught ' s surrounding environment. Such inferences 
sllould be made in abundance. 
EMlPLES,:, **Andy's diaper is wet. 
(inference) Andy is a youngster. (FEITL!) 
'** Joh was on his way to a masquerade. 
(inference) John \<as probably wearing a costumeD. (SITUATIOS) 
DISCUSSION: 
Jbny llassociativel' inferences can be made to produce nel*. features of' 
an object (or aspects of s situation) from known features. 
If something 
wags-its tail, it is probably an animal o'f some sort, if it bitesethe mail- 
man's leg, ir.is probably a dog, if it has a gray beard and speaks, it is 
probably an old man, if it honks in a distinctive way, it is probably some 
sort of vehicle, etc. These classes are -?nT~ereiltl-y unstructured, so I will 
say no.more about them here, except that they frequently contribute fea-• 
tures which help clear up reference ambiguities and initial reference fail-. 
ures. 
5.13 CLASS 16 ; UTTERANCE INTENT INFERENCES - 
PRINCIPLE : Based on the way a thought is conununicated (especially the 
often telling presence or absence of information) , infer- 
ences cm be made about the speakerf s reasons for speaking. 
LWLES : 
. . 
**Don't eat green gronks. 
(inference) Other kinds of gronks are probably edible 
**Mary threw out the rotten part of the fig. 
(inference) She threw it out because it was rotten. 
**John was unable to get an aspirin. 
(inference) John wanted to get. an aspirin. 
**Rita like thexhair, but it was green. 
(inference) Tl~e cae'% color is. a negative deature to. 
Rita (or the speaker). 
DISCUSSION: 
I have included this class only to represent the largely unexplored 
domain ot interences &am from the way a thought is phrased. The 01 will 
eventualiy need an explicit model of conversation,. and this model will, in- 
corporate inferences from this class. Typical of such inferences are t-hose, 
which translate the inclusion of ~eferentially superfluous features of an 
obj ect into an implied causality relation (the fig example) , those which 
infer desire from failure (the aspirin examplej those which infer features 
of an 0,rdinary X from features of special kinds of X . (the gronk exq?le), 
and so .forth. These issues will lead to a more,goal directed-model than J 
am current1.y exploring. 
6, Summary of the ~nr'erence -- Component - 
1 ilavc 110~ sketched 16 inference classes \\rllich, I conjccturc, lie 
at the corc or thc hrni~lm intcrcnce reflex. 
The central 11ypotl1esi.s is 
t 11at a lnmlcin lxlg~iagc comprchc~rJer pcrl'o~n~s 1110 1-c suhoo~~sc ious comlmtnt ion. 
on 1 ;c:un i ng st ruc turcs tlim ally other thcory of lan~,u~~c comjlr-clrcns i on has 
vet ;isLlovlc;lgccl. \\hen the currcnt 01 is turncd loose, it vill often gcn- 
ercltc u;.\,.ll.ds of 100 infcrcnccs €roc! a l'nir l'y I~;ulcll stimulus swh as ". Jolm 
.a\-c '.>11*> t!lC hooL." 
Id~ilc host ;II-C i ~.ref-~'ut;il)le, tllcy :!rc for the lqost 1l:~l-t 
.- 
.yi it~ m~J.inc. anJ "~lint crest in~" to a. cr i t iknl i~ul!:ln olwcr~cr, :mrl :I re, :i Strl- 
1 I 
tiic fact, tc-1. But C!IRII~C th~ contest ;~cd tl?~ 11:;n;ll !~cCo:omc~ sal icnt - - 
cl-cll iruc ial - - i 1 11 i 1 i? cc ! 1- Gl11 ScC IlO otl'lcl' 
;.ycnallis::. fol 
ltlilling contc'xtl~:il.inte~ictiop of inro~m;ltion th:~~ this 
s; on t ;incouS, subsonsc i ous g-op i ng . 
1 -s!.olilJ 1lcrh;lr$ Irricll:' :~J~rcss thc ndcquacy ;md a;\;d ic:~hility of tl:~ 
infc l*c:lic cl ;~.;scs in the cu1.1-cnt ~:loJcl . Tl~cl-c. is 1111tlod3tcJ 1 y :I mur.!vr 
1 
A:, 2, 4.' o ill intcrcst in!; ~IIS~*.C'IICC CS:ISS~S 1 c iy,norcd or 
lC. Lilt 1 fcc1 t!l~ iiu?!lj~r js not largc, a112 tl:;it otllcr clnsscz \:ill 
.-;:~!*;*.'i't to thc' ~;~:ii' 5OlStS C< ~!'~t~~il;it izat ion ns Llc.;cribc~! !lcrc. 
\;?~ilc the 
* icicc:C!il cs:tiiij\lcs I 11ave used to il1~1str:itc t!:c ~3rious i~lferc~lc'c~ ~CCZ'C not 
z 
sr:'l..~i-.~~ f~-OP ;in!- co!;crcnt Jozain suil~ ::r; ;! "l:locks l\arld" j -- :~nd this is 
1 : he - - 1 hcl iel-c t!lc net rcsult (t!~csc i~ifcrencc .classes and their 
ior.t!-ol .-;~ructurc) !~o\-c cc11t1-31 to any rc.;t~'ictcd do~ain \;.hicl~ inl-011-es 
1-olit ioiial ;IC~C~B. It is 3 current ch:~lle~.gc tp find ~11511 a restricted, yet 
intcrc~~ting, rioxain to \i!lic!l thcsc idcns can !?c tr:ms:--1antcJ 2nd ;qlplicd in 
5li;;:tly !.\orc i:oal-Zir'ccteti - cnvil-onrl~cnts. 
7. The 1nf.erence-~eference Relaxa,tiqn cyele in Conceptual - Memory, 
1 I L U I3 1 L UL(uC*lLly lllLclll'1 lJlC Ul 1113 LiLIILLIIIC.ULl~l)t 1 UCJI L 1 1,)' 1 Ilfi 
tile rcfcrcrlt 
(corlccpt or tokcrl in incr!lor!*) o F ;I l;~nguq:i.. construct ion [noun 
group, 1-ronoun, ctc. ) . Yct or1 :ittcnt ivc 1 i stcncr sc ldonl k1.i 1:; cvcnt~utl ly 
to identify t!le intcndcd i-cfcrent, and hc rcill sclrlo111 losc inro1pl;~tion 
becausc of the rcfcrcllcc delay. 
I:urthcn:~orc, incorl-cct. I-c ~CI-CIIC~ tlcc i s ions 
arc en~:~iric;~ll!' t'cli and ihr l?ct\iccn. 
1 hcl ic~c th:rt t!lcsc -pl~cnonsnap ;lrr 
int iln:~tol?- rclatcd to tllr i~lrcrcncc rcflcs. 
In t1:c C.1, initial rcfererlcc :~ttcnq,ts :Ire n!:ldc Tor concepts and to1:cns 
iro~a Jescript ive sets - - *collect ioris 05 fconccptu:~l 
fc:~ t urrs i. lcancd from i111 
rittcrancc by Ri cshcck s conccp tuaI annlyier (I<Z), I ip . 11 ill us t rates 
tllc llescriptive sct for thc "tRr big i-cJ Jog ~ho :~tc the 1)ircl." 
Potent i:ll 
inel,~oiory conccl~ts :111~l tol<m 1-cfcrentt; :ire it 'i S i 1 ! n intcrsc'ct ion rc;lrd; 
proccciurc \chic11 locntcs I:?CI;IQ~?~ obi c.c t s $.-Ilosc 
fcn tul-e* silt i s r?~ :I 11 t l~c 
unique identification of somc memory entity,. (11) a i'uil~lre to 10c:ltc on)- 
- 
satisfactory cntitics, or (c) a sct of'candidntcs,,one or rihicll is the 
probable referent. Case (a) requires 110- decision, but (1)) clnd [c) do 
ID either case, a nm, possibly temporary tokcn 1' is created ail& for 
? 7 
case (b) , 1- receives as its initial OCCU~TCII~~ set tllc descril~t ive set 
identically. In case (c), ld~cre a sct of candidates can 1)e locat'cd, T 
receives the set of features llVinn in the intersection of a11 candidates' 
occurl'enec sets '(this will I3c at 1.eAst tllc dcscl'iptivc set). In either 
case, the 01 then llas an internal token, to 4i0,rk ai th, :~llow ing tllc conCcp- 
tuaf graph in bhicb references to it occur to be i-.cntativcly Integrated into 
Tl~c infcl-cncc ref 1 cs I have described thc11 .gencratc.s ;ill t hC ~;ir jws 
infercnccs , and eventual 1.)- r~gul-ns to its quicsccrlt st3 tc. Onc 11>?>1-0dLlc.t 
of tile infcrcnc ing is that -thc occurrence zct of cach' memory obj cc t involv- 
ed in thc oiiginnl- structures r\i 11 cmcrgc \\i th a possjl~lc cnhanccd- occur- 
scnc'e set rihich may contain inferred infonnatio~l sufficient citl;cr (lj to 
idcit i 1-y thc *tc~iq?ol-a~-y tokcn of cotcgor). (b) cll~o\-e , or ( 2 ) to nnrl-or, thc 
sct o'f snnciidntes cls.ioc; 1:lecJ ici th the tcmporary tohcn of cntcgor?* 
c-) I hoyc- 
X: I ( I'SA 'X #DQG) 
(COLOR X $RED) 
(RELSI ZE X ZILAIIuc; l 
( 0Y: (I NGES-T X 
( (ISA # t;BIRD) (REF # *THE*) f 
I I (ISA P -~;~OUT~I 1 
I (ISA # fiSTOMACHI 1 
1 
(TIPIE Y (ISA ): #T~~IEI (BEFORE fl #NOW1 'I I. 
1 
I 
(REF X *THE#) 
1 
Descriptive set for "The big red dog who ate the bird" 
FIGURE 11 
- 
an example of. a desc-riptive set. 
fully* to exactly one) . 
Tlius , w11e11 tlic inference rct-lex has ceased, the 
re- applies the 1-efei'ence intersect ion algor i'rhms to cnch untdcn.t i f ied :okcn 
to seek ~ut any inference-tlhrif,icd refcrenccs. Successful idcntifjrcations. 
at this ppint result in thc merging (by the ~sao structure merger mcntioncd 
earliery of the tempmar;\- token's occur.rcncc set with the identified tok~n's 
occurrence set, thus preserving all informa~i~n collected to that point about 
the tcmporar)- token. (Implicit in the merge opcrat ion is the subst-itution or 
of - all references to the. temporary tokc11 by references to-the identified 
one. ) If, on the1 other -hand, the results of inferenc ing serve only to 
narrow the ~andiclate set of case (c,) , the occurrence seas of the rcmaip.lng 
candidates are re-intersected, and :if this inc~eases the size of the set) 
the set is reattached to the tempoyaly tok- In citlicr case progress 
Ims been madc. 
No\( .comes a key point. 1-1. any referents were in fact idcnti fied on 
this second attempt (makingetheir entire occurrcncc sets acccs~ib~c), or 
if any candidate set clccreasesa caused ne\G features to ix: associatcd rtith 
the temporary tokens, then there is the possibiiity that. more infcrenccs 
(~si~ich can  lie usceof the nerdy -:lccessible features) cub. l)e 11lade. Ine C! 
thus re-applies the inference reflesato all memory structures lclshich im-c 
produced on the first pass. (The manitor is condit ioncd not to dupl icatc 
work already done on the first pass. ) But a potential byprntluct of the- 
second pass is further feature gencrat ion l<nlcn c:ul again ~:estr ict candi - 
date sets or produce pr~sltive idgntifications. This infcrencc-refcl-cnce 
interaction can proceed bnti 1 no ners narrouings or ident if icati,ons occur; 
hence thea term "relaxation cycle." Pig. 12 illustrates TKO cxamples 'DL 
this phenomenon rchich are handled 'by the current 01, cuid :\gpcridlx jc son- 
tains the computer trace of the second example. 
EXAMPLE. 1 Andy Rieger is a yoUngst,er, 
An.dy Mooker is ah adult, 
Andyf's diaper is wet, 
IWERENCE-REFERENCE: Andy Rieqer's diaper is wet, 
EXAMPLE 2 John was in Palo Alto yesterday, 
Zenny Jones was in Palo Alto yesterday. 
Jenhy Smith was in France yestesday. 
Bill lpves* Jenny Jones. 
sill saw John kiss Jenny yesterday, 
INFERENCE-REFERENCE, FIRST PASS : It was Jenny Jones 
-- 
that John kissed. 
1NFEkRNCE-REFERENCE, SECOND PASS: Bill felt anger 
toward John 
FIGURE 12a 
Two examples of inference-reference 
inte*ractiu~l, 
starting 
i nf erence qyeue 
SUBPROP 
EXTRACTOR 
- 
- -~ -- 
- 
Multiple reference-inference interactiori. passes. 
- 
==> (* ** * m.. 
- 
(ATRANS 1 .* *I * * 
M +$/ temporar y token 
Me ( for unes t.ab I i shed INFERENCER 
ttPETEI.7 *"# reference 
9 $~cmlhSO 
/ 
) ~r~e4-s 
C @ 
-. 
# a-' fir~t pass &' 
I 
second 
'-1 C----- (ISA # #PERSON) pass 
FIGURE 12b 
RE-REFERENCER 
The inference-reference relaxation cycle. 
+----- .(NAME # PETE) * * * * .am *.I 
... 
C----- (neu inferred featu?el 
+---+- (neu. inferred 
feature) >/ ... sk.1 
\ 
t 
(Y* 
more new information, but 
this time about #PETE17 
8, Word Sense. Promotion and, Implicit Concept _ _ Activation 
in the C~nceptua~l Memory 
ihothcr ~yproduct of the generation 01 an abundance of probabilistic 
conceptual patternss from each input is that many related concepts and 
tokens implicitly in~+olved* in thc situation are activated, ora ltouched ." 
This can be put -to use in two rpys. 
First, %mplicitly touched concepts can clarify 1%??1at might otherwise 
be an utterly opaque subsequent teference. If, for instance, someone says 
(outside of a parvicular context): "The nurses were nice", you will prob- 
ably inquire "\tIlat nurses?" If, on the other hana, someone says : ''John 
\ias ~un over by a milk truck. \\%en he woke up. the nurses were nice" 
you 
will experience neither doubt about the referents of '"the~urses", nor 
suvrise at their mention. I presume that a -subconscious filling-out of 
the situation "Jolm \\;as run over by .a milk truck!' implicitly activates an 
entire set of coneptually relevant concepts, "prechnrging" ideas of 110s- 
pitals and thdr relation ro patients. 
Other theories foundcd more on. concept associationism than conceptual 
inference have suggested that such activation occurs through ~ord-~iord or 
concept-concept free associ,ations (see (A2) and (91) for instance) . hhile 
these more direct aseociations play an undoubted role in many language 
functions, it is my belief that these straight asspciative phenomena are 
not fundamentally powerful enough to explain the kin2 of language behavior 
underlying the nurse example. 
It is more often than not the. "gestaltt1 
meaning context of an utterance which restricts the kinds of meaningful 
associations a human makes. 
In contrast t~ the nurse example above, 
most people would agree that the reference to "the nurses" in the fol1'01c- 
ing situation is a bit peculiar:, 
In the dark of the night, John kid ~iallos~ed 
through the knee-deep mud to the north \call 
of the deserted animal hospital. 
The nurses 
were nice. 
A simple hospital-nurses associationn~odel cannot account tor thls. Un 
the other hand, those concepts touched by the more restrictive conceptual 
inference patterns would presumably be quite distant from the medical 
staff of a hospital in this example, thus explainirig the incongruity. 
Relat-ed to this idea of concept activation tlnough conceptual infer - 
ence structures is another mechanism wllich, I -presume, underlies a compre- 
hendersr ability to select (almost unerringly) the propeF senses of words 
in comext dvr-ing the 1:nguist ic malysis of each utterance. 
This 
mecl~mis~r is frequently called -- word sense proribtion, and its 
exact nazure is one of the major ~~~~~~s of language analysis. It 
underlies our ability to avoid -- almost totally -- backing up to reinte't- 
pret words. It is as though at each moment during our comprehension Ice 
possess a dynamically shifting predisposition toward a unique sense of 
just about any word we are likely to hear next. Fig. 13 contams some 
illustrations of this phenomenon. 
I have ~nly a thought (which I plan to develop) on this issue. At 
each instant in the Of, there is a porcerful inference momentum which is 
the product of conceptual mferences. -Obviously, these concepts rillich 
the inference patterns touch will correspond to senses of -- \cords. These 
senses can be "promotecl" in the same way in-tpl-icit activa'tion promotes 
certain referents. This is a partial expl-anation or word sense promotion. 
Suppose, however; that in additio~ the 01 had an independent parallel pro- 
cess wl~ich took each inference as it arose and-mapped it back into a near- 
language "proto-sentence", a linear sequence of concepts which is almost 
a sentence of the lmguage, except that the actual word realizates of each 
concept have not yet been chosen. 
In other words, a generation process 
(see (Gl) for exam~le) would be applied to each inference, but would be 
stoppeci short of the final lexical substitutions of word senses. 
By pre- 
cnarging all tne senses of the various words ~+llicll could be substituted in 
such a proto-sentence, the 01 would, have a word sense "set" which ~coulll be 
a function of the kind of restrictive inferential context ~dlicli I ice1 is 
so vital to the prQcess of analysis. 
This. idea is obviously cb~puf at ionally 
exorbitant, but it might model a very real mecl1anisnl. he often catch our- 
selves subvocal izing \+hat we expect to hear next (especially --- - .-.. 1 istening 
to an amyingly slow speaker), and this is tantalizing evidence that sonc- 
thing like a proto-sentence generator is thrashing about ups.t:tirs. 
EXAMPLE, 1: (CONTEXT) John asked Mar'y which piece of fruit 
she wanted, 
(SENSE) Mary ~icked the apple. 
versus (CONTEXT) Mary climbed the apple tree. 
(SENSE) Mary picked the apple. 
EXAMPLE 2: (CONTEXT) John was in a meadow. 
(SENSE) The qrass smelled good. 
versus (CONTEXT) John Was looking forward to getting high 
(SENSE) The qrass - smelled good. 
EXAMPLE 3 : (Riesbeck ' s example (R2) ) 
John went on a hunting trip. 'He shot two bucks. 
It was. all he. hadl 
FIGURE 13a 
Examples of word sehse promotion, 
I 
( ......... ;. 1 i 
profo-sentences concep tua l 
I 
uhich are the various structures 
days each inference back into 
nr i ght he e~pressed pro to-sent,ences 
by I dnguage.~ These 
I nvo l,ve many a l tern? t i ve 
uord senses.. 
hlappi~ig i~~fereiltes back illto proto-sente~~ces, activating Inally word selises. 
Mapping in£ erences back into 
proto-sentences, activating many word senses. 
9. . Conclusion . - - -- - 
Any theary of -language nust also be a theory of inference and memot-y. 
It does not appear to bc possible to "understand:' cvcn the sinll~lcst of 
uttcrallccs in a contextwll}: meaningful \say in a system in \\.i~i~h language 
fails to interact wit11 a language-free belief system, or in a system idlich 
lack n spontaneous inference rcflcs. 
One \;cry knportat thcorctical issue concerns csnctly ]low ~nuch "inferelice 
energy" is expended. before the fact (prediction, expectation) versus !IOW 
111uch is expejded after the fact to clear up specific yrobla~a of hov: the 
utterance fits tile context. ?ly belief is tlmt there is a great deal 01 cx- 
plofa'to~y, essentially undirected iinfercncing which is frequently ovcrlookcd 
and dlicll cannot be repressed because it is the language-related manifesto- 
tion of tile ~~~uch broader nlotivationtll structure of the b~ain. I1athe.r than 
argue at an unsubstanriatable neuropl~ysiolog i cal 1 eve1 , I ]lave c'onlp i led 
evidence for this hypotllcsis ~sithin the domain of llmguage. I beli~\~e, 
ho\i7ever, that spontaneity of inference pervades all other modes of pcrce1:tion 
as well, and that quantity -.-as much as qualitv #- -- of spontaneous inference 
is a necessary requirement for general intelligence. 
APPENDIX A: CAUSAL .CHAIN EXPANSION ~OMPUTER EXAMPLE 
WORK I NG- "FORWARD" GENERATI NG (CONTEXT) Bill swiped Maryf $ book. 
RESULTAT.1 VE I NFER~NCES FROB (CAUSAL) Mary kissed John because 
THE PROPEL UNDERLY I NG "HI T" : 
he hit Bill. 
* 
resul tative 
I 
* 
resul tative 
1 
* 
resul tative 
1 
* 
resul tative 
1 
* 
resul-tat ive 
I 
* 
John propelled his hand toward Bill 
John's hand came into physical contact u;th Bill 
Because i t Llas. prop l l ed, the physical contact' was probab I y 
forceful 
Bill probably suffered a negative change in physical state 
Because Bill suffered a ne ative-change; and Mary felt 
a nkgativa enlotion toward 8i l l at the tinla, nary might 
.trave exper i'enced a posi ti v6 change in joy' 
Becausa Hary may have exper i enced th i s posi t i ve, change, 
a,nd becanuse i t was John uhose act I on. indirect l y caused her 
p.ogi t'i ve change, she. might feel a posi t ivye emot i o,n toctard John 
WORK E NG "BACKWARDU GENERATING 
CAUSAT INE I NFERENC~S  FRO^ THE 
PH.YSCONT UNDERLY I NG I'K I SS" : 
/ \ 
POINT OF COYTACT: 
Mary;probab! y feel s a 
pos I t i ve emot \.on toward 
John. 
\. 1 
* ,Mary* s placing ,her dps in contact wi th dohn was probably 
T 'caused -by Mary 'fee I ing a pqs i t i ve enlot i on t owarcj John, 
causa t i ve 
I 
* Marg* s l ips mre, i n contact ui t.h. dorirl 
Figure 5-21. 01le espla~~atio~i of ylly Mary's kissillg 
was related to-Jolln's liittiog. 
----- 
(MARY K I SSED .JOHN BECAUSE HE HI T 0.1 LL 
(I (CON ( (-CON ( (ACTOR (JOHN) c=> 
(*PROPEL*) OBJECT (uPHYSOBJ* SPEC 
(yU*)) FROM (JOHN) 'TO (BILL)) TINE 
(TIt101) 1 <= ((ACTOR (*PHYSOBJ% SPEC 
(XU#)-) <z> (*PHYSCONTs VAL (BILL))) 
TIME (TIN05))11 cs ((CON ((ACTOR 
(MARY1 <=> (*DO*)) TlflE tTIMB2) SPEC 
U < ( (ACTOR (*LIPS* PART LMARY) 
ce> (*PHYSCONT* VAL (JOHN) 1) TIME 
(TIM00 ((YAL f-W)i 
(TIN01 ((BEFORE TIM02 X))) 
(TIM02 ((BEFORE TIM00 XI)) 
ntence. I 
shodn nex 
1 
1 
t s under'l y i ng 
t. 
------------ 
Thi-s is the inpb 
donceptua l graph 
((CAUSE ( 
(C0013) ( 
(COB161 1 I 
(TIME ( 
(#MARY TI 
(CB017) 1 1 
(TIME , ( 
START I NG I NFERENCE QUEUE: 
( (X 1 e 0 C0035W 
ABOUT TO APPLV ~~~~ustl I u. C0035 
C0035: (CAUSE (CAUSE (*PROFEL* #JOHN1 
C0013 #JOHN1 #B I LL1) 1*PHYSCON T* 
C0Q13 #B-I LL1) 1 
(CAUSE (*no* #RARYl 
C08101 (aPHYSCONT* C0021 
#JOHN1 1 1 1 
I NFERR I NG t C0028 
ABOUT TO APPLY .aCAUSE2 TO C00S 
C0035: (CAUSE (CAUSE {*PROPEL* #Jot 
C0013 #JOHN1 #BILL11 (*PHYSCONT* 
C0013 RE![LL112 .(CAUSE (*DO* (cflARY1 
G0010) (*PHYSCONT* C0021 #JOHN1 1 1 
1 NFERR I NG: . C0034 
RECORD I NG CAUSqL RELATI ON: 
(C0824 . C00321 
Th'is is the'partially integrated ntenlqry 
structure, asfter references have been 
established. No reference anlbiguity ,is 
asSuhed. to exist for this example. 
C0035 is the resu! ting mehlbry.structur-e 
for this utterancp. 
-------- 
We suppress al! but this structure on' the 
starting inference queue. 
(sue wi ' l be see: ng 'about one. fourth of Y he 
original trace output fcp this example) 
ABOUT TO .APPLY oPHYSCONT1 TO C0032 
'C0032 : (%PHYSCONT* C0021 #JOHN1 1 
I NFERRI NG: (*PIFEEL* #MARY1 UPOSEMOTI ON 
#JOHN1 ) 
ALSO GENERAT I NG;. .-(7 i tl~ CB039 30817) 
Here, the CAUSE inference nrg l ecu.1 e i s 
injecting the two subconceptual.izations, 
A and 8 In Fig.. 1 into the, inference 
stream. 
The causal structure of this conceptual~zatian 
indicate'd that a path should.be found 
relatina strycture C to structure 0 in 
Fig. 1 This is noted.. C0024 corresponds 
to C, ~~32 to D. 
-------- 
Here, the causative infsrence that Mar s 
ki,ssing uas probably cau.sec1 by her fee Y' ing 
a positive emotion touard- John 4s made. 
ABOUT TO-APPLY sPROPELl TO C0024 1 Because the PHYSCONT of John's hand and. 
~0024: (rPROPEL*-#JOHNl C0048 #JOHN1 
I 
Bi I I uas caused by a PROPEL, RERORY here 
-#B.I LL 1 1 ! makes the inference that it uas' a forceful 
I NFERR I NG: (*FOJI.CECONT* C0046 #B I LL1) con t ac t . 
ALSO GENERATING: (TS'C8052 C00161 1 
ABOUT TO APPLY' FORCECONT2 TO C0052 
1 
Since on. of the ~bj~cts~involved in the 
C0052: (*FORCECONT* C0048 #BI LL11 
i 
F0RCECO;dT was a eer son. NEflQRY 
red i c t s 
I NEERRI NG: (NEGCHANGE #BI LL1 RPSTATEI a sma l l ?:EtCHAIiCt on hi s part. 
he degree 
ALSO GENERAT I NG: (TIME C0055 C0016) 
B 
of the EGCHANGE i s dependent un6n the 
type of object uhich cine intd contact 
1.11 th hin. . 
\ROUT TO ,APPLY oNEGCHANtitZ IU CUUb:, 
33055: (NEGCHAN6E UBI LL1 UPSTATEm) 
I NFERR I NG: (POSCHANGE #MARY 1 #JOY 1 
ALSO GENERATING: (TINE C0061 C0016) 
ABOUT TO APPLY ePOSCHANCE1 TO C0061 
C00613 IPQSCMANGE #t"lARYl #JOY) 
4 NFERR I NG: (MFEEL* #MARY1 #POSEflOT ION 
#JOHN1 
ALSO CENERAT I NG~ (TS C0068 .CB0.16) 
-------, 
CAUSAL EXPANS1 ON ACHIEVED: 
(COO24 C0032) 
CONTACT POINTS ARE: (C0068 C00391 
MERG I NG: 
C0068 1 (*flFEELw UIIARY 1 
#POSMOT I ON 
#JOHN 1 ) 
MOTION 
------- 
* ! EXPANDED-CAUSALS 
* (CAUSAL-PATH eC0024~ eC0032) 
(C0024 COO52 C0055 COO61 C0068 Cb032) 
ASET: 
C0054 : 
C8828: 
C(3025: 
(CAUSE # CBa521 
(C/\USEh # C8026) 
(TIVIE # C0016) 
RECE.:.;CY: 9908 
TRUTtl: T, STRJNGTH: 1.0 
REASOi'lS.: . 
Cj3023: (CAUSE C0024 C0026) 
OFFSPRING: 
CBQ70: (CALISE C00 1 C0068) 
CQO62 : (*MFEEL* #F; ARY l UPOSEMOT 
#JOHN1 1 
C00G5: (CAUSE C005S C00631 
C0063: (*MFEEILa 1IBI,/L1 UNEGEMOT 
# JOHI41) 
C0054: (CAUSE C0024 C0052) 
C0r353: (TS COOS2 C0016) 
C0052: (*FOfiCECOrJT* C0848 1;BI LL 
ISEEN: (sPROPEL1) 
1 ON 
'ION 
,11 
I Here, because r13r was. fee 1 i ng a nega t i ve 
Y emotion touard 8i I at the tirtle, when Bi l l 
underwent a smd l l NEGCHARGE ; the pr ed i c t i on 
can be made that Maru mau'.have ex~er ienced 
a degree of joy. 
Looking back the causal path which lead 
to Maril's likelg change in jog, the 
POSCHARGE inference nio l ecu l e. d i scclver s 
that it .was an action on John's part 
which wag nrost directly responsible for 
her joy. The'inference that Mary niight 
have started feeling a positive eniotion 
toward John is niade. 
-------- 
As this last inference is ,mada, the 
infe~ence evaluator notices that the same 
inforniation exists elseuhera in the memory. 
This is a point of contact in inference 
space. It is furthermore noticed that the 
t~lo flFEEL structures. join a causal path 
bgtween two,struc t,ures which have been 
rejated causal ly -by language. T.he two 
tlFEEL structures are Oierged into one, and 
this event i noted as a causal chain 
expansion, To the Ief t, X0068 and C0039 
are the contact points, C0024",and C0032 
are the two structures which have nou been 
causally related. 
-------.. 
Inference proceeds, and fjnal lg stops. At 
that point, 1.le took a look at the structures 
I ing along this explained causal path. 
~b2s is the original PROPEL stricture 
GO032 is the PHYsCONT-li s s.tructure. the 
serv i ce f unc t i on--CAUSAL- 6 ATH wmi l l track down 
the causad l i nkage for -us. Ths causa l cha i n 
consists of .the six stcucture? to the left. 
-------- 
This is -the original PROPEL. During the 
process, but not shown, C0048 was detected 
as uns ecified', and filled irin4as John's 
hand. hot! ce on. the REASONS and OFFSPRl NG 
semis the results uF' other inferencing uhich 
was not discussed above. 
-------.---------------------------- 
C0052: (*f=ORCECONTm C0048 #BI LL1) 
ASET: 
CD077: (WANT #JOHN1 #I 
C0857: (CAUSE # C8055) 
CE!i3S4: (CAUSE t0024 #I 
CO053: (TS # C0016) 
RECEfiCY: 15416 
TRUTH: T, STRENGTH: 0.89999999 
REASONS : 
C0@24: (*PROPEL* #JOHN1 C0048 .#JOHN1 
#BILL11 
OFFSPRING: 
CQ078: ' (T-S C0077 C00161 
C007.7: (WANT #JOHN1 C00521 
CG"057: (CAUSE COO52 C0055) 
C0d55: (TIME COOS5 C0016) 
C0855: (NEGCHANGE HB.1 LL1 #PS 
I SEEN: (bFORCECONT2) 
CBOSS: [NEGCHANGE #B I LL1 #PSTATE) 
ASET: . 
~0079: (WANT #JOHN1 #I 
CQ067: (CAUSE # C08591 
C0057: 
CB056: 
RECENCY: 
TRUTH: T, 
REASObJS : 
C0852; 
I 0008: 
OFFSPR I NG 
coGIsG1: 
C0073: 
C0067: 
(MUSE !? C0BS1) 
(CAUSE # CBB631 
(CAUSE C0052 #I 
(TIME # C00161 
.9833 
STRENGTH: 0,85500000 
(WANT   JOHN^ -C8855) 
(CAUSE C0055 C0059) 
C0066: (CAUSE ,CBBSS C08G1) 
C08G5: (CAUSE C91055 C0063) 
CBGJG4: (TS CBBr33 C0016) 
C0063: (aMFEELs #B  ILL^ #NEGEflOT I ON 
#JOHN1 1 
C00G2: (TIME C0961 C0016) 
C006 1 :. (POSCHANGE #MARY 1 #JOY 
CB062: (TS' C0059 C0016) 
C8059: (WANT #B I LL1 %0058). 
1 SEEN: (eNEGCHAJGE3 QNEGCHANGEZ 
QNEGCHANGE~) 
---------------------------------- 
CBB61: (POSCHANGE #MARY 1 #JOY 1 
ASET: 
C0070: (CAUSE I# CQ0681 
C09G6: (CAUSE C0055 #I 
C08G2: (TIME # C0016) 
RECENCY.: 24616 
TRUTH: 7, STFIENGTH: NIL 
RE.ASOI*JS : 
C00S5: (NE'GCHANGE #BILL1 #PSTATE) 
101.37: (MFEEL UMAR'YI UNEGEROT I ON 
OFFSPRING: 
here is the FO9CECCINT uhich uas inferred 
from the PROPEL. 
This is Bi 11's l ikelg (smal 1') chan e in 
PSTATE which resul ted from 'the FOR ECUNT. 
t! 
This is ihe inpgrtant inference that 
Bi l I' s NEGCHAliGE may have cause a sma l I 
degree of ha pinsss in-flarg, Notice that 
one .of the R ASCX was assunled to be the 
€ 
case beforehand- (18137). 
C0070: (CAUSE C0061 'C00681 
C'B069: (TS C0068 ~0016') 
CB068: (*MFEEt* #MARY 1 #POSEflOt I ON 
#JPHNl) 
I SEEN: (ePOSCHANGE11 
-----------b----------------------- 
Here, Mary is feeling a positiue.en\otion 
C0068:  FEEL* PRARYl #POSEHOT ION toward John, uhose act ion i nd i rec t l g caused 
#JOHN1 1 her jog, This structure is the point of 
ASET: contact, and is the structure which resul ted 
C06155:. (WANT ,#JOHN1 '171 fr'orn the n~erge. Elstice that its STRENGTH 
CO040: [TITIE # c0017) has assunled the hi yher *STRENGTH o'f the tuo 
C0044: (*flL[IC* # C0041) structures uhich were merged. 
C0047: (CAUSE # C0032) 
CQOTB: (CAUSE ~0061 #i 
~8863 
RECENCY : 
TRUTH: T 
REASONS : 
as- #- C00l6) 
27366 
STRENGTH: 0.35000800 
UFFSPR I NG: 
C00874: 
COOS6 : 
~0085 : 
ISEEFJs NIL 
TS C0085 C00161 
TIME C0085 C0017) 
WhNT #JOHMl C8068) 
'Ast 1: 
COO88: (WANT #JOHN1 #1 
COB71 : ILIAbIT #MARY 1. #I 
~0047: (CAUSE C8068 #I 
C0046: (CAUSE pl C0044) 
C802lr: (CAUSE C0029 .#I 
C0033: (TITIE # C0817) 
RECENCY: 12016 
TRUTH: T; STRENGTH,: 1.0 
REASOPIS : 
C0034: (CAUSE C0029 C0032) 
This is the original PHYSCONT-lips structure 
uhich lead, bia a causative inference to 
I the precliction that Nary may-have felt a 
1 positive enlotion touard John, 
OFFSPR I r\lG : 
C0083: (TIFIE C0088 C0017) 1 
CwS8: (WANT #JOHN1 C0032) 
! 
jS0072: ('T I tlE C0071 C801'7 1 
COB71 : (WANT #MARY.l C0032) 
C0047': (CAUSE C0063 C0032) 
C8946: (CAUSE C0032 COB441 
CQcj45: (TS '~0044 GO0171 
COd44: (*MLOC* C80G8 C0041) 
C0040: (TIDE C00G8 C8017) 
I SEEN: (sPHYSCONF2 sPHYSCONT1) 
This WqNT is a prediction that one reason 
Mqr may have k i ssecl John i s so that he 
uou d knob1 she felt,a positive motion Y 
toward him. 
This ULOC represents the inference tbt 
John probably nolrl knows that Mary MFEELS 
a bositiwe enlotion toward him. 
APPENDIX B 
INFERENCE-REFERENCE RELAXATION CYCLE, COMPUTER EXAMPLE 
This computer example illustrates reference-inference, reference-inference inieraction (two 
inference passes). Hearing the input "8111 saw John kiss Jenny.", MEMORY-is unable to decide 
upon the referent of "Jenny": it could be Jenny Jones or Jenny Smith. MEMORY therefore 
creates a temporary token hav~rlg as features all the common features of Jenny Jones and Jenny 
Srnlth. By inference, MEMORY is able to decide upon Jenny Jones. At that point, the temporary 
token is merged inlo the concept !nr Jenny Jones, and a second pass 01 inferencinr is initiated. 
However, on the second pass a new inference arises: because Bill loves Jenny Jones, and he saw 
John kiss her, he (probably) becams angry at John. This inference was not triggered on the first 
inference pass.because beiing loved by Bill was not a common feature of both Jennys, and hence 
not accessible then (ie. it had not been copied to the temporary token's occurrence set). 
The example begins with a few lines to set the scene for MEMORY, Inferencine on these 
setup lines (which is normally spontan~ous) has been suppressed for the sake of simplicity in 
this example. 
_--- - ------- ------- 
JO!+i WAS 114 PAL0 ALTO YESTERDAY 
( (sLQC* (BJClHNll (kPALOALTO1) 
(TIRE - (CBe8l)l) 
CBi?BZ 
------------------- 
JEf;?!Y JOIJFS WAS IN PAL@ ALTO YESTERDAY 
( (+LOClrc WJEPJ'IYZ) (#PALOALTO) 
(TlVC - IC0034))) 
CCB05 
This exaaple illustrates reference-inference, 
reference-inferencq interaction. That is, 
MERORY is unable to establish a reference, 
so i t creates a temporary tbken, and proceeds, 
wi th. inference; I nferenc I ng generates new 
infornation. which solves the 'reference, so 
more inferencing can be under taken. However, 
becal~se features of the referent are 
accessible~on *the second inference pass. 
,neu inferences are possible, 
____________------- I To the left, NmORY is reading in some 
JENNY SBI TH WAS IN FRANCE YESTERDAYt 
( t*LOClr (#JENNY1 (#FRANCE),) 
(TlflE - (C000711 
Z0808 
1 
Bl LL LOVES JENNY .JONES 
I (*MFEEL* t#BILLLl (#LQVE) (#JENNY21 1 I* 
c,o0 1 0 
----..---'-----I---- 
BILL SAW JOHN KISS JENNY YESTERDAY * 
COPY I NG ~0tlfl0N.. FEATURES Tn C0015 
FROM (#JENNY2 #JENNY 1) 
4 (dITRANS* (BBI'LL1) ( (CAUSE ( (*DO* 
(#JOHN1 I#UNSPECIFIED) 1 (TIME 
(C0011) 1 1 ( (rkPHYSCONT* (C0012) TC001S):I 
(TIME (C0011))111 tC08181 (C0021)) 
(TI flE - (C0011) [ I NST ( (*LOOK,ATs 
(#BILLTI (Cet015   JOHN^)^ (TIME - 
i~formation uhiih is relevant to this 
demonstrat ion. Each of these inputs uould 
normal ly pr-oduce inferences as i t i's processed, 
but inferencj ny has been suppressed for ' th8 
first four'sentences o'f this example. The 
four sentences are shown uith their ,partial 
inte rations and final structures; CW2, 
~0008, C0008, C001 R. 
The synopsis of this short plat is as fol lows: 
There are, two Jenn s: Jenny Jo~es and Jennb 
Smi th. Bi l l loves Y enny Jones, John and Jenny 
Jones w6re in Palo Alto yeste day, Jenny Smith 
uas in France yesterday. The $ l i .ax-bornes 
when Bill sees John kiss Jennd. It is MEMORY'S 
job 20 figure out which Jeriny. MEMORY wi l'l 
decide upon Jenriy Jones, then re- i nf e~ence 
and infer that 81 11 probably got-angry at 
John-- something uhi ch l~ou l dn t have. happened 
if Bi 1.1 had seen John kiss Jenny Smith.' 
To the left, the climan line is in the 
P 
recess of being read and internalized. 
ts final structure is C0031. Notice that 
C0015 uas created to stand for some jenny, and 
that all common .features of the tuo enny 
1 candidates uere co~ied to it. 
-------- -------- 
___________________----------dm---- 
I We interrupt NEflORY el. this point to have 
#JENNY 1 : N1.L a'look atlathe two,Jennys and C0015, the token 
represent i ng one of these Jennys. 
ASET: 
10019: (SURNAflE # SHI TH) 
10018: (ISA # #PERSON) 
10017.: (NAME # JENNY) 
RECENCY: NIL 
I 
I---------d------------------------ 
#JENNY2: NIL 
I 
ASET: 
10022: (SURNAME # JONES) 
10021: (ISA # #PERSON) 
10820: (NAtlE .# JENNY 1 
RECENCY: NIL 
-------- 
START I NG . I NFERENCE QUEUE: 
( (X .l. 0 C00311 (X 1.0 -C0017) 
(X 1.0 C0~16). 
-----------------------I--------------- 
C0015: NIL 
This is the person nirned Jenny who Bil,l 
saw yesterday, and who 'John )s I ssed. C0012 
ASET : 
is the- token re resenting John's .I ips, which 
C0023: (*LOOK AT* #RILL1 .#) 
uere in *PHYSCOI ! T* u i th thi s person named 
-----cn-- 
MEMORY t5egi ns i nf erenc i-ng Prom t h is s 'i npu t.. 
The start~ng inference queue consists of. 
the' nrain structure for the sentence, together 
with all ottier 'facts known about C0015. In 
1 338s case, these are sirnplg that C8815 is 
C8026: (*PHYScONT* C00.12 #I 
C0017: ( I SA .# #PERSON) 
C0016: (NAME # JENNY 1 
RECENCY: 9866 
Jenny (C0015) at t i ms C0011. 
APPLY I NG , I NF MOLECULE *llLOC* TO 
C083.7: -(wflLOC* (CAUSE (*DO* #JOHN1 
#UNSPEC I F I ED) hPHYSCONT* CW2 
C00151) $0021 
ABOUT TO APPLY eMLOCl TO C0037 
I NFERR t NG: (~leflLOC* C0028 C0048) 
ALSO GENERATING: (TS COB43 C0011) 
APPLY I NG I NR NOLECULE *PHYSC T* TO 
C0026: (*PHYSCONT* COB12 C001 ? 1 
ABOUT TO APPLY oPHVSCONT1 TO CB826 
'I NFERRI NG: (aMFEEL* #JOHN1 UPOSEMOTION 
C00151 
ALSO GENERATING: (TEME 'COB49 c8011) 
ABOUT TO APPLY CBPHYSCONT~ TO C0026 
1.NFERRI NG: (*MLOC* C0049 C0051) 
ALSO' GENERATING: (TS COB54 C0011'1 
ABOUT TO APPLY oPHYSCdNT3 TO COB26 
INFERRING: I*LOC* C0815 #PALOALTO) 
ALSO GENERAT I NG: ( T I RE C0056 C0011) 
APPLY I NG oPOSTSCANL TO C0043: . . , 
8 '~*M~oc* (CAUSE (*DO* #JOHN1 
RUNSPEC IF I EOI (*PHYSCONM C0012 ~00lS) 1 
I NFERR I NG: (*flLOC* C0049 C00h0) 
COPYING. TIMES. FROM C0043 TO CB086 
,,,,,--,,--------b----------------- 
C0015: 'NIL 
a person, 'and that i,ts name is Jenng. Ihese 
wi.1 l not be of use in this example. A1 1 other 
subprop~si t ibns have been suppressed from 
the s t-ar t inq 'i'nf erence queue - for' th i s examp 1 e, 
One inference fromBill's seeing this event 
is that he knous that the event oc~urred. 
That js, themevent ~ent froni his eyes to 
hi s consciaus processor, C8821. 
To the I.eft, the infer.ence that Bi l l knows 
about John's kissing Jenn i s being enera ted: 
information in Bi l l s CP Y C0821) UI l 7 also 
enter his LTfl, C0040. This fact wFI 1 'be of 
use dur i n .the second pass of ' i'nf erenc i.ng 
[after ME ORY de'cidbs that CBBIS is Jenny 
Jones). 
il 
Another inference ar i ses from John' s l i ps 
being -in PHYSCONT uith C0015: that John 
feels a ~ositive emotion touard C0015. THe 
structurb rep~esenting this inference is 
- 
C0849, 
Another inference from John's 'kissing action 
is that' C0015 kno.ws tha.1 John fee l s a posi t i.ve 
emotion toward C0015, C0051 is C0015's LTM. 
This inference will beo of no ditect 
consequeoCe in this example. 
MEflORY also infers from John'*s kissing C081S 
that John and C0015 shad the same l oca t i on 
at theevent tima, C0811 (yesterday). Since 
flEflORY knows that John was i n Pa l o A l to, and 
has no information concerning C0815's l oca t i on 
, NEflORY infers that C0015 was also 
In Palo Ito.yesterday. This information wi 11. 
solve the reference arnbiguli ty, 
the\postscan lnterencln , .the fact 
that i l l saw John kiss C0815 eads to Our nil 9 
the inference that. Bi 1 l knaws that John 
feels a positive emotion, toward C.0015'. This 
inference type implements the p+i$ciple that 
if a'person knows X, he also is.likely also 
to knou the inferences uhich -cansbe drawn 
from X, That is, MEflORY assumes that other 
eo 'le possess the same inference ppwars. as 
RE~~RY tloe S. 
-------- 
Inftencing eventual ly,ceases. Ue interrupt 
processing at this o~nt to examine C0815, 
R the unknodn Jenny, ot ice the new inf ormat i'on 
which has been bu I 1 t up about C0815. 
C0012 i s John's l ips. 
C0016: (NAME # JENNY) 
RECENCY: 9350 I 
ASET: 
CC878: (*MLOC# # C0040) 
C0857: (TIME #'GO0111 
RECENCY :- 42533 
TRUTH: T, STRENGTH: 0.90250@00 
REASONS : 
C0002: (rleLOCs #JOHN1 #PALOALTO) 
CBB26: (*PHYSCONT* C0812 C0815) 
OFFSPRING: 
C0101t 
ISEEN: NIL 
ns'L I 
C0087: (TS # C0011) 
RECENCY t 25750 
TRUTH: T. STRENGTH: 0.95000088 
PEASONS: . 
C0043: (*MCOCU 
I SEEN: (alLOC2 1 
RETRY I NG REFERENCE: 
(COB15 #JENNY2 #JENNY1 1 
REFERENCE AMB I GU I TY SOLVED, 
I 
OLD: (COB15 #JENNY2 #JENNY11 
NEU: #JENNY2 
MERG I NG: 
#JENNY2: #JENNY2 
PURG! NG (rkLOC* C0015 BPALOALTO) 
PURG I FIG; (*nLoc* (*LOC* ~0015 #PALOALTO) 
C0040) 
PURG 1,NG: (75 (*flifi~*-i*~~t* C0015 
#PALOALTO) C0040) C80111 
PURG! FIG: IT I ME (*LOC* C0015 #PALOALTO1 
PURGING: 
PURG I NG : 
C0011) 
( 1 SA C0015 #PERSON1 
(NAME 60015 JENNY 1 
Sims it will ssttle tha re.fer-ance ambigbi ty, 
we have a closer, loo'k atbthe structure which 
represents C0015's bein in Palo Alto 
yesterday (CO811). ~0073 reprrten ts Bi l I' s 
knodledge of CBa15's loca,t ion yesterday 
(but has no direct relevance to. this-example). 
Notice that the reasons for PIEROR\! believing 
that C0915 was in.Pal,o Alto at time C8011 
are twofold: that John was in Palo Alto at 
that time,'and that a bod part of John 
uae in PHYSCONJ.ui th ~0018 then. 
We also examine the structure which represents 
the inference that Bi l l kn'ows. that John 
feels a positive emotion toward C0015, This 
information ui I I come into play after C8015's 
identit is solved (on the second inference 
pass). b087 indicates. when Bi 1 l started 
knowing this fact (COB48 is his LTMI. 
The first ass of inferencin is now f inish9d. 
We al low R k! MORY to proceed. ? t not ices that 
a.reference decision is pendin , and attempts 
to decide betu??~ #JENNY1 and i) JENNY2 as the 
referent of CC2!5 b1.i using newly-inferred 
information about Coal5 (from the f ir$t 
pass). I t sirccaeds,. because #JENNY2 was 
knoun to be in Palo Alto yesterday, and 
this matches new C0915 information, C0056. 
BEflORY merges Coal5 into UJEFINYZ, purging 
dld infor ation uhich is not used to augment 
UJENNYZ. lecal 1 that the merge rep1 acos 
occurrence set pointers, so. that ever 
PlEtIGRY structure uhich referenced C80 5 now 
references #JENNYZ. 
Y 
We have anather look at #JENNY2 before the 
second irnf erence p'ass begi ns. 
ASET: 
C0117:. (IDENTIFIES # Ce01,5) 
C002G: (aPHYSCONTm C0012 #1 
C0029: (*LOOK AT* #BILL1 N) 
C0049: (*MFEEC%  JOHN^ #POSElOT 1 flh #I 
C0053: (PART C0051 #I 
C0010: (*PIFEEL* #BILL1 #LOVE #I 
C0805: (AOC* # flPALOALTO1 
10019: (SURNAME # JONES) 
10018: (ISA #.#PEPSON), 
10017: (NAME # JENNY) 
RECENCY: - 8958 
.RE- I NFERR I NG . . 
*,.... 
APPLY I NG 'I NF MOLECULE *MLOC*' TO 
C0086: (*flLOCu' (*RFEEL;x #JOHN-1 
,#POSEHQTION'#JENNY2) Cf2048) 
ABOUT TO APPLY dlLOC3 TO C0086 
4 NFERR I NC: (MIFEEL* $0 I Ul #ANGER 
#JOHN1 1 
ALSO GENERATING: (TS C0119 C00111 
ASET: 
C8121: 
C0120: 
RECENCY: 
TRUTH: T, 
REASONS: 
C0086: 
C0010: 
ISEEN: NIL 
(CAUSE C0086 #I 
(TS C00111 
,87600 
STREIiGTH': 0.90258808 
(~LOC* ceo~ ~0040) 
(&FEEL* '#B I LLI #LOVE 
- - - - - - - - 
REflORY,begins fhe second pass of inferencing. 
This conslst$.o,f sub'ject'ing each inferehce 
uhich arose from the first pass to inqference 
again. The,ISEEN property prevents-duplication 
of inferences during second and subsequent 
passes. 
One new inference which was not possib,e 
on the first'pass is .that Bill probably 
became anyry at John* Thi e inference ar i ses 
from Bill 3 knouin -that John .feels a'positive 
B emotion toward #JE NY2, sohenne. B i I I l oves. 
Call-9 is the-structure representing Bi l 1's 
incipient anger toward John* The crucial 
po-int is that this inference became possible 
on1 ya after ,#JENNYZ1 s features became 
avat lable.after a'reference decision, which 
uas in' turn'made possible through first- 
pass inferencing, 
 inal all^, ue have a look at this second pass 
inference. 
C0121 represents the cause of B i l l 's anger 
as bei n C0086, hi s knowing 'about t'he k I ssi ng 
event . e0049. 
Notice the reasons MERORY believes that Bi l l 
became angry at John:. he knew John ki ssed 
#JEEINY2 (th~s' structure 5s C0B491, and he 
loves #JENNY2. 

REFERENCES 

(~1) Abelson, R. P., "The Structure or Beller systems, " In scnank 
and Colby (eds.) Computer Models o.f T,houqh.t and Lanquaqe, W. H. 
Freeman, Sari Franciscb, 1973 

(~2) ~nderson, J., and Bower, G , Human Associative Memory., Wiley & 
Sons, New York, 1973 

(~1) Charniak, E., "~oward a Model of children ' s Story Comprehension, 'I 
Doctoral ~issertatl-on, MIT, 1972 

(~1) Eccles, J., "The PhySiology of Imagination, " scientific .American, 
Sept. 1958 

(GI) Goldman; N., '"computer Generation of ~aturax Language from a 
Deep Conceptual  as&,." '~octokal ~issert~tion, Stanford, 197-4 

(HI) Hewitt, C.!. "Procedural 'Embedding of Knowledge in PLANNER, " 
Proc. 2nd ~oint Conf. on ~rtificial ~ntelliqence, 1971 

(~1) Levy, D;, "Contextual Guidance of Infergnce," worklng paper, 
Stanford, 1974 

(Ml) Marcus, M. , 'Wait-and-See Strategies for Parsing Natnral 
Language, " working, paper, M. I..T, -A1 Lab, August 1974 

(Ql) Q~i1lia.n~ R., "Semantic Memory, I' in Minsky (ed. ) Semantic 
Informatipn ~rocess.ini, MIT Press, cambridge Mass., 1968 

(Rl) Rieger, C., "@nceptual Memory: A Theory and Computer Program 
for ~rncessing the MeaningXontent of Natural Language Utterances," 
Doctoral ~is,sertation; Stanfdrd, 19'74 

(~2) Riesbeck, 3., "~omputational Understanding: Analysis of 
Sentences and Context," Doctoral Dissertation, Stanford, 1974 

(~1) Schank, R., "Conceptual Dependency: A Therory of Natural 
Language Understanding,'" ~ognitiye' Psychology, 3 (4) , '1972 

(52 Schank, R., Goldman. N., Ri-eger, C., and ~iesbeck, C., "primitive 
Concepts TJnderlying Verbs of ~hhught, 'I- Stanford A1 Memo #162, 
1972 

(S3) Schank, R., Goldman, N.., ~ieger, C., and ~iesbeck, c., "MARGIE : 
Memory, Analysis, Response Gerrergtion and Imference on English," 
in Proc. 3rd Joint .con£. of Artificial ~ntellige'nce, 1973 

(~1) Winograd, T, "Procedures as a Representation for Da,ta in a 
Computer Program for understanding ~itural Language, " Doctoral 
~issertation, MI, 1971 

(21) Zadeh, L., "The Concept of a Linguistic Variable and Its 
Application to Approyimate Reasoninq, " workinq paper, U. C. 
Berkeley, 1973
