File Information
File: 05-lr/acl_arc_1_sum/cleansed_text/xml_by_section/metho/86/c86-1051_metho.xml
Size: 31,999 bytes
Last Modified: 2025-10-06 14:11:47
<?xml version="1.0" standalone="yes"?> <Paper uid="C86-1051"> <Title>ABSTRACT ~ IDEAL V PROPOSITIONAL v qUANTITY v \]RREAL REAL ~ (PHYSICAL v TEMPORAL v SENTIENT) & (NATURAL v SOCIAL) PHYSICAL ~ (STATIONARY v NONSTATIONARY) & (ANIMATE v INANIMATE) NONSTATIONARY -- SELFMOVING V NONSELFMOVING COLLECT\]VE ~ MASS v SET v STRUCTURE STATIONARY ~ ~ MOVEABLE TEMPORAL ~ STATIVE v NONSTATiVE NONSTATIVE ~ (GOAL v NONGOAL) & (PROCESS v ACTIVITY v MOTION) PROCESS ~ POSITIVE v NEGATIVE ACTIVITY ~ OCCUPATIONAL v INTERACTIONAL OCCUPATIONAL ~ AGRICULTURAL v MININGMANU v IRADE v SERVICE v EDUCATION INTERACTIONAL ~ POSSESSIVE v ASSISTIVE v CONIACTt}A4. V CONFRONTATIONAL MOTION ~ (FAST v SLOW) & (TOWARD v AWAY)</Title> <Section position="4" start_page="216" end_page="218" type="metho"> <SectionTitle> INDIVIDUAL COLLECTIVE ENTITY ENTIIY ABSTRACT REAl_ ABSTRACT REAL / J CONCRETE CONCRETE / / ANIMAL ANIMAL / / CO,_~W HERD </SectionTitle> <Paragraph position="0"> Inheritance of properties works differently for tile collectivcs than it does for individuals. Because cow is under ANIMAL, &quot;cow is a kind of animal&quot; is true. In contrast, !toLd attaches to ANIMAL, but &quot;a herd is a kind of animal&quot; is not true. A herd consists of animals. We have found that though there arc gaps among the collectives, a surprising mnnber of types of entities lmve collective names in English. For example, plop- null ositions come in collectives (discourse~th~. Another important cross-classification involves SOCIAL vs NATURAL. Entities (or events) which come into being (or take place) naturally must be distinguished from those which arise through some sort of social intervention.</Paragraph> <Paragraph position="1"> ARTIFACT is one of the SOCIAL nodes. The distinction needs to be made high up in the ontology because it affects most kind types. For example, events may either be SOCIAL (p_arty_) or NATURAL _(earthguake). (Section IV expands upon the justifications for the ontology). The ontology also assumes the possibility of multiple attachments of instantiations to nodes. Thus the representation is actually a lattice rather than a tree. For example, an entity, John, is both a HUMAN with the physical properties of a mammal, and is also a PERSON who thinks.</Paragraph> <Paragraph position="2"> The latter makes John very similar to other sentients such at institutions and social roles. Instead of loading all of that complexity into a single HUMAN node, we make the SENTIENT~NON-SENTIENT distinction high tip in the hierarchy. There is ample philosophical (Strawson 53) and psychological (Gehnan and Spelke 81) support Ior this decision. Any actual person is attached to both the HUMAN and PERSON nodes in the ontology.</Paragraph> <Paragraph position="3"> 1II.2 Generic Information In the generic features database, each sort is represented as a predicate with two arguments. The first is a list of prototype features and tile second is a list of inherent features. A protetype feature it typically associated with a sort or tuedicate. Most entities' have more prototypical features than inherent features. From our sample, a miner is typically &quot;male&quot;; a norse is typically &quot;female&quot;; a town typically has &quot;houses&quot;, Ua square II, Ila fountain&quot;, and so on, hlherent features are are rationally unrevisable properties of a sort or predicate.</Paragraph> <Paragraph position="4"> Thus, a man is inherently &quot;male&quot;, a wife is inherently &quot;married&quot;, a house is inherently &quot;house-slzed&quot;. From our sample, a miner inherently &quot;works in a mine&quot;, a nurse inherently is &quot;educated&quot;, a town inherently contains &quot;buildings&quot;, and so on.</Paragraph> <Paragraph position="5"> IlI.3 Feature ~I~y~ The lu'ototype features are represented by the same set of predicates osed to represent the inherent 1eatures, thus achieving SOOle econollly bl the rules. Nevertheless, the nmnber of predicates needed to encode the inherent and prototype features is theoretically limitless. Fortunately, a small and manageable set of 33 feature types encodes a great deal of inforlnation, although not exhaustively. The features themselves were chosen empirically to correspond with psycholingulstic data gathered by l;'.esch et al (1976), Asheraft (1976) and Dahigren (1985a) When asked to list prototypieal Ieatures of various concrete objects, subjects tend to name features which fall into a small nunrber of types such as SIZE, COLOR, SHAPE, and FUNCTION.</Paragraph> <Paragraph position="6"> Similarly, a few types of features such as STATUS, SEX, INTERNAL TI~.AIT AND I~.ELAT1ON are named for social roles.</Paragraph> <Paragraph position="7"> Notice that a feature type such as SIZE or COLOR may be inherent for one sort but only prototypleal for another. For instance, while blood inherently has COLOP, &quot;red&quot;, a brick is only prototypically &quot;red&quot;. While a brick inherently has SHAPE &quot;rectangular parallehJpitmd&quot; , bread is only prototypically &quot;loaf-shaped&quot;. In some cases, a sort has a feature type both inherently and prototypically. For example, a doctor has the inherent FUNCTION &quot;treats sick peot~le&quot; and the prototypical FUNCTION &quot;consoles sick people&quot;.</Paragraph> <Paragraph position="8"> I11.4 Kind q'ypcs as Metasorts Most knowledge representation systems permit any combination of the features in descriptions. KT limits these combinations by taking advantage of several important ontological constraints affecting the possible real-world objects and therefore possible combinations of features in commonsense knowledge. Objects 1all into kinds. In particular, natural kinds exist because their members share some underlying trait, while artifacts and social kinds exist because el social convention Schwartz(1979) Dahlgren (1985b). We call classifications of kinds KIND TYPES, so that NATUP, AL KIND constitutes one kiud type, ARTIFACT another, and so on. Kind types constrain the eomnlonsense knowledge base in several ways* First, each kind type is understood in terms of certain predictable feature types. NATURAL KIND is conceived primarily in terms of perceptual features, while ARTIFACT adds functional features. Second, there is a correlational structure to the features of reai-world objects. Given that an object is a mammal, certain features will be found (eg. &quot;fur&quot;) and others will be absent (eg. &quot;feathers&quot;).</Paragraph> <Paragraph position="9"> Associated with each node in the ontology is kind type information encoding feature types entities attached at that node may have. Entities may be described by features falling into a some or all of these feature types, and no others. Inheritance up the tree ensures that any lower node has all the feature types of higher nodes on any path to ENTITY. For instance, any node under PItYSICAL may have certain feature types, and any node under AP, TIFACT may have those inherited Item PlIYS1CAL, as well as fu,'ther feature types, as below: PIIYSICAL - Shape. Size, Color, Material rl'exture, Odor, ltasparts, Partof</Paragraph> </Section> <Section position="5" start_page="218" end_page="220" type="metho"> <SectionTitle> AF.TIFACT - {PHYSICAL\], Function, Operation, Construction, Owner </SectionTitle> <Paragraph position="0"> At each node, only certain feature types are applicable. Conversely, each feature known to KT is classified by type as a COLOR, SIZE, FUNC-TION, INTERNAL TRAIT or other. Cohn (19851 describes the econonry of the use of sorts in logic programming. In the KT system, sorts and predicates appear at the terminal nodes of the ontology. In addition, the kind types employed by the system represent metasorts, in that they constrain the possible ty_~es of sorts recognized by the system.</Paragraph> <Paragraph position="1"> I11.5 Encoding the Common Sense Knowlcdgt~ The representations described above will be illustrated with the sort nurse. Nurse is attached to the ohtology in axiom 9) nurse(X) -+ role(X).</Paragraph> <Paragraph position="2"> From this axiom nurse inherits SENTIENT, SOCIAL, PHYSICAL, REAL, INDIVIDUAL and ENTITY tronl the ontology, In the generic * database, the axiom (101 lists the prototype and inherent Ieatures of nurse.</Paragraph> <Paragraph position="3"> 10) marse ({caring,female}, {educated,asslstant,</Paragraph> <Paragraph position="5"> Notice that the last inherent feature is in the form of a PP.OLOG clause.</Paragraph> <Paragraph position="6"> This makes it possible to use the whole complex feature as input to the English grammar in order to \[ormnlate an English response to a question such as &quot;What does the nurse do?&quot;, or &quot;Does the uursc help peopleT&quot;. The feature typing database classifies the features as follows: relalion(assistant).</Paragraph> <Paragraph position="7"> interonltrait (caring).</Paragraph> <Paragraph position="8"> internzltrait(educated).</Paragraph> <Paragraph position="9"> sex(female).</Paragraph> <Paragraph position="10"> function(help(*,*)).</Paragraph> <Paragraph position="11"> The kind types predict that as a I~.OLE, nmse will have certain types of features. Inherited from the SENTIENT kind type are feature types INTERNAL TP, AIT (&quot;caring&quot;) and GOAL (&quot;tries to help&quot;). Inherited from the SOCIAL kind type are feature types FUNCTION (&quot;takes care of patients&quot;) and REQUIREMENT (&quot;license&quot;). In addition, RE-LATION type features (&quot;assistant&quot;) are predicted with a ROLE, IV. The Inference Mecharfism Built into the natural language component by Stabler and Tarnawsky is a mctainterpreter which solves queries of all axioms active in the system. This permits us to query ontological and generic information as welt as textual information. The translation of tile first sentence of Sample Text is as in (11).</Paragraph> <Paragraph position="12"> 11) miner(john} & town(town220) The problem solver derives t11o answers to qneries as in (12}. matching logic translations of tile queries, which are in ttm form of Prelog goals, to the database.</Paragraph> <Paragraph position="13"> 12) Is John a miner7 -- Yes Does John live in a town7 -- Yes In addition, KT is able to nlake a number of inferences from the text which are not directly stated there. The inferences are drawn from various aspects of the common-sense knowledge built into KT.</Paragraph> <Paragraph position="14"> IV.I Inheritance Using the ontological database anti tile same problem solver, the KT system deduces taxonomically inherited information about the entities mentioned in tile text, as in (13\]-(14).</Paragraph> <Paragraph position="15"> 13) What is a miner7 --A miner is a role, sentient, concrete, social, individual and an entity.</Paragraph> <Paragraph position="16"> 14) What does a miner do? --A miner digs for minerals.</Paragraph> <Section position="1" start_page="218" end_page="220" type="sub_section"> <SectionTitle> What is digging7 </SectionTitle> <Paragraph position="0"> --a goal-oriented, natural, nonmcutal, real, tem.tmt'al activity If an entity has dual attachment, for example as a human and as a role, or as a place anti as an institution, then KT explains inheritance relations along both paths of the ontology. A clinic is both a social place and an institution, and so when asked (15), 15) What is the clinic? KT replies both that &quot;A clinic is an institution, sentient, physical, real, collective, structure.&quot; and that &quot;A clinic is a social place, place, inaninmte, physical, stationary, social, real, individual.&quot; Direct ontological questions such as (16) are also answered: 16) ls tile clinic a social place? --Yes Is the clinic collective? --Yes The inheritance path is followed in answering such questions, so that the system call answer not only queries of node attachments at to the terminal nodes of the ontology, but at all higher levels.</Paragraph> <Paragraph position="1"> IV.2 Coln1~letq. a.D~l - hmon)plc_'tc.l(nowlcd~gc_&quot; In reasoning with this schema, the system knows which valid inferences it can derive ontologically, and thcrefcre definitively, and which knowledge is incomplete. For example, KT knows that it knows the following for certain:</Paragraph> <Paragraph position="3"> It also knows that if something is HUMAN, it is not AIISTRACT. When asked &quot;Is the teai:her abstract?&quot; it answers &quot;No&quot;. Thus it handles the exclusivity of sets called for by Hcndrix (1979) and Teoenbaum(1985).</Paragraph> <Paragraph position="4"> On the other hand, it knows which information is incomplete. With generic descriptions, KT knows that it only knows at the la'obabilistie level.</Paragraph> <Paragraph position="5"> It asked, &quot;Is Mary intelligent?&quot; it rcspouds &quot;Probably so.&quot; 'lhis reflects the fact that most English speakers share a prototype of teachers as intelligent. The logic works this way. If a question is ontological, KT gives dclinitive (yes/no) answers. If the question is generic, the answer is qualified as either prototype or inherent. If no attswcr can be derived to a non-ontological question; KT responds &quot;1 don't kttow.&quot; Thus KT makes the open world assumption except with regard to ontological classifica.</Paragraph> <Paragraph position="6"> tions. Tills ability to reason about incomplete definitions is shnilar to Levesque's proposal for incomplete databases (Levesque 84).</Paragraph> <Paragraph position="7"> features of the entities in the text, both directly and by types of features.</Paragraph> <Paragraph position="8"> Direct feature queries are of the form (17). The form of the answer depends upon whether the feature is prototypical or inherent.</Paragraph> <Paragraph position="9"> 17) Is the miner rugged? --Probably so.</Paragraph> <Paragraph position="10"> Is the clinic a place? --Inherently so.</Paragraph> <Paragraph position="11"> Does the chicken lay eggs7 --Inherently so.</Paragraph> <Paragraph position="12"> Are tile eggs white7 --Probably so.</Paragraph> <Paragraph position="13"> tIow is digging done7 --Probably with a shovel.</Paragraph> <Paragraph position="14"> Where is digging done7 --Probably in the earth.</Paragraph> <Paragraph position="15"> IV.4 Overriding Features Genmqe information is handled differently from ontological information. First, it is tentatively inferred, and checked against tile current knowledge base of information built up from the reading tile text. If anything ill the tcxtnal database conflicts with a generic inference, the latter is overridden. K'F takes the text as the authority, and if the text says that an entity has a feature contradicting those in its oommonsense knowledge of the entity, the text's claim comes first. For example, Salnplc Text says that the eggs are &quot;brown&quot;, which overrides the prototypical generic lcaturo &quot;white&quot; which is listed for c~g, as in (18).</Paragraph> <Paragraph position="16"> 18) Are the eggs brown? --The text says so.</Paragraph> <Paragraph position="17"> The cancellation takes place simply by matching to the textual dalabase first. Sbnilarly, if a text said that an elephant had three legs. the KT system would reason that it had three legs, and not the inherent four that elephants have. By overriding inherent features, KT gets aronnll the cancellation problem which arises when features are viewed as logically necessary. If &quot;has four legs&quot; is taken to be a logically necessary feature, aug three-legged elephant forces a contradiction, or special processing for exceptions (Brachman and Schmolze 1985). The KT system accepts both facts as true, with no contradiction. This particular elephant has three legs, and elephants inherently have four legs.</Paragraph> <Paragraph position="18"> In attempting to match to both the textual and generic databases, tile possibility of infinite recursion arises. This is true in principle for the human reasoner, as well. KT prevents infinite recursion by limiting inferenccs to a depth of 5.</Paragraph> <Paragraph position="19"> Because of the feature typing, KT can answer queries as in (19).</Paragraph> <Paragraph position="20"> 19) What color are tile eggs? What function does tile clinic have? Feature typing classifies &quot;brown&quot; as of type COLOR. When KT looks first at the translation of the text to see whether it contains an assertion which states a color for tile eggs, it must distinguish tile facts in the text which arc relevant to the feature type queried. With respect to Sample Text, in order for KT to answer &quot;What color are tile eggs?&quot;, KT must know that &quot;brown&quot; is a COl.OR. Without feature types, KT would not contrast &quot;white&quot; with &quot;brown&quot;.</Paragraph> <Paragraph position="21"> KrI ' deduces sets of lacts, as well as individual facts. When qnmied for a type of feature, sucll as FUNCTION, KT responds with all functions listed for a sort. For example, clinics protolypically have both outpatient and emergency' functions, and Krl ' lists both when queried for funclion. For sorts which are structural, that is, concrete objects and institutions, KT is able to describe tile structure. If asked &quot;What structure does the clinic have?&quot;, KT answels that typically it has a hierarchy of headasslstant-clientele and has roles of doctor for head, nurse for assislant nnd patient for clientele. Similarly, if asked &quot;What structure does the lish have?&quot;, KT answers, inherently it has these parts: fins, 1 tail, 1 head, 2 eyes, scales. When KT lists parts, bare plurals mean an unspecified numo ber greater than one.</Paragraph> <Paragraph position="22"> IV.5 Kind &quot;l~'t~es The kind types are useful in both parsing and infercnclug for text understanding. In the parsing phase, kind types can be used four ways. First, verb sense ambiguity can be resolve d by' the kind types of subject and object head nouns. In a sentence with the verb take, knowing that the subject is a vehicle forces the choice of the one sense of the verb, and knowing that it is a human forces another. Secoudly, KT can reason the other way around, and use selection restrictions on verbs to infer the kind types entities referred to in the sentence. Consider sentence (20).</Paragraph> <Paragraph position="23"> 20) ABC sued tile man.</Paragraph> <Paragraph position="24"> Using kind types for selection restrictions, KT infers that the entity named ABC is a SENTIENT. Given the further information in (21), KT infers that ABC is an INSTrI'UTION and not a PEP, SON, because the verb ~ requires an INSTITUTION as object.</Paragraph> <Paragraph position="25"> 21) The man had joined ABC illegally.</Paragraph> <Paragraph position="26"> Thirdly, certain anaphoric references can be resolved using kind types.</Paragraph> <Paragraph position="27"> When verb selection restrictions classify the entity referred to by a pronoun as in a certain kind type, then possible antecedents are correspondingly constrained. Consider the relationships in (22). ~ corefers with milk because, when intransitive, s~ requires a LIQUID as subject.</Paragraph> <Paragraph position="28"> 22) The cat drank the milk. It spilled.</Paragraph> <Paragraph position="29"> Fourthly, attachment ambiguities for prepositional phrases can be resolved using kind types. Consider sentence (23).</Paragraph> <Paragraph position="30"> 23) John bought the lock in the afternoon.</Paragraph> <Paragraph position="31"> It is syntactically possible for the prepositional phrase in the aIternoon to modify the lock, the verb phrase or the whole sentence. Since afternoon is in the kind type TEMPORAL, KT can resolve this syntactic ambiguity, and attach the prepositional phrase so that it modifies the whole sentence (Dahlgren and McDowell, 1986).</Paragraph> <Paragraph position="32"> IV.6 Summary of Inference Mechanism In summary, predications used to derive inferences in the text are found in five aspects of common-sense In using KT, queries drive these inferences. After a text such as the Sample Text has been read, KT can respond to queries and seem to understand the text in a more human-like way using the various aspects of knowledge indicated above. Below are listed some queries and responses. Q: Who is John? A: The man who lives in the town.</Paragraph> <Paragraph position="33"> ---Prototypical town has people living in it.</Paragraph> <Paragraph position="34"> ---Prototypieal male person is a man (not a boy).</Paragraph> <Paragraph position="35"> Q: Was the town built? A: Yes.</Paragraph> <Paragraph position="36"> ---By ontology of artifacts Q: Who built the town.</Paragraph> <Paragraph position="37"> A: People.</Paragraph> <Paragraph position="38"> ---By ontology of artifacts Q: Does John wear pants7 A: Probably so.</Paragraph> <Paragraph position="39"> ---By prototype database.</Paragraph> <Paragraph position="40"> Q: Does John eat eggs.</Paragraph> <Paragraph position="41"> A: Yes.</Paragraph> <Paragraph position="42"> ---Because eggs are food.</Paragraph> <Paragraph position="43"> Q: What does health think7 A: Health doesn't think.</Paragraph> <Paragraph position="44"> ---By kind types.</Paragraph> <Paragraph position="45"> Q: Does John look like a clinic7 A: No.</Paragraph> <Paragraph position="46"> ---By ontology database.</Paragraph> <Paragraph position="47"> Q: Does John live in a'tent7 A: Probably not.</Paragraph> <Paragraph position="48"> ---By prototype of town Q: Does John have a function2 A: Yes, ---By kind types IV. Basis for the Commonsense Knowledge_ Results in linguistlc research underline the importance of category distinctions, such as those between abstract and concrete objects, and persons as opposed to other objects. These actively affect sentence interpretation and generation. The sentence &quot;The rock read the book&quot; must either be interpreted as anomalous or metaphorical bceause only persons read. These constraints provide an empirical basis for the ontology. Cognitive psychological re- null search provides a further basis for the ontology. Keil's work cn ontological categorization in cognitive development was consulted in constructing the schema (Kei179). Gehnan and Spelke's results suggested placing SENTIENT higher in the schema (Gehnan and Spelke 81).</Paragraph> <Paragraph position="49"> Graesser and Clark's studies were the basis of the verb ontology (Graesser & Clark, 1985). Psycholingulstie research in the prototype theory provided descriptions of the actual prototypes shared by Englishspeakers for a number of these categories (Rosch, ct a176) (Dahlgren 85). The ontological schema was developed in two steps. First, the verbs from the corpus o\[ geography texts were classified accotdlng to selectional restrictions (SRs) oo subjects and objects. Second, the minireal categories needed to aceomodate these Sl~.s were arranged in a hierarchical schema. Certain SRs, such as HUMAN, ANIMA'Iff, CONCRETE, were expected. Others were surprises. Some verbs required complements that were marked for PLACE, and others required either subjects or objects to have certain mnveability Ieatures. q'hese are summarized below.</Paragraph> <Paragraph position="50"> STATIONARY: normally immobile, attached to the earth, moved only at great effort.</Paragraph> <Paragraph position="51"> SELFMOVING: normally in motion or designed for motion, in some eases with no apparent initial source.</Paragraph> <Paragraph position="52"> NONSELFMOVING: normally immobile but can be moved with slight effort. A source for the motion is expected, usually something SELFMOV1NG.</Paragraph> <Paragraph position="53"> One other interesting result from this stage of tire project is that a nmnber of vcrbstakeelthera PROPOSITIONAL or a SENTIENT subject. Both a book and a person can say something.</Paragraph> <Paragraph position="54"> Once the set of categories had been established, the next stage was \[itting tllem into a hierarchy from which inheritance of features c(mM be computed by KT. There were several constraints guiding this process. First, we wanted the ontology to be as compact as possible. Second, we wished to minimize nonexistent leaf nodes. Third, we preferred ttmt the system infer too little than too much. Daring this process it was also necessary to decide which of the Sl~.s represented true category cuts in an ontological schema and which were merely features on individual lexieal items. The guiding principle here was that if the distinction under examination (i.e. ANIMATE/INANIMATE) pervaded some subtrce, then it was assigned to a branching point. But if some distinction was needed in isolated parts of the tree, then it was represented as a feature. For instance, we found that the INDIVIDUAL~COLLECTIVE distinction pervades the lexicon and must be a primary cut in the ontology. Many verbs select only INDIVIDUAL (rnin21~clc) or only COLLEC'IIVt~ (stamj~ede) subjects or&quot; objects. Properties which were assigned 1eature status were items like EDIBLE and SIZE.</Paragraph> <Paragraph position="55"> The NATURAL/SOCIAL distinction was placed high on tile tree because human intervention pervades the world. All abstract entities ate products oI tile human mind, but every category of real entities, including events and states, contains dozens of examples of the products of society. We therefore reserved the term ARTIFACT for irranimatc man-made objects to distinguish them from natural inanimate objccts. 'Ihe SENTIENT~PHYSICAL distinction is also fairly high. SENTIENF is often placed as a subordinate nf ANIMATE, but in commonsense reasoning, the properties of people and things are very difIerent. The NATURAL/SOCIAL distinction applies to SENTIENT just as it does to PHYSICAL. A NATURAL, SENTIENT entity is a PERSON, that is a man, woma~ whereas a SOCIAL, SI~NTI\]JNT entity is a ROLE, seoretary~ miner, president. A collection of PERSON is a BODY, crowd I, Agsnob _. A collection of ROLE is an INSTITUTION, hos imp~l~ school. The INDIVIDUAL/COLLECTIVE cut had to be made at the level of ENTITY (the highest level) at the same place as the AIJSTRACT/REAL cut. This was not the only place where multiple distinctions applied (see Figure I).</Paragraph> <Paragraph position="56"> Our term COLLECTIVE applies to all collections of entities, classified into three subgroups. True collectives are sets in which each member of tile set is identical to all tile others (l_lcrd~ t_nob, m~p_w~lL IIgR\[). Masses are collections whose members are referred to only in terms of measurable units (saskd~__water), Finally, there are structures where the members have specified relations, such as in institutions (school, cotnpanz~ It was consideration of both the constraints listed above, and the asslgnments of SRs to feature or node status that led us to abandon both binary branching and planar trees as useful representational devices. While it was possible to model some distinctions as binary, others required more than two branches. For example, ABSTRACT entities which divided into IDEAL, PP, OPOSH'IONAL, QUANTH'Y, and IRREAL, all of which have equivalent status as SRs.</Paragraph> <Paragraph position="57"> We were still faced with the fact that many entities still seemed to straddle the hierarchy. Is an individual human a PRIMATE or a PER-SON, or both7 Is a hospital an INSTITUTION or a PLACE, or both? If we were to establish a hierarchy which would reflect these differences, we would end up with a very large and unwieldy schema with huge gaps. Therefore, we deekted on multiple attachment for those entities which re(mlred it. This decision was justified as well by examioation of the texts which revealed that a human teeing was generally dealt within a context as either a person or a physiological being, but rarely as both at tim same time. Figure IIl giw:s examples of some nouns, their assignment to categories and rules by which terminal nodes in the schema are generated from higher..level nodes. Figure lII shows only a few examples of terminal nodes irt the schema, However, every path through the ontology resuits in a terminal node which is named and which represents a unique class defined by inheritance of features up the tree. Tetmina\[ node names distinguish the individuals from the collectives, For instance, the collective node corresponding to I'LANI' is FLOP, A. The individual node corresponding to DISCOURSE is PROPOSH'ION. Similarly, STUFF is the collective nf MINERAl_, INSTH'UTION is the collective of ROLl?, and BODY is the collective of PERSON, etc.</Paragraph> <Paragraph position="58"> The types of features which occurred in the data at each node in the ontology were the basis of the kind types. It is an empMeal fact that feature types are correlated in relation to ontological classifications. At each node in the ontology is a kind type encoding certain sets of properties that any entity classified at that node may have. Inheritance up the tree ensures that any lower node has all the properties of higher nodes on a single path to ENTITY. For each property at a node, a set of values applies. While the values for items such as COLOR are fairly obvious, we have had to construct value ranges elsewhere. For SIZE, we ltave started with the set {microscopic, l~iny, small, handleable, medium, large, huge, building-sized, skyscraper- sized, mountainous, region-sized!, which is a reality-oriented scale to be applied loosely. The kind types were extracted empirically from the generic data after all the features were typed, by inspection of types of features associa\[ed with sorts and predicates at each node of the ontology.</Paragraph> <Paragraph position="59"> The texts in the corpus describe lifestyle and industry in wlrious countries. Generic descriptions of the nouns in the text were drawn frotn the psycholinguistic literature, to the extent possible. ((Rosch 76); (Ashcraft 76); (Dahlgren 85)). For P, OLE, we used generic descriptions of social roles collected by Dahlgren and partially published in (Dahlgren 85). For PHYSICAL we used generic descriptions from (Ashcraft 76). For those nouns where no data existed, generic descriptions were created cenforming to the types of information generated by subjects for similar nouns. We do not consider this a defect of our system, since we are not trying to argue for the psychological reality of any particular generic description, but merely for the efficacy of a reasoning system which uses them. The decision to place features in the prototype list or the inherent list for a sort or predicate was decided by two judges. It is a research goal to verify judgments experimentally.</Paragraph> <Paragraph position="60"> Co eclusjot~ hi conclusion, KT encodes an ontology which omdels the top level of typical t~'nglish speaker's cognitive model ef the actual wolld. It employs several different types of information to reason in human-like ways about text that it reads. In addition to the onlolngy, iI uses velb selection restrictions and generic in\[ormatlon associated with COIleel)ls. By enlploying systematic constraints in the form of kind types assoeialed with nodes in the ontology, KT reasons efficiently. All of lhe information KT uses is drawn from empirical studies of human cognitive psychology, linguistics or the corpus of text which KT reads. Because of this empirical basis, and the breadth of the ontology, KT is a transportable syslcm which is potentially useful for understanding any text of a general, literal nat'tn'e,</Paragraph> </Section> </Section> class="xml-element"></Paper>