File Information
File: 05-lr/acl_arc_1_sum/cleansed_text/xml_by_section/evalu/85/p85-1040_evalu.xml
Size: 35,206 bytes
Last Modified: 2025-10-06 14:00:00
<?xml version="1.0" standalone="yes"?> <Paper uid="P85-1040"> <Title>GRAMMAR VIEWED AS A FUNCTIONING PART OF A COGNITIVE SYSTEM</Title> <Section position="4" start_page="0" end_page="330" type="evalu"> <SectionTitle> 2. Background </SectionTitle> <Paragraph position="0"> There are currently several approaches to developing cognitive models of linguistic function (Cottrell, 1984; Cottrell and Small, 1983; Gigley, 1981; 1982a; 1982b; 1983; 1984; 1985; Small, Cottrell and Shastri, 1982; Waltz and Pollack, in press). These models include assumptions about memory processing within a spreading activation framework (Collins and Loftus, 1975; Hinton, 1981; Quillian, 1968/1980), and a parallel, interactive control paradigm for the processing. They differ in the explicit implementations of these theories and the degree to which they claim to be psychologically valid.</Paragraph> <Paragraph position="1"> Computational Neurolinguistics (CN), first suggested as a problem domain by Arbib and Caplan (1979), is an Artificial Intelligence (AI) approach to modelling neural processes which subserve natural language performance. As CN has developed, such models are highly constrained by behavioral evidence, both normal and pathological.</Paragraph> <Paragraph position="2"> CN provides a framework for defining cognitive models of natural language performance of behavior that includes claims of validity at two levels, the natural computation or neural-like processing level, and at the system result or behavioral level.</Paragraph> <Paragraph position="3"> Using one implementation of a CN model, HOPE (Gigley, 1981; 1982a; 1982b; 1983) a model of single sentence comprehension, the remainder of the paper will illustrate how the role of grammar can be integrated into the design of such a model. It will emphasize the importance of the parallel control assumptions in constraining the representation in which the grammar is encoded. It will demonstrate how the grammar contributes to control the coordination of the parallel, asynchronous processes included in the model.</Paragraph> <Paragraph position="4"> The HOPE model is chosen explicitly because the underlying assumptions in its design are intended to be psychologically valid on two levels, while the other referenced models do not make such claims. The complete model is discussed in Gigley (1982a; 1982b; 1983) and will be summarized here to illustrate the role of the grammar in its function. The suggested implications and goals for including neurophysiological evidence in designing such models have been discussed else- null where in Lavorel and Gigley (1983) and will be included only as they relate to the role and function of the grammar.</Paragraph> <Paragraph position="5"> 2. I Summary of Included Knowledge and its Representation null Types of representations included in the HOPE model, phonetic, categorially accessed meanings, grammar, and pragmatic or local context, receive support as separately definable knowledge within studies of aphasia. There is a vast literature concerning what aspects of language are independently affected in aphasia that has been used as a basis for deciding these representations. (See Gigley, 1982b for complete documentation.) Information that is defined within the HOPE model is presented at a phonological level as phonetic representations of words (a stub for a similar interactive process underlying word recognition). Information at the word meaning level is represented as multiple representations, each of which has a designated syntactic category type and orthographic spelling associate to represent the phonetic word's meaning (also a stub). The grammatical representation has two components.</Paragraph> <Paragraph position="6"> One is strictly a local representation of the grammatical structural co-occurrences in normal language. The other is a functional representation, related to interpretation, that is unique for each syntactic category type. Please note that ~ ~ not used in the strictest sense of its use wlthln a t_~ semantic system.</Paragraph> <Paragraph position="7"> ~TIF be des~ l~n detaiaT'-Ta't-e~T. Finally, the pragmatic interpretation is assumed to reflect the sentential context of the utterance.</Paragraph> <Paragraph position="8"> Each piece of information is a thresholding device with memory. Associational interconnections are made by using an hierarchical graph which includes a hypergraph facility that permits simultaneous multiple interpretations for any active information in the process. Using this concept, an active node can be ambiguous, representing information that is shared among many interpretations. Sentence comprehension is viewed as the resolution of the ambiguities that are activated over the time course of the process.</Paragraph> <Paragraph position="9"> Within our implementation, graphs can represent an aspect of the problem representation by name. Any name can be attached to a node, or an edge, or a space (hypergraph) of the graph. There are some naming constraints required due to the graph processing system implementation, but they do not affect the conceptual representation on which the encoding of the cognitive linguistic knowledge relies.</Paragraph> <Paragraph position="10"> Any name can have multiple meanings associated with it. These meanings can be interpreted differently by viewing each space in which the name is referencea as a different viewpoint for the same information. This means that whenever the name is the same for any information, it is indeed the same information, although it may mean several things simultaneously. An example related to the grammatical representation is that the syntactic category aspect of each meaning of a phonetic word is also a part of the grammatical representation where it makes associations with other syntactic categories. The associations visible in the grammatical representation and interpreted as grammatical &quot;meanings&quot; are not viewable within the phonetic word meaning perspective. null However, any information associated with a name, for instance, an activity value, is viewable from any spaces in which the name exists. This means that any interpreted meaning associated with a name can only be evaluated within the context, or contexts, in which the name occurs. Meaning for any name is contextually evaluable. The explicit meaning within any space depends on the rest of the state of the space, which furthermore depends on what previous processing has occurred to affect the state of that space.</Paragraph> <Section position="1" start_page="324" end_page="325" type="sub_section"> <SectionTitle> 2.2 Summary of the Processing Paradigm </SectionTitle> <Paragraph position="0"> The development of CN models emphasizes process. A primary assumption of this approach is that neural-like computations must be included in models which attempt to simulate any cognitive behavior (Of Lavorel and Gigley, 1983), specifically natural language processing in this case. Furthermore, CN includes the assumption that time is a critical factor in neural processin~ mechanlsms an-~-d--that it can be a slgnlflcant factor in language behavior in its degraded or &quot;lesioned&quot; state.</Paragraph> <Paragraph position="1"> Simulation of a process paradigm for natural language comprehension in HOPE is achieved by incorporating a neurally plausible control that is internal to the processing mechanism. There is no external process that decides which path or process to execute next based on the current state of the solution space. The process is time-locked; at each process time interval. There are six types of serial-order computations that can occur.</Paragraph> <Paragraph position="2"> They apply to all representation viewpoints or spaces simultaneously, and uniformly. Threshold firing can affect multiple spaces, and has a local effect within the space of firing.</Paragraph> <Paragraph position="3"> Each of these serial-order computations is intended to represent an aspect of &quot;natural computation&quot; as defined in Lavorel and Gigley, 1983. A natural computation, as opposed to a mechanistic one, is a &quot;computation&quot; that is achieved by neural processing components, such as threshold devices and energy transducers, rather than by components such as are found in digital devices. The most important aspect of the control is that all of the serial order computations can occur simultaneously and can affect any info'~m-atTo~-'that has been defined in the instantiated model.</Paragraph> <Paragraph position="4"> Processing control is achieved using activity values on information. As there is no preset context in the current implementation, all information initially has a resting activity value.</Paragraph> <Paragraph position="5"> This activity value can be modified over time depending on the sentential input. Furthermore, there is an automatic activity decay scheme intended to represent memory processing which is based on the state of the information, whether it has reached threshold and fired or not.</Paragraph> <Paragraph position="6"> Activity is propagated in a fixed-time scheme to all &quot;connected&quot; aspects of the meaning of the words by spreading activation (Collins and Loftus, 1975; 1983; Hinton, 1981; Quillian, 1968/1980).</Paragraph> <Paragraph position="7"> Simultaneously, information interacts asynchronously due to threshold firing. A state of threshold firing is realized as a result of summed inputs over time that are the result of the fixed-time spreading activation, other threshold firing or memory decay effects in combination.</Paragraph> <Paragraph position="8"> The time course of new information introduction, which initiates activity spread and automatic memory decay is parameterized due to the underlying reason for designing such models (Gigley, 1982b; 1983; 1985).</Paragraph> <Paragraph position="9"> The exact serial-order processes that occur at any time-slice of the process depend on the &quot;current state&quot; of the global information; they are context dependent. The serial-order processes include: (1) NEW-WORD-RECOGNITION: Introduction of the next phonetically recognized word in the sentence.</Paragraph> <Paragraph position="10"> (2) DECAY: Automatic memory decay exponentially re-e'du'ces the activity of all active information that does not receive additional input. It is an important part of the neural processes that occur during memory processing.</Paragraph> <Paragraph position="11"> (3) REFRACTORY-STATE-ACTIVATION: ~-- _ -~ An automatic change of state that occurs after active information has reached threshold and fired. In this state, the information can not affect or be affected by other information in the system.</Paragraph> <Paragraph position="12"> (4) POST-REFRACTORY-STATE-ACTIVATION: ~ An automatic change of state which all fired information enters after it has existed in the REFRACTORY-STATE. The decay rate is different than before firing, although still exponential.</Paragraph> <Paragraph position="13"> (5) MEANING-PROPAGATION: Fixed-time spreading activation to the distributed parts of recognized words' meanings.</Paragraph> <Paragraph position="14"> (6) FIRING-INFORMATION-PROPAGATION: Asynchronous activity propagation that occurs when information reaches threshold and fires.</Paragraph> <Paragraph position="15"> It can be INHIBITORY and EXCITATORY in its effect. INTERPRETATION results in activation of a pragmatic representation of a disambiguated word meaning.</Paragraph> <Paragraph position="16"> Processes (2) through (6) are applicable to all active information in the global representation, while process (1) provides the interface with the external input of the sentence to be understood. The state of the grammar representation affects inhibitory and excitatory firing propagation, as well as coordinates &quot;meaning&quot; interpretation with on-going &quot;input&quot; processing. It is in the interaction of the results of these asychronous processes that the process of compre- null hension is simulated.</Paragraph> <Paragraph position="17"> 3. The Role of a Grammar in Cognitive Processing</Paragraph> </Section> <Section position="2" start_page="325" end_page="325" type="sub_section"> <SectionTitle> Models </SectionTitle> <Paragraph position="0"> Within our behavioral approach to studying natural language processing, several considerations must be met. Justification must be made for separate representations of information and, whenever possible, neural processing support must be found.</Paragraph> </Section> <Section position="3" start_page="325" end_page="325" type="sub_section"> <SectionTitle> 3.1 Evidence for a Separate Representation of Grammar </SectionTitle> <Paragraph position="0"> Neurolinguistic and psycholinguistic evidence supports a separately interpretable representation for a grammar. The neurolinguistic literature demonstrates that the grammar can be affected in isolation from other aspects of language function.</Paragraph> <Paragraph position="1"> (Cf Studies of agrammatic and Broca's aphasia as described in Goodenough, Zurif, and Weintraub, 1977; Goodglass, 1976; Goodglass and Berko, 1960; Goodglass, Gleason, Bernholtz, and Hyde, 1970; Zurif and Blumstein, 1978).</Paragraph> <Paragraph position="2"> In the HOPE model, this separation is achieved by including all relevant grammatical information within a space or hypergraph called the grammar. The associated interpretation functions for each grammatical type provide the interface with the pragmatic representation. Before describing the nature of the local representation of the currently included grammar, a brief discussion of the structure of the grammar and the role of the grammar in the global nature of the control must be given.</Paragraph> </Section> <Section position="4" start_page="325" end_page="326" type="sub_section"> <SectionTitle> 3.2 The Local Representation of the Grammar </SectionTitle> <Paragraph position="0"> The grammar space contains the locally defined grammar for the process. The current model defined within the HOPE system includes a form of a Categorial Grammar (Ajdukiewicz, 1935; Lewis, 1972). Although the original use of the grammar is not heeded, the relationship that ensues between a well defined syntactic form and a &quot;final state&quot; meaning representation is borrowed.</Paragraph> <Paragraph position="1"> Validity of the &quot;final state&quot; meaning is not the issue. Final state here means, at the end of the process. As previously mentioned, typed semantics is also not rigidly enforced in the current model.</Paragraph> <Paragraph position="2"> HOPE a11ows one to define a lexicon within user selecte~ syntactic types, and a11ows one to define a suitable grammar of the selected types in the prescribed form as well. The grammar may be defined to suit the aspects of language performance being modelled.</Paragraph> <Paragraph position="3"> There are two parts to the grammatical aspect of the HOPE model. One is a form of the structural co-occurrences that constitute context free phrase structure representations of grammar.</Paragraph> <Paragraph position="4"> However, these specifications only make one &quot;constituent&quot; predictions for subsequent input types where each constituent may have additional substructure. null Predictions at this time do not spread to substructures because of the &quot;time&quot; factor between computational updates that is used. A spread to substructures will require a refinement in timesequence specifications.</Paragraph> <Paragraph position="5"> The second aspect of the representation is an interpretation function, for each specified syntactic type in the grammar definition. Each interpretation function is activated when a word meaning fires for whatever reason. The interpretation function represents a firing activation level for the &quot;concept&quot; of the meaning and includes its syntactic form. For this reason, each syntactic form has a unique functional description that uses the instantiated meaning that is firing (presently, the spelling notation) to activate structures and relations in the pragmatic space that represent the &quot;meaning understood.&quot; Each function activates different types of structures and relations, some of which depend on prior activation of other types to complete the process correctly. These functions can trigger semantic feature checks and morphological matches where appropriate.</Paragraph> <Paragraph position="6"> Syntactic types in the HOPE system are of two forms, lexical and derived. A lexical cateqory te~xle is one which can be a category type of a cal item. A derived cate_~o type is one which is &quot;composed.-a~&quot;-~erlved category types represent the occurrence of proper &quot;meaning&quot; interpretation in the pragmatic space.</Paragraph> <Paragraph position="7"> The current represented grammar in HOPE contains the following lexical categories: OET for determiner, ENOCONT for end of sentence intonation, NOUN for common noun, PAUSE for end of clause intonation, TERM for proper nouns, VIP for intrasitive verb, VTP for transitive verb. As is seen, the lexical &quot;categories&quot; relate &quot;grammatical&quot; structure to aspects of the input signal, hence in this sense ENDCONT and PAUSE are categories.</Paragraph> <Paragraph position="8"> The derived categories in the current instantiated model include: SENTENCE, representing a composition of agent determination of a TERM for an appropriate verb phrase, TERM, representing a composed designated DET NOUN referent, and VIP, representing the state of proper composition of a TERM object with a VTP, transitive verb sense.</Paragraph> <Paragraph position="9"> TERM and VIP are examples of category types in this model that are both lexical and derived.</Paragraph> <Paragraph position="10"> &quot;Rules&quot; in the independently represented grammar are intended to represent what is considered in HOPE as the &quot;syntactic meaning&quot; of the respective category. They are expressed as local interactions, not global ones. Global effects of grammar, the concern of many rule based systems, can only be studied as the result of the time sequenced processing of an &quot;input&quot;. Table l contains examples of &quot;rules&quot; in our current model. Other categories may be defined; other lexical items defined; other interpretations defined within the HOPE paradigm.</Paragraph> <Paragraph position="12"> In Table l, the &quot;numerator&quot; of the specification is the derived type which results from composition of the &quot;denominator&quot; type interpretation with the interpretation of the category whose meaning is being defined. For example, DETerminer, the defined category, combines with a NOUN category type to produce an interpretation which is a TERM type. When a category occurs in more than one place, any interpretation and resultant activity propagation of the correct type may affect any &quot;rule&quot; in which it appears. Effects are in parallel and simultaneous. Interpretation can be blocked for composition by unsuccessful matches on designated attribute features or morphological inconsistencies.</Paragraph> <Paragraph position="13"> Successful completion of function execution results in a pragmatic representation that will either fire immediately if it is non-compositional or in one time delay if the &quot;meaning&quot; is composed. Firing is of the syntactic type that represents the correctly &quot;understood&quot; entity. This &quot;topdown&quot; firing produces feedback activity whose effect is &quot;directed&quot; by the state of the grammar, space, i.e. what information is active and its degree of activity.</Paragraph> <Paragraph position="14"> The nature of the research in its present state has not addressed the generality of the linguistic structures it can process. This is left to future work. The concentration at this time is on initial validation of model produced simulation results before any additional effort on expansion is undertaken. With so many assumptions included in the design of such models, initial assessment of the model's performance was felt to be more critical than its immediate expansion along any of the possible dimensions previously noted as stubs.</Paragraph> <Paragraph position="15"> The initial investigation is also intended to suggest how to expand these stubs.</Paragraph> </Section> <Section position="5" start_page="326" end_page="327" type="sub_section"> <SectionTitle> 3.3 The Grammar as a Feedback Control System </SectionTitle> <Paragraph position="0"> The role of the grammar as it is encoded in HOPE is to function in a systems theoretic manner.</Paragraph> <Paragraph position="1"> It provides the representation of the feedforward, or prediction, and feedback, or confirmation interconnections among syntactic entities which have produced appropriate entities as pragmatic interpretations. It coordinates the serial ordered expectations, with what actually occurs in the input signal, with any suitable meaning interpretations that can affect the state of the process in a top-down sense. It represents the interface between the serial-order input and the parallel functioning system.</Paragraph> <Paragraph position="2"> Grammatical categories are activated via spreading activation that is the result of word meaning activation as words are recognized.</Paragraph> <Paragraph position="3"> Firing of an instance of a grammatical type activates that type's interpretation function which results in the appropriate pragmatic interpretation for it, including the specific meaning that was fired.</Paragraph> <Paragraph position="4"> Interpretation function~ are defined for syntactic types not specific items within each type. Each type interpretation has one form with specific lexical &quot;parameters&quot;L A11 nouns are interpreted the same; a11 intransitive verbs the same. What differs in interpretation is the attributes that occur for the lexical item being interpreted. These also affect the interpretation. null The meaning representation for a11 instances of a certain category have the same metastructure. General nouns (NOUN) are presently depicted as nodes in the pragmatic space. The node name is the &quot;noun meaning.&quot; For transitive verbs, nodes named as the verb stem are produced with a directed edge designating the appropriate TERM category as agent. The effect of firing of a grammatical category can trigger feature propagations or morphological checks depending on which category fires and the current pragmatic state of the on-going interpretation.</Paragraph> <Paragraph position="5"> Successful interpretation results in threshold firing of the &quot;meaning.&quot; This &quot;meaning&quot; has a syntactic component which can affect grammatical representations that have an activity value. This process is time constrained depending on whether the syntactic type of the interpretation is lexical or derived.</Paragraph> </Section> <Section position="6" start_page="327" end_page="327" type="sub_section"> <SectionTitle> 3.4 Spreading Activation of the Grammar </SectionTitle> <Paragraph position="0"> Input to HOPE is time-sequenced, as phonetically recognized words, (a stub for future development). Each phonetic &quot;word&quot; activates all of its associated meanings. (HOPE uses homophones to access meanings.) Using spreading activation, the syntactic category aspect of each meaning in turn activates the category's meaning in the grammar space representation.</Paragraph> <Paragraph position="1"> Part of the grammatical meaning of any syntactic category is the meaning category that is expected to follow it in the input. The other part of the grammatical meaning for any category type, is the type it can derive by its correct interpretation within the context of a sentence.</Paragraph> <Paragraph position="2"> Because each of these predictions and interpretations are encoded locally, one can observe interactions among the global &quot;rules&quot; of the grammar during the processing. This is one of the motivating factors for designing the neurally motivated model, as it provides insights into how processing deviations can produce degraded language performance.</Paragraph> </Section> <Section position="7" start_page="327" end_page="327" type="sub_section"> <SectionTitle> 3.5 Grammar State and Its Effect on Processing </SectionTitle> <Paragraph position="0"> Lexical category types have different effects than derived ones with respect to timing and pragmatic interpretation. However, both lexical and derived category types have the same effect on the subsequent input. This section will describe the currently represented grammar and provide example processing effects that arise due to its interactive activation.</Paragraph> <Paragraph position="1"> Through spreading activation, the state of the syntactic types represented in the grammar affects subsequent category biases in the input (feedforward) and on-going interpretation or disambiguation of previously &quot;heard&quot; words (feedback). The order of processing of the input appears to be both right to left and left to right. Furthermore, each syntactic type, on firing, triggers the interpretation function that is particular to each syntactic type.</Paragraph> <Paragraph position="2"> Rules, as previously discussed, are activated during processing via spreading activation. Each recognized word activates all &quot;meanings&quot; in parallel. Each &quot;meaning&quot; contains a syntactic type. Spreading activation along &quot;syntactic type associates&quot; (defined in the grammar) predictively activates the &quot;expected&quot; subsequent categories in the input.</Paragraph> <Paragraph position="3"> In the HOPE model, spreading activation currently propagates this activity which is not at the &quot;threshold&quot; level. Propagated activity due to firing is always a parameter controlled percentage of the above threshold activity and in the presently &quot;tuned&quot; simulations always propagates a value that is under threshold by a substantial amount.</Paragraph> <Paragraph position="4"> All activations occur in parallel and affect subsequent &quot;meaning&quot; activities of later words in the sentence. In addition, when composition succeeds (or pragmatic interpretation is finalized) the state of the grammar is affected to produce or changes in category aspects of all active meanings in the process.</Paragraph> <Paragraph position="5"> The remainder of this section will present instances of the feedforward and feedback effects of the grammar during simulation runs to illustrate the role of grammar in the process. The last example will illustrate how a change in state of the grammar representation can affect the process. All examples will use snapshots of the sentence: &quot;The boy saw the building.&quot; This is input phonetically as: (TH-UH B-OY S-AO TH-UH B-IH-L-D-IH-NG).</Paragraph> </Section> <Section position="8" start_page="327" end_page="328" type="sub_section"> <SectionTitle> 3.5.1 An Example of Feedforward, Feedback, and Composition </SectionTitle> <Paragraph position="0"> This example will illustrate the feedforward activation of NOUN for the DETerminer grammatical meaning during interpretation of the initial TERM or noun phrase of the sentence. At1 figures are labelled to correspond with the text. Each interval is labelled at the top, tl, t2, etc. The size of each node reflects the activity level, larger means more active. Threshold firing is represented as F~ Other changes of state that affect memory are are denoted (~ and~ and are shown for coa~leteness. They indicate serial-order changes of state described earlier, but are not critical to the following discussion.</Paragraph> <Paragraph position="2"> Figure 1 On &quot;hearing&quot; /TII-UH/ (a) at tl, the represented meaning &quot;OET-the&quot; is activated as the only meaning (b). At the next time interval, t2, the meaning of OET is activated - which spreads activity to what OET predicts, a NOUN (c). A11 NOUN meanings are activated by spread in the next time interval, t3, in combination with new activity.</Paragraph> <Paragraph position="3"> This produces a threshold which &quot;fires&quot; the &quot;meaning&quot; selected (d). At completion of interpretation (e), in t4, feedback occurs to a11 instances of meaning with category types in the grammar associated as predictors of the interpreted category. OET is the only active category that predicts NOUN so all active meanings of type OET will receive the feedback activity. In Figure I, OET-the is ready to fire (f). The increase or decrease in activity of a11 related types, competitive ones for the meaning (inhibitory) (g) as well as syntactic ones for composition (excitatory) (f) is propagated at the next interval after firing, shown in t3 and t4. In tS, /S-AO/ enters the process (h) with its associated meanings. null The effect of OET-the firing is also seen in t5 where the compositional TERM is activated (i).</Paragraph> <Paragraph position="4"> NOTE: DETerminers are not physically represented as entities in the pragmatic space. Their meaning is only functional and has a &quot;semantic&quot; combositional effect. Here 'tne' requires a &quot;one and only one&quot; NOUN that is unattached as a TERM to successfully denote the meaning of the boy as a proper TERM (i). As this is a compositional &quot;meaning&quot;, the firing will affect t6. Because there is no active TERM prediction in the grammar space, and no competitive meanings, the top-down effect in t6 will be null and is not shown. The next exa~le will illustrate a top-down effect following TERM composition.</Paragraph> </Section> <Section position="9" start_page="328" end_page="330" type="sub_section"> <SectionTitle> 3.5.2 An Example of Feedforward, Feedback, </SectionTitle> <Paragraph position="0"> Composition, and Subsequent Feedback This ex~,nple, shown in Figure 2, will be very similar to the previous one. Only active information discussed is shown as otherwise the figures become cluttered. The grammar is in a different state in Figure Z when successful TERM interpretation occurs at all (a). This is due to the activation at tg of all meanings of B-UI-L-O-IH-NG (b).</Paragraph> <Paragraph position="1"> The VTP meanings of /S-AO/ and then /B-UI-L-O-IH-NG/ make a TERM prediction shown as it remains in tlO (c). After composition of &quot;the building&quot; (a) shown in tel, TERM will fire topdown. It subsequently, through feedback,- activates all meanings of the category type which predicted the TERM, all VTP type meanings in this case. This excitatory feedback, raises both VTP meanings in t12, for saw (d), as well as, building (e). However, the activity level of &quot;building does&quot; not reach threshold because of previous disembiguation of its NOUN meaning. When the VTP meaning, saw, fires (d) in t\]2, additional comoosition occurs. The VTP interpretation composes with a suitable TERM (a), one which matches feature attribute specifications of saw, to produce a VIP type at t13 this will subsequently produce feedback at t14, Neither are shown.</Paragraph> <Paragraph position="2"> The final example, Figure 3, will use one of the &quot;lesion&quot; simulations using HOPE. The grammar representations remain intact. This example will present the understanding of the first three words of the sentence under the condition that they are presented faster than the system is processing. Effectively, a slow-down of activation spread to the grammar is assumed. Figures such as Figure 1 and Figure 3 can be compared a to suggest possible language performance problems and to gain insights into their possible causes.</Paragraph> <Paragraph position="3"> In Figure 3, when /TH-UH/ is introduced at tl Ca), all meanings are activated (b) as in Figure 1. The spread of activation to the grammar occurs in t2 (c). However, the second word, /8-OY/ (d) is '*heard&quot; at the same time as the activity reaches the grammar. The predictive activation spread From the grammar takes effect at t3, when the new word /S-N)/ (e) is &quot;heard.&quot; The immediate result is that the NOUN meaning, saw (f), fires and is interpreted at t4 (g).</Paragraph> <Paragraph position="4"> This shows in a very simple case, now the grammar can affect the processing states of an interactive parallel model. Timing can be seen to be critical. There are many more critical results that occur in such &quot;lesion&quot; simulations that better i11ustrate such grammatical affects, however they are very difficult to present in a static form, other than within a behaviorial analysis of the overall linguistic performance of the entire made1. This is considered an hypothesized patient profile and is described in Gigley (1985). Other examDles of processing are presented in detail in Gigley (lg82b; 1983).</Paragraph> </Section> <Section position="10" start_page="330" end_page="330" type="sub_section"> <SectionTitle> 3.6 Summary </SectionTitle> <Paragraph position="0"> The above figures present a very simple examole of the interactive process. It is hoped that they provide an idea of the interactions and feedback, feedfor~ard processing that is cooPdinated by the state of the grammar. Any prediction in the grammar that is not sufficiently active affects the process. Any decay that accidently reduces a grammatical aspect can affect the process. The timing of activation, the categorial content and the interactions between interpretation and prediction are imbortant Factors when one considers grammar as part of a functioning ~ynamic system.</Paragraph> <Paragraph position="1"> Finally, the Categorial Grammar is one form of a Context-Free (CF) grammar which provides a suitable integration of syntactic and semantic processing. In addition, it has been used in many studies of English so that instances of gr~ars sufficiently defined for the current implementation level of processing could be found. Other forms of grammar, such as Lexical-Functional Grammar (Kaolan and Bresnan, 1982) or Generelized Phrase Structure Grammar (Gazdar, 1982; 1983) could be edually suitable.</Paragraph> <Paragraph position="2"> The criteria to be met all that they can be encoded as predictive mechanisms, not necessarily unamOiguous or deterministic, and also that they specify constraints on compositionality. The composition depends on adequate definition of interpretation constraints to assure that it is &quot;computed&quot; properly or else suitably marked for its deviation.</Paragraph> </Section> </Section> class="xml-element"></Paper>