!
 META-COMPILING TEXT GRAMMARS AS A MODEL FOR HUMAN BEHAVIOR Sheldon Klein 
Computer Sciences Department University of Wisconsin I. BACKGROUND Folktale 
(Propp 1968) which generated 50 Russian fairytales, according to the r u l e s 
of his text grammar, at an average speed of 128 words a second, again including 
plot computations and specification of deep structure as well as surface syntax 
(Klein et al 1974, Klein et all 1975). Our earliest automatic text generation 
work used syntactic dependency network/graphs with 2-valued labelling of edges 
as an approximation to semantic network/graphs with multi-valued labelling of 
edges (Klein & Simmons 1963, Klein 1965a, 1965b). Our work on automatic 
inference of grammars includes the world's first program for learning context 
free, phrase structure grammars, for both natural and artificial languages, and 
the first program for learning transformational grammars (Klein 1967, Klein et 
al 1968, Klein & Kuppin 1970). More recent inference work includes the 
formulation of techniques for automatic inference of generative semantic 
grammars (Klein 1973) and for the ontogeny of Pidgin and Creole languages (Klein 
& Rozencvejg 1974). In formulating components for automatic inference of rules 
in the meta-symbolic simulation system, we find that the common notation for the 
semantics of the non-verbal behavioral simulation r u l e s and natural language 
means that the same learning heuristics may be used to infer behavioral rules as 
well as linguistic rules. The implication is that the totality of human verbal 
and non-verbal behavior, in complex social groups, both synchronically and 
diachronically, may now be modelled within the same notational framework. What 
for us started as a generalized device for testing varying theoretical models as 
part of an effort to model language change and variation (Klein 1974a, Klein & 
Rozencvejg 1974) now appears as the basis for a higher level theory of the 
linguistic basis of human behavior (Klein 1974b). 
 
i
 i
 
I
 
 
In our efforts to model the totality of synchronic and diachronic language 
behavior in complex social groups, we developed a meta-symbolic simulation 
system that includes a powerful behavioral simulation programming language that 
models, generates and manipulates events in the notation of a semantic network 
that changes through time, and a generalized, semantics-to-surface structure 
generation mechanism that can describe changes in the semantic universe in the 
syntax of any natural language for which a grammar is supplied. Because the 
system is a meta-theoretical device, it can handle generative semantic grammars 
formulated within a variety of theoretical frameworks. A key feature of the 
system is that the semantic deep structure of the non-verbal, behavioral rules 
may be represented in the same network notation as the semantics for natural 
language grammars, and, as a consequence, provide non-verbal context for 
linguistic rules. We are also experimenting with a natural language 
meta-compiling capability, that is, the use of the semantic network to generate 
productions in the simulation language itself -- productions in the form of 
"texts" that may themselves be compiled as new behavioral rules during the flow 
of the simulation -- rules that may themselves control the process of deriving 
new rules. This feature permits non-verbal behavioral rules to be derived from 
natural language conversational inputs, and through inference techniques 
identical with those for inferring natural language generative semantic 
grammars. The total system has the power of at least the 2nd order predicate 
calculus, and will facilitate the formulation of highly abstract meta-models of 
discourse, including the logical quantification of such models. Achievements 
with the generative portion of the system include a text grammar model that 
generates 2100 word murder mystery stories in less than 19 seconds each, 
complete with calculation of the plot and specification of the deep structure as 
well as the surface syntax (Klein et al 1973). The speed of this generation is 
100 to 1000 times faster than other existing programs using transformational 
grammars. (The algorithm for the semantics-to-surface structure generative 
component is such that processing time increases only linearly as a f u n c t i 
o n of sentence length and syntactic complexity.) More recent achievements 
include models of portions of Levi-Strauss" mythology work in The Raw & the 
Cooked (Levi-Strauss 1969) and a model for Propp's Morphology of the
 
I I I I
 
II.
 
WHAT IS A TEXT GRAMMAR?
 
The text grammarian movement, centered in Germany and Holland, includes work 
such as that of van Dijk, Ihwe, Pet~fi and Rieser (1972), Pet6fi and Rieser 
(1973), Pet~fi (1973), van Dijk (1973), and van Dijk and Pet6fi (1974). The 
underlying motivation of this group is the belief that Chomskian derived 
linguistic theories are inadequate to handle the complexities of complex 
narrative and discourse -- that more powerful logical devices are needed. An 
attempted refutation of the text grammarian position appeared in Dascal & 
Margalit (1974). Our own work on Propp and Levi-Strauss models refutes the 
refutation by demonstration (Klein et al 1974). To provide the reader with an 
intuitive view of the nature of a text grammar, we offer the following two 
Russian fairytales generated by our automated model of Propp (Klein et al 1974). 
The same text grammar
 
figenerated both stories from a structural model at a level of abstraction that 
provided a semantic unification of the a ppa r e n t surface diversity.
 
Tale THE
 
BORISIEVICHES LIVE IN A DISTANCE PROVINCE. THE FATHER IS EMELYA. THE ONLY SON IS 
BORIS. MARTHA IS THE ONLY DAUGHTER. EMELYA HAS THE SHEEP. BORIS, MARTHA AND THE 
SHEEP ARE IN THE WOODS. BORIS SAYS MARTHA, DO NOT LEAVE THE WOODS. BORIS LEAVES 
TO GO BERRY GATHERING. MARTHA LEAVES THE WOODS. A WOLF APPEARS IN THE D I S T A 
N T PROVINCE. EMELYA ASKS THE WOLF WHERE IS YOUR WISDOM. THE WOLF SAYS THAT MY 
WISDOM IS IN A MAGIC EGG. THE WOLF P L U N D E R S THE SHEEP. EMELYA SENDS 
MARTHA TO SEARCH FOR THE WOLF. MARTHA DECIDES TO SEARCH FOR THE WOLF. MARTHA 
LEAVES ON A SEARCH. MARTHA MEETS A WITCH ALONG THE WAY. THE WITCH P R O P O S E 
S THAT MARTHA LISTEN TO THE GUSLA W I T H O U T FALLING ASLEEP. MARTHA RESPONDS 
BY STAYING AWAKE WHILE L I S T E N I N G TO THE GUSLA. A MAGIC WAFER IS C O N S 
U M E D BY MARTHA. M A R T H A O B T A I N S S U P E R - H U M A N STRENGTH. 
MARTHA T R A V E L S TO THE L O C A T I O N OF THE WOLF IN ANOTHER KINGDOM. 
MARTHA IS D I R E C T E D BY A HEDGEHOG. MARTHA FINDS THE WOLF THEY FIGHT IN AN 
OPEN FIELD. MARTHA IS WOUNDED. MARTHA D E F E A T S THE WOLF WITH THE AID OF S U 
P E R - H U M A N STRENGTH. THE WOLF IS CAUGHT BY MARTHA. MARTHA STARTS BACK 
HOME. MARTHA RETURNS HOME.
 
THE MAGIC BOW UNPROTECTED. THE MAGIC BOW, A MAGIC CARPET AND A MAGIC BOX ARE 
SEIZED BY NICHOLAS. NICHOLAS TRAVELS TO THE LOCATION OF THE MAGIC STEED IN 
ANOTHER KINGDOM. N I C H O L A S BY THE MAGIC CARPET. N I C H O L A S FINDS THE 
BEAR. N I C H O L A S S U R P R I S E S THE BEAR. N I C H O L A S KILLS THE BEAR 
WITH THE AID OF THE MAGIC BOW. THE MAGIC STEED A P P E A R S FROM THE MAGIC BOX. 
N I C H O L A S STARTS BACK HOME. THE B E A R ' S FATHER CHASES AFTER NICHOLAS. 
N I C H O L A S E S C A P E S BY FLYING ON A FALCON. N I C H O L A S RETURNS 
HOME.
 
III.
 
THE KEY Q U E S T I O N
 
We perceive the locus of theoretical interest to be the process of verbal and 
non-verbal behavior transmission across generations. Our work on m o d e l l i n 
g speech c o m m u n i t i e s i n c l u d e s designs for s i m u l a t i o n s 
in which many modelled individuals, each with his own semantic network, his own 
grammar(s), his own b e h a v i o r rules, interact with each other a c c o r d 
i n g to the modelled rules of the social s t r u c t u r e of the s o c i e t y 
(Klein 1974a). It is our hope to be able to model the t r a n s m i s s i o n 
process of all the rules in the system. This means that newly born m o d e l l e 
d individuals will infer rules for natural language and also for n o n - v e r b 
a l behavioral s i m u l a t i o n rules, as a function of inputs of texts 
supplied by other modelled individuals. The texts may be verbal discourse, or 
non-verbal sequences of behavior. The learning individual will a c t u a l l y c 
o m p i l e and r e c o m p i l e new versions of his own behavioral rules as 
the s i m u l a t i o n process proceeds. His own test p r o d u c t i o n s of 
b e h a v i o r s c e n a r i o s as well as natural language d i s c o u r s e 
will be subject to evaluation and possible c o r r e c t i o n by o t h e r m e 
m b e r s of the m o d e l l e d community, and their r e a c t i o n s as well 
as the c o n s e q u e n c e s of the p r o d u c t i o n s , will serve as a 
control on the entire learning process. And, as i n d i c a t e d earlier, the 
rules to be inferred, compiled and r e c o m p i l e d will include rules that 
govern the process of inference and c o m p i l a t i o n itself.
 
Tale
 
THE M O R E V N A S LIVE IN A D I S T A N T PROVINCE. THE FATHER IS EREMA. THE 
MOTHER IS VASILISA. THE O L D E S T SON IS BALDAK. THE Y O U N G E R SON IS 
MARCO. THE Y O U N G E S T SON IS BORIS. THE O L D E S T D A U G H T E R IS 
MARIA. THE Y O U N G E R D A U G H T E R IS KATRINA. THE Y O U N G E S T D A U G 
H T E R IS MARTHA. N I C H O L A S ALSO L I V E S IN THE SAME LAND. N I C H O L 
A S IS OF M I R A C U L O U S BIRTH. B A L D A K HAS A MAGIC STEED. A BEAR A P P 
E A R S IN THE D I S T A N T PROVINCE. THE BEAR S E I Z E S THE MAGIC STEED. B A 
L D A K CALLS FOR HELP FROM NICHOLAS. N I C H O L A S D E C I D E S TO SEARCH 
FOR THE MAGIC STEED. N I C H O L A S LEAVES ON A SEARCH. N I C H O L A S MEETS A 
JUG ALONG THE WAY. THE JUG IS F I G H T I N G WITH ELENA OVER A MAGIC BOW. THE 
JUG ASKS N I C H O L A S TO DIVIDE THE MAGIC BOW. N I C H O L A S TRICKS THE D I 
S P U T A N T S INTO LEAVING 85
 
IV. LOGICAL QUANTIFICATION, PARSING, P R E S U P P O S I T I O N A L A N A L Y S 
I S
 
SEMANTIC
 
We have mentioned the 2nd order or higher predicate calculus. For our purposes, 
the e s s e n t i a l feature is that the logical quantification of the rules 
may be q u a n t i f i e d by the contents of the rules themselves. 
Meta-compiling of rules g o v e r n i n g m e t a - c o m p i l i n g is an 
example of this process. There are other The b e h a v i o r a l rules classes 
that make it rules that can treat complex actions as techniques available. 
operate with high-level possible to formulate objects, c h a r a c t e r s and 
manifestations of the
 
fisame abstract semantic Unit. A major type of b e h a v i o r rule m o d i f i 
c a t i o n and extension is the a b i l i t y to requantify the rules as a 
heuristic function of experience. The process does not involve r e c o m p i l a 
t i o n -- rather m o d i f i c a t i o n of the domain of a p p l i c a b i l i 
t y of an existing rule. One of the types of semantic parsing possible in the 
system is the d e t e r m i n a t i o n of the presuppositions of the semantic 
content of input text. The scenario rules that could have generated the text 
have preconditions, and these p r e c o n d i t i o n s also have their own p r 
e c o n d i t i o n s as s p e c i f i e d by other rules. In cases where the 
semantic content of an input text is not potentially derivable from existing b e 
h a v i o r a l rules, the system can posit requantification (assignments and r 
e a s s i g n m e n t s to semantic classes) to make the input text derivable. 
Or, if necessary, the same end can be a c h i e v e d by c o m p i l i n g new 
rules that would make the text plausible. G e n e r a l i z a t i o n of the 
method makes it possible to build complex learning models for highly abstract, 
semantically driven text grammars. Perhaps the u l t i m a t e test is the m o d 
e l l i n g of the h e u r i s t i c processes of Levi-Strauss. We hope to be 
able to build a model that learns text grammars with arbitrarily a b s t r a c t 
semantics such as that m a n i f e s t e d in L e v i - S t r a u s s (1969). At 
the moment, we are w o r k i n g on m o d e l l i n g the text grammar he 
himself has derived (Klein et al 1975). The potential of our work is to handle a 
degree and kind of abstraction in semantics heretofore untouched by linguistics, 
i n c l u d i n g the m o d e l l i n g of the automatic creation of text 
grammars for dreams and myths as a function of cultural rules. GENERALITY OF THE 
META-SYMBOLIC S I M U L A T I O N SYSTEM AS A THEORY TESTING DEVICE
 V.
 
2. The theoretical foundations of Computer Science are identical with those of 
Linguistics. 3. T h e o r e t i c a l linguistic models that are not strongly 
linked to objective tasks are meaningless. No semantics is m e a n i n g f u l 
except in terms of the o b j e c t i v e tasks it facilitates. 4. The future of 
Linguistics, Computational Linguistics, Artificial Intelligence, Psychological 
models of human behavior, are in the future of the Foundations of Programming 
Languages and the Theory of Operating Systems. The human mind is at least as 
complicated as an operating system for a 4th g e n e r a t i o n computer. 5. An 
a d e q u a t e linguistic theory must account for the function of language in 
social groups and its transmission through time and space. At the same time, 
such a theory must account for the highest semantic a t t a i n m e n t s of the 
human mind, i n c l u d i n g l i t e r a t u r e and art, and, in fact, the 
totality of symbolic processes. 6. I n p u t / o u t p u t e q u i v a l e n c e 
of model and modelled does not imply isomorphism between model and modelled. 
(Chomskian beliefs to the c o n t r a r y have their roots in Leibniz" Theory of 
Monads and its required ontological argument.) There are no models of 
performance, only models of c o m p e t e n c e which can be compared, one 
against the other, for accuracy in predicting relations between input and output 
in real w o r l d systems.
 
Our m e t h o d o l o g y and programming style have y i e l d e d a system w h 
e r e i n all the rules, and even the form of the theories in which they are 
cast, are input as data. As far as we can determine, this permits us to encode 
in our system v i r t u a l l y all the t h e o r e t i c a l models c u r r e n 
t l y p r e v a l e n t in linguistics, plus heretofore unformulated models of 
vastly greater power. (Preliminary work in the classroom, for example, indicates 
that models of the work of Schank and his students may easily be i m p l e m e n 
t e d in our system, with an i n c r e a s e d speed of e x e c u t i o n of 
about 50 to I in favor of our versions.)
 
VII.
 
THEORETICAL
 
IMPLICATIONS of Human Mental
 
A. The Structures
 
Non-inateness
 
Our work c o n s t i t u t e s a refutation by counter example of the necessity 
for a c o r r e l a t i o n between models of human mental structures and the s 
t r u c t u r e of the human brain. (A s o f t w a r e system can o p e r a t e 
with no inherent i s o m o r p h i s m s with a p a r t i c u l a r computer.) N 
o t h i n g need be innate except the meta-compiling capacity and the p e r c e 
p t i o n of time. Our work suggests the logical p o s s i b i l i t y that the 
human mind can learn to learn, and learn how to learn to learn, and that each 
human may do it differently. The basic principles of language inference, which 
can be derived from a b e h a v i o r i s t i c p s y c h o l o g i c a l 
framework, can alone account for the s t r u c t u r i n g of mental p r o c e s 
s e s as a software phenomenon, independent of physiological reality. It follows 
that humans can have d i f f e r e n t rules, different data structures, 
different hierarchical 86
 
VI. THE M E T H O D O L O G I C A L WORK
 
S I G N I F I C A N C E OF
 
OUR
 
Our work over the years has suggested and r e i n f o r c e d the f o l l o w i 
n g m e t h o d o l o g i c a l principles: I. No significant theories 
formulated in L i n g u i s t i c s not c o m p u t e d based. can be that are
 
fiorganizations, where the only controlling factor is the requirement that the i 
n t e r n a l i z e d models permit the individuals to function and interact 
with the inputs and outputs of other individuals in a social group. B. History 
History as the Meta-language of
 
des Linguistics.
 
Bucharest
 
Implicit in our approach is an alternative to the concept of an infinite h i e r 
a r c h y of m e t a - l a n g u a g e s , as formulated by B e r t r a n d 
Russell in his Theory of Types in Principia Mathematica (Whitehead & Russell 
1911-1913). The concept of successive states of time, each linked with the p o s 
s i b i l i t y of defining (meta-compiling) new rules of the universe for the 
next state, (including the rules for defining new rules), suggests that there 
need be only a single meta-language and a single language in any state at any 
point in time, and that each serves, in turn, as the m e t a - l a n g u a g e 
for the other in successive time frames. This is not a s t o c h a s t i c 
process. It is the concepts of time and meta-compiling that appear to be the f u 
n d a m e n t a l aspects of human cognition. The principle may be universal for 
all human b e h a v i o r a l / s y m b o l i c processes, and students of the p 
h i l o s o p h y of history will now recognize our meta-symbolic simulation 
system as equivalent to an automated Hegelian dialectic philosophy which 
specifies that each successive state of h i s t o r i c a l d e v e l o p m e n 
t is c o n t r o l l e d by the meta-language of its previous state, and becomes 
the m e t a - l a n g u a g e of its successor state.
 
REFERENCES

Klein, S., . 1973. Automatic inference of semantic deep structure rules ingenerative semantic Grammars.  Univ. if W i s c o n s i n Comp. Sci. Tech 
Report 180. Also in 1974. Computational and Mathematical Linguistics, 
Proceedings of the Int. Conf. on Computational Linguistics, Pisa, 1973. A. 
Zampolli, ed., Florence: Olschki 

Klein, S., . 1974a. C o m p u t e r simulation of 
language contact models. In Towards Tomorrow's Linguistics, Shuy & Bailey, 
editors, Washington, D.C.: Georgetown University Press. 

Klein, S., . 1974b. A computer 
model for the linguistic basis of the t r a n s m i s s i o n of culture. 
Presented at 1974 Meeting of American Anthropological Association, Mexico City, 
Nov. 1974. (Final draft in preparation) 

Klein, S., J. F. Aeschlimann, D. F. 
Balsiger, S. L. Converse, C. Court, M. Foster, R. Lao, J. D. Oakley, and J. Smith . 1973. A U T O M A T I C NOVEL WRITING: a status report. Univ. of Wisc. Comp. Sci. Dept. Tech Report 186. Presented at 1973 Int. Conf. on Computers in the 
Humanities. 

Klein, S., J. F. Aeschlimann, M. A. Appelbaum, D. F. Balsiger, E. J. 
Curtis, M. Foster, S. D. Kalish, S. J. Kamin, Y-D. Lee, L. A. Price, D. F. 
Salsieder. 1974. M o d e l l i n g Propp and Levi-Strauss in a Meta-symbolic 
Simulation System. Univ. of Wisc. Comp. Sci. Tech Report 226. In press in 
Patterns in Oral Literature, edited by Heda Jason and Dimitri Segal as a 
retroactive contribution to this volume of the 1973 Wolrd Conference 
of Anthropological and Ethnological Sciences. Chicago. 

Klein, S., W. Fabens, R. Herriot, W. Katke, M.A. Kuppin, A. Towster . 1968. The AUTOLING system. Univ. of Wisc. Comp. Sci. Dept. Tech Report 43. 

Klein, S. 
and M. A. Kuppin. 1970. An interactive, heuristic program for learning 
transformational grammars. Computer Studies in the Humanities and 
Verbal Behavior, 3:144-162. 

Klein, S., L. A. Price, J. F. Aeschlimann, D. A. Balsiger and E. J. Curtis. 1975. A Meta-symbolic Simulation Model for Five Myths from Levi-Strauss " The Raw and the Cooked. Univ. of Wisc. Comp. Sci. Dept. Tech Report P r e s e n t e d at 2nd Int. Conf. on Computers in the Humanities, Chicago, April 1975. 

Klein, S. and V. Rozencvejg . 1974. A Computer Model for the Ontogeny of Pidgin and Creole Languages, Univ. of Wisc. Comp. Sci. Dept. Tech Report 238. Presented at the 1975 Int. Conf. on Pidgins and Creoles, Hawaii, January 1975. 87
 
Levi-Strauss, C . 1969, The Raw and the Cooked. (English translation) New York: Harper & Row.
 
PetSfi, J. S . 1973. Toward an empirically motivated grammatical theory of verbal  texts. In Studies in Text Grammar, edited by J.S. Petofi & H. Rieser. Dordrecht: Reidel. 

Pet~fi, J. S. & H. Rieser. 1973. Probleme der modelltheoretischen Interpretation yon Texten. Hamburg: Buske Verlag. 

Propp, V . 1968. Morphology of the Folkt~le (English translation) 2nd Edition, Austin: University of Texas Press. 

Whitehead, A. N. and B. Russell. 1911-1913. Princip~a Mathematica. (3 volumes London: Cambridge University Press.
