{ "paper_id": "C69-0201", "header": { "generated_with": "S2ORC 1.0.0", "date_generated": "2023-01-19T12:32:28.600032Z" }, "title": "", "authors": [], "year": "", "venue": null, "identifiers": {}, "abstract": "", "pdf_parse": { "paper_id": "C69-0201", "_pdf_hash": "", "abstract": [], "body_text": [ { "text": "limitations and non-linguistle knowledge. This is certainly what must be simulated as an inclusive part of linguistic theory. The kind of theory of 'performance' of which Chomsky speaks may well be in the far distant future to which Chomsky relegates it (1965) . However, a theory of simulative performance is not so far off. It would seem very reasonable that the possibility of the construction of a linguistic theory that both accounts for the data and does this in such a way as to appear to be consonant with the human method for doing so, is not so remote. Clearly, such a theory must deal with non-linguistic knowledge and problems of human memory as well as the problems that Chomsky designates as 'competence'. Thus, it seems that the sharp distinction between competence and performance is artificial at best. In particular, after elimination of some of the behavioristic problems such as distraction, we can expect to find a linguistic theory that is neither one of 'competence' nor 'performance' but something in between and therefore inclusive of both. Chomsky (1965:139) has stated: 'thus it seems absurd to suppose that the speaker first forms a generalized Phrase-marker by base rules and then tests it for well-formedness by applying transformational rules to see if it gives, finally, a well-formed sentence. But this absurdity is simply a corollary to the deeper absurdity of regarding the system of generative rules as a point-by-point model for the actual construction of a sentence by a speaker.\"", "cite_spans": [ { "start": 254, "end": 260, "text": "(1965)", "ref_id": null }, { "start": 1066, "end": 1084, "text": "Chomsky (1965:139)", "ref_id": null } ], "ref_spans": [], "eq_spans": [], "section": "", "sec_num": null }, { "text": "We could, on the other hand, attempt to formulate a system of rules that are a point-by-point model for the actual construction of a sentence by a speaker. Furthermore, we might expect that that system could also be a point-by-polnt model for the actual analysis of a sentenee by a hearer. These claims, however, would be largely unverifiable except by the use of computers as simulative devices. Chomsky (1965:141) has further stated that: 'She gralmnar does not, in itself, provide any sensible procedure for finding the deep structure of a given sentence, or for producing a given sentence, just as it provides no sensible procedure for finding a paraphrase to a given sentence. It merely defines these tasks in a precise way. A performance model must certainly incorporate a grammar; it is not to be confused with grammar.\"", "cite_spans": [ { "start": 397, "end": 415, "text": "Chomsky (1965:141)", "ref_id": null } ], "ref_spans": [], "eq_spans": [], "section": "", "sec_num": null }, { "text": "Insofar as the notion of a performance model here can be taken as being somewhere between Chomsky's notion of competence and performance, our notion of grammar also lies somewhere between Chomsky's notion of a grammar and the incorporation of a grammar.", "cite_spans": [], "ref_spans": [], "eq_spans": [], "section": "", "sec_num": null }, { "text": "-3-", "cite_spans": [], "ref_spans": [], "eq_spans": [], "section": "", "sec_num": null }, { "text": "The Conceptual Dependency framework (see Schank [ 1969] ) is a stratified linguistic system that attempts to provide a computational theory of simulative performance.", "cite_spans": [ { "start": 41, "end": 55, "text": "Schank [ 1969]", "ref_id": "BIBREF6" } ], "ref_spans": [], "eq_spans": [], "section": "II. Conceptual Dependency", "sec_num": null }, { "text": "The highest level of the stratificatlonal system (similar to Lanab [ 1966] , Sgall [1965] and others) employed by the Conceptual Dependency framework is an interlingua consisting of a network of language-free dependent concepts, where a concept may be considered to be an unambiguous word-sense, (except see Schank,[1968] ).", "cite_spans": [ { "start": 67, "end": 74, "text": "[ 1966]", "ref_id": null }, { "start": 77, "end": 89, "text": "Sgall [1965]", "ref_id": null }, { "start": 308, "end": 321, "text": "Schank,[1968]", "ref_id": "BIBREF7" } ], "ref_spans": [], "eq_spans": [], "section": "II. Conceptual Dependency", "sec_num": null }, { "text": "(The notion of dependency used here is related to those of Hays (1964) and Klein (1965) The semantics that Conceptual Dependency employs is a conceptual semantics in that it serves only to limit the range of conceptualizations in such a way as to make them consonant with experience. The form and major content of this semantics is thus universal, but Since we are dealing with experience we are required to speak of someone's experience. since we are dealing with a computer, we can talk of the systems' experience. Thus, the conceptual semantics consists of lists of potential dependents for any given concept. These lists are listed with respect to semantic categories if there is a generalization that can be made on that basis.", "cite_spans": [ { "start": 59, "end": 70, "text": "Hays (1964)", "ref_id": "BIBREF2" }, { "start": 75, "end": 87, "text": "Klein (1965)", "ref_id": "BIBREF4" } ], "ref_spans": [], "eq_spans": [], "section": "II. Conceptual Dependency", "sec_num": null }, { "text": "The Conceptual Dependency framework is used for a natural language parser by reversing the realization rules and using the semantics as a check with reality. The system for analyzing a sentence into its conceptual representation operates on pieces of a sentence looking up the potential conceptual realizates.", "cite_spans": [], "ref_spans": [], "eq_spans": [], "section": "III. The Parser", "sec_num": null }, { "text": "to see if that particular part of the construction has occurred before.", "cite_spans": [], "ref_spans": [], "eq_spans": [], "section": "All conceptualizations are checked against a list of experiences", "sec_num": null }, { "text": "If the construction has not occurred, or has occurred only in some peculiar context, this is noted. Thus, in the construction 'ideas sleep', it is discovered that this connection has never been made before, and is therefore meaningless to the system. If the user says that this construction is all right, it is added to the memory; otherwise the construction is looked up in a metaphor list or aborted ", "cite_spans": [], "ref_spans": [], "eq_spans": [], "section": "All conceptualizations are checked against a list of experiences", "sec_num": null }, { "text": "The parser is presently operating in a limited form. It is coded in MLISP for the iPDP-iO and can be adapted to other LISP processors with minor revisions. If ther~ are several senses from which to choose, the program sees whether it was anticipating a concept or connective from some specific category; if so it restricts its first guesses to senses in that category. Recent contextual usage of some sense also can serve to prefer one interpretation over others. To choose among several senses with otherwise equal likelihoods, the sense with lowest subscript is chosen first. Thus, by ordering senses in the dictionary according to their empirical frequency of occurrence, the system can try to improve its guessing ability.", "cite_spans": [], "ref_spans": [], "eq_spans": [], "section": "IV. Implementation", "sec_num": null }, { "text": "The realization rules that apply to each word sense are referenced in the dictionary under each sense. Most of the rules fall into categories that cover large conceptual classes and are referenced by many concepts. Such categories are PP, PA, AA, PPloc, PPt, LOC, T, simply transitive ACT, intransitive ACT, ACT that can take an entire conceptualization as direct object (\"state ACT\") and ACT that can take an indirect object without a preposition (\"transport ACT\"). In contrast to most concepts, each connective (e.g., an auxiliary, preposition, or determiner) tends to have its own rules or to share its rules with a few other words.", "cite_spans": [], "ref_spans": [], "eq_spans": [], "section": "IV. Implementation", "sec_num": null }, { "text": "A realization rule consists of two parts: a recognizer and a dependency chart. The recognizer determines whether the rule applies and the dependency chart shows the dependencies that exist when it does.", "cite_spans": [], "ref_spans": [], "eq_spans": [], "section": "IV. Implementation", "sec_num": null }, { "text": "In the recognizer are specified the ordering, categories, and Inflection of the concepts and connectives that normally would appear in a sentence if the rule applied. If certain concepts or connectives are omissible in the input, the rule can specify what to assume when they are missing.", "cite_spans": [], "ref_spans": [], "eq_spans": [], "section": "IV. Implementation", "sec_num": null }, { "text": "Agreement of inflected words can be specified in an absolute (e.g., \"plural\") or a relative manner (e.g., \"same tease\"). Rules for a language like English have a preponderance of word order specifications while rules for a more highly inflected language would have a preponderance -0-of inflection specifications.", "cite_spans": [], "ref_spans": [], "eq_spans": [], "section": "IV. Implementation", "sec_num": null }, { "text": "Realization rules are used both to fit concepts into the network as they are encountered and to anticipate further concepts and their potential realizates in the network. When a rule is selected for the current word sense, it is compared with the rules of preceding word senses to find one that \"fits\". For example, if \"very hot\" is heard, one realization rule for \"very\" is: vdry where the tags \"0\" and \"i\" indicate the relative order of the word sense in the recognizer and identify them for reference by the dependency chart; '~\" means the current word. One rule for \"hot\" is:", "cite_spans": [], "ref_spans": [], "eq_spans": [], "section": "IV. Implementation", "sec_num": null }, { "text": "0 AA PA : -i0-i", "cite_spans": [], "ref_spans": [], "eq_spans": [], "section": "IV. Implementation", "sec_num": null }, { "text": "The program notices that \"very\" fits in the \"-i\" slot of the \"hot\" rule and verifies that \"hot\" fits in the \"i\" slot of the \"very\" rule. Therefore, the dependency suggested by the chart can he postulated for the That Tom loves Sue is beautiful.", "cite_spans": [], "ref_spans": [], "eq_spans": [], "section": "IV. Implementation", "sec_num": null }, { "text": "For Tom to love Sue is beautiful.", "cite_spans": [], "ref_spans": [], "eq_spans": [], "section": "IV. Implementation", "sec_num": null }, { "text": "In conceptual dependency theory, connectives like \"that\", \"for\", \"to\"~ and \"of\" are cues to the structure of the network and need not appear in the network at all. The network above demonstrates such a situation.", "cite_spans": [], "ref_spans": [], "eq_spans": [], "section": "IV. Implementation", "sec_num": null }, { "text": "Conversely, portions of the network may be absent from the sentence. The square brackets indicate optional words. The tags \"(a0)\" and \"(al)\"", "cite_spans": [], "ref_spans": [], "eq_spans": [], "section": "IV. Implementation", "sec_num": null }, { "text": "indicate that \"for\" precedes the \"PP\" but the whole phrase may occur in any position of the construct. \"(al)!one\" in the dependency chart means that if \"(al)\", i.e., \"for PP\", is omitted, and the subject of the action is not obvious from context, then the concept \"one\" is to be assumed. ", "cite_spans": [], "ref_spans": [], "eq_spans": [], "section": "IV. Implementation", "sec_num": null } ], "back_matter": [ { "text": "inanimate motion object", "cite_spans": [], "ref_spans": [], "eq_spans": [], "section": "ball,", "sec_num": null } ], "bib_entries": { "BIBREF0": { "ref_id": "b0", "title": "Aspects of the Theory of Syntax", "authors": [ { "first": "N", "middle": [], "last": "Chomsky", "suffix": "" } ], "year": null, "venue": "", "volume": "", "issue": "", "pages": "", "other_ids": {}, "num": null, "urls": [], "raw_text": "Chomsky, N., Aspects of the Theory of Syntax, MIT Press, Cambridge, 1~5.", "links": null }, "BIBREF1": { "ref_id": "b1", "title": "Heuristic Methods for Computer Understanding of Natural Language in Context-Restricted On-Line Dialogues", "authors": [ { "first": "K", "middle": [], "last": "Colby", "suffix": "" }, { "first": "H", "middle": [], "last": "Enea", "suffix": "" } ], "year": 1967, "venue": "Mathematical Biosciences", "volume": "", "issue": "", "pages": "", "other_ids": {}, "num": null, "urls": [], "raw_text": "Colby, K., and Enea, H., \"Heuristic Methods for Computer Understan- ding of Natural Language in Context-Restricted On-Line Dia- logues,\" Mathematical Biosciences, 1967.", "links": null }, "BIBREF2": { "ref_id": "b2", "title": "Dependency Theory: A Formalism and Some Observations", "authors": [ { "first": "D", "middle": [], "last": "Hays", "suffix": "" } ], "year": 1964, "venue": "", "volume": "", "issue": "", "pages": "", "other_ids": {}, "num": null, "urls": [], "raw_text": "Hays, D., \"Dependency Theory: A Formalism and Some Observations\", V.40, December 1964.", "links": null }, "BIBREF3": { "ref_id": "b3", "title": "Experiments with a Powerful Parser", "authors": [ { "first": "M", "middle": [], "last": "Kay", "suffix": "" } ], "year": 1967, "venue": "", "volume": "", "issue": "", "pages": "", "other_ids": {}, "num": null, "urls": [], "raw_text": "Kay, M., \"Experiments with a Powerful Parser\", RAND, Santa Moniea, California, 1967.", "links": null }, "BIBREF4": { "ref_id": "b4", "title": "Automatic Paraphrasing in Essay Format", "authors": [ { "first": "S", "middle": [], "last": "Klein", "suffix": "" } ], "year": 1965, "venue": "Mechanical Translation", "volume": "", "issue": "", "pages": "", "other_ids": {}, "num": null, "urls": [], "raw_text": "Klein, S., \"Automatic Paraphrasing in Essay Format\", Mechanical Translation, 1965.", "links": null }, "BIBREF5": { "ref_id": "b5", "title": "The Sememic Approach to Structural Semantics", "authors": [ { "first": "S", "middle": [], "last": "Lamb", "suffix": "" } ], "year": 1964, "venue": "American Anthropologist", "volume": "", "issue": "", "pages": "", "other_ids": {}, "num": null, "urls": [], "raw_text": "Lamb, S., 'The Sememic Approach to Structural Semantics\", American Anthropologist 1964.", "links": null }, "BIBREF6": { "ref_id": "b6", "title": "A Conceptual Dependency Representation for a Computer-Oriented Semantics", "authors": [ { "first": "R", "middle": [], "last": "Schank", "suffix": "" } ], "year": 1969, "venue": "Also available as Stanford AI Memo", "volume": "83", "issue": "", "pages": "", "other_ids": {}, "num": null, "urls": [], "raw_text": "Schank, R., \"A Conceptual Dependency Representation for a Computer- Oriented Semantics\", Ph.D. Thesis University of Texas, Austin 1969 (Also available as Stanford AI Memo 83, Stanford Arti- ficial Intelligence Project, Computer Science Department, Stanford University, Stanford, California.)", "links": null }, "BIBREF7": { "ref_id": "b7", "title": "A Notion of Linguistic Concept: A Prelude to Mechanical Translation", "authors": [ { "first": "R", "middle": [], "last": "Schank", "suffix": "" } ], "year": 1968, "venue": "Stanford AI Memo 7~", "volume": "", "issue": "", "pages": "", "other_ids": {}, "num": null, "urls": [], "raw_text": "Schank, R., \"A Notion of Linguistic Concept: A Prelude to Mechanical Translation\", Stanford AI Memo 7~. December 1968.", "links": null }, "BIBREF8": { "ref_id": "b8", "title": "Generation, Production and Translation", "authors": [ { "first": "P", "middle": [], "last": "Sgall", "suffix": "" } ], "year": null, "venue": "Presented to 1965 International Conference on Computational Linguistics", "volume": "", "issue": "", "pages": "", "other_ids": {}, "num": null, "urls": [], "raw_text": "Sgall, P., \"Generation, Production and Translation\", Presented to 1965 International Conference on Computational Linguistics, New York.", "links": null }, "BIBREF9": { "ref_id": "b9", "title": "~he Syntactic Analysis of English by Machine", "authors": [ { "first": "", "middle": [], "last": "Io", "suffix": "" }, { "first": "J", "middle": [], "last": "Thorne", "suffix": "" }, { "first": "P", "middle": [], "last": "Bratley", "suffix": "" }, { "first": "H", "middle": [], "last": "Dewar", "suffix": "" } ], "year": 1968, "venue": "Machine Intelli~ence III", "volume": "", "issue": "", "pages": "", "other_ids": {}, "num": null, "urls": [], "raw_text": "IO. Thorne, J., Bratley, P., and Dewar, H., '~he Syntactic Analysis of English by Machine\", in Machine Intelli~ence III, University of Edinburgh, 1968.", "links": null }, "BIBREF10": { "ref_id": "b10", "title": "Recent Developments in the MITRE Syntactic Analysis Procedure", "authors": [ { "first": "", "middle": [], "last": "Ii", "suffix": "" }, { "first": "D", "middle": [], "last": "Walker", "suffix": "" }, { "first": "P", "middle": [], "last": "Chapin", "suffix": "" }, { "first": "M", "middle": [], "last": "Gels", "suffix": "" }, { "first": "L", "middle": [], "last": "Gross", "suffix": "" } ], "year": null, "venue": "MITRE Corp", "volume": "", "issue": "", "pages": "", "other_ids": {}, "num": null, "urls": [], "raw_text": "II. Walker, D., Chapin, P., Gels, M., and Gross, L., 'Recent Develop- ments in the MITRE Syntactic Analysis Procedure\", MITRE Corp., Bedford, Mass., June i~6.", "links": null } }, "ref_entries": { "FIGREF0": { "type_str": "figure", "text": "location of a conceptualization. A time of conceptualization; often has variant forms consisting of parts of a conceptualization. Attribute of a PP; corresponds (in English) to adjectives and some abstract nouns. Attribute of an ACT; corresponds (in English) to adverbs and indirectly objective abstract nouns. Thus, the categories assigned in the above network correspond closely to their syntactic correlates: PP ~ ACT ~ PP ~ PP PA PA However, in the sentence, 'Visiting relatives can be a nuisance', the syntactic categories often do not correspond with the conceptual actors and actions. The ambiguous interpretations of this sentence are: use the conditional present [denoted by c] form of the two-way dependency link, one of eight posslble tense-mood forms.) relatives ~ bother ~ one PP ~ ACT ~ PP written in a conceptual dependency analysis on a straight line. Dependents written perpendicular to the line are attributes of their governor except when they are part of another conceptualization line. Whole conceptualizations can relate to other conceptualizations as actors ([i] and [3]) or attributes ([2] where ~ indicates that the PP at its head is the actor in a main and subordinate conceptualization [ ~ is the subordinate, written below the line]). The Conceptual Dependency framework, at the conceptual level, is thus responsible for representing the meaning of a piece of written language in language-free terms. The representation is in terms of actor-action-object conceptualizations in a topic-cogent form. Thus, words that have many syntactic forms will have only one conceptual form. This is true interlinguistically as well as intralinguistically. The meaning of a construction is always the consideration used in representation. For example, 'of' in 'a chp of water' is realized as '~-~ contalns X' where X is water. cup contains water Similarly, in 'John's love is good', 'love' is realized conceptually as X = loves ~Y. John ~ good love t one In order to make this framework serve as a generative theory, semantics and realization rules must be added. The realization rules are used in conjunction with a dictionary of realizates. These rules map pieces of the network in accord with the granmaar. Thus, a simple rule in English might be: PP = Adj + N PA In facts the rules are not this simple since criteria of usualness and context enter into each application of a rule. These problems are discussed elsewhere (Sehank [1969] ) and are not the point of this paper.", "num": null, "uris": null }, "FIGREF1": { "type_str": "figure", "text": ". The system thus employs a record of what it has heard before in order to analyze what it is presently hearing. In order for the system to choose between two analyses of a sentence both of which are feasible with respect to the conceptual rules (see Appendix) the conceptual semantics is incorporated. The conceptual semantics limits the possible conceptual dependencies to statements consonant with the system's knowledge of the real world. The definition of each concept is composed of records organized by dependency type and by the conceptual category of the dependent. For each type of dependency, semantic categories (such as animate object, human institution, animal motion) are delimited with respect to the conceptual category of a given concept, and defining characteristics are inserted when they are known. For example, concepts in the semantic category 'physical object' all have the characteristic 'shape' Sometimes this information is intrinsic to the particular concept involved, for example, 'balls are round' The semantic categories are organized into hierarchical structures in which limitations on any category are assumed to apply as well to all categories subordinate to it. The system of semantic categories and a method of constructing semantic files is discussed more fully in Schank (1969). In the present system, the files are constructed by incorporating information derived from rules presented as English sentences. The program parses each of these sentences and observes which dependencies are new and then adds them to the files. As an example of the use of the conceptual semantics, consider the parse of 'the tall boy went to the park with the girl'. At the point in the parse where the network is boy ~ go ~ park t tall we are faced with the problem of where to attach the construct ~ ~tb girl. A problem exists since at least two realization rules may apply: 'ArT PR~P ~: 1 ~3; '~P P~EP ~P: ~2 ' The problem is resolved by the", "num": null, "uris": null }, "FIGREF2": { "type_str": "figure", "text": "Rather than attaching new dependencies to a growing network during the parse, the program determines all the dependencies present in the network and then assembles the entire network at the end. Thus, the sentence 'The big boy gives apples to the pig.' is parsed into:i) ~ boy t big 2) boy ~ give 3) gives ~ apples ~) give <~ pig and then these are assembled into: boy ~ give ~ apples ~ pig t big The input sentence is processed word-by-word. After \"hearing\" each word, the program attempts to determine as much as it c~n about the sentence before \"listening\" for more. To this end, dependencies are discovered as each word is processed. Furthermore, the program anticipates what kinds of concepts and structures may be expected later in the sentence.If what it hears does not conform wlth its anticipation, it may be\"confused\", \"surprised\", or even \"amused\".In case of semantic or syntactic ambiguity, the program should determine which of several possible interpretations was intended by the \"speaker\". It first selects one interpretation by means of miscellaneous heuristics and stacks the rests. In case later tests and further input refute or cast doubt upon the initial guess, that guess is discarded or shelved, and a different interpretation is removed from the stack to be processed. To process an interpretation, it may be necessary to back up the scan to an earlier point in the sentence and rescan several words.To avoid repetitious work during rescans, any information learned about the words of the sentence is kept in core memory.The parse involves five steps: the dictionary lookup, the application of realization rules, the elimination of idioms, the rewriting of abstracts~ and the check against the conceptual semantics.The dictionary of words is kept mostly on the disk, but the most frequently encountered words remain in core memory to minimize processingtime. Under each word are listed all its senses. \"Senses\" are defined pragmatically as interpretations of the word that can lead to different network structures or that denote different concepts. For example, some of the senses of \"fly\" are: fly I -(intransitive ACT): what a passenger does in an airplane. fly 2 -(intransitive ACT): what an airplane or bird does in the air. fly 3 -(PP): an insect fly 4 -(transitive ACT): what a pilot does by operating an airplane. fly 5 -(intransitive AcT--metaphoric): to go fast. fly 6 -(PP): a flap as on trousers.", "num": null, "uris": null }, "FIGREF3": { "type_str": "figure", "text": "for two adjacent word senses are processed, other rules are tried, and more distant word senses are checked. file to see if it is an idiom or a proper name and should be restructured. Thus, the construct: make up is reduced to the single concept make-up This idiom will be detected by the parser even if several words intervene between '~make\" and \"up\" in the sentence. After eliminating idioms from the network, there still may be constructs that do not reflect language-free conceptions. The most conspicuous eases are caused in English by abstract nouns. Most such nouns do not correspond to PP's but rather are abbreviations for conceptualizations in which the concept represented is actually an ACT or a PA. The program treats an abstract noun as a PP temporarily in order to obtain its dependents, because abstract nouns have the syntax not of ACT's but of PP's. After obtaining its dependents, the PP is rewritten as an entire conceptualization according to rules obtained from an abstract file. These rules also specify what to do with the dependents of the PP; they may be dependent on the entire conceptualization , dependent on the ACT only, or appear elsewhere in the conceptualization. By way of example, the sentence: Tom's love for Sue is beautiful. leads to the following dependencies; love (PP) love (PP) for Tom (PP) SueAfter hearing \"is\", the program expects no mor~ dependents for \"love\"(by a heuristic in the program), so it checks the abstract file and finds rules for \"love\" including: a)\" and \"(b)\" identify concepts without reference to senten~al order. The network Is now rewritten:Tom love t Suewhere the horizontal main link represents \"is\", waiting for a right-hand concept. When 'beautiful\" Is heard, the network is complete, giving: may be realized alternative~ as either of the paraphrases:", "num": null, "uris": null }, "FIGREF4": { "type_str": "figure", "text": "For example, the sentence:It is good to hit the ball near the fence. one\" and \"place\" are not realized. Notice that the relevant realization rule for \"it\" is:", "num": null, "uris": null }, "FIGREF5": { "type_str": "figure", "text": "The conceptual network should reflect the Beliefs inherent in the original discourse in a language-free representation. The interlinguistic file of conceptual semantics is checked to verify that the dependencies are indeed acceptable. This check is made after abstracts have been rewritten. After the five parsing steps are completed, the program proceeds to the next word. At the end of the sentence, it outputs the final network in two dimensions on a printed page or on a display console. V. Examples of Algorithm Only a few of the relevant realization rules will be shown in new dependencies ~ohn ~\"see[~ Pe (note: \"to\"means \"tense of ACT Number O) 3: (to saw):.., etc. birds 1: (all PP patterns)But now there are two main links on one llne so go back and try as object of 'see'who was in the park yesterday, wanted to hit Fred in the mouth today'. John was persuaded by the doctor in New York to be easy to please.'", "num": null, "uris": null }, "TABREF0": { "type_str": "table", "num": null, "content": "
The conceptual level works with a system of rules (shown in the
Appendix) that operate on conceptual categories. These rules generate
have been referred to previously. all the permissible dependencies in a conceptualization. 'Big' refers to the concept 'big' Multiple
which cannot stand alone conceptually. combinatlon of conceptualizations in various relationships are intended The concept 'man' can stand
alone and is modified, conceptually by 'big', so it is realized in the to account for the totality of human language activity at the conceptual
network as a governor with its dependent. level.'Steals' denotes an action
that is dependent on the concept that is doing the acting. A conceptual-The conceptual categories are divided into governing and assisting
ization (a proposition about a conceptual actor) cannot be complete groups:
without a concept acting (or an attribute statement), so a two-way depen-
dency link may be said to exist between 'man' and 'steal' Governin~ CategoriesThat is, they
PP are dependent on each other and govern each other. Every conceptualiza-An actor or object; corresponds syntactically (in English) to concrete nominal nouns or noun
tion must have a two-way dependency llnk. forms.'Book' governs 'red' attribu-
ACT tively and the whole entity is placed as objectively dependent on 'steals'. An action; corresponds syntactically (in English) to verbs, verbal nouns, and most abstract nouns.
The construction 'from the girl' is realized as being dependent on the LOC
action through the conceptual object. This is a different type of de-T
pendency (denoted by 4).There are different forms of this 'prepositional
dependency', each of which is noted by writing the preposition over the Assisting Categories
llnk to indicate the kind of prepositional relationship. PA(Although a
language may use inflections or nothing at all instead of prepositions AA
to indicate prepositional dependency, we are discussing a language-free
system here and it is only the relation of the parts conceptually that
is under consideration.)
The conceptual network representation of this sentence is then
as follows:
from
man ~ steals ~ book ~ , girl
tt
bigred
", "text": "For example, in the sentence, '~he big man steals the red book from the girl.\" the analysis is as follows: 'The' is stored for use in connecting sentences in paragraphs, i.e., 'the' specifies that 'man' may", "html": null }, "TABREF1": { "type_str": "table", "num": null, "content": "
programs llke Colby's, as well as in question-answering programs, a
discourse-generating algorithm must be incorporated to reverse the
(I) function of the parser. The conceptual parser is based on a linguistic boy g go ~to park <with girl
tall boy g go <t=o park t tall ~with girl Thus the Coneeputal Dependency parser is a conceptual analyzer (2) rather than a syntactic parser. lated. It is primarily concerned with expli-cating the underlying meaning and conceptual relationships present in theory that uses the same rules for both parsing and generating, thus facilitating man-machine dialogues. In an interviewing program, the input may contain words that the program has never encountered, or which it has encountered only in different environments. The input may deal with a conceptual structure Although ,~formulation that can be operated on, realized in a paraphrase, or trans-a piece of discourse in any natural language. The parser described here that is outside the range of experience of the program, or even use a
bears some similarity to certain deep structure parsers (Kay [1967] , syntactic combination that is unknown. The program is designed to
learn new words and word-senses, new semantic possibilities, and new
rules of syntax both by encountering new examples during the dialogue
and by receiving explicit instruction.
The parser's output is a language-free network consisting of unambiguous
concepts and their relations to other concepts.Pieces of discourse
with identical meanings, whether in the same or different languages,
parse into the same conceptual network.
The parser is being used to understand natural language state-
ments in Colby's (1967) on-line dialogue program for psychiatric inter-
viewing, but is not restricted to this context.In interviewing
", "text": "Girl' is not found so the network (I) is allowed while (e) is aborted. 'th girl' is dependent on 'go' it is dependent through 'park'. That is, these are not isolated dependencies since we would want to be able to answer the question 'Did the girl go to the park?' affirmatively.Thus, the system purports to be analyzing a sentence in a way analogous to the human method.It handles input one word at a time as it is encountered checks potential linkings with its ~n knowledge of the world and past experience, and places its output into a language-free", "html": null }, "TABREF2": { "type_str": "table", "num": null, "content": "
Vll. Conclusion
ACT
sizeanyinphys objspecificbounce
shaperoundonphys objmotion objectroll, come, spin
coloranyforphys objconcretefall, hit ...
textureusually smoothbyplaceanybegin, cause ...
elasticitybouncesofanimal
atno
tono
I ~ like
\"25 -
", "text": "Before computers can understand natural language they must be able to make a decision as to preclsely what has been said. The conceptual parser described here is intended to take a natural language input and place the concepts derivable from that input into a network that explicates the relations between those concepts. The conceptual network that is then formed is not intended to point out the syntactic relations present and there is some question as to why any system would want this information. Although Chomsky's deep structures convey a good deal more information than just syntactic relations, it is clear that a parser that uses deep structures for output would be oriented syntactically.We see no point in limiting our system by trying to test out a previously formulated grammar. The output of a transformational parser, while making explicit some important aspects of the meaning of the sentence, does not make explicit all the conceptual relationships that are to be found, does not limit its parses with a check with reality, and most importantly is syntax based. The parser presented here is semantics based. We aver that the system that humans employ is also semantics based. It seems clear to us that our parser satisfies the requirements that a parser must satisfy and in so doing points out the advantages of regarding language from a Conceptual Dependency point of view.", "html": null } } } }