{ "paper_id": "P85-1020", "header": { "generated_with": "S2ORC 1.0.0", "date_generated": "2023-01-19T09:39:23.853508Z" }, "title": "MOVEMENT IN ACTIVE PRODUCTION NETWORKS", "authors": [ { "first": "Mark", "middle": [ "A" ], "last": "Jones", "suffix": "", "affiliation": { "laboratory": "", "institution": "AT&T Bell Laboratories Murray Hill", "location": { "postCode": "07974", "country": "New Jersey" } }, "email": "" }, { "first": "Alan", "middle": [ "S" ], "last": "Driacoll", "suffix": "", "affiliation": { "laboratory": "", "institution": "AT&T Bell Laboratories Murray Hill", "location": { "postCode": "07974", "country": "New Jersey" } }, "email": "" } ], "year": "", "venue": null, "identifiers": {}, "abstract": "We describe how movement is handled in a class of computational devices called active production networks (APNs). The APN model is a parallel, activation-basod framework that ha= been applied to other aspects of natural language processing. The model is briefly defined, the notation and mechanism for movement is explained, and then several examples are given which illustrate how various conditions on movement can naturally be explained in terms of limitations of the APN device.", "pdf_parse": { "paper_id": "P85-1020", "_pdf_hash": "", "abstract": [ { "text": "We describe how movement is handled in a class of computational devices called active production networks (APNs). The APN model is a parallel, activation-basod framework that ha= been applied to other aspects of natural language processing. The model is briefly defined, the notation and mechanism for movement is explained, and then several examples are given which illustrate how various conditions on movement can naturally be explained in terms of limitations of the APN device.", "cite_spans": [], "ref_spans": [], "eq_spans": [], "section": "Abstract", "sec_num": null } ], "body_text": [ { "text": "Movement is an important phenomenon in natural languages. Recently, proposals such as Gazdar's dcrivod rules (Gazdar, 1982) and Pereira's extraposition grammars (Pereirao 1983) have attemptod to find minimal extensions to the context-free framework that would allow the description of movement. In this paper, we describe a class of computational devices for natural language processing.", "cite_spans": [ { "start": 109, "end": 123, "text": "(Gazdar, 1982)", "ref_id": "BIBREF6" }, { "start": 161, "end": 176, "text": "(Pereirao 1983)", "ref_id": null } ], "ref_spans": [], "eq_spans": [], "section": "I. INTRODUCTION", "sec_num": null }, { "text": "called active production networks (APNs), and explore how certain kinds of movement are handled. In particular. we are concerned with left extraposition, such as Subjectauxiliary Inversion. Wh-movement, and NP holes in relative clauses, in these cos\u2022s, the extraposod constituent leaves a trace which is insertod at a later point in the processing. This paper builds on the research reported in Jones (1983) and Jones (forthcoming).", "cite_spans": [ { "start": 395, "end": 407, "text": "Jones (1983)", "ref_id": "BIBREF7" } ], "ref_spans": [], "eq_spans": [], "section": "I. INTRODUCTION", "sec_num": null }, { "text": "Our contention is that only a class of parallel devices will prove to be powerful enough to allow broad contextual priming, to pursue alternative hypotheses, and to explain the paradox that the performance of a sequential system often degrades with new knowledge, whereas human performance usually improves with learning and experience. = There are a number of new parallel processing (connection-\u2022 st) models which are sympathetic to this view-- Anderson (1983) . Feldman and Ballard (1982) , Waltz and Pollack (1985) . McClelland and Rumelhart (1981, 1982) , and Fahlman. Hinton and Sejnowski (1983) .", "cite_spans": [ { "start": 447, "end": 462, "text": "Anderson (1983)", "ref_id": "BIBREF0" }, { "start": 465, "end": 491, "text": "Feldman and Ballard (1982)", "ref_id": "BIBREF5" }, { "start": 494, "end": 518, "text": "Waltz and Pollack (1985)", "ref_id": "BIBREF11" }, { "start": 521, "end": 535, "text": "McClelland and", "ref_id": null }, { "start": 536, "end": 558, "text": "Rumelhart (1981, 1982)", "ref_id": null }, { "start": 565, "end": 601, "text": "Fahlman. Hinton and Sejnowski (1983)", "ref_id": "BIBREF4" } ], "ref_spans": [], "eq_spans": [], "section": "7..1 Tim i~vk~", "sec_num": null }, { "text": "Many of the connection\u2022st models use iterative relaxation techniques with networks containing excitatory and inhibitory links. They have primarily been used as best-fit categorizers in large recognition spaces, and it is not yet clear how they will implement the rule-governed behavior of parsers or problem solvers. Rule-based systems need a strong notion of an operating state, and they depend heavily on appropriate variable binding schemes for operations such as matching (e.g.. unification) and recurs\u2022on. The APN model directly supports a rule-based interpretation, while retaining much of the general flavor of I. 1\"be htmmm li~ity to L:mrfofm mmlpatztmmtlly e\u2022patm,m opmltmm =alia s ~y ~, imt'alkd loud,mum remforou this b\u00a2fid. connection\u2022sin. An active production network is a ruleoriented, distributed processing system based on the following principles:", "cite_spans": [], "ref_spans": [], "eq_spans": [], "section": "7..1 Tim i~vk~", "sec_num": null }, { "text": "1. Each node in the network executes a uniform activation algorithm and assumes states in response to message (,such as expectation, inhibition, and activation) that arrive locally; the node can, in turn, relay messages, initiate messages, and spawn new instances to process message activity. Although the patterns that define a node's behavior may be quite idiosyncratic or spocializod, the algorithm that interprets the pattern is the same for each node in the network. strobings of the network by external inputs. In generation or problem solving applications, partially instantiated nodes are viewed as partially satisfied goals which have out.standing subgoaLs whine solutions are de=ired. The source= in this case are endogenously generated. The compatibility of the=e two views not only allows the same network to be used for both parsing and generation, but also permits procesu~ to share in the interaction of internal and external sources of information. This compatibility, somewhat surprisingly, turned out to be crucial to our treatment of movement, but it is aLso clearly desirable for other aspects of natural language processing in which parsing and problem solving interact (e.8., referenco resolution and inferen(~P.).", "cite_spans": [], "ref_spans": [], "eq_spans": [], "section": "7..1 Tim i~vk~", "sec_num": null }, { "text": "Each node in an APN is defined by a pattern, written in the pattern language of Figure 2 . A pattern describes the me=age= to which a node rmponds, and the new mes-sage= and internal state= that are produced. Each subpattern of the form ($ v binding-put) in the pattern for node N is a variable binding site; a variable binding takes place when an instance of a node in binding-gat activates a reference to variable v of node N. Implicitly, a pattern defines the set of state= and. state transitions for a node. The ? (optiouality), + (repetition) and \u2022 (optional repetition) operators do not extend the expressiveness of the language, but have been added for convenience. They can be replaced in preprocessin8 by equivalent expre&sions, j Formal semantic definitions of the m_~_~$e passing behavior for each primitive operator have been specified. graphical representation of an APN, synchrony is indicated by a short tail above the subpattern expression; the definition of U in Figure I illustrates both conventions:", "cite_spans": [], "ref_spans": [ { "start": 80, "end": 88, "text": "Figure 2", "ref_id": null }, { "start": 980, "end": 988, "text": "Figure I", "ref_id": null } ], "eq_spans": [], "section": "7..1 Tim i~vk~", "sec_num": null }, { "text": "(and ($ vl (and TI)) ($ v2 S)). Figure 3 shows the stages in parsing the string acd. An exogenous source Exog-srcO first activates a, which is not currently supported by a source and, hence, is in an inactive state. The activation of an inactive or inhibited node give= rise to a new instance (nO) to record the binding. The instance is effectively a new node in the network, and derives its pattern from the spawning node. The activation spreads upward to the other instances shown in Figure 3 The activation of a node causes its pattern to be 4re)instantiated and a variable to be (re)bound. For exampie. in the activation of RO, the pattern (seq ($ vi Q) (5 v2 c'9) is replaced by (seq ($ vi (or Q QO)) ($ v2 c)). and the variable vl is bound to (20. For simplicity, only the active links are shown in Figure 3 . RO posts an expectation message for node C which can further its pattern. The source Exog-secO is said to be supporting the activation of nodea nO. QO. RO and PO above it, and the expectations or inhibitions that are generated by these nodes. For the current paper we will assume that exogenous sources remain fully on for the duration of the sentenco, s", "cite_spans": [], "ref_spans": [ { "start": 32, "end": 40, "text": "Figure 3", "ref_id": null }, { "start": 486, "end": 495, "text": "Figure 3", "ref_id": null }, { "start": 806, "end": 814, "text": "Figure 3", "ref_id": null } ], "eq_spans": [], "section": "7..1 Tim i~vk~", "sec_num": null }, { "text": "In Figure 3(b) , another exogenous source Exog-srcl activates c, which furthers the pattern for RO. RO sends an inhibition message to QO, posts expectations for S, and relays an activation message to P0, which rebind~ its variable to RO and a~umes a new activation value. Figure 3(c) shows the final situation after d has been activated. The synchronous conjunction of SO is satisfied hy TO and dO. RO is fully satisfied (activation value of 9), and PO is re-satisfied.", "cite_spans": [], "ref_spans": [ { "start": 3, "end": 14, "text": "Figure 3(b)", "ref_id": null }, { "start": 272, "end": 284, "text": "Figure 3(c)", "ref_id": null } ], "eq_spans": [], "section": "Am F..~m~", "sec_num": "2.3" }, { "text": "The APN in Figure I illustrates several grammar writing paradigms. The situation in which an initial prefix string (a or b) satisfies a constituent (P), but can be followed by optional suffix strings (cd or ce) occurs frequently in natural language grammars. For example, noun phrase heads in English have optional prenominal and postnominal modifiers. The synchronous disjunction at P allows the local role of a or b to change, while preserving its interpretation as part of a P. It is also simple to encode optional prefixes.", "cite_spans": [], "ref_spans": [ { "start": 11, "end": 19, "text": "Figure I", "ref_id": null } ], "eq_spans": [], "section": "1,4 Gramm~ Writbql P~Ulpm", "sec_num": null }, { "text": "Another common situation in natural language grammars is specialization of a constituent based on some interhal feature. Noun phrases in English, for exampl\u00a9, can be specialized hy case; verb phrases can be specialized as participial, tensed or infinitive. In Figure l , node S is a spe. cialization which represents \"Ts with d-ness or e-ness, but not f-heSS.'\" The specialization is constructed by a synchronous conjunction of features that arise from subtrees somewhere below the node to be specialized.", "cite_spans": [], "ref_spans": [ { "start": 260, "end": 268, "text": "Figure l", "ref_id": null } ], "eq_spans": [], "section": "1,4 Gramm~ Writbql P~Ulpm", "sec_num": null }, { "text": "The APN model also provides for node outputs to he partitioned into independent classes for the purl~s\u00a2~ ,~)f the activation algorithm. The nodes in the classes form levels in the network and represent orthogonal systems of classification. The cascading of expectations from dilfcrent I~els can implement context-sensitive behaviors such as feature agreement and s':mantic sclectionai restrictiops. This is described in Jones (forthcoming). In the next section, we will introduce a grammar writing paradigm to represent movement, another type of non..context-fre\u00a2 behavior.", "cite_spans": [], "ref_spans": [], "eq_spans": [], "section": "1,4 Gramm~ Writbql P~Ulpm", "sec_num": null }, { "text": "$. It is interertins to sp~'ulatc: on the oOm~lUamC~ o( vsr~w relauua~q of ~hiu \u00a2al~m~l~Oe. Fundam,mt~l limitatmm in the allocatm of ~ may be reJalod to limiuUmna in sluart term memory (~r buff're space in dc'tl~iatMi\u00a2 zzleJ\u00a2l~ I\u00a2\u00a2 Matctul, 19BO). Lin|uilti\u00a2 \u00a2emmzinUl ~ on OoQM~tlt~l\u00a2 IcqtStb oou~ be col=ted tO ~rl~ daca),. ~ |yntlcli\u00a2 Mlzdca path bebav~\u00a2 mJlbl be rclltad to accc.h=Itad iowr~ decay r.atmmd by inbibitioo from \u2022 ~up~mll bypmbmia. Anythin$ mum than \u2022 f,~m~ iJ ~tttre at ,hi= 3. MOVI~W..NT From the APN perspective, movement (limited here to left-extrapnsition) necessitates the endogenous reactivation of a trace that was created earlier in the process. To cap.. ture the trace so that expectations for its reactivation can be posted, we use the following type of rule: (seq (5 vl ... X... ) ($ v2 ... (and X X-see Y) ...). When an instance, XO, first activatea this rule, vl is bound to XO; the second occurrence X in the rule is constrained to match instances of XO, and expectations for XO, X-see and Y are created. No new exogenous source can satisfy the synchronous conjunction; only an endogenous X.src can. The rule is similar to the notion of an X followed by a Y with an X hole in it (cf. Gazdar, 1982) . sources such as PRO-src and Rel.src are activated in contexts in which they are expected and then receive inhibition. Figure 5(c) shows the resulting network after PROsrc has been activated, The inserted pronoun behaves precisely as an input pronoun with respect to subsequent movement.", "cite_spans": [ { "start": 810, "end": 835, "text": "($ v2 ... (and X X-see Y)", "ref_id": null }, { "start": 1216, "end": 1229, "text": "Gazdar, 1982)", "ref_id": "BIBREF6" } ], "ref_spans": [ { "start": 1350, "end": 1361, "text": "Figure 5(c)", "ref_id": null } ], "eq_spans": [], "section": "1,4 Gramm~ Writbql P~Ulpm", "sec_num": null }, { "text": "NP-t raell CNP V V", "cite_spans": [], "ref_spans": [], "eq_spans": [], "section": "1,4 Gramm~ Writbql P~Ulpm", "sec_num": null }, { "text": "The trace generation necessary for movement uses the same insertion mechanism described above. Figures 6(a) Figure 6(a) , after parsing the cat which, synchronous expectations are posted for an S which contains a reactivation of the RelO trace by Rel. see. The signal sent to S by Rei.src will be in the form of an NP (through NP-trace).", "cite_spans": [], "ref_spans": [ { "start": 95, "end": 107, "text": "Figures 6(a)", "ref_id": null }, { "start": 108, "end": 119, "text": "Figure 6(a)", "ref_id": null } ], "eq_spans": [], "section": "1,4 Gramm~ Writbql P~Ulpm", "sec_num": null }, { "text": "Figure 6(b) shows how the input of ran produces inhibition on Rei-src from SI. The inhibition on Rei-src caus~ it to activate (just as in the null pronoun insertion) to try to satisfy the current contextual expectations. Figure 6(c) shows the network after Rel-src has activated to supply the trace. The only remaining problem is that Rel-src is actively inhibiting itself through .~0. 6 When Rel-src activates again, new instances are created for the inhibited nodes as they are re-activated; the uninhibited nodes are simply rebound. The final structure is shown in Figure 6(d) .", "cite_spans": [], "ref_spans": [ { "start": 221, "end": 232, "text": "Figure 6(c)", "ref_id": null }, { "start": 568, "end": 579, "text": "Figure 6(d)", "ref_id": null } ], "eq_spans": [], "section": "1,4 Gramm~ Writbql P~Ulpm", "sec_num": null }, { "text": "it is interesting that the network automatically enforces the restriction that the relative pronoun, complementizer and subject of the embedded sentence cannot all be missing. PRO must be generated before its trace can be inserted as the subject. Furthermore. since expectations are strongest for the first link of a sequence, expectations will be much weaker for the VP in the relative clause (under S under S\") than for the top-level VP under SO.", "cite_spans": [], "ref_spans": [], "eq_spans": [], "section": "1,4 Gramm~ Writbql P~Ulpm", "sec_num": null }, { "text": "The fact that the device blocks certai'n structures, without explicit weli-formedness constraints, is quite significant. Wherever possible, we would like to account for the complexity of the data through the composite behavior of a universal device and a simple, general grammar. We consider the description of a device which embodies the appropriate principles more parsimonious than a list of complex conditions and filters, and, to the extent that its architecture is independently motivated by proc,'ssink (i.e.. performance) considerations, of greater thcorctical interestf As we have seen, certain interpretations can be suppressed by expectations from elsewhere in the network. Furthermore, the occurrence of traces and empty constituents is severely constrained because they must be supplied by endogenous sources, which can only suppurt a sintie constituent at any given time. For NP movement, these two properties of the device, taken together. elfectively enforce Ross's Complex NP Constraint (Ross. 1967) , which states that, \"No element contained in a 6. Another ,~sy o4\" rut\u2022inS thi,J iJ that the noa~ynchroetM:ity of the two vanaMea in the I~ttern hat ~ viohtted. Figure 4 ). as in. [NP the rat [~\" whichi [S he read a report [~\" that [$ the cat chased fl]]]]], then there is only one trace in the intended interpretation, but there is nondeterminlsm during parsing between the noun complement and the relative clause interpretation. The interference eaus\u00a2,, the trace to be bound to the innermost relative pronoun in the relative clause interpretation.' Thus, the combined properties of the device and grammar consistently block those structures which violate the Complex NP Constraint. Our preliminary findings for other types of movement (e.g., Subjectauxiliary Inversion, Wh-movement, and Raising) indicate that they also have natural APN explanations.", "cite_spans": [ { "start": 1004, "end": 1016, "text": "(Ross. 1967)", "ref_id": "BIBREF10" }, { "start": 1198, "end": 1213, "text": "[NP the rat [~\"", "ref_id": null } ], "ref_spans": [ { "start": 1179, "end": 1187, "text": "Figure 4", "ref_id": "FIGREF4" } ], "eq_spans": [], "section": "1,4 Gramm~ Writbql P~Ulpm", "sec_num": null }, { "text": "Although the re.torch described in this summary is primarily of a theoretic nature, the basic ideas involved in using APNs for recognition and generation are being implemented and tested in Zetalisp on a Symbolics Lisp Machine. We have also hand-simulated data on movement from the literature to design the theory and algorithms presented in this paper. We are currently designing networks for a broad coverage syntactic grammar of English and for additional, cascaded levels for NP role mapping and case frames. The model has aLso been adapted as a general, context-driven problem solver, although more work remains to be done.", "cite_spans": [], "ref_spans": [], "eq_spans": [], "section": "IMPLF.aMENTATION 8ml Fu'ruRg DIMF.CrlONS", "sec_num": "4." }, { "text": "We are considering ways of integrating iterative relaxation techniques with the rule-based framework of APNs. This is particularly necessary in helping the network to identify expectation coalitions. In Figure 5(a) , for exampie. there should be virtually no expectations for Rel-src, since it cannot satisfy any of the dominating synchronous conjunctions. Some type of non-activating feedback from the sources seems to be necessary.", "cite_spans": [], "ref_spans": [ { "start": 203, "end": 214, "text": "Figure 5(a)", "ref_id": null } ], "eq_spans": [], "section": "IMPLF.aMENTATION 8ml Fu'ruRg DIMF.CrlONS", "sec_num": "4." }, { "text": "Recent linguistic theories have attempted to induce general principles (e.g., CNPC. Subjacency, and the Structure Preserving Hypothesis) from the detailed structural descriptions of earlier transformational theories (Chomsky, 1981) , Our research can be viewed as an attempt tu induce the machine that embodies theae principles. In this paper, we have described a class of candidate machine~, called active production networks, and outlined how they handle movement as a natural way in which machine and grammar interact.", "cite_spans": [ { "start": 216, "end": 231, "text": "(Chomsky, 1981)", "ref_id": "BIBREF1" } ], "ref_spans": [], "eq_spans": [], "section": "S. SUI~ARY", "sec_num": null }, { "text": "The APN framework was initially developed as a plausible cognitive model for language processing, which would have real-time processing behavior, and extensive contextual processing and learning capabilities based on a formal notion of expectations. That movement also seems naturally expressible in a way that is consistent with current linguistic theories is quite intriguing.", "cite_spans": [], "ref_spans": [], "eq_spans": [], "section": "S. SUI~ARY", "sec_num": null } ], "back_matter": [ { "text": "VP", "cite_spans": [], "ref_spans": [], "eq_spans": [], "section": "SO(2)", "sec_num": null } ], "bib_entries": { "BIBREF0": { "ref_id": "b0", "title": "The Architecture of Cognition", "authors": [ { "first": "J", "middle": [ "R" ], "last": "Anderson", "suffix": "" } ], "year": 1983, "venue": "", "volume": "", "issue": "", "pages": "", "other_ids": {}, "num": null, "urls": [], "raw_text": "Anderson, J. R. (1983). The Architecture of Cognition, Harvard University Press, Cambridge.", "links": null }, "BIBREF1": { "ref_id": "b1", "title": "Lectures on Government and Binding", "authors": [ { "first": "", "middle": [ "N" ], "last": "Chomsky", "suffix": "" } ], "year": 1981, "venue": "", "volume": "", "issue": "", "pages": "", "other_ids": {}, "num": null, "urls": [], "raw_text": "Chomsky. N. (1981). Lectures on Government and Bind- ing. Foris Publications, Dordrecht.", "links": null }, "BIBREF2": { "ref_id": "b2", "title": "Filters and Control", "authors": [ { "first": "N", "middle": [], "last": "Chomsky", "suffix": "" }, { "first": "H", "middle": [], "last": "Lasnik", "suffix": "" } ], "year": 1977, "venue": "Linguistic Inquiry", "volume": "", "issue": "", "pages": "425--504", "other_ids": {}, "num": null, "urls": [], "raw_text": "Chomsky, N. and Lasnik, H. (1977). \"Filters and Con- trol,\" Linguistic Inquiry g, 425-504.", "links": null }, "BIBREF3": { "ref_id": "b3", "title": "NETL\" A System for Representing and Using Real-World Knowledge", "authors": [ { "first": "S", "middle": [ "E" ], "last": "Fahlman", "suffix": "" } ], "year": 1979, "venue": "", "volume": "", "issue": "", "pages": "", "other_ids": {}, "num": null, "urls": [], "raw_text": "Fahlman, S. E. (1979). NETL\" A System for Represent- ing and Using Real-World Knowledge. MIT Press, Cam- bridge.", "links": null }, "BIBREF4": { "ref_id": "b4", "title": "Massively Parallel Architectures for Ah NFTL, Thistle, and Boltzmann Machines,\" AAAI.83 Conference Proceedings", "authors": [ { "first": "S", "middle": [ "E" ], "last": "Fahlman", "suffix": "" }, { "first": "G", "middle": [ "E" ], "last": "Hinton", "suffix": "" }, { "first": "T", "middle": [ "J" ], "last": "Sejnowski", "suffix": "" } ], "year": 1983, "venue": "", "volume": "", "issue": "", "pages": "", "other_ids": {}, "num": null, "urls": [], "raw_text": "Fahlman, S. E., Hinton, G. E. and Sejnowski, T. J. (1983). \"Massively Parallel Architectures for Ah NFTL, Thistle, and Boltzmann Machines,\" AAAI.83 Conference Proceedings.", "links": null }, "BIBREF5": { "ref_id": "b5", "title": "Connectionist Models and Their Properties", "authors": [ { "first": "A", "middle": [ "J" ], "last": "Feldman", "suffix": "" }, { "first": "D", "middle": [], "last": "Ballard", "suffix": "" }, { "first": "", "middle": [], "last": "It", "suffix": "" } ], "year": 1982, "venue": "Cognitive Science", "volume": "6", "issue": "", "pages": "205--254", "other_ids": {}, "num": null, "urls": [], "raw_text": "Feldman. J, A. and Ballard, D. It. (1982). \"Connection- ist Models and Their Properties,\" Cognitive Science 6, 205-254.", "links": null }, "BIBREF6": { "ref_id": "b6", "title": "The Nature of Syntactic Representation", "authors": [ { "first": "G", "middle": [], "last": "Gazdar", "suffix": "" } ], "year": 1982, "venue": "Reidel", "volume": "", "issue": "", "pages": "131--186", "other_ids": {}, "num": null, "urls": [], "raw_text": "Gazdar, G. (1982). \"Phrase Structure Grammar,\" The Nature of Syntactic Representation, Jacubson and Pullum, eds., Reidel, Boston, 131 -186.", "links": null }, "BIBREF7": { "ref_id": "b7", "title": "Activation-Based Parsi.g", "authors": [ { "first": "", "middle": [ "M A" ], "last": "Jones", "suffix": "" } ], "year": 1983, "venue": "8th IJCAI", "volume": "", "issue": "", "pages": "678--682", "other_ids": {}, "num": null, "urls": [], "raw_text": "Jones. M. A.. (1983). \"Activation-Based Parsi.g.\" 8th IJCAI, Karlsruhe, W. Germany, 678-682.", "links": null }, "BIBREF8": { "ref_id": "b8", "title": "A. (forthcoming). submitted for publication", "authors": [ { "first": "M", "middle": [], "last": "Jones", "suffix": "" } ], "year": null, "venue": "", "volume": "", "issue": "", "pages": "", "other_ids": {}, "num": null, "urls": [], "raw_text": "Jones, M.A. (forthcoming). submitted for publication.", "links": null }, "BIBREF9": { "ref_id": "b9", "title": "Logic for Natural Language Analysis", "authors": [ { "first": "", "middle": [ "M P" ], "last": "Marcus", "suffix": "" } ], "year": 1980, "venue": "A Theory of S),ntactic Recogni. lion for Natural L,znguage, M IT Press", "volume": "", "issue": "", "pages": "", "other_ids": {}, "num": null, "urls": [], "raw_text": "Marcus. M. P. (1980). A Theory of S),ntactic Recogni. lion for Natural L,znguage, M IT Press, Cambridge. Pereira. F. (1983). \"Logic for Natural Language Analysis,\" technical report 275, SRI International. Menlo Park.", "links": null }, "BIBREF10": { "ref_id": "b10", "title": "Constraints on Variables.in Syntax", "authors": [ { "first": "J", "middle": [ "R" ], "last": "Ross", "suffix": "" } ], "year": 1967, "venue": "", "volume": "", "issue": "", "pages": "", "other_ids": {}, "num": null, "urls": [], "raw_text": "Ross, J. R. (1967). Constraints on Variables.in Syntax, unpublished Ph.D. thesis, MIT, Cambridge.", "links": null }, "BIBREF11": { "ref_id": "b11", "title": "Massively Parallel Parsing: A Strongly Interactive Model of Natural Language Interpretation", "authors": [ { "first": "", "middle": [ "D L" ], "last": "Waltz", "suffix": "" }, { "first": "J", "middle": [ "B" ], "last": "Pollack", "suffix": "" } ], "year": 1985, "venue": "Cognitive Science", "volume": "9", "issue": "", "pages": "51--74", "other_ids": {}, "num": null, "urls": [], "raw_text": "Waltz. D. L. and Pollack, J. B. (1985). \"Massively Parallel Parsing: A Strongly Interactive Model of Natural Language Interpretation,\" Cognitive Science, 9, 51-74.", "links": null } }, "ref_entries": { "FIGREF0": { "num": null, "type_str": "figure", "text": "Messages are relatively simple. They have an associated time, strength, and purpose (e.g., to post an expectation). They do not encode complex structures such as entire binding lists, parse trees, feature lists, or meaning representations, z Consequently, no structure is explicitly built; the \"result\" of a computation consists entirely of the activation trace and the new state of the network.", "uris": null }, "FIGREF1": { "num": null, "type_str": "figure", "text": "an artificial', but comprehensive example of an APN grammar in graphical form. The grammar generates the strings--a, b. acd. ace. bed. bee. fg and gland illustrates mapy of the pattern language features and grammar writing paradigms. The network responds to $ourcex which activate the network at its leaves. Activation messages spread '*upward\" through the network. At conjunctive nodes (seq and and), expectation messages are posted for the legal continuations of the pattern; inhibition messages are sent down previous links when new activations are recorded. P J~ Figure i. A Sample APN In parsing applications, partially instantiatcd nodes are viewed as phrase structure rules whose next constituent is expected. The sources primarily arise from exogenous 2. For \u2022 sit'tatar \u00a2oaaectioaett vnew, ~ F\u00a2ldman sad B#llard (1982) or Waltz ted Pollack (198S). A compemoa or markor patuns, value Imaan I \u2022ad uoreltricted melmzle pinball =yttm=t= i= ipvea ia Fahlmnm, Hlalal lad Scjnowl~ (IgS)).", "uris": null }, "FIGREF2": { "num": null, "type_str": "figure", "text": "pattern ::--binding-site (seq pattern ...) (and pattern ...) (or pattern ...) (? pattern) (+ binding.site) (. binding-site) binding-site ::--($ vat binding-pattern) binding.pauern ::--node I (and binding-pattern ...) I (or binding-pattern ...)Figure 7.. The APN Pattern Language An important distinction that the pattern language makes is in the synchronicity* of activation signals. The pattern (and ($ vl X) ($ v2 ]'3) require= that the activation from X and F emanate from distinct network sources, while the pattern ($ v (and X I\"3) insists that instances of X and Y are activated from the same source. In the ]. The enact chore= o( cq~s'acors in the pattern tan|up it t matewhat ~at= mine from the =!~=m~attma of the APN maciaa~.", "uris": null }, "FIGREF3": { "num": null, "type_str": "figure", "text": "(a). The labels on each node indicate the current activation level, repreu:nted as an integer between 0 and 9, inclusive. trace structzure after acd [~ple 3, Stalp=l in Parsing acd", "uris": null }, "FIGREF4": { "num": null, "type_str": "figure", "text": "A Grammar for Relative ClausesFigure 4defines a grammar with an NP hoic in a relative clause; other type, s of [eft-extraposition are handled analogously. Our treatment of relatives is adapted from C'homsky and Lasnik(1977). The movement rule for S is: (seq ($ vl (and Cutup Re/ (or Exog.src PRO-src)) ($ v2 (and Rel Rel.src S))). The rule restricts the first instance of Re/ to arise either from an exogenous relative pronoun such as which or from an endogenously generated (phonologically null) pronoun PRO. The second variable is satisfied when Rei,src simultaneously reactivates a trace of the Rel instance and inserts an NP-tracc into an S.It is instructive to consider how phonologically null pronouns are inserted before we discuss how movement occurs by trace insertion. The phrase, [NP the mouse [~ PRO=\" that ...]], illustrates how a relative pronoun PRO is inserted.Figure 5(a) shows the network after parsing the cat. When the complementizer that appears next in the input, PRO-src receives inhibition (marked by downward arrows in Figure 5(b)) from Rel.CompO. Non-exogenous", "uris": null }, "FIGREF5": { "num": null, "type_str": "figure", "text": "-(d) illustrate various stages in parsing the phraso, [/vp the cat [~\" whichi [$ tl ranll], in", "uris": null }, "FIGREF6": { "num": null, "type_str": "figure", "text": "Uhle tO r~-.~,-~.--i \u00a2oeskJs~ttmsJ wb~t rg~lto tO e.lp~t~om q~t~nfftb~ tJr\u2022~ tm heud ia s ~.tr tlmt ~ nemmtg. trace structure after the cat PRO that .Figure 5. Relative Pronoun Insertion", "uris": null }, "TABREF0": { "content": "
4, -r~ \u00a2nulreat APN model allocate= ~telueatmUy. The ten=
$yllgiteomlclly reflC~lt thl fact thll t[~ ~kicl~Uly o4 r t~ i~
m~se=can be Ioc~y COmlm '.,,I f~m tlm=r tiuw ~f ~TI~
u kin| u aeuntiom cam a*su.hq~ ~at t t'' prom t'~h, ~e( tit iaaemmltal
cxp~\u00a2ume ~mvtm,,_,~._ F.= re~l~iy illlequndeut i,i.. o,m'lap
may nm po~ a p~Vlem.
", "text": "the ,ctJvuua= pmau= rims [=== a~ugb to coacli~aa the network bmmi, mai my, m0 scuvatmm. Alua'aaUvety, a,:Uvalma melal~ covid emV tl~ mmr\u00a2~ ideatiW =t as a4di,t*...-t l~ram, et~n. ia tl~ csm. m=Jme", "num": null, "type_str": "table", "html": null }, "TABREF2": { "content": "
The wdt-inhibittoa of \u2022 murcg ocgtwt in othcnr conteat~ in the APN ft'tnM:lmek eve\u2022 for egolgno~t toMt~eL Is net,aerita that contai\u2022 leJ't.rm;urtiv\u2022 cyr.t~ or ,endmSl~tm tttaghn~nta (e.S.. PP lUaghfl~'ltt), tett-iahibltioa Call Ifiu naturally U the t~ult at nemum~ me-de~rmiaim~ ae.tctivatioe of \u2022 ~[-inhil~t~ mum d'egUvety Ixorgtva the aea-tyarJumigity ~ pmuwnt.
?. 1\"I~ work 4
", "text": "Margin (1980) iain tJ~tm~&l~t. sentence dominated by an NP with a lexLcal head noun may be moved out of that NP by a transformation.\" To see why this constraint is enforced, consider the two kinds of sentences that an NP with a lexical head noun might dominate. If the embedded sentence is a relative clause, as in. [pip the rat [~\" whichl [$ the cat [~\" whichj [S fj chased/I]] likes fish]J], then Rel.src cannot support both traces. If the embedded sentence is a noun complement (not shown in", "num": null, "type_str": "table", "html": null } } } }