{ "paper_id": "P93-1027", "header": { "generated_with": "S2ORC 1.0.0", "date_generated": "2023-01-19T08:52:19.514839Z" }, "title": "ON THE DECIDABILITY OF FUNCTIONAL UNCERTAINTY*", "authors": [ { "first": "Rolf", "middle": [], "last": "Backofen", "suffix": "", "affiliation": {}, "email": "backofen@dfki.uni-sb.de" } ], "year": "", "venue": null, "identifiers": {}, "abstract": "We show that feature logic extended by functional uncertainty is decidable, even if one admits cyclic descriptions. We present an algorithm, which solves feature descriptions containing functional uncertainty in two phases, both phases using a set of deterministic and non-deterministic rewrite rules. We then compare our algorithm with the one of Kaplan and Maxwell, that does not cover cyclic feature descriptions.", "pdf_parse": { "paper_id": "P93-1027", "_pdf_hash": "", "abstract": [ { "text": "We show that feature logic extended by functional uncertainty is decidable, even if one admits cyclic descriptions. We present an algorithm, which solves feature descriptions containing functional uncertainty in two phases, both phases using a set of deterministic and non-deterministic rewrite rules. We then compare our algorithm with the one of Kaplan and Maxwell, that does not cover cyclic feature descriptions.", "cite_spans": [], "ref_spans": [], "eq_spans": [], "section": "Abstract", "sec_num": null } ], "body_text": [ { "text": "Feature logic is the main device of unification grammars, the currently predominant paradigm in computational linguistics. More recently, feature descriptions have been proposed as a constraint system for logic programming (e.g. see [ll D . They provide for partial descriptions of abstract objects by means of functional attributes called features.", "cite_spans": [], "ref_spans": [], "eq_spans": [], "section": "Introduction", "sec_num": null }, { "text": "Formalizations of feature logic have been proposed in various forms (for more details see [3] in this volume). We will follow the logical approach introduced by Smolka [9, 10] , where feature descriptions are standard first order formulae interpreted in first order structures. In this formalization features are considered as functional relations. Atomic formulae (which we will call atomic constraints) are of either the form A(x) or zfy, where x, y are first order variables, A is some sort predicate and f is a feature (written in infix notation). The constraints of the form xfy can be generalized to constraints of the form xwy, where w = fl-.. fn is a finite feature path. This does not affect the computational properties.", "cite_spans": [ { "start": 90, "end": 93, "text": "[3]", "ref_id": "BIBREF2" }, { "start": 168, "end": 171, "text": "[9,", "ref_id": "BIBREF8" }, { "start": 172, "end": 175, "text": "10]", "ref_id": "BIBREF9" } ], "ref_spans": [], "eq_spans": [], "section": "Introduction", "sec_num": null }, { "text": "In this paper we will be concerned with an extension to feature descriptions, which has been introduced as \"functional uncertainty\" by Kaplan and Zaenen [7] and Kaplan and Maxwell [5] . This formal device plays an important role in the framework of LFG in modelling so-called long distance dependencies and constituent coordination. For a detailed linguistic motivation see [7] , [6] and [5] ; a more general use of functional uncertainty can be found in [8] .", "cite_spans": [ { "start": 153, "end": 156, "text": "[7]", "ref_id": "BIBREF6" }, { "start": 180, "end": 183, "text": "[5]", "ref_id": "BIBREF4" }, { "start": 374, "end": 377, "text": "[7]", "ref_id": "BIBREF6" }, { "start": 380, "end": 383, "text": "[6]", "ref_id": "BIBREF5" }, { "start": 388, "end": 391, "text": "[5]", "ref_id": "BIBREF4" }, { "start": 455, "end": 458, "text": "[8]", "ref_id": "BIBREF7" } ], "ref_spans": [], "eq_spans": [], "section": "Introduction", "sec_num": null }, { "text": "Functional uncertainty consists of constraints of *This work was supported by a research grant, ITW 9002 0, from the German Bundesministerium ffir Forschung und Technologic to the DFKI project DISCO. I would like to thank Jochen Dhrre, Joachim Niehren and Ralf Treinen for reading draft version of this paper. For space limitations most of the proofs are omitted; they can be found in the complete paper [2] the form xLy, where L is a finite description of a regular language of feature paths. A constraint xLy holds if there is a path w E L such that zwy holds. Under this existential interpretation, a constraint xLy can be seen as the disjunction = I ,.,, e", "cite_spans": [ { "start": 404, "end": 407, "text": "[2]", "ref_id": "BIBREF1" } ], "ref_spans": [], "eq_spans": [], "section": "Introduction", "sec_num": null }, { "text": "xLy L}.", "cite_spans": [], "ref_spans": [], "eq_spans": [], "section": "Introduction", "sec_num": null }, { "text": "Certainly, this disjunction may be infinite, thus functional uncertainty yields additional expressivity. Note that a constraint zwy is a special case of a functional uncertainty constraint.", "cite_spans": [], "ref_spans": [], "eq_spans": [], "section": "Introduction", "sec_num": null }, { "text": "To see some possible application of functional uncertainty we briefly recall an example that is given in Kaplan Kaplan and Maxwell [5] have shown that consistency of feature descriptions is decidable, provided that a certain aeyclicity condition is met. More recently, Bander et hi. [1] have proven, that consistency is not decidable if we add negation. But it is an open problem whether consistency of feature descriptions without negation and without additional restrictions (such as acyclicity) is decidable. In the work presented here we show that it indeed is decidable.", "cite_spans": [ { "start": 105, "end": 111, "text": "Kaplan", "ref_id": null }, { "start": 131, "end": 134, "text": "[5]", "ref_id": "BIBREF4" }, { "start": 283, "end": 286, "text": "[1]", "ref_id": "BIBREF0" } ], "ref_spans": [], "eq_spans": [], "section": "Introduction", "sec_num": null }, { "text": "We will first briefly describe the main part of solving the standard feature terms and then turn to their extension with functional uncertainty.", "cite_spans": [], "ref_spans": [], "eq_spans": [], "section": "']?he Method", "sec_num": "2" }, { "text": "(from now on we will refer to pure conjunctive formulae as clauses). A standard method for solving feature terms would rewrite \u00a2 in order to achieve a solved form. This rewriting depends on the paths Pl and Pz. If Pl equals Pz, we know that yl and Y2 must be equal. This implies that \u00a2 is equivalent to xplyx Ayl --Yz. If p~ is a prefix of p2 and hence P2 = P~P~, we can transform \u00a2 equivalently into the formulae xplyi A YlP'Y2.", "cite_spans": [], "ref_spans": [], "eq_spans": [], "section": "Consider a clause \u00a2 = xplyl A xpzy2", "sec_num": null }, { "text": "The reverse case is treated in a similar'fashion. If neither prefix or equality holds between the paths, there is nothing to be done. By and large, clauses where this holds for every x and every pair of different constraints xp~y and xp2z are the solved forms in Smolka [9] , which are consistent.", "cite_spans": [ { "start": 270, "end": 273, "text": "[9]", "ref_id": "BIBREF8" } ], "ref_spans": [], "eq_spans": [], "section": "Consider a clause \u00a2 = xplyl A xpzy2", "sec_num": null }, { "text": "If we consider a clause of the form \u00a2 = zL~y~ A zL2y~, then we again have to check the relation between ys and y~. But now there is in general no unique relation determined by \u00a2, since this depends on which paths p~ and P2 we choose out of L~ and L~. Hence, we have to guess the relation between pl and p~ before we can calculate the relation between yl and y~. However, there is a problem with the original syntax, namely that it does not allow one to express any relation between the chosen paths (in a later section we will compare our algorithm to the one of Kaplan/Maxwell, thus showing where exactly the problem occurs in their syntax). Therefore, we extend the syntax by introducing so-called path variables (written c~, fl, a',...), which are interpreted as feature paths (we will call the other variables first order variables). Hence, if we use the modified sub- By using this extended (two-sorted) syntax we are now able to reason about the relations between different path variables. In doing so, we introduce additional constraints c~ -fl (equality), o~ ~ fl (prefix) and c~ fl fl (divergence). Divergence holds if neither equality nor prefix holds. Now we can describe a normal form equivalent to the solved clauses in Smolka's work, which we will call pre-solved clauses. A clause \u00a2 is pre-solved iff for each pair of different constraint xayl and x~y2 in ~b there is a constraint a I] ~ in \u00a2.", "cite_spans": [], "ref_spans": [], "eq_spans": [], "section": "Consider a clause \u00a2 = xplyl A xpzy2", "sec_num": null }, { "text": "We call this clauses pre-solved, since such clauses are not necessarily consistent. It may happen, that the divergence constraints together with the restrictions of form a ~ L are inconsistent (e.g. think of the clause a~ f+ A ~ ~ ff+ A (~ fl fl). But pre-solved clauses have the property, that if we find a valuation for the path variables, then the clause is consistent.", "cite_spans": [], "ref_spans": [], "eq_spans": [], "section": "Consider a clause \u00a2 = xplyl A xpzy2", "sec_num": null }, { "text": "Our algorithm first transforms a clause into a set of pre-solved clauses, which is (seen as a disjunction) equivalent to the initial clause. In a second phase the pre-solved clauses are checked for consistency with respect to the path variables. In this paper we will concentrate on the first phase, since it is the more difficult one.", "cite_spans": [], "ref_spans": [], "eq_spans": [], "section": "Consider a clause \u00a2 = xplyl A xpzy2", "sec_num": null }, { "text": "Before looking at the technical part we will illustrate the first phase. For the rest of the paper we will write clauses as sets of atomic constraints. Now The clause 71 is pre-solved. For the others we have to evaluate the relation between a and ]Y, which is done as follows. For 72 we substitute/~ by ot and z by y, which yields {y \"--z, xay, o~E L1, aEL2}.", "cite_spans": [], "ref_spans": [], "eq_spans": [], "section": "Consider a clause \u00a2 = xplyl A xpzy2", "sec_num": null }, { "text": "We keep only the equality constraint for the first order variables, since we are only interested in their valuation. Combining {4 ~ L1, a ~ L2} to {4 ~ (L1 f')L2)} then will give us the equivalent pre~solved clause", "cite_spans": [], "ref_spans": [], "eq_spans": [], "section": "Consider a clause \u00a2 = xplyl A xpzy2", "sec_num": null }, { "text": "For 73 we know that the variable/3 can be split into two parts, one of them covered by 4. We can use concatenation of path variables to express this, i.e. we can replace fl by the term c~.fl', where ~' is new. Thus we get the clause", "cite_spans": [], "ref_spans": [], "eq_spans": [], "section": "Consider a clause \u00a2 = xplyl A xpzy2", "sec_num": null }, { "text": "The only thing that we have to do additionally in order to achieve a pre-solved clause is to resolve the constraint a./~ ~ ~ L2. To do this we have to guess a so-called decomposition P, S of L2 with P.S C_ L2", "cite_spans": [], "ref_spans": [], "eq_spans": [], "section": "7~ -{xc~y, a~ L1, yfl' z, c~.fl'~L2},", "sec_num": null }, { "text": "such that a ~ P and ]~' ~ S. In general, there can be an infinite number of decompositions (think of the possible decompositions of the language f'g). But as we use regular languages, there is a finite set of regular decompositions covering all possibilities. Finally, reducing {c~ ~ L~, ~ ~ P} to {~ ~ (L1 n P)} will yield a pre-solved clause.", "cite_spans": [], "ref_spans": [], "eq_spans": [], "section": "7~ -{xc~y, a~ L1, yfl' z, c~.fl'~L2},", "sec_num": null }, { "text": "Note that the evaluation of the prefix relation in 73 has the additional effect of introducing a new constraint y~z. This implies that there again may be some path variables the relation of which is unknown. Hence, after reducing the terms of form a --\" ]~ or ~ fl we may have to repeat the non-deterministic choice of relations between path variables. In the end, the only remaining constraints between path variables will be of the form a fl ft.", "cite_spans": [], "ref_spans": [], "eq_spans": [], "section": "7~ -{xc~y, a~ L1, yfl' z, c~.fl'~L2},", "sec_num": null }, { "text": "We have to consider some additional point, namely that the rules we present will (naturally) loop in some cases. Roughly speaking, one can say that this always occurs if a cycle in the graph coincides with a cycle in the regular language. To see this let us vary the above example and let 7 now be the clause However, we will proof that the rule system is quasi-terminating, which means that the rule system may cycle, but produces only finitely many different clauses (see [4] ). This means that checking for cyclic derivations will give us an effective algorithm.", "cite_spans": [ { "start": 474, "end": 477, "text": "[4]", "ref_id": "BIBREF3" } ], "ref_spans": [], "eq_spans": [], "section": "7~ -{xc~y, a~ L1, yfl' z, c~.fl'~L2},", "sec_num": null }, { "text": "Quasi-termination is achieved by the following measures: first we will guarantee that the rules do not introduce additional variables; second we restrict concatenation to length 2; and third we will show that the rules system produces only finitely many regular languages. In order to show that our rewrite system is complete, we also have to show that every solution can be found in a pre-solved clause.", "cite_spans": [], "ref_spans": [], "eq_spans": [], "section": "7~ -{xc~y, a~ L1, yfl' z, c~.fl'~L2},", "sec_num": null }, { "text": "Our signature consists of a set of sorts S (A, B,...), first order variables X (z,y,...), path variables 7 9 (a,/3,...) and features Jr (f, g,...). We will assume a finite set of features and infinite sets of variables and sorts. A path is a finite string of features. A path u is a prefix of a path v (written u ~ v) if there is a non-empty path w such that v = uw. Note that is neither symmetric nor reflexive. Two paths u, v diverge (written u n v) if there are features f, g with f ~ g and possibly empty paths w, wl, w2 such that u = wfw~ A v = wgw2. Clearly, n is a symmetric relation.", "cite_spans": [], "ref_spans": [], "eq_spans": [], "section": "Preliminaries", "sec_num": "3" }, { "text": "= v, u .~ v, u ~-v oru II v holds.", "cite_spans": [], "ref_spans": [], "eq_spans": [], "section": "Proposition 3.1 Given two paths u and v, then exactly one of the relations u", "sec_num": null }, { "text": "A path term (p, q .... ) is either a path variable a or a concatenation of path variables a.fl. We will allow complex path terms only in divergence constraints and not in prefix or equality constraints. Hence, the set of atomic constraints is given by We exclude empty paths in subterm agreement since xey is equivalent to x -y. Therefore, we require fl\"...'fn E ~r+ and L C_ jr+. A clause is a finite set of atomic constraint denoting their conjunction. We will say that a path term a.fl is contained (or used) in some clause \u00a2 if \u00a2 contains either a constraint a-fl ~ L or a constraint a.fl ti q) Constraints of the form p~ L, p fl q, a :~ fl and c~ -fl will be called path constraints.", "cite_spans": [], "ref_spans": [], "eq_spans": [], "section": "Proposition 3.1 Given two paths u and v, then exactly one of the relations u", "sec_num": null }, { "text": "An interpretation Z is a standard first order structure, where every feature f ~ ~ is interpreted as a binary, functional relation F z and where sort symbols", "cite_spans": [], "ref_spans": [], "eq_spans": [], "section": "Proposition 3.1 Given two paths u and v, then exactly one of the relations u", "sec_num": null }, { "text": "We will not differentiate between p fl q and q ~ p.", "cite_spans": [], "ref_spans": [], "eq_spans": [], "section": "Proposition 3.1 Given two paths u and v, then exactly one of the relations u", "sec_num": null }, { "text": "are interpreted as unary, disjoint predicates (hence A zOBz= 0 for A 5\u00a3 B). A valuation is a pair (Vx, VT~) , where Vx is a standard first order valuation of the variables in X and Vv is a function V~v : P ---+ ~'+. We define V~,(a.fl) to be VT,(a)V~,(13),", "cite_spans": [], "ref_spans": [ { "start": 98, "end": 107, "text": "(Vx, VT~)", "ref_id": null } ], "eq_spans": [], "section": "Proposition 3.1 Given two paths u and v, then exactly one of the relations u", "sec_num": null }, { "text": "The validity of an atomic constraint in an interpretation 2\" under a valuation (Vx, V~,) is defined as follows: For a set ~ C X we define =\u00a3 to be the following relation on first order valuation:", "cite_spans": [], "ref_spans": [], "eq_spans": [], "section": "Proposition 3.1 Given two paths u and v, then exactly one of the relations u", "sec_num": null }, { "text": "Similarly, we define =~ with 7r C 79 for path valuations. Let 0 C_ XU79 be a set of variables. For a given interpretation 7: we say that a valuation Recall that we have switched from the original syntax to a (two-sorted) syntax by translating constraints zLy into {zay, ~ ~ L}, where a is new. The result of the translation constitutes a special class of clauses, namely the class of prime clauses, which will be defined below. Hence, it suffices to show decidability of consistency of prime clauses. They are the input clauses for the first phase. Let \u00a2 be some clause and z, y be different variables. We say that \u00a2 binds y t0 z if z -y E \u00a2 and y occurs only once in \u00a2. Here it is important that we consider equations as directed, i.e. we assume that z -\" y is different from y -x. We say that \u00a2 eliminates y if \u00a2 binds y to some variable x. A clause is called basic if 1.", "cite_spans": [], "ref_spans": [], "eq_spans": [], "section": "Vx =~ V/~ iff W e ~ : Vx(~) = V/~(x).", "sec_num": null }, { "text": "x -y appears in \u00a2 iff \u00a2 eliminates y, 2. For every path variable a used in \u00a2 there is at most one constraint zc~y E \u00a2. Now let's turn to the rule system. As we have explained informally, the first rule adds nondeterministiely relational constraints between path variables. In one step we will add the relations between one fixed variable a and all other path variables/3 which are used under the same node x as a. Furthermore, we will consider only the constraints -/3, c~ fl /3 and a ~ /3 and not additionally the constraint a 9/3.", "cite_spans": [], "ref_spans": [], "eq_spans": [], "section": "Vx =~ V/~ iff W e ~ : Vx(~) = V/~(x).", "sec_num": null }, { "text": "EQUATION", "cite_spans": [], "ref_spans": [], "eq_spans": [ { "start": 0, "end": 8, "text": "EQUATION", "ref_id": "EQREF", "raw_str": "(Join) {a ~ L, ~ ~ L'} U \u00a2 L :/: L' {,~ ~ (L n L')} U \u00a2", "eq_num": "(" } ], "section": "Vx =~ V/~ iff W e ~ : Vx(~) = V/~(x).", "sec_num": null }, { "text": "For better readability we will use pseudo-code for describing this rule (using the usual don't care/don't know distinction for non-determinism):", "cite_spans": [], "ref_spans": [], "eq_spans": [], "section": "Vx =~ V/~ iff W e ~ : Vx(~) = V/~(x).", "sec_num": null }, { "text": "(PathRel) Choose x 6 l)arsx(\u00a2) (don't care)", "cite_spans": [], "ref_spans": [], "eq_spans": [], "section": "Vx =~ V/~ iff W e ~ : Vx(~) = V/~(x).", "sec_num": null }, { "text": "Choose xay 6 \u00a2 (don't know)", "cite_spans": [], "ref_spans": [], "eq_spans": [], "section": "Vx =~ V/~ iff W e ~ : Vx(~) = V/~(x).", "sec_num": null }, { "text": "For each x/3z 6 \u00a2 with c~ #/3 and c~ fl/3 ~ \u00a2 add a 6~/3 with 5Z 6 {-, 4~, fl} (don't know)", "cite_spans": [], "ref_spans": [], "eq_spans": [], "section": "Vx =~ V/~ iff W e ~ : Vx(~) = V/~(x).", "sec_num": null }, { "text": "\"don't care non-determinism\" means that one is free to choose an arbitrary alternative at this choose point, whereas \"don't know\" means that one has to consider every alternative in parallel (i.e. for every alternative of the don't care non-determinism a clause \u00a2 is equivalent to the set of all don't know alternatives that can be generated by applying the rule to \u00a2). Note that the order of rule application is another example for don't care non-determinism in our rule system.", "cite_spans": [], "ref_spans": [], "eq_spans": [], "section": "Vx =~ V/~ iff W e ~ : Vx(~) = V/~(x).", "sec_num": null }, { "text": "Although we have restricted the relations 6~ to {-, :(, u}, this rule is globally preserving since we have non-deterministically chosen zay. To see this let \u00a2 be a clause, 27 be an interpretation and (Vx, VT~) be a valuation in 27 with (Vx, V~) ~z \u00a2. To find an instance of (PathRel) such that (Vx, V~,) ~z 7 where 3' is the result of applying this instance, we choose", "cite_spans": [], "ref_spans": [], "eq_spans": [], "section": "Vx =~ V/~ iff W e ~ : Vx(~) = V/~(x).", "sec_num": null }, { "text": "xay 6 \u00a2 with V~(a) is prefix minimal in {v~@ 1~/3z ~ \u00a2}.", "cite_spans": [], "ref_spans": [], "eq_spans": [], "section": "Vx =~ V/~ iff W e ~ : Vx(~) = V/~(x).", "sec_num": null }, { "text": "Then for each x/3z 6 \u00a2 with a #/3 and ~ fi /3 ~ \u00a2 we add a 6~ /3 where Vp(a) o~ V~(/3) holds. Note that 5 0 equals ~ will not occur since we have chosen a path variable a whose interpretation is prefix minimal. Therefore, the restriction 6~ 6 {-, k, fi} is satisfied.", "cite_spans": [], "ref_spans": [], "eq_spans": [], "section": "Vx =~ V/~ iff W e ~ : Vx(~) = V/~(x).", "sec_num": null }, { "text": "We have defined (PathRel) in a very special way. The reason for this is that only by using this special definition we can maintain the condition that concatenation of path variables is restricted to binary concatenation. E.g. assume that we would have added both /31 \"~ O~ and a :\u00a2 /32 to a clause 7. Then first splitting up the variable a into/31 .a' and then 132 into a./3~ will result in a substitution of/32 in 7 by/31\"a\"/3~. By the definition of (PathRel) we have ensured that this does not occur.", "cite_spans": [], "ref_spans": [], "eq_spans": [], "section": "Vx =~ V/~ iff W e ~ : Vx(~) = V/~(x).", "sec_num": null }, { "text": "The second non-deterministic rule is used in the decomposition of regular languages. For decomposition we have the following rules: The clash rule is needed since we require regular languages not to contain the empty path. The remaining rules are listed in Figure 1 .", "cite_spans": [], "ref_spans": [ { "start": 257, "end": 265, "text": "Figure 1", "ref_id": "FIGREF4" } ], "eq_spans": [], "section": "Vx =~ V/~ iff W e ~ : Vx(~) = V/~(x).", "sec_num": null }, { "text": "We use A in (LangDecA) as a global restriction, i.e. for every A we get an different rule (LangDecA) (and hence a different rule system 7~A). This is done because the rule system is quasi-terminating. By restricting (LangDeca) we can guarantee that only finitely many regular languages are produced.", "cite_spans": [], "ref_spans": [], "eq_spans": [], "section": "Vx =~ V/~ iff W e ~ : Vx(~) = V/~(x).", "sec_num": null }, { "text": "For (LangDec^) to be globally preserving we need to find a suitable pair P, S in A for every possible valuation of (~ and ]3. Therefore, we require A to satisfy VL E A, Vwl, w2 ~ e : [WlW 2 E L =:~", "cite_spans": [], "ref_spans": [], "eq_spans": [], "section": "Vx =~ V/~ iff W e ~ : Vx(~) = V/~(x).", "sec_num": null }, { "text": "We will call A closed under decomposition if it satisfies this condition. Additionally we have to ensure that L E A for every L that is contained in some clause \u00a2. We will call such a set A C-closed. Surely, we will not find a finite A that is closed under decomposition and C-closed for arbitrary \u00a2. But the next lemma states some weaker condition that suffices. We say that 7 is a (\u00a2,TiA)-derivative if 7 is derivable from C by using only rules from 7~h. If R^ is clear from the context, we will just say that 7 is a C-derivative. ", "cite_spans": [], "ref_spans": [], "eq_spans": [], "section": "BP, S e A : (P.S C_ L A Wl E PAw2 e S)].", "sec_num": null }, { "text": "then A is 7-closed for all (C, T~h)-derivaLives 7.", "cite_spans": [], "ref_spans": [], "eq_spans": [], "section": "If A is C-closed and closed under intersection,", "sec_num": "1." }, { "text": "The proof of this lemma (containing the construction of the set A) can be found in the appendix.", "cite_spans": [], "ref_spans": [], "eq_spans": [], "section": "For every prime clause C there is a finite A such that A is C-closed and closed under intersection and decomposition.", "sec_num": "2." }, { "text": "The rule system serves for an algorithm to transform a prime clause into an equivalent set of pre-solved clauses. The rules are applied in arbitrary order until a pre-solved clause has been derived. If one of the non-deterministic rules is applied, a clause is substituted by a whole set of clauses, one for each of the don't know alternatives. Since the rule system is quasi-terminating, we may encounter cycles during the application of the rules. In this case we skip the corresponding alternative, since every pre-solved clause that can be produced via a cyclic derivation can also be produced via a derivation that does not contain a cycle. To prove this theorem we have to show that the rule system is sound and complete. Sound means, that we do not add new solutions during the processing, whereas complete means that we find all solutions in the set of pre-solved derivatives.", "cite_spans": [], "ref_spans": [], "eq_spans": [], "section": "Completeness and Quasi-Termination", "sec_num": "4.2" }, { "text": "For the completeness it normally suffices to show that (1) every rule preserves (or globally preserves) the initial solutions and (2) the pre-solved clauses are exactly the T~h-irreducible clause (i.e. if a clause is not pre-solved, then one rule applies). But in our case this is not sufficient as the rule system is quasiterminating. A prime clause \u00a2 may have a solution Vx which is a solution of all (C, T~A)-derivatives in some cyclic derivation, but can not be found in any pre-solved (\u00a2, T~h)-derivative. We have to show that this cannot happen. Since this part of the proof is unusual, we will explain the main idea (see the appendix for a more detailed outline of the proofs).", "cite_spans": [], "ref_spans": [], "eq_spans": [], "section": "Completeness and Quasi-Termination", "sec_num": "4.2" }, { "text": "Let \u00a2 be some (consistent) prime clause and let Vx E ~\u00a2]z for some Z. Then there exists a path valuation Vp such that (Vx, V~) ~z \u00a2. We will find a pre-solved C-derivative that has Vx as a solution by", "cite_spans": [], "ref_spans": [], "eq_spans": [], "section": "Completeness and Quasi-Termination", "sec_num": "4.2" }, { "text": "imposing an additional control that depends on V~,. This control will guarantee (1) finiteness of derivations, (2) that each derivation ends with a pre-solved clause, (3) the initial solution is a solution of every clause that is derivable under this control. Since the (Pre) rule does not preserve the initial path valuation V~, (recall that the variable fl is substituted by the term a.~), we have to change the path valuation V~, every time (Pre) is applied. It is important to notice that this control is only used for proof purposes and not part of the algorithm. For the algorithm it suffices to encounter all pre-solved e-derivatives.", "cite_spans": [], "ref_spans": [], "eq_spans": [], "section": "Completeness and Quasi-Termination", "sec_num": "4.2" }, { "text": "To understand this control, we will compare derivations in our syntax to derivations in standard feature logic. Recall that we have a two-level inter- or (Pre) is applied, we also have to rewrite Cv~. Taking the above example, we are only allowed to add ali fl to C (using (PathRel)), since ev~ is already in pre-solved form. Now let's vary the example and let Vp be a path valuation with V~,(a) = f and V~,(f~) = Hg. Then we have to add a ~ /3 in the first step, since this relation holds between a and ft. The next step is to apply (Pre) on a :~ /3. Here we have to rewrite both \u00a2 and Cv~. Hence, the new clauses \u00a21 and evv are {xax, a~f, x/3z, a./3~ f*g} and {x f x, x fgy} respectively. Note that the constraint xffgy has been reduced to x fg y by the application of (Pre).", "cite_spans": [], "ref_spans": [], "eq_spans": [], "section": "Completeness and Quasi-Termination", "sec_num": "4.2" }, { "text": "Since infinite derivations must infinitely often use (Pre), this control guarantees that we find a presolved clause that has Vx as a solution.", "cite_spans": [], "ref_spans": [], "eq_spans": [], "section": "Completeness and Quasi-Termination", "sec_num": "4.2" }, { "text": "The Second Phase Additionally, one has to consider the effects of introducing the path terms/~.a~. The main part of this task is to resolve constraints of the form fl.tr~ li tr. There are two possibilities: Either a has also f~ as an prefix, in which case we have to add fl ~ a; or fl is not a prefix of c~, which means that we have to add c~ fl ft. After doing this, the introduced prefix constraints have to be evaluated using (Pre). (In the appendix we present a solution which is more appropriate for proofing termination). 6 Kaplan and Maxwell's Method", "cite_spans": [ { "start": 528, "end": 529, "text": "6", "ref_id": "BIBREF5" } ], "ref_spans": [], "eq_spans": [], "section": "5", "sec_num": null }, { "text": "We are now able to compare our method with the one used by Kaplan is the example we have used on page 2. The first three cases correspond exactly to the result of the 2This is not the way their algorithm was originally described in [5] as they use a slightly different syntax. Furthermore, they don't use non-deterministic rules, but use a single rule that produces a disjunction. However, the way we describe their method seems to be more appropriate in comparing both approaches. derivations that have been described for 72, 73 and 3'4. By and large, the last case is achieved if we first add c~ [I ~ to 3' and then turn over to the second phase as described in the last section.", "cite_spans": [ { "start": 59, "end": 65, "text": "Kaplan", "ref_id": null }, { "start": 232, "end": 235, "text": "[5]", "ref_id": "BIBREF4" } ], "ref_spans": [], "eq_spans": [], "section": "5", "sec_num": null }, { "text": "The problem with Kaplan/Maxwell's algorithm is that one has to introduce a new variable u in the last case, since there is no other possibility to express divergence. If their rule system is applied to a cyc!ic description, it will not terminate as the last part introduces new variables. Hence it cannot be used for an algorithm in case of cyclic descriptions.", "cite_spans": [], "ref_spans": [], "eq_spans": [], "section": "5", "sec_num": null }, { "text": "The delaying of the evaluation of divergence constraint may not only be useful when applied to cyclic feature descriptions. As Kaplan and Maxwell pointed out, it is in general useful to postpone the consistency check for functional uncertainty. With the algorithm we have described it is also possible to delay single parts of the evaluation of constraints containing functional uncertainty. We will prove that A is also closed under decomposition. Given some L E A and a word w = wlw2 E L, we have to find an appropriate decomposition P, S in A. Since each L in A can be written as a finite m L intersection L = Nk=l i~ where Lik is in A0, we know that w = wlw2 is in The rule (PathRel) is X U 12sound and globally X U 12-preserving. If A is closed under decomposition, then (LangDec^) is X U 12sound and globally X U IJ-preserving. The (Pre) rule is X-sound and X-preserving. All other rules are X U 13-sound and X U 13-preserving.", "cite_spans": [], "ref_spans": [], "eq_spans": [], "section": "5", "sec_num": null } ], "back_matter": [ { "text": "Next we will prove some syntactic properties of the clauses derivable by the rule system. For the rest of the paper we will call clauses that are derivable from prime clauses admissible.Proposition A.2 Every admissible clause is basic. Ira -~ 13, o~ -- [3 or c~ (I 13 is contained in some admissible clause \u00a2, then there is a variable z such that zc~y and zflz is in \u00a2.Note that (by this proposition) (Pre) (resp. (Eq)) can always be applied if a constraint c~ 4 [3 (resp. -/3) is contained in some admissible clause. The next lemma will show that different applications of (Pre) or (Eq) will not interact. This means the application of one of these rule to some prefix or path equality constraint will not change any other prefix or path equality constraint contained in the same clause. This is a direct consequence of the way (PathP~el) was defined.Lemma A.3 Given two admissible clauses 7, 7' with 7 ---~r 7' and r different from (PathRel). Then c~ \"-13 E 7' (resp. ~ 4 13 E 7 I) implies ~ --13 E 7 (resp. a :\u00a2 [3 E 7) . Furthermore, if a.13 is contained in 7', then either a.fl or a -~ 13 is contained in 7.Note that this lemma implies that new path equality or prefix constraints are only introduced by (PathRel). We can derive from this lemma some syntactic properties of admissible clauses which are needed for proving completeness and quasitermination. The first property will guarantee that concatenation does not occur in prefix or equality constraints and that the length of path concatenation is restricted to 2. The second property ensures that a constraint c~.13 fi 13' is always reducible.Theorem A.5 For every finite A the rule system 7~a is quasi-terminating.Proof. The rule system produces only finitely many different clauses since the rules introduce no additional variables or sort symbols and the set of used languages is finite. Additionally, the length of concatenation is restricted to 2.[] Lemma A.6 There are no infinite derivations using only finitely many instances of (Pre).Since the rule system is quasi-terminating, the completeness proof consists of two parts. In the first part we will proof that pre-solved clauses are just the irreducible clauses. In the second part we will show that one finds for each solution Vx of a prime clause \u00a2 a pre-solved e-derivative 7 such that Vx is also a solution of 7. where pre-solved(\u00a2,R^) is the set of pre-solved (\u00a2, R A )-derivat ives.Proof (Sketch) We have to show, that for each prime clause \u00a2 and each Vx, V~,Z with (Vx, V~) ~z \u00a2 there is a pre-solved (\u00a2, T~A)-derivative 7 such that Vx E ~7] z. We will do this by controlling derivation using the valuation (Vx, VT~). The control will guarantee finiteness of derivations and will maintain the first completeness property, namely that the irreducible clauses are exactly the pre-solved clauses. We allow only those instances of the nondeterministic rules (PathRel) and (LangDecA), which preserve exactly the valuation (Vx, V~). That means if (Vx,V~) ~z \u00a2 and \u00a2 --~r 7 for one of these rules, then (Va', V~) ~z 7 must hold. Note that the control depends only on VT,. E.g. for the clause \u00a2 = {xc~y, a ~ L1, x13z, 13~ L2} and arbitray Z, Vx this means that if VT,(a) = f, V~,(13) = g and (Vx, VT,) ~z \u00a2, the rule (PathRel) can transform \u00a2 only into {a h 13} U \u00a2.If V~, satisfies V~, (tr) 7~ V~, (13) for ~ different from fl with zcry E \u00a2 and 213z E \u00a2, we cannot add any prefix constraint using this control. Hence, (Pre) cannot be applied, which implies (by lemma A.6) that in this case there is no infinite controlled derivation. We will call such path valuations prefix-free with respect to \u00a2.If V~, is not prefix-free, then (Pre) will be applied during the derivations. In this case we have to change the path valuation, since (Pre) is not P-preserving. If (Vx, V~) ~z \u00a2 = {a k 13} U \u00a2 and we apply (Pre) on cr -~ fl yielding 7, then the valuation V\u00a2 with v (13) = and = for # will satisfy (Vx, pz % We will use for controlling the further derivations.If we change the path valuation in this way, there will again be only finite derivations. To see this, note that every time (Pre) is applied and the path valuation is changed, the valuation of one variable is shortened by a non-empty path. As the number of variables used in clauses does not increase, this shortening can only be done finitely many times. This implies, that (Pre) can only finitely often be applied under this control. Hence (by lemma A.6), there are again only finite controlled derivations.1:3", "cite_spans": [ { "start": 253, "end": 267, "text": "[3 or c~ (I 13", "ref_id": null }, { "start": 463, "end": 472, "text": "[3 (resp.", "ref_id": null }, { "start": 1015, "end": 1022, "text": "[3 E 7)", "ref_id": null } ], "ref_spans": [], "eq_spans": [], "section": "annex", "sec_num": null }, { "text": "Consistency of Pre-Solved Clauses We will first do a minor redefinition of divergence. We say that two paths u, v are directly diverging (written u u0 v) if there are features f ~ g such that u E f/'* and v 6 g/'*. Then u n v holds if there are a possible empty prefix w and paths u', v' such that u = wu' and v = wC and u' n0 v'.We will reformulate the reduction of divergence constraints in order to avoid constraints of form a.fl fi fl'. Handling such constraints would make the termination proof somewhat complicated. For the reformulation we use a special property of pre-solved clauses, namely that a fi fl is in a pre-solved clause \u00a2 iff zay and zflz is in \u00a2. Hence, if a fi/? and ~ fi df is in \u00a2, then a Ii df is also in \u00a2. This implies, that we can write ep as fi together with the rules (LangDech), (Join) and (Empty) completes the rule system 7~ \u00b0iv. (Reds) is needed as path variables always denote non-empty paths. We will view (Redz) and (Red2) as one single rule (Reduce). A clause ~ is said to be solved if (1) a.fl ~ L and ot~0 is not in ep; (2) a~L1 in ep and a~L~ in ep implies Lz = L2; (3) \u00a2 does not contain constraints of form afl fl, a Ii0 fl, oL :< fl, or a -\" fl; and (4) for every {xay, z/~z} _C ~ with a \u00a2/? there are features f #g with {a~fLs,fl~gL2} _C \u00a2. It is easy to see that every solved clause is consistent. Note that every solved clause is also prime. Lemma A.9 The rules (Reduce) = (Redt) + (Reds) and (Solv) are X-sound and globally Xpreserving. Furthermore, 7~ s\u00b0lv is terminating. Lemma A.10 Let \u00a2 be a pre-soived clause. If A is e-closed, closed under intersection and decomposition, then a (\u00a2, TiS\u00b0lv)-derivative different from 1 is irreducible if and only if it is solved. Finally we can combine both phases of the algorithm. Theorem A.11 Consistency of prime clauses is decidable.", "cite_spans": [], "ref_spans": [], "eq_spans": [], "section": "A.2", "sec_num": null } ], "bib_entries": { "BIBREF0": { "ref_id": "b0", "title": "On the expressivity of feature logics with negation, functional uncertainity, and sort equations", "authors": [ { "first": "F", "middle": [], "last": "Baader", "suffix": "" }, { "first": "H.-J", "middle": [], "last": "Bfirckert", "suffix": "" }, { "first": "B", "middle": [], "last": "Nebel", "suffix": "" }, { "first": "W", "middle": [], "last": "Nutt", "suffix": "" }, { "first": "G", "middle": [], "last": "Smolka", "suffix": "" } ], "year": 1991, "venue": "", "volume": "", "issue": "", "pages": "", "other_ids": {}, "num": null, "urls": [], "raw_text": "F. Baader, H.-J. Bfirckert, B. Nebel, W. Nutt, and G. Smolka. On the expressivity of feature logics with negation, functional uncertainity, and sort equa- tions. Research Report RR-91-01, DFKI, 1991.", "links": null }, "BIBREF1": { "ref_id": "b1", "title": "Regular path expressions in feature logic", "authors": [ { "first": "R", "middle": [], "last": "Backofen", "suffix": "" } ], "year": 1993, "venue": "", "volume": "", "issue": "", "pages": "", "other_ids": {}, "num": null, "urls": [], "raw_text": "R. Backofen. Regular path expressions in feature logic. Research Report RR-93-17, DFKI, 1993.", "links": null }, "BIBREF2": { "ref_id": "b2", "title": "A complete and recursire feature theory", "authors": [ { "first": "R", "middle": [], "last": "Backofen", "suffix": "" }, { "first": "G", "middle": [], "last": "Smolka", "suffix": "" } ], "year": 1993, "venue": "Proc. of the 31 th ACL", "volume": "", "issue": "", "pages": "", "other_ids": {}, "num": null, "urls": [], "raw_text": "R. Backofen and G. Smolka. A complete and recur- sire feature theory. In Proc. of the 31 th ACL, 1993. this volume.", "links": null }, "BIBREF3": { "ref_id": "b3", "title": "Termination of rewriting", "authors": [ { "first": "N", "middle": [], "last": "Dershowitz", "suffix": "" } ], "year": 1987, "venue": "Journal of Symbolic Computation", "volume": "3", "issue": "", "pages": "69--116", "other_ids": {}, "num": null, "urls": [], "raw_text": "N. Dershowitz. Termination of rewriting. Journal of Symbolic Computation, 3:69-116, 1987.", "links": null }, "BIBREF4": { "ref_id": "b4", "title": "An algorithm for functional uncertainty", "authors": [ { "first": "R", "middle": [ "M" ], "last": "Kaplan", "suffix": "" }, { "first": "J", "middle": [ "T" ], "last": "Maxwell", "suffix": "" } ], "year": 1988, "venue": "Proc. of the 12th COL-ING", "volume": "", "issue": "", "pages": "297--302", "other_ids": {}, "num": null, "urls": [], "raw_text": "R. M. Kaplan and J. T. Maxwell III. An algorithm for functional uncertainty. In Proc. of the 12th COL- ING, pages 297-302, Budapest, Hungary, 1988.", "links": null }, "BIBREF5": { "ref_id": "b5", "title": "Functional uncertainty and functional precedence in continental west germanic", "authors": [ { "first": "R", "middle": [ "M" ], "last": "Kaplan", "suffix": "" }, { "first": "A", "middle": [], "last": "Zaenen", "suffix": "" } ], "year": 1988, "venue": "editor, 4-(gsterreichische Artificial-lnteiligence-Tagung: Wiener Workshop-Wissensbasierte Sprachverarbeitung", "volume": "", "issue": "", "pages": "114--123", "other_ids": {}, "num": null, "urls": [], "raw_text": "R. M. Kaplan and A. Zaenen. Functional uncer- tainty and functional precedence in continental west germanic. In H. Trost, editor, 4-(gsterreichische Artificial-lnteiligence-Tagung: Wiener Workshop- Wissensbasierte Sprachverarbeitung, pages 114-123. Springer, Berlin, Heidelberg, 1988.", "links": null }, "BIBREF6": { "ref_id": "b6", "title": "Long-distance dependencies, constituent structure, and functional uncertainty", "authors": [ { "first": "R", "middle": [ "M" ], "last": "Kaplan", "suffix": "" }, { "first": "A", "middle": [], "last": "Zaenen", "suffix": "" } ], "year": 1988, "venue": "Alternative Conceptions of Phrase Structure", "volume": "", "issue": "", "pages": "", "other_ids": {}, "num": null, "urls": [], "raw_text": "R. M. Kaplan and A. Zaenen. Long-distance depen- dencies, constituent structure, and functional uncer- tainty. In M. Baltin and A. Kroch, editors, Alterna- tive Conceptions of Phrase Structure. University of Chicago Press, Chicago, 1988.", "links": null }, "BIBREF7": { "ref_id": "b7", "title": "Feature logics, infinitary descriptions and the logical treatment of grammar", "authors": [ { "first": "B", "middle": [], "last": "Keller", "suffix": "" } ], "year": 1991, "venue": "Cognitive Science Research Report", "volume": "205", "issue": "", "pages": "", "other_ids": {}, "num": null, "urls": [], "raw_text": "B. Keller. Feature logics, infinitary descriptions and the logical treatment of grammar. Cognitive Science Research Report 205, Univerity of Sussex, School of Cognitive and Computing Sciences, 1991.", "links": null }, "BIBREF8": { "ref_id": "b8", "title": "A feature logic with subsorts", "authors": [ { "first": "G", "middle": [], "last": "Smolka", "suffix": "" } ], "year": 1988, "venue": "", "volume": "33", "issue": "", "pages": "", "other_ids": {}, "num": null, "urls": [], "raw_text": "G. Smolka. A feature logic with subsorts. LILOG- Report 33, IBM Deutschland, Stuttgart, 1988.", "links": null }, "BIBREF9": { "ref_id": "b9", "title": "Feature constraint logics for unification grammars", "authors": [ { "first": "G", "middle": [], "last": "Smolka", "suffix": "" } ], "year": 1992, "venue": "Journal of Logic Programming", "volume": "12", "issue": "", "pages": "51--87", "other_ids": {}, "num": null, "urls": [], "raw_text": "G. Smolka. Feature constraint logics for unification grammars. Journal of Logic Programming, 12:51- 87, 1992.", "links": null }, "BIBREF10": { "ref_id": "b10", "title": "Records for logic programming", "authors": [ { "first": "G", "middle": [], "last": "Smolka", "suffix": "" }, { "first": "R", "middle": [], "last": "Treinen", "suffix": "" } ], "year": 1992, "venue": "Proceedings of the 1992 Joint International Conference and Symposium on Logic Programming", "volume": "", "issue": "", "pages": "240--254", "other_ids": {}, "num": null, "urls": [], "raw_text": "G. Smolka and R. Treinen. Records for logic pro- gramming. In Proceedings of the 1992 Joint Inter- national Conference and Symposium on Logic Pro- gramming, pages 240-254, Washington, DC, 1992.", "links": null } }, "ref_entries": { "FIGREF0": { "text": "term relation xo~y and a restriction constraint o~ ~ L, a constraint xLy can equivalently be expressed as xay A a ~ L (4 new). The interpretation of xay is done in two steps. Given a valuation V~, of the path variables as feature paths, a constraint =c~y in \u00a2 is substituted by xV~,(cQy. This constraint is then interpreted using the valuation for the first order variables in the way such constraints are usually interpreted.", "num": null, "uris": null, "type_str": "figure" }, "FIGREF1": { "text": "{xay, al ~ L1, xflz, fl~ L2}. The first step is to guess the relation between the path variables c~ and ft. Therefore, 7 can be expressed equivalently by the set of clauses 71 = {4 ,) ~} u 7 73 = {~ ~ ~} u 7 72 = {,~ -~} u 7 74 = {~ -~ ,~} u 7", "num": null, "uris": null, "type_str": "figure" }, "FIGREF2": { "text": "{xax, c~ ~ f, xflz, fl ~ f'g}. Then a possible looping derivation could be 1. adda4]~: {4 4 fl, xax, a~f, xflz, fl~f*g} 2. split fl into a-f~': 3. decompose c~-/~ I~ f'g: {=~, ~f, ~f~'~, a~f*, Z'~f*g} 4. join a-restrictions: {=~z, ~I, ~/~'z, ~'~y*g}", "num": null, "uris": null, "type_str": "figure" }, "FIGREF3": { "text": "Vx, V~,) ~z Ax :\u00a2=:~ Vx(x) e A z (Vx, Vr) Pz = -Y :\u00a2=~ Vx(=) = Vx(U) (vx,vr) ~z zpy (vx, vv) ~z =.u (vx, vT~) ~z p e L (Vx, VT~) ~z pbq :\u00a2=, vx(~) F? 0...o F, ~ Vx(y) :\u00a2:=~ (Vx, Vv) ~z \u2022 Vv(a) y :\u00a2==~ V~,(p) C L :\u00a2::~ Vp(p) o VT~(q) for aC {u,k,--\" }, where p is the path fl\"...'f, and F/z are the interpretations of fi in Z.", "num": null, "uris": null, "type_str": "figure" }, "FIGREF4": { "text": "(Vx, V~) is a O-solution of a clause \u00a2 in 2\" if there is a valuation (V~, V~) in 2\" such that Vx =a'ne V~:, Vp =~,no V~ and (V~:, V~) ~z \u00a2. The set of all 0-solutions of \u00a2 in 2: is denoted by [\u00a2]~. We will call X-solutions just solutions and write [\u00a2]z instead of [\u00a2],~. For checking satisfiability we will use transformation rules. A rule R is O-sound \u00a2 --*n 7 [\u00a2]z D [7]~ z for every interpretation 2\". R is called O-preserving if \u00a2 \"+R 3' :\u00a2\" [\u00a2]Z C [7]~. R is globally O-preservingif [\u00a2]z C_ U [7Set of Rules", "num": null, "uris": null, "type_str": "figure" }, "FIGREF5": { "text": "z~y, x/3z) U \u00a2 {x~,v} u {v/3z} u \u00a2[/3.-~./3]", "num": null, "uris": null, "type_str": "figure" }, "FIGREF6": { "text": "DecClash) {a./3~L} O \u00a2 {w e L llwl > 1} = g _L (LangDecn) {a.fl ~ L) U \u00a2 P.S C L {o~P} U {/3~S} U\u00a2 where P, S, L C F + and A is a finite set of reg. languages with L, P, S 6 A. L must contain a word w with [w[ > 1.", "num": null, "uris": null, "type_str": "figure" }, "FIGREF7": { "text": "", "num": null, "uris": null, "type_str": "figure" }, "FIGREF8": { "text": "Let \u00a2 be a prime clause. If A is Cclosed, closed under intersection and decomposition, then [[C] z = U.y~ [[7] z for every interpretation Z, where \u00a2b is the set of pre-solved (C, T~^)-derivatives. The set (9 is finite and effectively computable.", "num": null, "uris": null, "type_str": "figure" }, "FIGREF9": { "text": "and Maxwell. In our method, the non-deterministic addition of path relation and the evaluation of these relations are done at different times. The evaluation of the introduced constraints c~ -fl and o~ :\u00a2 fl are done after (PathRel) in the first phase of the algorithm, whereas the evaluation of the divergence constraints is done in a separate second phase.InKaplan and Maxwell's algorithm all these steps are combined into one single rule. Roughly, they substitute a clause {xL~y, xL2z, } O \u00a2 nondeterministicly by one of the following clauses: ~ { x(L~f3L~)y, x-y}U\u00a2 { x(L~f3P)y, ySz }U\u00a2 P.SC_L~ { x(L~NP)z, zSy }U\u00a2 P.S C L1 { x(P1NP2)u, u(f.S1)y, u(g.S2)z }U\u00a2 with PI'f'S~ C_ L~, P2\"g'S~ C_ L~, f # g, u new Recall that {XLly, xL2z} is expressed in our syntax by the clause 3' = {xay, o~ ~ L1, x~z, j~ ~ L2}, which", "num": null, "uris": null, "type_str": "figure" }, "FIGREF10": { "text": "Proof of Lemma 4.2. The first claim is easy to prove. For the second claim let {L1,...,Ln} C P(~+) be the set of regular languages used in \u00a2 and let .Ai = (Q.4~, i.4~, cr a~, Fin.4~) be finite, deterministic automatons such that .A i recognizes Li. For each .Ai we define dec(.Ai) to be the set dee(A/) = {L~ ]p,q E QJt,}, whereL~ = {w E 2 \"+ I a~,(p,w) = q}. It is easy to show that dec(.Ai) is a set of regular languages that contains Li and is closed under decomposition. Hence, the set A0 = [.Jinx dec (Ai) contains each Li and is closed under decomposition. Let A = fi (A0) be the least set that contains A0 and is closed under intersection. Then A is finite and e-closed, since it contains each Li.", "num": null, "uris": null, "type_str": "figure" }, "FIGREF11": { "text": "for 1..m. As A0 is closed under decomposition, there are languages Pi~ and Si~ for k = 1..m with wl E Pi~, w2 E Si~ and Pik'Sik C Li~. Let P = M~n=l Pik and S = s,~. Clearly, wl 6 P, w2 6 S and P.S C L. Furthermore, P, S 6 A as A is closed under intersection. This implies that P, S is an appropriate decomposition for", "num": null, "uris": null, "type_str": "figure" }, "TABREF0": { "content": "", "num": null, "type_str": "table", "text": "and Maxwell [5, page 1]. Consider the topicalized sentence Mary John telephoned yesterday. Using s as a variable denoting the whole sentence, the LFG-like clause s topic x A s obj x specifies that in s Mary should be interpreted as the object of the relation telephoned. The sentence could be extended by introducing additional complement predicates, as e.g. in sentences like Mary John claimed thai Bill telephoned; Mary John claimed thai Bill said that ...Henry telephoned yesterday; .... For this family of sentences the clauses s topic x A s comp obj x, s topic xAs comp cornp obj x and so on would be appropriate; specifying all possibilities would yield an infinite disjunction. This changes if we make use of functional uncertainty allowing to specify the above as the single clause s topic x A s comp* obj x.", "html": null }, "TABREF3": { "content": "
", "num": null, "type_str": "table", "text": "pretation. A constraint xay is valid under Vx and V~ if xV~(c~)y is valid under Vx. Hence, for each clause \u00a2 and each valuation Vx, Vp with C valid under Vx and Vp there is a clause Cv~ in standard feature logic syntax (not containing functional uncertainty) such that \u00a2v~ is valid under Vx. E.g. for the clause {xax, a~f, xflz, fl~f*g} and a path valuation V~, with VT,(a) = f and V~,(j3) = g the clause Cv~, is {xfx, xgy}. The control we have mentioned requires (by and large) that only those rewrite rules will be applied, that are compatible to the clause Cv~ and thus preserve Vx. If one of the rules (Eq)", "html": null }, "TABREF4": { "content": "
al ~ L1 and a2 ~ L2. A constraint al ti a2 is
va|id under some valuation V~, if there are (possi-
bly empty) words w, wl, w2 and features'f ~ g such
that V~,(al) = {al fi ct2} U\u00a2f#g, ~,~1 2new
where \u00a2' = \u00a2[al ~--/?.a~,a2 ~/3.a~]. By the ap-
plication of this rule we will get constraints of the
form j3.a~ ~ L1 and fl.a~ ~ L2. Decomposing these
restriction constraints and joining the correspond-
ing path restrictions for ~ and ~,~ will result in
{fl~ (Pl nP2), ~i ~ (S~:*ns,), ,~ (g~'*MS2)}
with PI.S~ C L~ and P2.S2 C_ L~, which completes
the consistency check.
In the second phase we have to check consistency
of pre-solved clauses. As we have mentioned, a pre-
solved clause is consistent if we find some appropri-
ate path valuation. This means that we have to
check the consistency of divergence constraints of
the form al fi a2 together with path restrictions
", "num": null, "type_str": "table", "text": "WfWl and V~,(c~2) = wgw2. This definition could directly be used for a rewrite rule that solves a single divergence constraint, which gives us", "html": null } } } }