Database Reference
In-Depth Information
1.6.1 S UMMARY OF DIFFERENCES BETWEEN DBS AND OTHER SYSTEMS
1. Derivation Order:
The parsing algorithm of DBS, i.e., LA-grammar, uses a strictly time-
linear derivation order to compute possible continuations . The derivations
of Phrase Structure Grammars and Categorial Grammars, in contrast, are
partially ordered and compute possible substitutions . As a consequence,
the application of LA-grammar to natural language has been shown to
be of linear complexity, while the complexity of the other grammar for-
malisms is either polynomial but empirically inadequate (context-free), or
computationally intractable ranging from context-sensitive (exponential)
to recursively enumerable (undecidable; cf. FoCL'99, Chaps. 9-12). 28
2. Ontology:
In DBS, the model of natural language communication is located inside
the speaker-hearer as a software machine attached to the agent's external
and internal interfaces (cf. FoCL'99, Sect. 2.4). In Truth-Conditional Se-
mantics, in contrast, the speaker-hearer is part of a set-theoretic model, and
nonprocedural metalanguage definitions are used to connect the language
expressions directly to referents in the model. As a consequence, DBS is
designed for a talking robot, while Truth-Conditional Semantics is not 29
(cf. FoCL'99, Chaps. 19-21).
3. Elementary Meaning:
In DBS, the agent's basic recognition and action procedures are reused
as the elementary meanings of language. In Truth-Conditional Seman-
tics, in contrast, elementary meanings are defined in terms of their truth-
27 The acronym S LIM abbreviates S urface compositional, time- L inear, I nternal M atching, which are the
methodological, empirical, ontological, and functional principles of the approach. When the acronym
is viewed as a word, S LIM indicates low mathematical complexity, resulting in the computational
efficiency required for real-time processing.
28 Tree Adjoining Grammar (TAG, Joshi 1969) parses “mildly” context-sensitive languages in poly-
nomial time. For example, the formal languages a k b k c k (three terms), a k b k c k d k (four terms),
a k b k c k d k e k (five terms) are called mildly context-sensitive, but anything longer than five terms, e.g.,
a k b k c k d k e k f k , is called “strongly” context-sensitive and is of exponential complexity in TAG. The
algorithm of LA-grammar, in contrast, parses all these formal languages in linear time, regardless of
the number of terms.
29 We are referring here to the classical approach of Truth-Conditional Semantics based on set theory,
as in Montague grammar. This is in contradistinction to adaptations of Truth-Conditional Semantics
to applications. For example, the robot STRIPS (Fikes and Nilsson 1971) uses the GPS (Global
Positioning System) strategy for the motion planning of an artificial agent. The reasoning for this is
formulated in terms of higher-order predicate calculus and “theorem proving,” based on large numbers
of “well-formed formulas” (wffs). To plan the motion of a robot like STRIPS, however, “certain
modifications must be made to theorem-proving programs” (op. cit., p. 192).
For a comparison of logical semantics and the semantics of programming languages, see FoCL'99,
Sect. 19.2. Our alternative to the theorem proving method in AI is DBS inferencing. See 11.2.4 and
11.2.5 for a listing of The inferences defined in this topic.
Search WWH ::




Custom Search