LINGUIST List 23.2548|
Wed May 30 2012
Review: Syntax; Linguistic Theories; Language Acquisition: Borsley & Börjars (2011)
Editor for this issue: Rajiv Rao
This LINGUIST List issue is a review of a book published by one of our supporting publishers, commissioned by our book review editorial staff. We welcome discussion of this book review on the list, and particularly invite the author(s) or editor(s) of this book to join in. If you are interested in reviewing a book for LINGUIST, look for the most recent posting with the subject "Reviews: AVAILABLE FOR REVIEW", and follow the instructions at the top of the message. You can also contact the book review staff directly.
From: Yusuke Kubota <ykphiz.c.u-tokyo.ac.jp>
Subject: Non-Transformational Syntax
E-mail this message to a friend
Discuss this message
Announced at http://linguistlist.org/issues/23/23-278.html
EDITORS: Robert D. Borsley and Kersti Börjars
TITLE: Non-Transformational Syntax
SUBTITLE: Formal and Explicit Models of Grammar
Yusuke Kubota, Department of Language and Information Sciences, University of Tokyo
This book presents a state-of-the-art overview of the three major variants of
non-transformational syntactic theories: Head-Driven Phrase Structure Grammar
(HPSG), Lexical-Functional Grammar (LFG) and Categorial Grammar (CG). The book
is divided into two parts; the first part consists of chapters (two for each
theory) that provide thorough overviews of these theories, and the other six
chapters deal with related and somewhat broader issues such as sentence
processing, language acquisition, and general theoretical issues such as the
role of features in grammatical theory and the notion of lexicalism.
In the first chapter, ''Elementary Principles of Head-Driven Phrase Structure
Grammar'', Georgia M. Green presents the basics of HPSG. Green begins with a
discussion of general architectural considerations that have informed the
formulation of HPSG, such as the notion of grammar as a set of constraints and
the organization of constraints and grammar rules in terms of typed feature
structures. The chapter then sketches a simple grammar of English, explaining
how basic syntactic notions and phenomena such as subcategorization, agreement,
binding and long-distance dependencies are treated in HPSG. In HPSG, recursive
objects called feature structures, which encode feature-value pairs (where the
values of certain features can themselves be feature-value pairs), play a
central role in grammatical description. Green illustrates how identity
conditions (called structure sharing) imposed on such complex feature structures
enable explicit and precise analyses of linguistic phenomena without recourse to
the notion of syntactic transformation.
Building on Green's introductory chapter, ''Advanced Topics in Head-Driven Phrase
Structure Grammar'', by Andreas Kathol, Adam Przepiórkowski and Jesse Tseng
discusses broader and more advanced issues in HPSG. The chapter concisely covers
a wide range of topics, including: a lexicalist treatment of complex predicates
in terms of argument composition; the linearization-based approach to word
order-related phenomena (an extension of HPSG that decouples linear order from
hierarchical constituency); the Minimal Recursion Semantics framework of
underspecified semantics; sophisticated treatments of morpho-syntactic issues
such as clitics, case assignment and agreement; and an approach to integrating
HPSG with the ideas from Construction Grammar. The interconnections (and
possible tensions) between different analytic techniques and lines of
research---such as the opposition between the argument composition-based vs.
linearization-based analyses of complex predicates---are addressed carefully.
The chapter is essentially a snapshot of cutting edge HPSG research around the
end of the 1990s, but many issues discussed here are still relevant and have
wider implications cross-theoretically.
The next two chapters deal with LFG. In ''Lexical-Functional Grammar:
Interactions between Morphology and Syntax'', Rachel Nordlinger and Joan Bresnan
describe the morpho-syntactic component of LFG. The idea behind LFG is that
making phrase structure representations (called c-structure) maximally simple
and representing notions such as grammatical relations in a separate component
(called f-structure) leads to an overall simplification of grammar. The authors
demonstrate how this multi-component architecture enables a simple
characterization of the difference between configurational and
non-configurational languages, where different parts of grammar (syntactic rules
vs. lexicon) are made primarily responsible for building up (nearly identical)
f-structures in the two language types. This is followed by an analysis of a
somewhat more complicated case, where multiple parts of c-structure add up to
specify one unitary component in f-structure, which is found in Welsh verb order
and multiple tense marking in Australian languages. Here, the notion of
'co-head' plays a central role in dispensing with the notion of head movement in
In ''Lexical-Functional Grammar: Functional Structure'', Helge Lødrup explains the
role of f-structure in LFG. The first half of the chapter is an exposition of
Lexical Mapping Theory (LMT), a theory of argument realization developed in LFG.
Here, Lødrup first explains how the mapping between semantic roles and
grammatical relations is generally mediated via a small set of principles making
reference to two binary features, +/-o(bjective) and +/-r(estrictive). Then, the
author illustrates how this general theory of argument realization is employed
in the analyses of argument alternation phenomena such as passive and locative
inversion. The second half consists of analyses of syntactic phenomena such as
raising, control, long-distance dependencies and binding. In the analyses of
these phenomena, imposing various kinds of identity conditions between different
parts of f-structure plays a crucial role. Lødrup explains how the key notions
of functional control, anaphoric control and functional uncertainty are
formulated in LFG and are employed in the analyses of these phenomena (i.e.
raising, control, etc.).
The last two chapters in the theory part deal with CG. Unlike the chapters for
HPSG and LFG, the two chapters for CG each independently introduce different
variants of CG. ''Combinatory Categorial Grammar'', by Mark Steedman and Jason
Baldridge, presents the theory of Combinatory Categorial Grammar (CCG). The
chapter starts with a simple CG grammar, consisting of function application
alone, and motivates an extension to CCG that has more flexible rules such as
type-raising and function composition. The rules introduced are explained
alongside relevant linguistic examples. This is followed by analyses of major
syntactic phenomena, including binding, control, raising, long-distance
dependencies and coordination. The notion of modal control, a major theoretical
revision to the CCG introduced in Baldridge (2002), is explained along the way.
This innovation, building on a technique originally developed in Type-Logical
Grammar (TLG), enables CCG to maintain a fully universal rule component
cross-linguistically. The chapter ends by briefly touching on implications for
human sentence processing and computational implementation.
''Multi-Modal Type-Logical Grammar'', by Richard T. Oehrle, presents an overview
of TLG. TLG differs from CCG in that it literally identifies grammar (of natural
language) as a kind of logic. Thus, in TLG, operations such as type-raising and
function composition are not recognized as primitive rules but rather as derived
theorems. Oehrle starts by laying out the basic theoretical setup of TLG, which
is followed by a couple of linguistic applications. Among these, the interaction
between raising and quantifier scope illustrates the flexibility of the theory
(especially its syntax-semantics interface). The provided fragment of Dutch
illustrates another important aspect of Multi-Modal TLG, namely, the notion of
modal control. Here, the mismatch between surface word order and
predicate-argument structure exhibited by cross-serial dependencies is mediated
by a type of rule called ‘structural rules’, which govern the way in which
syntactic proof is conducted. This, in effect, allows for modeling the notion of
verb raising in transformational grammar in a logically precise setup. An
extensive appendix at the end situates TLG in the larger context of logic-based
approaches to linguistic theory and provides pointers to original sources and
further linguistic applications.
The rest of the book deals with somewhat broader issues. In ''Alternative
Minimalist Visions of Language'', Ray Jackendoff compares current minimalist
theory with the Simpler Syntax approach that he endorses, which is closely
related to HPSG and LFG. The discussion centers on the fact that mainstream
generative syntax has so far relied on an unwarranted distinction between 'core'
and 'peripheral' phenomena, and has failed to attain descriptive adequacy by
simply ignoring the latter. Jackendoff takes up some representative cases of
such 'peripheral' phenomena, and demonstrates that they exhibit properties that
are strikingly similar to 'core' phenomena. In an approach that draws a
categorical distinction between the 'core' and the 'periphery', such
similarities cannot be anything other than a pure accident. Jackendoff argues
that such a treatment misses an important generalization and concludes that
certain constraint-based approaches to syntax, including Simpler Syntax, where
the commonality between the 'core' and the 'periphery' can be seamlessly
captured by the notion of constructions, embody a more adequate architecture of
In ''Feature-Based Grammar'', James P. Blevins discusses the problem of syncretism
in the context of feature-based theories such as HPSG and LFG. In these
theories, agreement is typically handled via unification. That is, the governing
verb and the subcategorized element each contribute their own specifications for
agreement features such as case and gender, and agreement is enforced by
unifying the (often partial) information contributed by each element to yield a
complete description. If no coherent description is obtained via unification,
agreement fails. Syncretic forms are problematic for this type of approach,
since, cross-linguistically, such forms can often simultaneously satisfy
conflicting morphological requirements (typically, as a shared argument of
coordinated functors) by virtue of the fact that they happen to have identical
phonological forms for the conflicting specifications. A simple
unification-based approach incorrectly predicts that such cases lead to
agreement failure. Blevins suggests that this problem can be avoided by
replacing the notion of unification by the notion of subsumption, which merely
checks whether the specifications of subcategorizing and subcategorized elements
are consistent. The chapter ends by briefly discussing whether such a change can
be readily implemented in HPSG and LFG, and concludes that the way
subcategorization is handled in HPSG, in terms of cancellation of list-valued
specifications of subcategorized elements, poses a problem for a straightforward
implementation of the subsumption-based approach.
In ''Lexicalism, Periphrasis, and Implicative Morphology'', Farrell Ackerman,
Gregory T. Stump and Gert Webelhuth provide a detailed review of the notion of
lexicalism. They identify four principles which may plausibly be taken to
constitute the notion of lexicalism. A widely adopted approach to complex
predicates in HPSG and LFG, known as argument composition, violates one of these
principles, which states that syntactic operations cannot alter lexical
properties encoded in words, where argument structure is taken to be part of
lexical properties. The authors then suggest an alternative possibility in which
a different principle is abandoned; one which dictates that lexemes be
syntactically realized as a single word (expressed as a continuous string).
This, in effect, introduces discontinuous constituency, and, as such, the
authors illustrate an approach to the morphology-syntax interface building on
the realizational model of morphology, which implements this analytic option.
The framework is illustrated with analyses of two phenomena exhibiting
(potentially) discontinuously expressed complex morphological words: compound
tense in Slavic languages in the inflectional domain; and phrasal predicates in
Hungarian in the derivational domain.
''Performance-Compatible Competence Grammar'', by Ivan A. Sag and Thomas Wasow,
discusses how the surface-oriented and constraint-based architecture that many
non-transformational theories share bears on the question of constructing a
realistic model of human sentence processing. The chapter discusses some recent
experimental results showing that human sentence processing is incremental and
parallel, and exploits different levels of grammatical information as soon as
they become available. Constraint-based grammars, the authors argue, provide a
more natural fit to these experimental results, since the grammar is free from
the notion of 'syntactic derivation', which, without a highly abstract
characterization of the relationship between competence grammar and performance,
is inconsistent with such experimental results. The authors provide a brief
comparison between their model and Philips's (1996) strictly incremental model
based on minimalist syntax, speculating that once Philips's model is completely
formalized, it might result in a constraint-based reformulation of the
minimalist theory. They reject Philips's proposal in the end, however,
commenting that too much detail is left unresolved in his proposal.
The final two chapters deal with language acquisition. The two chapters address
this question from entirely different perspectives. In ''Modeling Grammar Growth:
Universal Grammar without Innate Principles or Parameters'', Georgia M. Green
sketches an outline of a theory of language acquisition where the knowledge of
grammar is acquired in an incremental manner, without presupposing any innate
language acquisition faculty. The key idea that Green puts forward is that many
(or most) aspects of language acquisition can be thought of as instances of more
general cognitive capacities that the infant is developing at the same time as
(s)he is learning language. Green sketches how the development from the one-word
utterance stage to the multi-word utterance stage, and the subsequent
acquisition of polar and constituent questions, can be modeled as incremental
grammar development. The discussion touches on several fundamental issues in
language acquisition that are simply shielded from being scrutinized in
approaches to language acquisition that start from the innateness premise.
In contrast to the emergent view of Green, in ''Language Acquisition with
Feature-Based Theories'', Aline Villavicencio assumes the innateness view and
justifies this choice by pointing out the lack of any adequate and explicit
model of language acquisition without an innate component. Villavicencio then
lists five elements that need to be specified in detail in any explicit model of
language acquisition: the object being learned; the learning data or
environment; the hypothesis space; what counts as successful learning; and the
procedure that updates the learner’s hypothesis. The chapter reviews previous
research addressing each of these issues, focusing on work that is consistent
with the assumptions of constraint-based and feature-based grammatical
frameworks. As part of this literature review, a relatively detailed sketch of a
word order acquisition model is provided. In this model, Universal Grammar is
formalized as a set of grammar rules in a unification-based CG organized in a
typed inheritance hierarchy. The problem of word order acquisition is modeled as
a problem of parameter setting, where the interdependence between the parameters
is captured by means of default inheritance. Villavicencio argues that this use
of default inheritance leads to a plausible model of language acquisition, since
the organization of information in terms of default inheritance hierarchies
reduces the amount of information that a learner needs to be exposed to until
(s)he arrives at the target grammar.
This book is of great value to researchers and students in syntax and related
fields such as psycholinguistics, computational linguistics and formal
semantics. The first six chapters explain the general theoretical motivations of
each theory succinctly, illustrate their linguistic application clearly, and
provide pointers to relevant literature. The other six chapters are also useful
in situating these theories within a larger context. I am thoroughly impressed
by the breadth and depth covered in this volume. The book is literally packed
with useful information and thought-provoking ideas, crystallizing the insights
resulting from research on non-transformational syntax in the past 30 years or
so. The chapter by Kathol et al. on advanced topics in HPSG and the one by
Oehrle on TLG are especially valuable. The former illuminates the open-ended and
dynamic nature of the inquiry in theoretical linguistics, where linguistic
theories develop through a communal effort by researchers who propose competing
hypotheses on the basis of a shared set of explicitly formulated assumptions.
The latter chapter is important in that it provides a highly accessible
introduction to TLG, which, despite its potentials for linguistic application,
has been largely ignored in the linguistic community due to the highly technical
nature of its underlying mathematical formalisms.
I would nevertheless like to point out two ways in which the book could have
been made even better. The first concerns the treatment of the syntax-semantics
interface. In many non-transformational syntactic theories, providing an
explicit syntax-semantics interface has always been of central concern, and
there are some important recent developments in this domain in each of the three
theories: in LFG, the development of glue semantics (e.g. Dalrymple 2001) has
changed the landscape of the syntax-semantics interface radically; in HPSG, a
new approach called Lexical Resources Semantics (Richter and Sailer 2004) is
currently being developed as the first serious theory of syntax-semantics
interface grounded in explicit model-theoretic semantics; and in CG, two recent
proposals are attracting attention as promising approaches to the
syntax-semantics interface, with one of them facilitating the modeling of both
semantic and phonological components in terms of the lambda calculus (de Groote
2001, Muskens 2003), and the other employing the notion of ‘continuations’ from
computer science in characterizing the syntax-semantics interface (Shan and
Barker 2006). The architecture of the syntax-semantics interface bears directly
on several important issues that recurrently come up in the present volume, such
as the plausibility of a parallel architecture of grammar, in which syntactic
and semantic representations are built in tandem. In view of these, a somewhat
more detailed treatment of the syntax-semantics interface would have been desirable.
Another area where more extensive discussion would have been useful is regarding
comparisons of the three theories. The chapters in this book are more or less
stand-alone readings, and cross-references among chapters are scarce. This is
somewhat disappointing since, given the nature of the present book, there are a
lot of connections and points of contrast that are worth mentioning or
elaborating. To take just one example, LFG treats auxiliaries as purely
inflectional elements occupying the head of a functional projection (in a way
more in line with GB/minimalist literature), whereas in HPSG and CG (where such
functional heads are dispensed with), they are simply treated as a kind of
raising verb. Does such a difference have any empirical consequences? To what
extent do such differences reflect the built-in architectures of the respective
theories? Blevins's chapter is exceptional in touching on these sorts of issues,
but one or two additional chapters focusing solely on such questions and
exploring them in detail with respect to some major grammatical phenomenon would
have been interesting to include. This is important, since considerations of
such issues are likely to be of central concern in research on
non-transformational syntax, and syntax in general, in the next era.
Notwithstanding the above desiderata, the book is very readable, and represents
an excellent introduction to the major variants of non-transformational syntax.
It is highly recommended as an essential source of reference for both working
syntacticians and researchers in related (sub)fields.
Baldridge, Jason. 2002. Lexically Specified Derivational Control in Combinatory
Categorial Grammar. Ph.D. thesis, University of Edinburgh.
Dalrymple, Mary. 2001. Lexical Functional Grammar. New York: Academic Press.
de Groote, Philippe. 2001. Towards abstract categorial grammars. In ACL39. 148-155.
Muskens, Reinhard. 2003. Language, lambdas, and logic. In G.-J. Kruijff and R.
Oehrle, eds., Resource Sensitivity in Binding and Anaphora, 23-54. Kluwer.
Philips, Colin. 1996. Order and Structure. Ph.D. thesis, MIT.
Richter, Frank and Manfred Sailer. 2004. Basic concepts of lexical resource
semantics. In A. Beckmann and N. Preining, eds., ESSLLI 2003 -- Course Material
I, vol. 5 of Collegium Logicum, 87--143. Kurt Godel Society Wien.
Shan, Chung-chieh and Chris Barker. 2006. Explaining Crossover and Superiority
as Left-to-Right Evaluation. Linguistics and Philosophy. 29. 91-134.
ABOUT THE REVIEWER
Yusuke Kubota is a postdoctoral fellow of the Japan Society for the Promotion of Science at the University of Tokyo. He received his PhD in Linguistics at the Ohio State University. His recent work focuses on developing a linguistically adequate model of the syntax-semantics interface based on categorial grammar, exploring phenomena such as coordination and complex predicates.
Read more issues|LINGUIST home page|Top of issue
Page Updated: 30-May-2012
While the LINGUIST List makes every effort to ensure the linguistic relevance of sites listed
on its pages, it cannot vouch for their contents.