The study also highlights the constructs of current linguistic theory, arguing for distinctive features and the notion 'onset' and against some of the claims of Optimality Theory and Usage-based accounts.
The importance of Henk Zeevat's new monograph cannot be overstated. [...] I recommend it to anyone who combines interests in language, logic, and computation [...]. David Beaver, University of Texas at Austin
AUTHOR: Taylor, John R. TITLE: The Mental Corpus SUBTITLE: How Language is Represented in the Mind PUBLISHER: Oxford University Press YEAR: 2012
Stefan Hartmann, German Department (Deutsches Institut), University of Mainz
In this book, John R. Taylor defends the hypothesis that linguistic knowledge can be conceived of as a repository of memories of exposure to actual usage events. According to Taylor, language users keep track of the utterances they encounter, thereby compiling a “mental corpus” of constructions at various levels of abstraction. This hypothesis is elaborated on and substantiated by a great number of examples in the thirteen chapters of this book. In line with his plea to focus linguistic attention on actual language use, most of Taylor’s examples are derived from corpora (mainly from the British National Corpus and the Corpus of Contemporary American English).
The first two chapters of the book are concerned with the generative model of linguistic knowledge, which Taylor rejects. Chapter 1, “Conceptualizing language”, introduces Chomsky’s (e.g. 1986) distinction between E-language, i.e. ‘external’ language as encountered in the world, and I-language, i.e. the ‘internal’ system of linguistic knowledge in the speaker’s mind. In contrast to the generative model, which treats E-language as an “epiphenomenon” (Chomsky 1986: 25), Taylor argues that I-language emerges from the exposure to E-language events. He then discusses to what extent E-language can be studied with the help of corpora, especially focusing on the pivotal question of representativeness. He also addresses the use of the World Wide Web as a corpus, reviewing research that shows data obtained from the Internet to correlate highly both with data from established corpora and with informants’ acceptability judgments.
The title of Chapter 2, “The dictionary and the grammar book”, refers to the generative model of linguistic knowledge, which sees language as consisting of the lexicon on the one hand, i.e. “a finite list of structural elements” (Jackendoff 2002: 39), and the grammar on the other, i.e. “a finite set of combinatorial principles” (ibid.). While this model accounts neatly for the phenomenon of linguistic creativity, i.e. the ability of speakers to create a potentially infinite set of sentences from a finite set of elements, it entails a rule-based approach to linguistic knowledge that proves problematic in various respects. In evaluating the model against actual language use as attested by corpus data, Taylor argues that grammaticality intuitions do not in all cases reliably reflect the contents of the mental grammar. He illustrates this with the use of “explain” in the double object construction, e.g. “explain me this”. While this construction strikes most native speakers of English as ungrammatical, Taylor’s corpus data suggest that in the mental grammar of a fair number of language users, “explain” can be used analogically to semantically related words such as “tell”. The critical assessment of the generative model leads Taylor to question both the classical definition of the lexicon as “a list of ‘exceptions’” (Chomsky 1995: 235) and the principle of compositionality, which “requires that each component of a complex expression contributes a fixed and stable chunk of semantic [...] material to the expressions in which it occurs” (p. 41). In sum, he concludes that “[w]e will need to abandon the model, its assumptions, and all that it entails” (p. 43).
The subsequent chapters deal with various aspects of actual language use. Chapter 3, “Words and their behaviour”, questions the generative model’s assumption that all items in the lexicon can be assigned to a small number of categories and that all members of a (lexical) category behave identically with respect to syntactic rules. He shows that a number of words -- e.g. “fun”, “much”, or so-called defective verbs such as “beware”, cf. “*I always beware of dogs” -- exhibit a unique distribution that cannot be predicted from their membership in a syntactic category. These findings suggest “that knowledge of a language can only be attained through exposure to actual usage events, whose particularities are noted and laid down in memory” (p. 68).
Chapter 4, “Idioms”, and Chapter 5, “Speaking idiomatically”, discuss at length the pervasiveness of idiomatic language use. While the generative model treats idioms as peripheral exceptions to the otherwise rule-governed use of language, “[t]here comes a point [...] when the periphery looms so large that it may no longer be an option to regard it as peripheral” (p. 43). In Chapter 4, Taylor presents corpus findings on semantic (e.g. “kick the bucket”) as well as syntactic (e.g. “by and large”), lexical (e.g. the usage range of “fun”), and phrasal idioms (e.g. “of course”, “in fact”, “under way”). In Chapter 5, Taylor proposes to understand idiomaticity in terms of conformity with usage norms. On the one hand, this concerns the appropriateness of linguistic expressions to the respective context of use (e.g. the formality of a situation as well as the medium of communication). On the other hand, it concerns language-internal relations as manifested by preferred collocations (e.g. “merry Christmas”, but “*merry birthday”). His case study of “X-minded” then shows that such usage norms, which are governed by “language-internal statistics” (p. 114), are subject to diachronic change. While this compound pattern enjoyed a peak of popularity in the middle of the 20th century, its use declined afterwards. He concludes that the usage norms are not categorical, but statistical in nature.
Chapter 6 is dedicated to the controversial notion of “constructions”. After a brief overview of Langacker’s (1987, 1991, 2008a) Cognitive Grammar, he contrasts two approaches towards defining a construction and then proposes a third definition incorporating elements from both approaches. On the one hand, constructions can be conceived of as internally complex entities, i.e. “any linguistic form which can be analysed into its parts” (p. 124). On the other hand, constructions can be defined as pairings of form and meaning. In narrower approaches (e.g. Goldberg 1995), however, only those form-meaning pairings that have unit status or even only those whose properties cannot be derived from the properties of any other construction are regarded as constructions proper. Taylor’s own proposal conceives of constructions as basically synonymous with the notion of “unit” in Cognitive Grammar, i.e. “any element of a language that has been learned and that forms part of a speaker’s linguistic knowledge” (p. 126). He then discusses the differences between rule-based and construction-based approaches. Crucially, a construction-based approach tolerates a certain amount of redundancy, predicting that even regularly formed expressions may be stored in memory rather than being generated in every single case (cf. Langacker’s discussion of the “rule/list fallacy”, e.g. Langacker 1987: 29f., 492). This view is substantiated by the results of experimental studies such as lexical decision tasks. Furthermore, Taylor contrasts the autonomy of syntax hypothesis with constructional and collostructional (e.g. Stefanowitsch and Gries 2003) analyses, concluding that “even the most general syntactic patterns of a language [...] need to be regarded as constructions” (p. 145).
Chapter 7 deals with the key notion of “Frequency”. Taylor reviews a broad variety of research indicating that speakers know, at least implicitly, the relative frequency of all elements of their language. His discussion of Chomsky’s well-known “Dayton Ohio argument” reveals that the alleged correspondence between corpus attestations and real-world facts only pertains to a relatively small number of cases. While the frequency of “I live in New York” is indeed not only higher than the frequency of “I live in Dayton, Ohio”, but even corresponds almost exactly to the number of inhabitants of New York City in relation to Dayton, Ohio (cf. Stefanowitsch 2005), the same is not true for the relative frequencies of “He lives in New York” and “She lives in New York”, respectively. While it can reasonably be assumed that the distribution of both genders should be roughly equal, the former sentence is about twice as frequent as the latter. Further frequency-related issues that Taylor discusses include collocation patterns (for example, the word “unmitigated” is highly probable to be followed by “disaster”, but not vice versa), prevalent phonological patterns, and ambiguity resolution. His discussion of these matters leads him to conceive of “Skewed frequencies as a design feature of language”, which is the title of Chapter 8. Rather than being a mere side-effect emerging in any language-like system, skewed frequencies contribute to the learnability of language in that they facilitate linguistic categorization at all levels of language structure and use.
The question of language acquisition is further elaborated on in Chapter 9, “Learning from input”. Focusing on phoneme acquisition, Taylor argues that language learners record the statistical properties of the input, which leads to the emergence of phonetic categories. He cites the so-called “recency effect” as evidence that people are registering the properties of the language they encounter: As both experimental and corpus studies show, the choice between linguistic alternatives is often determined by the immediately preceding context. For example, if a speaker uses a comparative construction with “more”, her interlocutor is likely to do so as well, even if forming the comparative with “-er” would be the more natural choice in other contexts.
Chapter 10 addresses one of the most widely-discussed topics of Cognitive Linguistics, namely, polysemy. Contrary to most accounts, Taylor suggests that each word should not be associated with a fixed number of discrete meanings, but that word meaning should be thought of in terms of the ways in which a linguistic form can be used, i.e. seeing words in their respective “contextual profile”. On this account, the variability and dynamicity of meaning that constitute polysemy can be seen as a consequence of its embeddedness in language use. This hypothesis is substantiated by the finding that the acquisition of prepositions rarely follows the pattern predicted by traditional network accounts of polysemy: Irrespective of their most central sense, prepositions are mostly acquired as parts of fixed expressions, which may vary from individual to individual (cf. Rice 2003). This lends support to Taylor’s overall hypothesis in that it demonstrates that a word cannot be conceived of as a fixed entry in the mental lexicon but rather “provides access, not only to the conceptual domains against which it is understood, but also to the linguistic contexts in which it has been used” (p. 244).
Chapter 11 deals with the notions of “Creativity and innovation”, which play a central role in the classic generative model. While creativity, in the generative sense, refers to the application of grammatical rules to lexical items, the notion of innovation refers to the coinage of new words or expressions. Taylor shows this distinction to be problematic in many respects as it proves to be difficult to apply in many cases. In the case of reanalysis, for example, language change comes about not through innovation by an individual speaker, but through the hearer’s interpretation of the speaker’s utterance. Case studies of progressive “busy” as attested primarily in South African English (“My essay is busy being typed”), ditransitive “explain” (“explain me it”), and constructions involving “all over” illustrate that “innovation is incremental” (p. 262). Patterns that emerge as minor variants, deemed ungrammatical by most language users, gain acceptance when they become more frequent.
As a model to account for manifestations of linguistic creativity and innovation, Taylor proposes “Blending” (cf. Fauconnier and Turner 2002), which is the topic of Chapter 12. After a brief outline of blending theory, Taylor addresses the phenomena of word blending and phrasal blending both in speech errors (e.g. “that’s torrible”) and in creative language use (e.g. “glitterati”, a blend of “glitter” and “literati”, cf. Kemmer 2003). As the emergence of the plural form “process[i:]s” (in analogy to the plural forms of some words of Greek origin such as “thesis”) demonstrates, blending can also affect inflectional morphology. In the domain of phrasal blending, idioms such as “time and again” can be analyzed as blends of two (or more) distinct phrases (e.g. “time after time” + “again and again”). He concludes that blending not only “is able to do the work traditionally assigned to generative rules” (p. 279) but also accounts for linguistic creativity and innovation.
The final chapter, “The mental corpus”, summarizes the main hypotheses of the book: 1) Linguistic knowledge emerges from exposure to actual language use. 2) Knowing a language means much more than just knowing the words of a language. Instead, language is organized in terms of constructions: “Knowledge of words and knowledge of constructions cannot easily be teased apart” (p. 282). A large part of what constitutes a language has to be considered idiomatic. 3) Frequency of occurrence is highly important with regard to the emergence of usage norms in a language. Speakers know, at least implicitly, the frequency profile of linguistic items; moreover, skewed frequencies can be considered a design feature of language. 4) The mental corpus does not (only) consist of a number of fixed words and expressions. Instead, speakers are capable of generalizing over the data they encounter. This facilitates language comprehension in that it provides schemas against which utterances can be understood. Furthermore, it lays the ground for linguistic creativity and innovation. 5) This has important bearings for a theory of language acquisition, which has to be conceptualized as an input-driven, bottom-up process. Considering the dialectic relation between I-language and E-language, acquisition can also be seen as a life-long process. 6) Importantly, these hypotheses are substantiated by both corpus studies and experimental findings.
In the tradition of approaches to language that highlight the importance of usage frequency (e.g. Langacker 1988, 2000; Bybee 2007), Taylor makes a convincing case for a radically usage-based account of linguistic knowledge. While the foundational works of Cognitive Grammar often relied on invented examples, almost all the examples Taylor presents are derived from corpora. Furthermore, Taylor takes into account a broad variety of experimental studies. This makes his book a major step towards an empirically grounded cognitive-linguistic theory of language.
Taylor maintains the highly readable style that already characterizes his textbooks on Cognitive Grammar (2002) and linguistic categorization ( 2003) without ever oversimplifying matters. Therefore his book is equally recommendable to students of linguistics, advanced scholars, and even interested laypersons.
What is more, his “mental corpus” hypothesis ties in neatly with current approaches in cognitive linguistics and also has important implications for some issues that lie outside of the scope of Taylor’s monograph. For example, his hypothesis is highly compatible with approaches that view language as a complex adaptive system (CAS, e.g. Beckner et al. 2009), a framework that has also been proposed for research in historical cognitive linguistics (Frank and Gontier 2010). As both the “mental corpus” hypothesis and the CAS approach draw attention to the “intrinsically diachronic aspect of language as a system” (Frank and Gontier 2010: 48), his work also has implications for historical linguistics in that it highlights the role of usage and frequency in language change. Moreover, Taylor’s observations, especially those concerning the paramount importance of idiomatic knowledge for a speaker’s language proficiency, can prove fruitful for foreign language teaching (cf. also Langacker 2008b: 84).
All in all, Taylor’s book is highly recommended for anyone interested in usage-based theories of language and linguistic knowledge. It offers valuable insights not only to cognitive linguists and corpus linguists, but also to historical linguists and second language teachers.
Beckner, Clay; Blythe, Richard; Bybee, Joan; Christiansen, Morten H.; Croft, William; Ellis, Nick C.; Holland, John; Ke, Jinyun; Larsen-Freeman, Diane; Schoenemann, Tom. 2009. Language is a Complex Adaptive System. Position Paper. Language Learning 59 Suppl. 1, 1–26.
Bybee, Joan. 2007. Frequency of Use and the Organization of Language. Oxford: Oxford University Press.
Chomsky, Noam. 1986. Knowledge of Language. Its Nature, Origin, and Use. New York: Praeger.
Chomsky, Noam. 1995. The Minimalist Program. Cambridge, MA: MIT Press.
Fauconnier, Gilles; Turner, Mark. 2002. The Way We Think. Conceptual Blending and the Mind’s Hidden Complexities. New York: Basic Books.
Frank, Roslyn M.; Gontier, Nathalie. 2010. On Constructing a Research Model for Historical Cognitive Linguistics (HCL). Some Theoretical Considerations. In: Winters, Margaret E.; Tissari, Heli; Allan, Kathryn (eds.): Historical Cognitive Linguistics. Berlin, New York: De Gruyter (Cognitive Linguistics Research, 47), 31–69.
Goldberg, Adele E. 1995. Constructions. A Construction Grammar Approach to Argument Structure. Chicago, London: The University of Chicago Press.
Jackendoff, Ray. 2002. Foundations of Language: Brain, Meaning, Grammar, Evolution. Oxford: Oxford University Press.
Kemmer, Suzanne. 2003. Schemas and Lexical Blends. In: Cuyckens, Hubert; Berg, Thomas; Dirven, René; Pather, Klaus-Uwe (eds.): Motivation in Language. Amsterdam: John Benjamins, 69-97.
Langacker, Ronald W. 1987. Foundations of Cognitive Grammar. Vol. 1: Theoretical Prerequisites. Stanford: Stanford University Press.
Langacker, Ronald W. 1988. A Usage-Based Model. In: Rudzka-Ostyn, Brygida (ed.): Topics in Cognitive Linguistics. Amsterdam; Philadelphia: John Benjamins (Amsterdam Studies in the Theory and History of Linguistic Science, 50), 127–161.
Langacker, Ronald W. 1991. Foundations of Cognitive Grammar. Vol. 2: Descriptive Application. Stanford: Stanford University Press.
Langacker, Ronald W. 2000. A Dynamic Usage-Based Model. In: Barlow, Michael; Kemmer, Suzanne (eds.): Usage-based models of language. Stanford: CSLI Publications, 1-63.
Langacker, Ronald W. 2008a. Cognitive Grammar. A Basic Introduction. Oxford: Oxford University Press.
Langacker, Ronald W. 2008b. Cognitive Grammar as a Basis for Language Instruction. In: Robinson, Peter; Ellis, Nick C. (eds.): Handbook of Cognitive Linguistics and Second Language Acquisition. London: Routledge, 66-88.
Rice, Sally. 2003. Growth of a Lexical Network. Nine English Prepositions in Acquisition. In: Cuyckens, Hubert; Dirven, René; Taylor, John R. (eds.): Cognitive Approaches to Lexical Semantics. Berlin: De Gruyter, 243-280.
Stefanowitsch, Anatol. 2005. New York, Dayton (Ohio), and the raw frequency fallacy. Corpus Linguistics and Linguistic Theory 1, 295-301.
Stefanowitsch, Anatol; Gries, Stephan Th.. 2003. Collostructions: Investigating the Interaction of Words and Constructions. International Journal of Corpus Linguistics 8, 209-243.
Taylor, John R. 2002. Cognitive Grammar. Oxford: Oxford University Press.
Taylor, John R.  2003. Linguistic Categorization. 3rd ed. Oxford: Oxford University Press.
ABOUT THE REVIEWER
ABOUT THE REVIEWER:
Stefan Hartmann is a PhD student in historical linguistics at the University of Mainz, Germany. He is currently conducting a corpus-based study on the diachronic change of German nominalization patterns. Apart from historical and corpus linguistics, his research interests include Cognitive Linguistics, sociolinguistics, and psycholinguistics.