The study also highlights the constructs of current linguistic theory, arguing for distinctive features and the notion 'onset' and against some of the claims of Optimality Theory and Usage-based accounts.
The importance of Henk Zeevat's new monograph cannot be overstated. [...] I recommend it to anyone who combines interests in language, logic, and computation [...]. David Beaver, University of Texas at Austin
The need for a book like “Translation and Web Localization” emerged as a result of deep changes brought by the Internet in our lives. Web localization is entrenched as a field of study beyond translation and localization, which embraces “cognitive, pragmatic, discourse, communicative and technological perspectives” (p. 2) in the process of rendering web texts suitable for readers speaking different languages and living in different socio-economic contexts. The book is organized into a brief introduction and three parts. The first part gives definitions for the main terms (localization and web localization); the second part deals with current issues in web localization research and the final part constitutes a look at the future of web localization.
PART I: TECHNOLOGY, LOCALIZATION AND TRANSLATION: EVOLVING CONCEPTUALIZATIONS
1. The emergence of localization
Chapter 1 delineates the field of localization as evolving over the last two decades either as a distinct process or as a technological extension of translation. Localization of digital texts emerged as a need with the continuing growth of the Internet. The field spread as a separate branch within Translation Studies during the late 1970s and early 1980s. Localization started from English to other languages, whereas later the procedure was reversed; web pages are localized into the lingua franca to address global audiences. Early on, it became obvious that it is more than a translation process for linguists.
In the rest of the chapter, Jiménez-Crespo gives an overview of all the definitions of localization. Depending on their starting point and their perspective, various scholars from various disciplines define localization as a process of rendering a digital text to be used by audiences in different sociocultural regions and languages as well as the products of the above processes.
For the industry, localization is a translation process with additional components, dealing with texts having the ‘look and feel’ of locally made products through a process of adaptation.
Definitions of localization from the perspective of Translation Studies distinguish two stages: the stage of translation and the stage of technical adaptation; the latter includes the processes of making the content accessible, usable and culturally suitable for the target audience.
Finally, the author develops his definition of web localization and its place within translation studies. This definition includes textual, linguistic, cognitive, communicative, technological, sociological and target-oriented perspectives and requires cooperation among different agents beyond translators.
2. The web localization process: From GILT to web usability
Chapter 2 records the global cycle of web localization, which is seen as part of the GILT (Globalization, Internationalization, Localization and Translation) process. Within the GILT cycle, the actual localization process is just one stage affected by other processes taking place well before it starts and long after it ends. Various localization types (e.g. videogame localization, software localization, web localization etc.) are distinguished; they share several characteristics and, at the same time, show stark differences.
The localization process is broken down into levels; different agents, including business managers, localizers, translators, QA (Quality Assurance) operators and many others are involved in the process at various stages. Cultural adaptation plays a crucial role since various culture-dependent issues should be taken into consideration. Last but not least, web usability has to do with how the localized product is received by the target audience; the localization process is successful if the localized site is as clear, concise and efficient as possible.
PART II: CURRENT ISSUES IN LOCALIZATION RESEARCH
3. Web localization and text
The purpose of chapter 3 is to provide a new definition of ‘text’ in Translation Studies from an interdisciplinary perspective that includes Text Linguistics, Applied Linguistics and Translation Studies. Such an analysis is long overdue since the technological revolution and the emergence of new forms of hypertextuality, textual segmentation and reuse have challenged all existing notions of text. Texts have become hypertexts, with different requirements in terms of cohesion and coherence.
More particularly, the chapter starts with a brief introduction to the definition of text in linguistics since the structuralists in the 1960s, moves through text in Translation Studies towards text in web localization. The notion of text is ultimately defined as “a digital interactive activity that is coherently developed as a unit and presented to users as such” (p. 51).
Jiménez-Crespo explores further the impact of technologies on translation, mainly focusing on Content Management Systems (CMS) and Global Management Systems (GMS), and how they reshape web localization processes. He argues that instead of having traditional source and target texts, we have internationalized texts from which all language- and culture-dependent features have been removed.
The chapter also explores the issue of hypertext theory relevant to web localization. Contrary to printed texts with linear structure, readers of hypertexts approach them differently, according to their preferences. Web localizers adopt a non-linear reading different to that of end users based on programming criteria. The extensive use of hyperlinks makes their structure different from that of printed texts. Even the notions of cohesion and coherence need to be adapted when talking about hypertexts, since, as the author points out, “coherence building in hypertext depends more on forward-looking mechanisms rather than on classical cohesive ties between textual elements” (p. 61). This shift from text to hypertext, the author argues, especially the openness of hypertext and its dynamic nature, affects the translation process.
4. Web localization and digital genres
In chapter 4, Jiménez-Crespo reviews genre theory and its significance for localization. With the development of internet-mediated communication the last two decades or so, new digital genres emerge and proliferate. Web localizers need to know the prototypical features of each one of them in both the source and the target context in order to follow the appropriate conventions receivers are accustomed to. Since genres differ among cultures, we realize why genre theory plays an important role in Translation Studies. The notion of genre has been used in addition to that of text type since the 1960s in Linguistics and Discourse Analysis. Texts are classified in various text types according to their function; they can also be multifunctional, meaning that a text can serve more than one rhetorical purpose.
According to the author, digital genres have been evolving faster than other genres due, in part, to the constant evolution of the functionalities of the web. The distinction between novel and extant digital genres is very interesting; the former are genres that do not exist in printed form and emerged in the web whereas the latter have been transferred to the web.
In the rest of the chapter the author develops a framework for analyzing digital genres in Translation Studies and localization research. The new parameter that he adds to the ESP (English for Specific Purposes)-inspired model of genre analysis for web localization is that of interactivity and functionality of digital genres. The taxonomy of web genres proposed in empirical research on localization is based on three main criteria: (a) the purpose of the genre (to advertise, to inform, to entertain, etc.); (b) the communicative function (expositive, argumentative, persuasive-exhortative), and (c) the type of communicative process established (community to community, individual to individual, etc.). All the proposed genres should be treated as prototypical, embodying the core features, and each one having degrees of variation.
5. Web localization and translation quality
In chapter 5, Jiménez-Crespo discusses the issues of translation quality and evaluation which are inherently controversial for web localization. Being a relatively new phenomenon, the web does not have a set of canonized criteria for its evaluation. It is still arguable how much theory and empirical research we need in order to formulate a theory for web localization quality. >From the point of view of the industry, various contextual and procedural constraints make translation dependent upon various internal and external quality parameters, e.g., clients’ and end-users’ goals, intended purpose, text type or genre, cost, time constraints, etc. The author then presents various theoretical models on QA that have been used so far. The first such model which is error-based, is the LISA (Localization Industry Standards Association) QA model, the most widely used for both translation and localization quality evaluation. Theoretical issues covered in error-based approaches in general include, among others, definition of the notion of error, error taxonomies, impact of errors and quality thresholds. Holistic, textual, pragmatic and corpus-based approaches are also discussed as alternatives to error-based ones.
The evaluation framework proposed by the author combines and adapts existing trends to the specifics of web localization, aiming at bridging the gap between industrial and Translation Studies (TS) perspectives. It is meant to be used as a template based on which customized evaluation frameworks can be developed rather than a complete evaluation method per se. The quality of a localized website is relative to three properties: adequacy, accuracy and effectiveness. The framework incorporates two possible evaluation methodologies, the error-based and the holistic one.
6. Web localization and empirical research
The purpose of chapter 6 is to review the main paradigms, models and methods used in Translation Studies (TS) which can serve as a basic introduction to web localization research. It starts from the premise that the lack of theoretical research on web localization still hinders the development of empirical research. Within the interdisciplinary context of TS, web localization emerges as interdisciplinary as well, drawing on fields like foreign languages, linguistics, computational linguistics, translation, computer science, graphic design, information management, etc.
Starting from Holmes’ (1988/2000) framework of research in TS, which subdivided the discipline into Pure and Applied TS, the author reviews its impact upon Localization Studies (LS), the newly developed field. LS is divided into Pure and Applied, the former including a theoretical and a descriptive branch. Applied LS covers most of the existing applied research on localization, including cases that do not share TS models, methodologies or theories.
In the rest of the chapter, Jiménez-Crespo gives guidelines on how to apply research models and paradigms of TS in web localization. Paradigms are defined as “sets of principles that underline different theories” (p. 143), whereas models are the “interrelated networks of concepts that we use to discuss translation or localization” (p. 144). Alternative classifications of models are discussed.
The chapter continues with a description of the research methodologies and research design that should be adopted before embarking on research in web localization; this is summarized in the form of a useful checklist of issues to be covered while planning for research.
At the end of the chapter, the author discusses the main challenges of research in the area of web localization with the emphasis placed on corpus-based studies.
PART III: LOCALIZATION AND THE FUTURE
7. Web localization and training
The chapter starts from the premise that web localization can no longer be treated as a peripheral task for translators and should be a core component of their training. It is therefore essential to build a model of localization competence which should be seen as a special subset of general translation competence.
Localization training falls in the middle of the conflict between the industry, highlighting the technological component, and TS academics, paying more attention to theoretical training and as a result being accused of preparing localizers inadequate for the real world market. According to Jiménez-Crespo, a professional localizer should possess a balance of advanced translation competence and technological and management skills. In his attempt to fit localization competence within current translation competence models, the author explores various issues of translation competence and training research. The model upon which he places significant emphasis is the model of the PACTE (Process of Acquisition of Translation Competence and Evaluation) research group. It has been in development since 1997 and has been tested with professional translation students and language teachers. It consists of five interrelated subcompetences: strategic, bilingual, extralinguistic, knowledge about translation, and instrumental subcompetence.
Jiménez-Crespo builds his model of localization competence on this model, arguing that the emphasis the industry places on technological or engineering skills can form part of the instrumental knowledge about translation.
The model then divides localization competence in the following subcompetences: a. instrumental - technological subcompetences b. knowledge about translation -- localization competences c. specialized bilingual and extralinguistic subcompetences d. strategic subcompetences.
Each one of them encompasses various skills, abilities and knowledge. The chapter ends with a discussion of how this model can be turned into a translation training program and a teaching methodology to account for different settings. In most cases, localization training is conceptualized as a specialization within a larger framework, that of translation training, which means combining translation competence with computational engineering. The important characteristic of this framework lies in the fact that it aims to account for the fuzzy area between the translator and the multilingual developer or engineer; despite the fact that they have different starting points, they can both develop the additional necessary skills and acquire localization competence. The chapter ends by arguing that “A shared localization competence model informed by theoretical and empirical research in TS and related disciplines and informed by industry experts can represent a common base for localization training acquisition resulting in better outcomes” (p. 185).
8. Future perspectives in localization
The aim of this chapter is to discuss various issues related to the evolution and future perspectives of web localization in the context of constant technological developments. Jiménez-Crespo distinguishes two main areas of interest: the first has to do with the impact of technology in the process of web localization, and the second with the effect of web localization on TS. One of the key issues examined is that of professionalization, especially since Machine Translation and crowdsourcing seem to threaten the status of translator-localizer as a highly skilled profession in society. Crowdsourcing emerged and has been developed together with Web 2.0 as volunteer translation produced by communities of internet users in collaboration, often using specific platforms. According to the author, this practice does not constitute any kind of threat to the localization profession.
Last but not least, Jiménez-Crespo discusses the role of Machine Translation (MT) in web localization and argues that we are moving away from the practice of using MT engines to automatically localize entire websites towards a Human-Assisted Machine Translation (HAMT) model meaning that experts perform post-editing to the output of MT systems.
The goals of the book and the intended audience are clearly and specifically described by Jiménez-Crespo. The book fills an important gap in the interdisciplinary field of web localization, especially since it moves beyond both the translation-studies perspective and the procedural description of best professional practices.
As a pedagogical tool, the structure of the book is very user-friendly; there is a short introduction in every chapter to prepare the reader on the content and a brief summary at the end. The section on further reading is very interesting and helpful.
The volume makes an important contribution to the field of web localization research for several reasons. First, and most significantly, it argues for the need to treat web localization as a field different from translation. In addition to that, it places the localization process in the context of the global GILT cycle. In this reviewer’s opinion, chapter 2 is one of the most important ones in the book since it takes into account all the agents involved in the process arguing for the need to consider not only the product but the process as well.
Another interesting point raised in the book is that of training in web localization. The relevant chapter is very helpful. It builds upon various subcompetencies deriving elements from both theoretical and empirical research. Because of this, the book is of very high importance for all those working in the fields of translation studies, once they recognize the need to offer separate training for web localization.
Holmes, J.S. 1988/2000. The Name and Nature of Translation Studies. In Venuti L. (ed.) “The Translation Studies Reader”. London: Routledge, pp. 172-185.
ABOUT THE REVIEWER:
Nadia Economou has been working with the Institute for Language and Speech Processing (ILSP)/ R.C. “Athena” since 1994 and she currently holds the position of Principal Researcher in the area of Modern Greek language teaching with multimedia. She holds a B.A. in Linguistics from the University of Athens, an M.A. in Language Studies from the University of Lancaster U.K. and a Ph.D. from the same University with specialisation in Educational Linguistics. In the Department of Educational Technology in ILSP / R.C. 'Athena' she has been involved in the design and development of educational multimedia software. She has worked on various projects as a researcher and/or coordinator. She has published research papers in the areas of language teaching and learning, CALL and discourse analysis. Her current research interests include multimedia technologies in education, language teaching and learning and discourse analysis.