M1: Unit 1: Chapter 6 – Beyond Descriptivism

« back to unit 1  » next chapter

Chapter 1: Introduction Chapter 5: Descriptivism Chapter 9: Postcolonial Theory
Chapter 2: Where to Start? Chapter 6: Beyond Decsriptivism Chapter 10: Afterword
Chapter 3: Traditional Approaches Chapter 7: Poststructuralism Chapter 11: Bibliography
Chapter 4: Text Linguistics & Pragmatics Chapter 8: Gender Studies Chapter 12: Reference Works

Chapter 6: Beyond Descriptivism

6.1 Contextualizing translation in the way descriptivism tends to do, has at least two distinct advantagesFirstly, it casts the translator as an active and thinking social being, not just a linguistic decoding machine or a drudge with a good dictionary.Secondly, translation norms ultimately point up larger social and ideological structures. This makes translation a more interesting object of study, but there is an extra dimension. A culture normally builds an image of itself in relation to that which lies outside it, i.e. in relation to what is ‘other’. Translation then is that privileged domain of study where we can observe a culture confronting otherness and in the same gesture transforming it into its own modes and categories. In that sense translation provides us with a window on cultural self-definition and identity. The norms that govern translation determine to a large extent how other cultures will be presented to the receptor audience. The norms of translation globally prescribe what is to be selected for translation, how the material is to be handled, and how it is likely to be received.

image6.2 If the process of translation is governed by norms, whose norms are they? And if the translator follows conflicting norms and delivers a hybrid product, should we allow that to happen? The descriptivist’s answer is: yes, they can – because the translator’s norms are not necessarily the researcher’s. While the reseacher’s work is governed by the scholarly norms and rules of his or her discipline, the translation norms which the researcher is trying to identify and analyze operate on the level of the object of study. The difference is that between object-level and meta-level.

6.3 This has problematic implications, though. It means that translations made on the basis of translation norms very different from those which prevail in our world today (but what does ‘our’ here mean?) and which the researcher may well take for granted, are nevertheless translations. The point is that, if we want to give a non-normative definition of what translation is, we cannot base that definition on immanent or essentialist features. It is hard to delimit the notion of translation in an absolute or a-temporal sense without becoming normative. From a semiotic perspective there is no clear dividing line between translation in the conventional interlingual sense and intralingual or intersemiotic operations which involve cross-overs from one sign system to another. The solution adopted in empirical studies is to say that the category ‘translation’ is culture-dependent and therefore relative. Different cultures delineate and organize the field of translation is different ways. Norms are among the prime instruments that cultures use to define and delimit the field of translation, because they mark the boundary between what is accepted as legitimate (or `proper’) translation and what is not. It is the researcher’s task to explore the extent and internal organization of that field. Norms make translation into a cultural product rather than a fixed concept.

6.4 Where does all this leave equivalence? In the traditional approach, as we saw in Chapter 3, equivalence defined translation. Only a target text which possessed a sufficient amount of equivalence, of the right kind, could be called a translation. In the last few decades the notion has been progressively hollowed out. `Dynamic’ or `communicative’ or `functional’ equivalence was contrasted with `formal’ equivalence, different kinds of equivalence were distinguished as being appropriate in different circumstances, etc. (see Chapter 4). In Gideon Toury’s work, for example, the concept has been reduced to an empty label. His reasoning is that if in a given community text A is regarded as a translation of text B, then we agree to call the relation between them a relation of equivalence. What that relation actually amounts to, has to be explored from case to case. In that way the researcher can colour in the blank label and determine what kind of ‘equivalence’ we have. Put differently, equivalence is now seen as a consequence of translation, not its precondition. That is the exact opposite of the traditional view, which defined equivalence as the precondition of translation.

6.5 A very different line of development that builds both on descriptive/empirical studies and on linguistics is that based on corpus studies. A corpus is a computerized collection of documents. Computer technology allows us to scan in, say, 50 novels (and if each novel is 200,000 words, that means 10 million words), store their texts, and tag them in a variety of ways. There are now corpora that contain original texts and matching translations, and there is software that allows us to compare individual paragraphs or sentences in the original and in translation.

ch4 img3 6.6 Among the most rudimentary things a corpus study might investigate is the ratio of types to tokens in a text. A token is each word as it occurs. For example, the sentence ‘The cat sat on the mat’ contains 6 words, i.e 6 tokens. A type is each different word, which means you count a word only the first time it appears and ignore repetitions. ‘The cat sat on the mat’ contains only 5 tokens: ‘the’ (which occurs twice but is counted only once), ‘cat’, ‘sat’, ‘on’, and ‘mat’. A text that repeats words many times will come across as repetitive. The type-token ratio is a text’s lexical density.

6.7 Broadly speaking two kinds of research are currently being conducted using corpora.

6.8 One is geared to finding general patterns in translated texts, with a view to determining if there are certain features which recur in all translations, regardless of genre, translator or language. Among the features being tested in this way are the stylistic flattening sometimes associated with translation (which may be shown by a consistently broader range of vocabulary in originals compared with translations) and the greater degree of explicitation in translated texts (apparently translators across the board tend to link statements more explicitly, e.g. through the use of connectives, than writers of original texts).

6.9 Another line of research focuses on ‘forensic stylistics’, i.e. on detecting linguistic features that individual translators use frequently, usually without realizing it, as a sort of ‘tic’. This type of analysis was used some years ago, for example, to identify the anonymously published novel Primary Colors, a book about Bill Clinton’s early career; the analysis successfully identified the author by using a computer to locate similarities in linguistic usage between the novel and the assorted writings of a large number of people belonging to the Clinton circle. In the same way it has been found, for example, that certain translators use certain words or constructions in all their translations, regardless of the source text or language, with a degree of frequency that is statistically signficant (Baker 2000).

Further Reading (see Chapter 11): Baker 2000; Hermans 1999; Olohan 2004

Questions

Tasks

Translations