Translation, Theory and Technology

Previous section Contents Next section

Barker Lecture

Does Mainstream Linguistic Theory Come to the rescue?

Mainstream linguistic theory emphasizes grammatical relations in a sentence. It is essentially a sophisticated form of sentence diagramming. Depending on when and where you went to high school, you may have encountered sentence diagramming or you may have missed it entirely. A sentence diagram shows all the words of a sentence and how they fit together. Mainstream linguistic theory has added a new dimension to sentence diagramming: Universal Grammar. According to Universal Grammar, there is only one method of diagramming sentences, this method applies to all the languages of the world, and it is universal because it is genetically encoded into the brain of every human child. This is a bold thesis and the large number of linguists around the world are working within this approach. Unfortunately, whether Universal Grammar is indeed universal or not, it says very little about the meaning of an individual word. It classifies words only according to the grammatical categories of nouns, verbs, adjectives, adverbs, and prepositions.

Not surprisingly, given the way it ignores word meanings, mainstream linguistics does not stack up very well when presented with the three types of translation difficulty we have discussed. It makes no mention of the distinction between general vocabulary and specialized terminology. This is because mainstream linguistics does not really deal with language in its entirety. It deals only with relatively uninteresting sentences that can be analyzed in isolation. Essentially, it deals with one very narrow slice of the pie of language that only appears to include general vocabulary and then calls that piece the whole pie. If it is true that mainstream linguistics does not really deal with the general vocabulary in all its richness, then it should be no shock to learn that it ignores the basic fact we have been exploring, namely, that a word can have several meanings, even within the same grammatical category. And if mainstream linguistics ignores the meanings of words, it has no need to take into account the context of a sentence. In fact, it has been a firm principle of mainstream linguistics for many years that the proper object of study is a single sentence in isolation, stripped of its context, its purpose, and its audience. This treatment of language on a local level (sentence by sentence) rather than on a global level has influenced the design of machine translation systems and we have seen the results in the telescope example.

It is a big job to take on the mainstream approach in any field. Actually, I am not saying that the mainstream is totally wrong. It does have many interesting things to say about grammar. Instead, I am saying that grammar, no matter how interesting it may be, is far from sufficient to teach a computer how to translate more like a person. Although none of the past three Barker Lectures has dealt directly with translation, I detect in them considerable support for my thesis about the insufficiency of mainstream linguistics to deal with meaning, which I have shown to be highly relevant to translation. I trust my three colleagues would agree that mainstream linguistics does not treat meaning adequately.

Taking the past three Barker Lectures in the order they were presented, we will begin with John Robertson, who warned us against the dangers of unwarranted reductionism. Robertson uses reductionism to describe an unwarranted oversimplification of a problem that leaves out an essential element. Reductionism, in a broader sense that I will use in this paper, is an approach commonly used in science. Reductionism, as suggested by the name, reduces a complex phenomenon to simple underlying components. It has in some areas been spectacularly successful, such as the reduction of visible light, infrared heat, radio waves, and x-rays to variations of a single phenomenon called electromagnetic radiation. But as Robertson points out, reductionism can go too far. In linguistics, the reduction of language to grammar separated from meaning is a highly unwarranted instance of reductionism. It may give the appearance of allowing a scientific study of grammar, but ultimately it is a dead end approach that will not form a solid basis for studying other aspects of language beyond grammar and will not even allow a fully satisfying explanation of grammar.

My second colleague and Barker Lecturer, Cheryl Brown, argued for the importance of words over grammar. Mainstream linguistics does not study real language as spoken by real people. Instead, it studies an "idealized, homogenous speaker-hearer community." That is, it assumes that everyone has exactly the same internal grammar and vocabulary, that everyone is a carbon copy of everyone else. Brown ably shows through careful empirical studies that this idealization is not at all justified. She shows significant differences in the way men and women react to certain words. She gives examples of regional differences in the way certain words are used. And she shows that very advanced students of English in China are influenced by their culture in the connotations they give certain words. She illustrates the flexibility of humans in dealing with language, a flexibility which is not predicted by mainstream linguistics.

My third colleague and Barker Lecturer, Jerry Larson, described the state of the art in regard to technology in language learning. He described many new developments that allow more sophisticated access to information, from text to sound to pictures to motion video. But he acknowledged that for a computer to evaluate the appropriateness of the speaking and writing of a student, when there is not just one predetermined response, we will need software that is "far more sophisticated than any currently available." He rightly points out that such software would have to be able to recognize not just grammar but meaning and take into account the context of what is said and adjust for cultural factors.

This section of the paper was supposed to explore whether mainstream linguistics adequately addresses the types of translation difficulty I identified in the first section. I can now answer, with the support of my colleagues, in the negative. All three types of difficulty required a sensitivity to meaning, not just a mechanical attention to which words are used and how they are related grammatically. If mainstream linguistics cannot come to the rescue of those who want to program a computer to translate more like a person, then what kind of linguistics would it take? It is clear that it would take some approach to language that deals directly, not peripherally, with meaning. It is less clear what that approach should be. When you start working with meaning and try to pin it down so that it can be programd into a computer, you begin to sympathize with the reluctance of mainstream linguistics to deal with meaning. And you come up against some pretty big issues in philosophy. For example, you eventually have to deal with the question of where meaning comes from. Are meanings already out there somewhere before we even make up words for them? Or do we create meanings out of nothing? How do we manage to communicate with others?

Some approaches to meaning assume that there is one basic set of universal concepts on which all other concepts are based. In this approach, which is sometimes called objectivism and dates back at least to Descartes, everyone, good or evil, must deal with these same starting concepts. I begin with the assumption that meanings are not absolutely imposed on us from the nature of the universe but that they are not entirely arbitrary either. Then where does meaning come from? I will now discuss a key factor that I believe to be missing from current theories of language. An approach to language that incorporates this factor should bring us closer to dealing adequately with meaning. Such an approach should guide us in the design of a computer that could translate like a person.

Previous section Contents Next section

| Homepage | Theory | Technology | CLS Framework | About us |
Copyright © 1999
Translation Research Group
Send comments to comments@ttt.org
Last updated: Wednesday, January 3, 1999