10 Mar '16

Lost in Translation

There is a popular story among translators that during the height of the Cold War, the Pentagon attempted to use computers to translate the reams of Russian intelligence being gathered. During these experiments, the term "out of sight: out of mind" was fed into the computer, translated into Russian and then retranslated back to English. The result: "Invisible Insanity". Whether this tale is fact or urban legend is not the point — the point being that this encapsulates the problem with machine translation (MT).


Beginning in the early 1950s, the approach to the translation problem was to build a computerized rule-based system. The idea was that, knowing the rules of the source language and the target language, one could be reliably transformed into the other. Although some progress was made using this approach, computing power and the myriad of linguistic rules proved to be formidable challenges in attaining the consistency and quality demanded by potential consumers of the service. The first major breakthrough occurred in the 1990s when IBM abandoned the rules-based approach and instead tackled the issue from a statistical basis. Instead of attempting to have the computer understand what was being input, computer engineers programmed into their computers comparative versions of as much translated text as possible and had the system compute the probability of meanings of words and phrases based on statistical precedent.


While the complexity of rules-based systems limits their effectiveness, consistency and quality, the statistical based systems are meeting up with the limitations of corpora as well as the tremendous demands on storage capabilities. The accurate translations of less popular languages present a problem since there is comparatively little text to "learn" as reference. Google Translate, Babelfish (powered by Systran) and Globalink's Comprende have made tremendous advances in the last decade, but the marginal improvement in services being offered is decreasing. In order to enhance translation quality and consistency, the industry is incorporating models that utilize both statistical and rules-based approaches in hybrid systems. But still, limitations in nuance, colloquialism and idiomatic vernacular are formidable obstacles in further refinement of software.


The nascent industry has its share of proponents and detractors. Nicholas Ostler, chairman of the Foundation for Endangered Languages, believes that mechanical translation will eventually liberate the world from the necessity of learning dominant languages and will contribute to linguistic diversity. With regard to the current plateau in MT advancements, Ostler suggests that even if you don't like what a mechanically translated document says, it is still possible to immediately make sense of the translation and compare it with what you know about the subject at hand. MT still needs a degree of familiarity from the user, but the fact is that MT delivers a much better product than before and there is little doubt it will continue to improve. Andreas Zollmann who is a researcher in the field states: "We are now at this limit where there isn't that much more data in the world that we can use, so now it is much more important again to add on different approaches and rules-based models."


Douglas Hofstadter, author of the seminal book on consciousness and machine intelligence, Gödel, Escher, Bach: An Eternal Golden Braid, is particularly critical of the effects of mechanical translation. Hofstadter believes that MT starts from the wrong place: "There is no attempt at creating understanding and therefore Google Translate is doomed to the same kind of failure for ever. Of course they get occasional good results, but essentially it is mindless. They are rendering a very low-level service that will always produce something not far above the level of nonsense. I suppose that we will all bow to the pressures to use it at some level, but it will never get the flavour of phrases." This is consistently the main criticism of MT — a computer's inability to recognize nuance, tone, cultural relevance and wordplay.


While MT can contribute significantly to translations of technical manuals, weather reports and other subjects that have a relatively small and standardized corpora, documents and texts that incorporate rhetorical inferences, polemical issues and unique phraseology will continue to present challenges to the quality of output from MT. Mechanical translation is not going to disappear. Given this fact, it is essential that professional translators embrace the technology and incorporate its ability to quickly and inexpensively translate huge quantities of data. The cost efficiencies and speed when combined with the expertise of professional translators who can tweak the MT output for better quality of comprehension can only advance the industry — making it more efficient while maintaining a level of quality that end users expect and require. As MT revolutionizes, progressive translation firms may need to specialize in certain languages, industries or subject matters. Quality translation will never be optimized without the human touch.


http://www.spotlight-online.de/blogs/the-spotlight-team/invisible-insanity

About The Author

An accomplished author, Jason brings a diverse skill set to MKTG. He originally started at the company as a research assistant and, after spending time overseas, returned to the team in 2008 as Manager of Special Projects. In his current role, Jason oversees MKTG’s special projects, with a particular focus on employee development, training, multimedia translation requests and other large-scale or special-skill opportunities.

Top