A quadrillion has fifteen zeroes. And the above figure – 250 quadrillion – denotes a rough average of the number of words written on web pages all over the world, on any one single day.
Now this is the number of words – approximately 1.6 billion – which are translated each day from one language into another. The rest remain untranslated; and hence, out of reach of a majority of the people all around the world.
This is precisely where translation as a service and now a necessity, comes in. In today’s world economy which is dominated by e-commerce companies as well as international businesses, it is imperative that content is translated from its original language into a number of different languages in order to ensure its reach to all corners of the world. Maximizing the reach of a message would lead to more understanding, which would in turn lead to more interactions among the masses (read: consumers) and ultimately more sales, increasing business and better profits.
Translation can broadly be classified into two major categories:
The Human Translation Process: This involves a person first decoding or decrypting the source text (that text which needs to be translated), followed by re-encrypting (‘translating’ in common parlance) the same text into the desired target language.
The Machine Translation Process: This involves programming specialized software into comprehending a body of text like a human being would and creating a new body of (translated) text in the desired target language, with minimal (or ideally zero) human intervention.
A perpetual debate is under way regarding the two – Human Translators versus Machine Translations. The crux of the matter? Translation is, surprisingly, several times more complex than merely writing in a particular language. And since language isn’t just about playing with words, machines have not yet arrived at the point where they can make sense of a language the way humans can. Superficially, maybe yes; but that’s about it.
No doubt, machine translation has a lot going for it. First and foremost, it is an excellent research vehicle. An average human translator is thought to be able to translate at the most 2000 words a day; whereas machine translation is known to generate thousands of words per minute and hence, saves the day when you are pressed for time. As compared to human translation, it is quick, cheap and also confidential, as one need not entrust their classified information to any third person who might end up misusing the same. Moreover, machine and online translations are generally universal in nature, unlike a professional human translator who can specialize in only one field or language at a time. And since computers originally came into being in order to boost human productivity, we assume that it can only be beneficial to use them to our advantage and get a greater amount of work done in a lesser duration of time, involving the least amount of effort. Or is it?
Now, let’s look at the other side of the coin. Agreed, machine translation saves time; but what about accuracy? The end product is usually just a mechanically translated group of system-generated words, based on certain formal rules; an approximation or a ‘gist’ of the original content. Contrary to popular perception that machine translation is ‘free for all’; this is more of a misconception. For instance, even Google Translate is not totally free beyond a certain limited usage volume.
One would assume that in the present world of cutting-edge technology, the translation tools would be fairly developed as well resulting in the process becoming more streamlined. The translators though have a different story to tell altogether. Even today, there is no proper ‘What You See Is What You Get’ tool to provide for in-context translation of websites in the digital world. This results in the translators having to inevitably spend more time than required working on different systems for each step of the machine translation process, something like this – one system to procure the files which need to be translated, another one to carry out the actual translation and a third one to deliver the finished final copy. In this entire melee, there is a heightened probability of errors creeping in due to the many steps involved. Not to forget that in the beginning our sole purpose for using machines as translators was to simplify the entire process. Is this what we could then call ‘simplifying things’?
The bottom line and the most important factor which goes against machine translations: It lacks accuracy and needs the manual intervention of a translator to make corrections. Machines aren’t capable of concentrating on a particular context and analyzing it from the points of view of grammar, semantics or even idioms. Translators prioritize on quality whereas machine translations are more about literal translations of groups of words, irrespective of what they would eventually end up to mean. For instance, if you go by whatGoogle Translate says, then “Tom Cruise is Spanish for Heath Ledger”. On the other hand, the Spanish for Tom Cruise is Tom Cruise. Why? Go figure.
No doubt, language barriers are being broken down and the present era is witnessing a revolution in the role of traditional human translators, thanks to what we know as MT + PE (Machine Translation + Post-Editing) together. Would it be fair to say that the machine translation technology is a welcome addition to the erstwhile exclusively manual (read: human) world of translation? Definitely, provided the machines eventually learn to tackle the cognitive aspect of content as well. Till then, human translators will continue to be an indispensable cog in the wheel of the grand scheme of things in the translation world.
How Does Speech Recognition Technology Work?
Surrounded by smart phones, TVs, tablets, speakers, laptops, automated cars and more, we take for granted ...
Speech Recognition Software: Past, Present, and Future
Data collection will do the heavy lifting when it comes to the future of speech recognition software.