3 Reasons to Use a Human Interpreter

If you follow current news in interpretation technology, you surely heard about the biggest leap in real-time live interpretation in years: Google’s Pixel Buds. Unveiled last month, audiences were introduced to a new pair of wireless headphones able to interpret between 40 languages with virtually no lag time. The initial reviews are all very positive, with some claiming that technology like this will revolutionize global communication.

These new Buds are just the most recent development in a rapidly advancing industry. Microsoft recently revealed a real-time Spanish or Chinese voice over option with their Powerpoint “Presentation Translator” add-on; two years ago Skype unveiled a live translation feature for 50 languages, and four languages available for audio interpretation. With such an advanced technology, two speakers of different languages can now communicate in real time without an intermediary. Does this spell doom for human interpreters?

The short answer: No, it does not. While ever-advancing technology is certainly changing how we communicate, there are still things that human interpreters provide that cannot be substituted (at least, not yet). Here are three reasons why people still can’t be beat:

  1. Accessibility
    So far, the real-time interpretation tools on the market are only available in a limited number of languages. Even the Pixel Buds are only available in 40 languages. This limited selection leaves out large swaths of the population and is useless in certain areas of the world.Additionally, the technology available certainly has limits to its usage. Machines require battery power, translation technology may require an internet connection at some point, and equipment can be expensive. This means that, no matter how revolutionary real-time earbuds might be, there are still very real logistical limitations.
  2. The evolving nature of language
    Humans can pick up on ever-changing lingo, whereas machines must have the language programmed in ahead of time. Things like regional slang, pop-culture terms and even jokes that use wordplay are all things that require a little touch of human creativity to create and to understand. Even in advanced AI technology, this kind of adaptation and quick learning is hard to get right. This can be especially true for dialects or rarer languages.
  3. Non-verbal communication
    Considering how the majority of communication is actually non-verbal, this is a huge win for human interpreters. While artificial intelligence is continually improving facial-recognition capabilities, there is still a long way to go for machines to be able to capture the essence of communication the way a human can. Things like body language and tone of voice all add significance to the spoken information; think of a sarcastic sentence delivered in a monotone voice, along with an eye roll or a wink. This completely changes the information delivered verbally, and most likely will not be picked up with an earbud interpretation tool.

Finally, there’s a key element that completely affects the way humans interact and communicate: trust. Looking someone in the eye and having a face-to-face conversation creates a certain relationship, and forges a very human bond that impacts how we view the information being exchanged. Where technology might glitch, or wires might be crossed, there’s still no real substitute for having a uniquely human exchange.

If you have a business meeting, conference or other event that could benefit from live interpreters, contact an Account Manager today!