Back in September, Google announced a new system called Google Neural Machine Translation (GNMT), an end-to-end learning framework that learns from millions of examples, and thereby provides significant improvements to Google Translate quality.

While Google Translate already boasts of great accuracy in language translations thanks to the built-in engine that can translate between any two languages, now the new improved system is even more dexterous in translation, able to render one language to another even if those two weren't paired previously.

It uses a “token” at the beginning of the input sentence to specify the required target language, which in addition to improving translation quality, enables “Zero-Shot Translation” — translation between language pairs never seen before by the system.

For instance, in a trained multilingual systems of Japanese to English and Korean to English samples, GNMT is able to translate between these four different language pairs, even though the system has never been taught to do so before.

And the need to translate between multiple languages at the same time forces the system to better utilize its machine learning power.

The Multilingual Google Neural Machine Translation system is already running for all users, and according to Google, this is the first time this type of transfer learning has worked in Machine Translation.

Google Translate switches to Neural Machine Translation framework

Back in September, Google announced a new system called Google Neural Machine Translation (GNMT), an end-to-end learning framework that learns from millions of examples, and thereby provides significant improvements to Google Translate quality.

While Google Translate already boasts of great accuracy in language translations thanks to the built-in engine that can translate between any two languages, now the new improved system is even more dexterous in translation, able to render one language to another even if those two weren't paired previously.

It uses a “token” at the beginning of the input sentence to specify the required target language, which in addition to improving translation quality, enables “Zero-Shot Translation” — translation between language pairs never seen before by the system.

For instance, in a trained multilingual systems of Japanese to English and Korean to English samples, GNMT is able to translate between these four different language pairs, even though the system has never been taught to do so before.

And the need to translate between multiple languages at the same time forces the system to better utilize its machine learning power.

The Multilingual Google Neural Machine Translation system is already running for all users, and according to Google, this is the first time this type of transfer learning has worked in Machine Translation.