AI Case Study
Google Translate reduced translation errors by more than 55% in its translation services using Recursive Neural Networks
Google Translate has implemented a Neural Machine Translation system which improves upon older word-to-word translation methods in terms of accuracy and newer methods in terms of computational resource efficiency so that it can be deployed online and on applications.
Internet Services Consumer
The Google Brain research team "overcame the many challenges to make NMT work on very large data sets and built a system that is sufficiently fast and accurate enough to provide better translations for Google’s users and services. The Google Translate mobile and web apps are now using GNMT for 100% of machine translations from Chinese to English—about 18 million translations per day. The production deployment of GNMT was made possible by use of our publicly available machine learning toolkit TensorFlow and our Tensor Processing Units (TPUs), which provide sufficient computational power to deploy these powerful GNMT models while meeting the stringent latency requirements of the Google Translate product." (Google blog)
R And D
According to the Google AI blog: "Ten years ago, we announced the launch of Google Translate, together with the use of Phrase-Based Machine Translation as the key algorithm behind this service. Since then, rapid advances in machine intelligence have improved our speech recognition and image recognition capabilities, but improving machine translation remains a challenging goal. Whereas Phrase-Based Machine Translation (PBMT) breaks an input sentence into words and phrases to be translated largely independently, Neural Machine Translation (NMT) considers the entire input sentence as a unit for translation.The advantage of this approach is that it requires fewer engineering design choices than previous Phrase-Based translation systems. When it first came out, NMT showed equivalent accuracy with existing Phrase-Based translation systems on modest-sized public benchmark data sets. Despite these improvements, NMT wasn't fast or accurate enough to be used in a production system, such as Google Translate."
The Google AI blog states that"GNMT reduces translation errors by more than 55% - 85% on several major language pairs measured on sampled sentences from Wikipedia and news websites with the help of bilingual human raters".
RNNs are used "to directly learn the mapping between an input sequence (e.g. a sentence in one language) to an output sequence (that same sentence in another language).... Neural Machine Translation (NMT) considers the entire input sentence as a unit for translation". (Google AI blog)
For the research paper using English-French and English-German language pairs, 36m sentence pairs were used as training set for the former and 5m sentence pairs for the latter. Google internal datasets were used for other language evaluation (and presumably the implementation into Google Translate).