Sundar Pichai, CEO of Alphabet, demonstrates a breakthrough Conversational AI technology LaMDA (Language Model for Dialogue Applications. Like other language models, LaMDA is a Transformer-based neural network architecture model but are trained on dialogue. Since the release of Transformer by Google Research in 2017, several large-scale language models like GPT-3, DeBERT, and ROBERTA were released that have revolutionized the artificial intelligence industry. Today, language models can generate code, summarise articles, and more.
However, Transformer-based models can be heavily limited to specific tasks or require training the pre-build models with new information to effectively on a wide range of functions.
To make models topic/task agnostic, Google blazed a trail and trained LaMDA on dialogue, especially with chatbots. “During its training, it picked up on several of the nuances that distinguish open-ended conversation from other forms of language. One of those nuances is sensibleness. Basically: Does the response to a given conversational context make sense?;” mentions Google.
Google has witnessed a superior performance of LaMDA while asking a few questions. The video below demonstrates LaMDA’s capability with open-ended conversation.
LaMDA resulted from Google’s earlier research published in 2020 that showcased that Transformer-based language models can be trained on dialogue to improve use cases on numerous tasks. Since it is still early in the research, Google is committed to revolutionizing Conversational AI technologies.
For now, Google will be focusing on the sensibleness and satisfyingness of response from LaMDA to further enhance its responses.