How Amazon's Alexa Brain initiative makes the virtual assistant smarter

Amazon has announced an initiative it dubbed Alexa Brain, which is focused on improving Alexa’s ability to track context within and across dialog sessions, as well as make it easier to discover and interact with Alexa’s over 40,000 third-party skills.

While this is perhaps the first of what Amazon says are many enhancements scheduled this year to make its virtual assistant more personalized, smarter, and more engaging.

The initiative focuses on enhancing Alexa's tracking of context and with memory update, Alexa will be capable of remembering any information you demand of her, by storing and retrieving it later. For instance, you can direct Alexa to remember an important day by saying something like, “Alexa, remember that Sean’s birthday is June 20th.” Alexa will then reply, “Okay, I’ll remember that Sean’s birthday is June 20th.”

It effectively turns Alexa into an information engine, and a reminiscent of chatbots, which were designed to remember anything you told it, for later retrieval over SMS or messaging platforms.

Alexa will also be able to have natural conversations, through what the compnay called “context carryover” - Alexa will be able to understand follow-up questions and respond appropriately, even though you haven’t addressed her as “Alexa.”

According to Ruhi Sarikaya, the head of the Alexa Brain group, speaking at the World Wide Web Conference in Lyon, France, the feature takes advantage of deep learning models applied to the spoken language understanding pipeline, in order to have conversations that carry customers’ intent and entities within and across domains.

Finally, there's a refocus on Alexa’s skills, the third-party voice apps that aim to help you do more with Alexa – like checking your credit card account information, playing news radio, ordering an Uber, playing a game, and more.

As it’s becoming harder to dig around in the Alexa Skills Store, users will be able to launch skills using natural phrases, instead of explicit commands like “Alexa, open [skill name]” or “…enable [skill name].”

These capabilities will be going live for U.S. users in the coming weeks, while a more broader roll out will follow soon.
Next Post »