Alexa, Tell Me About NBC: Conversational UIs and Drupal
Since the dawn of time, humans have wanted to talk to their computers. The history of conversational interfaces and natural language processing is a story of fits and stalled attempts. It is only within the last few years that real, true spoken-language interfaces have become a reality.
But the way that users talk to computers is unlike anything else— Making a spoken request of a computer is like neither human-to-human directed speech, nor is it merely a spoken version of text-based conversational input. Integrating with spoken, conversational UIs like Alexa, Google Home, and Siri isn't simply on the horizon for Drupal-powered content platforms, it's already in front of us.
Inspired by the Alexa codebase developed for SyFy.com, NBC.com came to the team at Four Kitchens to build out the first Amazon Echo “skill” for a broadcast TV network. Web Chef Elliott Foster (along with three other developers on the NBC team) accepted the challenge, producing a full Echo interface for NBC.com in just one week from start to finish. The Alexa interface has a unique way of processing spoken language, so we had to anticipate not just how people search for shows on NBC, but also how they might talk to this weird little black tube that Amazon built. For example, one person might say “Alexa, start NBC” which is straightforward command, while another might instead say “Alexa, tell me about Saturday Night Live”.
In this session, we will use our NBC/Alexa case study to frame the history and make suggestions for the future of conversational UIs in Drupal-powered sites.