We wanted to understand how to produce effective interactions for voiced-based interfaces and natural language processing, so we decided to produce two Alexa skills around content which we are passionate about: Star Wars and comics.
Our Star Wars Alexa Skill, Star Archives, provides all kinds of information about the Star Wars universe; users can ask Alexa about galaxies, starships, weapons, and characters. To begin we created an interaction model on what users can ask Alexa. The schema considered everything we believed users would be willing to ask out loud and receive the information via “spoken” voice. Our Comic Book Guide Alexa Skill is nearly identical with respect to capabilities, techniques used, and challenges experienced.
We gained insights into how to construct voice interactions, and now Amazon Echo users can ask Alexa about the Star Wars universe–galaxies, starships, weapons, and characters–and the world of comic books, and our Star Archives and Comic Book Guide will provide the answers.
We created both a haptic compass that buzzes according to the direction of an iPhone and a standalone haptic device to detect iBeacons without the use of a smartphone.
We created an Alexa skill that integrates with our beacon- and Slack-enabled Mission Data Office app. Staffers can ask the Amazon Echo to identify the specific location of someone in the office. We also created a skill for Alexa to tell our DC staffers which food trucks are outside the office at any given time.