We wanted to understand how to produce effective interactions for voiced-based interfaces and natural language processing, so we decided to produce two Alexa skills around content which we are passionate about: Star Wars and comics.
Our Star Wars Alexa Skill, Star Archives, provides all kinds of information about the Star Wars universe; users can ask Alexa about galaxies, starships, weapons, and characters. To begin we created an interaction model on what users can ask Alexa. The schema considered everything we believed users would be willing to ask out loud and receive the information via “spoken” voice. Our Comic Book Guide Alexa Skill is nearly identical with respect to capabilities, techniques used, and challenges experienced.
We gained insights into how to construct voice interactions, and now Amazon Echo users can ask Alexa about the Star Wars universe–galaxies, starships, weapons, and characters–and the world of comic books, and our Star Archives and Comic Book Guide will provide the answers.
We created both a haptic compass that buzzes according to the direction of an iPhone and a standalone haptic device to detect iBeacons without the use of a smartphone.
We created a prototype AR app for the iPhone that shows users where the nearest Capital Bikeshare station is relative to the user’s location and directional orientation. The AR overlay shows the name and distance of each station.