Programming Voice Interfaces with Jibo

Glad to hear that Jibo is finally shipping, I can’t wait to get my hands on one! You can read more on that here: https://developers.jibo.com/blog/first-jibo-experiences%E2%80%A6out-of-the-box

Bob and I mention Jibo in our book “Programming Voice Interfaces, Giving Connected Devices a Voice”. While we only touch on Jibo for a moment, you can get a high level understanding of the current landscape around voice and how one can get started playing in the field.

Here’s an excerpt…

“In addition to Wit.ai and API.AI you will want to check out IBM Watson and Watson Virtual Agent as well as tools such as Jasper, PocketSphinx, Houndify, and Festival. You should also check out the latest offerings from Nuance and be on the lookout for startups such as Jibo, for example. Jibo is an interesting offering in that it’s an actual physical robot that moves, blinks, and reacts physically to voice input and output.

While at the time of this writing Jibo isn’t publicly available, there are tools developers can download such as the Jibo SDK, which has Atom IDE integration, as well as a Jibo Simulator (shown in Figure 2-5), which is great for visualizing how your code would affect Jibo and how users can engage with the robot.”

For more on the book, go to http://qnovations.com/programming-voice-interfaces

For more information on developing for the Jibo platform, check out https://developers.jibo.com.

Share on FacebookTweet about this on TwitterShare on Google+Share on LinkedInShare on Reddit