More Voice First With Alexa - Boldstart Technology
single,single-post,postid-17492,single-format-standard,ajax_fade,page_not_loaded,,qode-theme-ver-14.1,qode-theme-bridge,wpb-js-composer js-comp-ver-4.11.1,vc_responsive

More Voice First With Alexa


More Voice First With Alexa

Two months ago I wrote about experimenting with Alexa and my new found python skills. This week I’m pleased to say that experiment has progressed into a fully fledged published skill for Badminton Horse Trials (my third so far).

What started as a few hundred lines of python has now been translated into a full production ready php application, thanks to a joint venture with an old colleague of mine, Nik Roberts. With Nik on the application development side, and myself running point on the Natural Language side, we’ve managed to deliver a ‘voice first’ application for Alexa that answers a wide variety of queries about this well attended sporting event.

It also handles a complex set of results and competition timetables that would have most statisticians quivering in the corner. Moreover it handles all this with the conversational grace of a seasoned virtual assistant the likes of which I’ve yet to see on Alexa or Google Home.

We had to build all of the conversational elements ourselves as the Alexa SDK is very barebones. But this now handles topic switching, say from results to ticket or travel info, whilst maintaining context, and can continue back and forth dialog learning and remembering various pieces of information which can be used later in the conversation, or for that matter in a subsequent conversation. I still maintain that Amazon is missing a trick by not providing developers with access to actual user utterances, like IBM Watson, Google Home et al do, but for now we’ll work with what we have.

In the video below you’ll notice that the language used is very ‘conversational’ requiring contextual memory to deliver some of the answers, eliminating the need to be explicit for every request.

The application makes uses of a number of APIs to answer user specific questions in addition to a variety of relatively static information that is extracted from a php CMS we adapted for this use.

Whilst the Alexa SDK is very ‘barebones’ it does provide a number of building blocks to construct voice applications, most of which are surfaced as ‘slots’ that you use as parameters to construct query strings to poke at your application. For example, Amazon provides a time recognition slot that will convert expressions like “tomorrow” into tomorrow’s date. You can also create custom slots for your own use too, and pass these as parameters, for example, in the following sample utterance we have a custom slot called SearchRider that looks at a prebuilt list of names of riders at Badminton, and we use Alexa’s Date slot to convert “tomorrow” into a date (eg. 2017-05-06). This allows us to construct a response from the application with a rider’s time for the appropriate date.

“What time is {SearchRider} on course {Date}”

Like any development against an SDK, you encounter limitations along the way, and you have to adapt your information architecture or logic to cater for what’s possible. I’ll detail anything we’ve found in a separate post, but for now please marvel at the wonderful skill Nik and I have built and if you’d like your own Alexa skill or a Google Home one for that matter, please get in touch with either of us, we’d be happy to assist.

The Mitsubishi Motors Badminton Horse Trials has started today and finishes on Sunday 7th May, so it’s the perfect opportunity to test out the skill. Please do give it a go, and try some of the conversational elements we’ve hand crafted like in the video above, they really demonstrate the usefulness of voice applications.

More Details: Badminton Horse Trials Alexa Skill

No Comments

Post A Comment