Christmas is a time for family, so when I was plunged into a scene involving my family and two of my favourite topics right now, conversation UX and bias in machine learning data sets, I couldn’t help but get involved.

Like so many people I got my parents an Amazon Echo for Christmas last year. I set it up for them before I left last time but I was totally shocked to return home this Christmas to find Alexa unplugged and hiding in the corner, gathering dust. Whether it’s the Apple Watch or the Oculus Rift, we in tech seem to be very good at making toys that don’t address real consumer needs. But I was sure based on the short time we spent post-setup last January that Alexa was doing some good in the homestead so what went wrong?

Well, let’s start with the fact that my parents are Irish. For those among us who haven’t had the pleasure of hearing the lilting and melodic Cork accent, here is a little sample:

A pretty good representation

Looking through their usage history on the Alexa app, I can see that searches for the weather in Gullane (Scotland) yielded results from Preston (Lancashire), Cobham (Surrey), Golan (Israel) and even the exotic and distant Gallon (India). Needless to say, this probably started to get a bit frustrating.

English’s status as a lingua franca is both a blessing and curse for the conversation designer; almost universal coverage but a maddening variety of accents and colloquialisms. While not as widely spoken as Chinese or Spanish, English must surely be the language with the greatest proportion of limited speakers in the world [insert citation here 😅]. Living in London, you’re confronted all the time by migrants whose gumption and ‘little bit of English’ have taken them a long way. But if my parents who have lived in the UK for 30 years and have English as their first language are struggling, who are we shutting out of the conversation entirely in this world of voice assistants? I don’t imagine having a tough time with voice assistants is not going to disadvantage for many in 2018 but will that still be true in 2020? Or 2025?

Who are we shutting out of the conversation entirely in this world of voice assistants?

At Filament, we think a lot about bias in the data sets that power machine learning models. And while typically we are able to sit down with a client at the start of a project to make sure the data we are feeding in is representative, consistent and significant to avoid the dreaded overfitting; the speech-to-text models that power Google, Apple and Amazon’s voice assistants are a bit out of our control. This is something people are talking about and even writing academic papers on and it’s something we as builders and practitioners of applied AI have a responsibility to push for and be cognisant of.

The reason Scotland will have to abandon multi-story structures

The next issue was very simple. Last Christmas I created a Spotify family account and set my parents up on it so they could play music. Those who have Amazon Echos but not Amazon Prime can probably see where this is going; when activating Alexa to play music, they were neglecting the ‘…on Spotify’ suffix. So instead of the deep and magical ocean of Spotify, they were dipping into the padding pool of Amazon’s free music selection. I would get pretty annoyed as well if I wanted the whole of Ed Sheeran’s ‘Perfect’, but was stuck listening to a sample.

Access to Ed Sheeran should be a universal human right

How were they meant to discover their mistake? Well if you’ve never mapped an intent or extracted an entity, and you don’t have the problem-solving toolkit to google your way to the solution, then it’s tough to see where you’re going wrong.

An example of a Facebook persistent menu from an in-production Filament chatbot

AI still needs a scaffolding to be successful and a lot of the Facebook Messenger bots I use (my favourite chat channel to develop for right now) make great use of persistent menus. It’s something we build into our Filament bots as well; harnessing the power of machine learning to parse user utterances and extract meaning while keeping the familiar smell of home that users can safely run back to.

The ‘persistent menu’ I slapped together, printed out and stuck up in the kitchen might be pretty focused on one use case, but if I can come back in 6 months and my mum is able to listen to a whole George Michael album and not just part of a song then I’m going to chalk that up as a massive success.

The persistent menu I created for my parents’ kitchen

Through the red wine haze and a few hands of Uno, these musings coalesced into a kind of New Year’s Resolution. We as conversation designers need to get out of our little bubble to spend time with all of the people who use our products, not just our friends and colleagues. If we’re going to start delivering banking and health services through conversational interfaces, we need to make sure the good work being done on web accessibility is carried over into the chat space, and that we’re not contributing to the widening gap between those with easy access to services and those without.