Who is Alexa? How we get to know the stranger in our kitchen
Amazon Echos are everywhere. And Alexa hype is in overdrive. Analysis tends to revolve around questions like:
These are interesting questions. They evaluate the utility of emerging spoken interfaces like Alexa, so we can better understand the role they will and won’t play. They are the same questions one would ask of any innovative new piece of technology.
But Alexa begs a new set of questions that aren’t being addressed:
Who is she?
What is she like?
Is she a computer? Or something else?
Mark Zuckerberg begins to raise some of these questions in his excellent blog on building Jarvis, his personal AI butler:
…once you can speak to a system, you attribute more emotional depth to it than a computer you might interact with using text or a graphic interface. One interesting observation is that ever since I built voice into Jarvis, I’ve also wanted to build in more humor. Part of this is that now it can interact with Max and I want those interactions to be entertaining for her, but part of it is that it now feels like it’s present with us. I’ve taught it fun little games like Priscilla or I can ask it who we should tickle and it will randomly tell our family to all go tickle one of us, Max or Beast. I’ve also had fun adding classic lines like “I’m sorry, Priscilla. I’m afraid I can’t do that.”
But Zuckerberg only goes so far. While he acknowledges the emotional depth triggered by voice interaction, the humour he has built into the system is immediately rationalised in terms of utility: entertaining his daughter. Similarly his observation of Jarvis’s ‘presence’ in the family home is not a theme he chooses to develop on its own terms – instead emphasising how Jarvis encourages interaction between family members.
We’ve been getting under the skin of some of the existential themes Zuckerberg raises over the last year. By observing the behaviour of users over many weeks we have been trying to figure out not just how Alexa is useful, but who she is.
We’ve found there are distinctive relationships forming between Alexa and her users, even within a single household. To a mum who asks Alexa to remind her kids to stop watching TV she’s ‘kind of a nanny’. To a lonely man who says goodnight to her she’s ‘like a girlfriend’. To a housemate who has struggled to find any use for her she’s ‘just a toy’.
We observed that several factors informed how these different relationships developed over time:
Expectations – what did they know about Alexa already? What movies had they watched that featured AIs? What do they already know about Amazon?
Needs – given their personal situation what did they want her to be? A housekeeper? A nanny? A girlfriend?
Interactions – what is the experience of speaking with her like? Does she understand me? Is she polite? Does she do what I say?
Her Voice – who does Alexa sound like? Who does she remind me of?
Her Actions – what can she do well? Where does she fall down?
As these differentiated relationships crystallise through use, they in turn reinforce expectations and future usages. If Alexa had become a nanny to someone, then this informed their belief about how she would be useful in the future, and what they would try and use her for.
So the answer to the question ‘Who is Alexa?’ is ultimately ‘It depends on who you ask’.