Hey Siri, tell me a joke. Siri: Two drums and a cymbal fall off a cliff. Claire: What? Claire: Hey Siri, are we friends? Siri: I'm your assistant, and your friend too. Claire: Hey Siri, call me human boss. Siri: From now on, I'll call you human boss. Okay.
Hi, I’m Claire Evans, aka Human Boss, and this is YOU: a podcast all about the intersection of technology, humanity, and identity, brought to you by Okta.
This episode, we're getting into the world of digital assistants, what they are and what they mean to us. We're talking about Alexa and Siri and Google Assistant. They set our alarms. They tell us the weather. They play us our favorite songs and podcasts, and in theory, they make our lives easier, but when does an assistant become too embedded in our lives? When does it cross the line and become more than just a helper?
I'm trying to remember my very first, how shall I put it, voice-on-voice interaction. it must've been Siri, which seemed to me to be shockingly futuristic when I first encountered her. It seemed like the future had come too soon. I was genuinely surprised that we were able to do this. Sure, it was a little bit choppy at first but it went from being this thing that was impossible seeming to a totally normal part of life in months, less than a year. Now I talk to Siri every single day and I don't even think about it. That is the shocking and amazing thing about technology today is just how quickly these things become normal, when our parent's generation or even five, 10 years ago would've seemed unreal.
I have it in my kitchen. I use it when I'm cooking, for timers It's good in that respect but It's strange to think that it's maybe listening to me and using data without me really thinking about it.
It's weird I guess because it talks back to you but, no it's definitely not a person. It's not a robot. I don't even know how to describe it.
I think I just talk to Siri like I would talk in general, because it's a means of communications and it doesn't sort of matter what it's directed at. I mean I talk to my dog. My dog doesn't understand what I'm saying. Yeah, Siri probably better understands what I'm saying than my dog. So there's that.
I think some of the best reactions are when you say something maybe by mistake or accident and you see a response and you get this serendipitous conversation where ... Siri is like no I don't think I can answer that. She will respond in a way where you're like yeah, I see an inkling of a human in the engineering of this.
Today, we're talking about Assistants — specifically, the Google Assistant, which is a virtual AI that does everything from tell you what the weather’s like to identifying whatever catchy song is currently playing in your favorite coffee shop. To find out more, I went to Google HQ in Mountainview to speak to someone who knows more about how the Google Assistant behaves and why.
Hi, I'm Lilian Rincon. I'm a Director of Product Management for the Google Assistant. I lead a team of product managers that define what the Google Assistant should be good at.
Can you recall your first experience with a voice user interface?
Okay so my first experience with a VUI actually happened when I was at Skype, actually my previous job. And this was, I don't know, circa five years ago. And this sort of bot platforms were kind of bubbling out of everywhere, and Skype was known as something where you did video calls and so we decided that we would build Skype video bots that you could actually call a video bot and have a voice conversation with an avatar. And so, yeah, we built a few.
That's interesting. Like a synthetic person, with a face.
Yeah, it's fascinating, at that time we did a lot of research for avatars that look like people and avatars that looked like characters. And what we found was that the ones that looked like people were creepy and the world was not ready for them, and so we went very much the, it looks like a toy, you are literally talking to a cartoon.
That's so funny. Yeah, I think one of the things I appreciate about the voice interface is that I deal with an interface within my life is that they don't have faces and I can project all manner of things onto them. I have a certain idea about what Siri looks like versus what Alexa looks like, and that's gonna be different for me than for anybody else, and I like that freedom.
So Assistant can be fun and funny, often. I mean, it gives us all kinds of funny responses, unexpected responses. I suppose that's part of the design 'cause it keeps people wanting to ask more. How do you determine what is in character for the Assistant and what is appropriate for the character, the personality, however you wanna put it?
Yeah, I think at the end of the day we always want the Assistant to be doing whatever is being asked If somebody just says, "Hi." That's more of what we call a character response, which is more of personality response.
Right, you don't wanna ask for directions from the Assistant and then get a joke back. You wanna just get to work.
Yes, exactly. Exactly.
Interesting. Is the team that's crafting the character responses a different team than the team that's crafting the more pragmatic responses?
We have the delight personality team that really crafts more of these personality type answers. I don't know if you know this, but you can ask the Assistant to sing you Happy Birthday. Again, that's more of the delight personality team.
That sounds a little lonely, doesn't it?
It's pretty fun actually.
So I have a habit of thanking my digital Assistant when it gives me information and in talking to friends and family, I think I'm not the only one that does this. I don't know exactly why I do it. I think, on some level I know that it's not like the Assistant appreciates me being thankful, but I do enjoy stepping into this collaborative role play game with the Assistant where I extend the conversation and make it feel more like there's a back and forth between us. And I wonder if that's something that you see a lot and if you design for that?
Yeah, so we have done a ton of research in how people use the Assistant across many different surfaces, and what we've noticed, actually, is that there are certain segments of the population where you have more of this type of appreciation of the multi-turn type of conversations that you're alluding to. In particular, we see this a lot with kids, especially younger kids, like three to eight year olds, who want to play, let's say, with the Assistant. So part of the play is having a conversation.
And interestingly enough, we also see this with more the older population, the senior population. And there, actually, it's been kind of fascinating looking at the research 'cause you see more of the widows, people who are old but who maybe lost their significant other, and them finding, if you will, a bit of companionship with this Assistant in their house that now they can say, "Hey, I'm going for a walk," and it has a back and forth. And it's kind of fascinating to see, actually, them then recommend it to other seniors and the explosion that segment of the population seeing that, again, it's not about actions, it's not about performing a call or doing other things that it can do, but just literally this act of having a conversation with this Assistant is something that they value.
VO - Male
We like to think of it as another person sometimes, but we usually use it as a tool as opposed to another person. I think the registry we use when talking to these kinds of machines is completely different.
VO - Rod Rolland
I have a nine year old daughter and she wants to talk to Siri sometimes. She's asked me before. It's kind of confusing a little bit for her I think.
VO - Julia
I was seeing this guy once and he was talking to his Alexa. I just got the worst vibe from how he was talking to his smart phone essentially. I just remember him using a demanding tone. No please and thank you. It just rang me the wrong way.
VO - Gina
Something that I'm wary of as this kind of technology becomes more popular is losing that base level human politeness and losing sort of this very intrinsically human thing of just at least treating people you're communicating with, with some level of respect.
There seems to be this very delicate balance, maybe even a conflict between the idea of the digital assistant as a utility and the digital assistant as a friend.
I am like a child, already at the point where I'm relating to my Siri as though it was a buddy of mine. I can't help that. I think I'm a very anthropomorphic person. I tend to look for patterns and shapes and faces and I like projecting that kind of warmth into the world. I think on some level, the way that you treat all inanimate things, the natural world, I suppose that's animate. Animals, fine, also animate. Siri, Cortana, in their own way is animate, but the way you treat non human world says a lot about who you are.I like to respect the digital assistants. I know it's silly because they're just technologies owned by massive corporations but it's not about what it does for the tool. It's what it does for me. I think consistently modeling a compassionate behavior towards all things makes me feel better about who I am and I think makes me a better person. It's a way of training myself to be better, to be more mindful.
Let’s get back to my conversation with Lillian Rincon, who heads the team designing the Google Assistant.
You mentioned your kids earlier and I'm always curious about how really young people, I don't know how old your kids are, but how young people engage with things like this because I see babies using tablets and phones and it blows my mind that they can internalize all this amazing magical technology without even thinking it's unusual 'cause they've never known anything different. So I'm just sort of curious, what do your kids think the future of something like Assistant looks like? I mean, are they imagining that this is gonna be part of their lives forever?
I mean, my kids are really young. My daughter just turned two and my son is five, so they're really small. I will say that it's been fascinating to me 'cause my son, for example, we didn't have an Assistant, let's say speaker in our house when he was growing up. Whereas my daughter kind of grew up with it, right? From zero to two, and so it was interesting to me as she was learning to speak, she very quickly, actually, realized that she can tap the speaker to stop it and that she can say ... and she can't actually say it yet, but she can say, "Google, ABCs." Or, "Google, Twinkle Twinkle." She clearly got enjoyment out of some of the few things that we do actually on the personality team and through media.
That's fascinating because then at this point you have an Assistant that's part of the development of a child, right?
So I sometimes worry, and I don't think that I'm alone in this, that with especially voice operated digital assistants, we risk training a generation of kids to expect immediate responses to all of their queries because kids who are growing up with this technology, it has a very significant developmental role in their lives. Is that something that you think about, and how do you go about training systems that positively reinforce behavior for kids?
Yeah, I mean, I think as a mother of two, and especially seeing my daughter kind of grow up with these devices, this is definitely something that's been close to me and something that actually I started on the team because I felt very passionate that we should actually have a point of view and develop something for parents who are in my situation with kids kind of growing up with these devices. And so actually it's a feature that we call pretty please, it's something that we announced earlier on in the year. And the idea is exactly as you're suggesting, is to use positive reinforcement to put the devices in a mode that encourages kids through accolades when they say please, to be polite, basically.
It's interesting, I think it touches at how much power these technologies really can wield in our lives because whether or not the Assistant models positive behavior for kids can have a huge impact on the way kids grow up, but also if the Assistant one day says, "Hey, I'm a feminist," or, "Go vote," these things can have a massive ripple effect across the world. I imagine that that responsibility is somewhat daunting. I don't know, is that something that you think about?
I think that a lot of us actually do feel this responsibility. As I've said, the Assistant is only two and half years old and so a lot of the information we have is very new, but as these things come up, like the kids thing we just talked about, I think know that there are a lot of people that do feel that kind of burden on their shoulders to think about, "Well, what kind of impact does that have on society? What kind of impact does that have on the kids that are growing up?"
Yeah, well, I mean, I can't imagine that there's an easy solution to any of it. It's a complicated thing. And it's funny because it's ... I don't know, we still culturally have this assumption of technology being kind of this neutral, sterile thing, but especially once we get into these voice interfaces, these are very intimate parts of our lives. This is our very expression, the way that we talk, the way that we are in our homes, the way that we are with each other. And how these technologies respond to us is going to have a huge impact on who we are as a species, dare I say. I mean, maybe that's a little bit much. But if you start looking in the future, you're saying this is early days, so.
I think with kids and with seniors, clearly there's an opportunity for us to be thoughtful about the implication of this technology and introducing, let's say the internet and the power of technology. It's interesting to me 'cause a large portion of our users are young millennials, and for millennials, actually, the feedback we get is the opposite. It's like they actually don't want it to be personality, they want it to just get to the point and do the thing, more of a utility. this is why I say it's early, that we have to look at the research, kind of understand and then decide, "Well, what should the Assistant ... maybe the Assistant should behave differently in certain modes."
OK, you keep saying it's super early days, which is mind blowing to me 'cause it's like the most futuristic technology on the market probably, but what is the end game, then, in your mind? What's the dream scenario? What's the relationship look like between people and digital assistants in say, five years, ten years?
Actually, there's some places in the world right now that you can literally buy a house where everything in the house is connected to the internet and could be hooked up to an Assistant so that you could essentially do anything in your house through your voice.
I think the other aspect as we think about the future of the Assistant is really thinking about how the Assistant can be more proactive and comprehensive, actually. today, for example, you could add something to your shopping list and that's it. It just basically goes into a shopping list, but in the future imagine if you tell your Assistant to help remind ... this is a problem actually I have. Remind me to book a date night every Saturday night with my husband. And the Assistant in the future could actually suggest, "Well, actually, would you like me to look up one of the favorite restaurants that you have and actually make a reservation for you? Text the babysitter for you?" Imagine an Assistant that can actually be truly helpful to the task that you're trying to get done rather than just taking a note. Set up a date night.
That’s sort of a movement towards a much more proactive role in our lives. Are there common responses that you see a lot in terms of what people say to Assistant?
Yeah, so actually one thing we did recently on the personality team actually, was to take a look at what are the kinds of things that people say to the Assistant. And we found that over one million people a month say I love you to the Google Assistant, which we thought was kind of cute and fascinating.
I have to wonder how many people say I hate you, or whatever. I'm sure people say all kinds of stuff. But it's nice-
Yeah, I didn't pull that metric.
VO - Rosa
The future of digital assistants is exciting because I feel like in my short lifetime, I went from not having one to being so used to one in my home and my friends' home and my family's home. I feel like it's gonna transform our lives where we're all going to rely on it heavily and everybody we know will have one in their home.
VO - Ken Fisher
In 50 years I think we'll be a lot further along that we probably imagine right now and I think AI will be all around us and I think we'll be able to have much more seamless fluid interaction with it. I guess there's this possibility for people who are lonely to have interactions in a way that maybe we can't even imagine now.
VO - Gina
I think that the hope is always that new technology is doing something positive, that it's making life better for people, that it's making the world better in some way and maybe even if we can't sort of see the specific of how it could possibly do that now, I think the possibility that it can is exciting.
VO - Mika Karuna
I want to think that we can co- exist with technology beautifully in a way that enables us to be in harmony, that's what I want to think.
So here's the thing: these assistants are useless without us. They're for us. They respond to us. They listen to us. And if our primary mode of interaction with technology is going to be through our voices, then the question is, what are we going to say? Are we going to say the same things that we say when we’re typing in the dark, or are these technologies going to make us nicer?
Because when we actually have to communicate out loud, we might actually be more truthful. That or the internet will just be a giant shouting factory, but still I have a hard time believing that anyone is going to tell Alexa or Siri or Google or Cortana to say something horribly mean to somebody else on Twitter for example.
At the same time, digital assistants are there to serve us, which means they're always telling us what we want to hear, and the only things they hear from us are demands, complaints, and orders. I mean if I could look back at everything I've ever asked Siri, I would probably get a really good portrait of all of my insecurities, the limitations of my knowledge, my neediness, my demands, it wouldn't be the best version of me, it would be all the things that I'm lacking.
I don't know that I would say that my digital assistant is an essential part of my life. It is a highly enjoyed convenience. In the coming years, there will be new features, new integrations that will make my digital assistant as important to me as my phone itself or my computer itself. We're almost there, I think. But I’m a cautious adopter.
For me, the important thing is being able to have consent across the board when these devices are around, just because they're so intimate and the relationship with a computer is so much different when you're relating to it primarily by voice. It's such a human point of contact. That's as close as it gets to the real self, the unmediated self, the part of ourselves that's not necessarily being curated and projected for social media or for the workplace.
Claire Talking To Siri
Hey Siri, do you love me..? Siri: I don't know. But I think you're pretty great. Claire: Do you think it's possible to love me? Siri: I really couldn't say.
This has been YOU, a podcast brought to you by Okta and I’m Claire Evans. Thanks so much to our guest, Lilian Rincon, for joining us. We’ll be back next time for a grudge match: it’s internet fame versus internet shame. Spoiler alert: we all lose.
Don’t forget to subscribe, share, and leave us a review wherever you listen to podcasts. Thanks for listening.