Alexa, Siri, Google: which is the smartest? Dumbest? Most useful? Growing the fastest?
On the one hand … Siri, Alexa, and Google Assistant are amazing technology … on the other, they face-palm on some ridiculously simple tasks. We chat about:
- Which are leading
- Which are smartest
- Google Duplex and assistants that can phone businesses for you
- Alexa and smart home
- Privacy (of course!)
- Why Apple’s challenged to make Siri truly smart
- Smartglasses … how they’ll work with AI assistants
Listen: the future of smart assistants
Don’t forget to subscribe to TechFirst with John Koetsier wherever podcasts are published:
- Apple podcasts
- Google podcasts
- Spotify podcasts
- And multiple other platforms … see them all on Anchor
Watch: the future of smart assistants
Subscribe to my YouTube channel so you’ll get notified when I go live with future guests, or see the videos later.
Read: the future of smart assistants
John Koetsier: How smart is Siri? Alexa? Google? Welcome to TechFirst with John Koetsier. So we’ve had AI assistants for years now. On the one hand, they’re amazing technology. On the other hand, they kind of face-palm on some pretty simple, ridiculously simple tasks even.
So what’s the future of smart assistants? To learn more, we’re joined by Brian Jackson who’s from the Info-Tech Research Group. Brian, welcome! How are you?
Brian Jackson: Hi, John. I’m great. Thanks for having me here.
John Koetsier: Excellent, glad to have you. Let’s start here, Brian, what’s the smartest AI assistant right now?
… it’s Google Assistant, just by a nose probably over Amazon Alexa
Brian Jackson: Yeah, I think it’s Google Assistant, just by a nose probably over Amazon Alexa right now. The difference being that it’s just a little bit more conversational. Now I think a user could use either of those assistants and be happy with it and have a good experience, right? So it’s very close, but in testing and every sort of comparison I’ve seen, Google Assistant gets that edge for that conversational capability.
So you don’t have to know exactly the commands of what you want to say to your assistant. You can just ask a question or give it some sort of vague command about what you want and it’ll try and do something for you.
John Koetsier: Yeah, yeah and Google obviously has a lot of information. So we’re going to go through Alexa. We’re going to go through Siri a little bit. We’re going to talk about Amazon’s recent announcement. We’ll talk about where we think, where you think smart assistants are going, where AI is going. We’ll talk about Google Duplex, a few other things, and maybe even get into some international competitors, because this is mostly U.S. specific assistants here.
But also a few other things like smart glasses, that sort of thing. Let’s, just on the topic of all of the assistants right now, which do you think is getting smarter faster?
Brian Jackson: Well, according to Amazon, Alexa’s getting smarter faster.
John Koetsier: Haha, what a shock, I can’t believe they believe that.
… at the Alexa Live conference they just hosted, they debuted this deep neural net future where Alexa’s going to get more conversational
Brian Jackson: Yeah, that’s what they’re leading with. So at the Alexa Live conference they just hosted, they debuted this deep neural net future where Alexa’s going to get more conversational. So I was just talking about how Google Assistant is more conversational and you don’t have to remember those specific commands that you can issue to it. Well, in developer preview now, this deep neural net future is going to open up the gates for more conversational experiences with third party apps.
And to be fair to Apple, they’re not sleeping on this either. Just recently they’ve hired away a couple of big AI names from Google and they’re leaning heavily into improving Siri, and now that they’ve got the HomePod out there, you know that they’re taking voice-first more seriously.
John Koetsier: Yeah, absolutely. Let’s dive into Alexa a little bit then. And one of the reasons why Alexa is interesting to me, it’s perhaps the most associated with smart home, some IOT stuff. And I guess that was kind of something that Amazon had to do, I mean, they didn’t own a mobile platform. Of course, there’s Fire phones, Fire tablets, that sort of thing, but they didn’t own a mobile platform that was popular so they kind of went towards smart home.
Talk a little bit about where an AI assistant should live and be available from ideally.
… I mean, you could buy an Echo in 2013
Brian Jackson: Yeah, well, you’re right that Alexa is most associated with the smart home and I really think that it speaks to Amazon’s excellent developer strategy with Alexa. Let’s keep in mind that Alexa is the voice assistant that’s been available for years. I mean, you could buy an Echo in 2013 in the U.S., right? So …
John Koetsier: Seven years ago, it’s hard to believe almost.
Brian Jackson: I know, and then it sort of exploded in popularity around 2018 when the Superbowl commercial came out. But yeah, they’ve been pushing this for a while, but Amazon has been developing that vision of making Alexa a voice-first device that would control your smart home, be a hub to connect to all these other devices, and it’s cheap to get Alexa integrated into your smart home gear.
So you can pay as little as $4 now for this module that you just put into your dishwasher, or your microwave, or your oven, or whatever it is that you’re making, and suddenly Alexa can talk to it and give it commands.
John Koetsier: ‘Alexa wash the dishes,’ now I just need a robot to actually put them in the dishwasher. But you know, it is interesting when you think about where an assistant should live, right?
Brian Jackson: Yeah.
John Koetsier: Because the nice thing about having your assistant on your phone is it’s always with you, it’s in your pocket. You can just say something, there you go, you’ve got that. But that also means it’s tied to me and so if we have smart tech in the home, then how does my wife control it? How do our kids control it? Those sorts of things, right?
So Alexa has kind of nailed that piece but there would seem to be ideally some sort of merging of the minds there.
Brian Jackson: Right, I see what you’re saying. Well, I think a smart assistant should live in the cloud and be available to the user wherever they want to talk to it. So you think about Alexa, Google Assistant, Siri, where do they live? Well, really they reside on some server in California somewhere, right?
… a smart assistant should live in the cloud and be available to the user wherever they want
John Koetsier: Yes.
Brian Jackson: And all of these access points that we have in our homes, whether it’s an Echo speaker or our smartphone, that’s the end point that allows us to relay a message to that server, right?
John Koetsier: Exactly.
Brian Jackson: So you can get that internet connection, that cloud connection, of course, that’s how it works. But you raise this point that Amazon doesn’t have a smartphone platform, right? And that’s definitely been a hindrance to it in trying to collect data, because when you think about the power of Google Assistant, that comes from people using voice search more often, right?
Every time you’re talking to your phone to try and locate the nearest pizza restaurant, when you’re on the go, Google Assistant gets to learn a little bit more about how people talk and what commands will sound like. So that’s a huge advantage in terms of collecting the data they need to feed to their AI algorithms.
John Koetsier: Let’s turn our attention towards Apple and Siri, which of course has huge promise, always has, but honestly in the rankings that have come out when people have asked all the major assistants like a thousand questions or something like that, seems to rank last. And in just general usability, and what it understands, what it can do without just kicking off a web search or something like that, ‘Let me search for that for you’ … it seems to lag and it seems to be growing and getting smarter slower than the others. Is that correct?
Brian Jackson: Yeah, I agree with you. I think that’s a fair assessment and it’s sort of interesting when you realize that when Siri came out it was the first smart assistant that we could really talk to on our phones.
And I remember a time when my friends were using Siri just to hear the different jokes that it would tell you.
And you’d think that that would lead to a really rich voice assistant that would get smarter, learn from that data, learn from that interaction, and be able to do more things for us now. But it’s possible that Apple’s focus on user privacy is what’s slowed down its development of voice assistants here. Because Google and Amazon, I think we all know they’re as willing as anything to take all the user data they can get their hands on and teach their algorithms how to know us even better. That’s their game, right?
Amazon has to sell us things. Google has to deliver ads to us. What does Apple have to do? They have to sell us things, right?
They have to sell us their products, not other products, so they want us to buy that hardware and that perhaps makes them come at this from a bit of a different angle. And when they released the HomePod, you saw the focus was not on talking to your voice assistant to get it to do everything and anything. It was about making the music experience as good as possible, getting the sound really engineered, and that was the focus in the marketing, not on the intelligent assistant aspect.
John Koetsier: Yeah, yeah, it is interesting. So Amazon just announced something that was pretty cool. Can you talk about that?
Brian Jackson: Yeah. So at the Alexa Live conference last week, it announced a slew of things, as is Amazon’s way — it’s never just one thing, it’s always like a hundred things with Amazon — but I think what’s most significant here is the announcements for developers, and this could really change the way that we all talk to Alexa.
So, right now if you want to use a third-party skill on Alexa, it’s a bit awkward. You have to say, ‘Alexa, talk to … skill name,’ right? So what skills, do you use any skills on your Alexa, John?
John Koetsier: I actually got rid of all my Amazon Echos, so I still …
Brian Jackson: Oh really, okay.
John Koetsier: … have the Alexa app, but I don’t use it that frequently.
Brian Jackson: Yeah. Well, there you go, so you’re definitely not using any. Let’s say you wanted to talk to Domino’s Pizza, right, a well-used voice application that delivers pizza to people. And you would have to say, ‘Alexa, I want to talk to Domino’s’ or ‘Alexa, ask Domino’s to order a pizza for me.’ So the difficulty there …
John Koetsier: A little game of telephone there.
Brian Jackson: Yeah, right. It’s not very fluid, right?
There’s a lot of friction in trying to remember the specific things you want to talk to, and what Alexa is doing, and Amazon is doing to address that is they’re having name-free interactions.
So it’s going to be more conversational where you’ll be able to say, ‘Alexa, I’m hungry’ or ‘Alexa, it’s lunchtime.’ And Alexa will say, ‘Oh, well, he probably wants to order a pizza, so how about…’ you can start talking to that, right? And another thing they introduced was skill resumption. So the annoying thing about having to name all the skills you want to talk to on Alexa is that you’d have to say that over and over again, every time.
John Koetsier: Yes.
Brian Jackson: ‘Alexa, talk to Domino’s. How close is that pizza?’ Well, now you’ll say, ‘Alexa, how soon will my pizza be here?’ It’ll remember that you ordered that pizza from Domino’s 10 minutes earlier. It’ll give you an update, connect back to that skill.
So it’s getting a bit smarter in terms of opening up the doors to developers and allowing them to access their users in a more fluid and frictionless experience.
John Koetsier: Yeah. It’s almost getting a short term memory in addition to its long term memory, right? So remembering state, remembering context, and being more like a human in yeah, you know, you were just talking to me about X, this is probably about X, and there we go. That is a great thing, and also just remembering the name of all those different skills that you have can be really challenging, so that is good.
Let’s talk about the future a little bit. Where do you see a smart assistant in 5 to 10 years?
We saw some interesting technology, probably almost a year ago now, that Google released Google Duplex, right? And that was pretty cool. It wasn’t built into the Google Assistant, but it was sort of accessible from there and it would just call a restaurant for you, book a reservation, those sorts of things. And I’m wondering, do you see that as part of the future of an assistant where I can just tell Siri, I can tell Google or Alexa, ‘Hey, book a trip for me. I want a hotel in Chicago, somewhere near the Miracle Mile. I want a flight that’s no longer than two hops,’ you know, that sort of thing — give it some details and let it go off and build your itinerary?
Brian Jackson: Yeah. I think that it’s possible that we could see voice assistants start to take on some of those types of administrative tasks that really we don’t like doing. They take up a lot of our time.
John Koetsier: Yes.
Brian Jackson: We tend to forget to do them because you say you’ll do something during a meeting, or you have a phone call where somebody, you get an action item you don’t write it down, it’s lost, right? You realize two weeks later that you never did it and then you have to go back and do it all over again. Well, with Duplex it’s been coming out in the U.S. only. So you’re in Canada, I haven’t been able to have that experience myself yet, but it’s available in 48 states now, and it’s available in New Zealand too.
But Google is being pretty tight-lipped about the adoption of it, and they’re working through some privacy laws around when can phone calls be recorded in the U.S. And I think they’re trying to figure all that out, but the suspicion is that they’ll expand that service soon and we’ll start to have access to it in the U.K., Canada, Australia, and it’ll be interesting to see how people respond to that.
Will restaurants like the fact that suddenly Google Assistant is calling them to place orders? Will that work as seamlessly as Google says that it will?
John Koetsier: Yeah.
Brian Jackson: I mean the demo seemed great when they put it on stage in 2018, right? But how many real life experiences will be just as good as that, and we’ll see if it’s really as frictionless as they say it will be.
John Koetsier: Yeah. I mean, what I liked about it, there was obviously a massive pushback, right? There was a privacy pushback, there was a like, ‘You should let me know if I’m talking to a machine’ type of pushback. People felt creeped out by it, and I understand that and that’s a valid response. I didn’t personally feel that way.
I personally thought that Google Duplex was amazing and is amazing, and I would love to be able to use that capability. And frankly, if I was working in a restaurant and doing bookings, it wouldn’t really matter to me if a computer was calling me or if somebody was calling me …
Brian Jackson: As long as you’re getting the business who cares, right?
John Koetsier: Exactly. Realistically, APIs are going to take care of most of this stuff anyways, right? So if Google Assistant has an API into something that can book a table at a restaurant or order a pizza, they’ll do it that way and it’s machine to machine.
Brian Jackson: Right.
John Koetsier: But in situations where there’s a human that you need to talk to, I think that’d be an amazing thing.
Brian Jackson: Yeah. I could see that happening. And I could see in that type of integration of trying to figure out when a user wants to take care of an administrative task being integrated with other experiences we have. So we’re all doing a lot of video conferencing right now.
John Koetsier: You think? Hahaha.
Brian Jackson: And I think what we will see … like we are in this podcast, exactly. So, you know, I think what will happen is we’ll have machine learning listening to our meeting conversations and when we say, ‘Hey, when’s the next time we should meet?’ oh, ‘2:00 PM, two weeks from now.’ Then usually you would have to go into your calendar, book that, email the person.
Well, there’s no reason that a smart assistant couldn’t listen to that, do the natural language processing, figure out what you’re talking about and what you want to happen, and create the calendar invitation for you, and email it out to everybody, right?
John Koetsier: That is interesting, and that also gets into the topic of data and privacy and other things like that. And one of the other topics that I wanted to talk to you about, which is there are some movements, OpenAI and other things like that, which are aiming to give you the capabilities that an AI assistant has, but one that you control, one that you own.
And we’ve seen that, maybe that’s science fiction still, but you know, where you can control and own what data goes, what happens is not a third party, whether it’s Apple, whether it’s Google, whether it’s Amazon or anybody, it’s your AI and it works for you.
Do you see something like that happening in the future?
Brian Jackson: Yeah. I think that people are becoming very aware of their privacy. And we just learned recently in the past year, I think it was in 2019, we learned about all of these different smart assistants are recording us and then sending those recordings to be listened to by people.
John Koetsier: Yes.
Brian Jackson: Now this isn’t every recording. To be clear, this is the odd recording that is sent to be reviewed by humans for quality review purposes is the reason that all of these companies gave. Amazon, Apple and Google were all found to be practicing this and that really got to the heart of a lot of people’s fears with these smart assistants is that they’re spying on you, they’re listening to you when you don’t want to be listened to. And the fact that, yeah, people were listening to certain commands you gave to your smart assistant without you realizing it, and apparently they heard some pretty sensitive material.
John Koetsier: Mm-hmm.
Brian Jackson: So that was a big problem and we can’t have things like that happening.
And it shows that these big tech giants, perhaps aren’t taking seriously our privacy enough, right?
That’s why we see these sorts of movements where we say, ‘Well, why should the tech giants get to own all of the AI? Why do they get control over the algorithms?’ They get to hire up all the talent. What if we had an open source type of system where we could all see what’s happening with those algorithms, some transparency into how they work so it’s not a black box, and then I would feel a lot more comfortable about my privacy because I can understand exactly what’s happening to my data.
John Koetsier: Yeah, yeah. One thing that’s near and dear to my heart in some way is the evolution of smart glasses. And so I’ve written about it multiple times and follow it pretty closely. I’ve owned a pair of smart glasses, several pairs actually …
Brian Jackson: Which ones?
John Koetsier: … which are not very smart by the way. From Snap and from Vuzix, and …
Brian Jackson: Yeah.
John Koetsier: … not that capable, not that smart, sort of the way Steve Jobs talked about smartphones before the iPhone was launched. But I look forward to that and there’s increasing momentum around that, and you know that all the big tech companies are working on that.
How do you see the integration of AI assistants and smart glasses?
Brian Jackson: Yeah, it makes total sense. I haven’t really seen a smart glasses model that’s broken through in the consumer sense yet.
John Koetsier: No, no, no, and it will be some years.
Brian Jackson: Yeah, but in the enterprise space they’re being used today. And companies like General Electric are, they have them out there and there’s a real use case for field workers that are busy with their hands, they’re repairing things, they’re working with things with their hands, but having some sort of computer interface to look up information and be relevant to what they’re doing, or be able to talk to somebody as they’re trying to complete a task. There is a real need for that. So there’s this whole ecosystem that’s popping up around visual interfaces.
Google Glass still exists, right? Everybody says that it’s dead, but it’s not dead.
John Koetsier: No.
Brian Jackson: It’s an enterprise product now and it’s out there, just like Vuzix is, and they’re targeting the enterprise too. But if you’re a geek like us, you can go and buy one and start using it. But I think that it could get better, right, these things always continue to improve. And I know that Apple has filed some patents in this space recently.
So maybe, you know, Apple has this way of watching a sort of nascent tech space and then releasing that killer product that consumers suddenly love, like they did with the iPod, right?
So we could see that happen with smart glasses, it’s possible. And if they did that, it would just be a perfect integration for Siri, right?
John Koetsier: Yes.
Brian Jackson: Because you could have smart glasses and you don’t really want to be touching them that much and you can’t have that many controls on them, so to be able to talk to them and give commands to Siri, to be able to control different interfaces on it, or relay commands elsewhere, that would be useful.
John Koetsier: It would be. One of the challenges for smart assistants is what is also its best thing, the voice interface, right? Which is, you know, I can quickly tell my lights to turn on. I can quickly arm my security system, other things like that, open my garage doors.
But also, when I’m in public, it feels odd to speak to my device. It feels odd, and that’s something that I’m not sure we’ll get a … maybe social norms will change and we’ll be more accustomed to people shouting at random times at their technology.
Brian Jackson: Yeah, that’s a tough one because if you want smart glasses to succeed, they’re going to have to look cool, right? You want to make them look as close to a pair of sunglasses as possible, and you don’t want to make people feel like they’re being spied on by these people that are wearing smart glasses around.
I mean, I remember when Google Glass had its consumer pilot out there, and people had them out in the wild, this term “glassholes” was invented to describe people wearing …
John Koetsier: Yes, yes, I know the original glasshole, Robert Scoble, yes haha.
Brian Jackson: Right. And so that shows that there’s sort of this creepy feeling that people have when they’re around this technology that they feel is surveilling them. And frankly, I can see the point. Like we don’t walk around with the record function on our smartphones on all the time, right? So that’s sort of a step up in creeping in on your user privacy.
So that’s going to be a tough one. Maybe these manufacturers could create some sort of light on the device that shows when it’s recording something and people could see when it’s on or off. They’re going to have to think through that, and normalization will be tough.
John Koetsier: It will be tough and social norms will change of course, but slowly. Other interesting things are definitely going to happen and there’s going to be challenges as they come out. Brian, I want to thank you for taking this time.
Brian Jackson: Thanks, John. It was a lot of fun.
John Koetsier: Excellent. For everybody else, thank you for joining us on TechFirst. My name of course is John Koetsier. I appreciate you being along for the ride. Hey, whatever platform you’re watching on, like, subscribe, share. If you like the podcast you’re listening to that, please rate it, review it, that’d be a massive help.
Thank you, until next time … this is John Koetsier with TechFirst.