Is artificial intelligence out of control? Is technology and AI already beyond our ability to manage and monitor?
In this TechFirst, we chat with David B. Auerbach. He’s a software engineer who has worked for Google and Microsoft and recently wrote a book. The title is a bit on the scary side … MEGANETS: How Digital Forces Beyond Our Control Commandeer Our Daily Lives and Inner Realities. It’s available on Amazon and other stores.
He says technology is already way out of control … just like weather.
“Rather than treating these technological systems as based on algorithmic recipes that we can just debug, it’s better to think of them like the weather. The weather is a classically chaotic system that we cope with, but nobody thinks that we can actually control …”
Check out our conversation:
(subscribe to my YouTube channel)
Or of course you could listen to the audio version in the TechFirst podcast …
Is artificial intelligence out of control?
Note, this is an AI-generated transcript. It probably contains some errors.
John Koetsier:
Are we losing control of our technology? Welcome to Tech First. My name is John Koetsier. Our guest today is writer, technologist and software engineer and author, recent author who’s previously worked at Google and Microsoft. His name is David B. Auerbach. He’s the author of Meganets: How Digital Forces Beyond Our Control Commandeer Our Daily Lives and Inner Realities. That’s a mouthful. Welcome David.
David Auerbach:
Welcome, thanks for having me.
John Koetsier:
Hey, super happy to have you. Let’s start with the question that the title begs, are we out of control with our technology?
David Auerbach:
I think we are to a much greater degree than we expect or that we think or that our experience has led us to believe. Because I think you see this in the constant either scandals or just mishaps that occur across the board in highly networked systems, whether it’s the cryptocurrency crash, whether it’s just the unpleasantness of social media, whether it’s the recent AI fiascos that we’ve seen.
My thesis is that there’s actually a common route to all of these things. It’s not individual technologies running out of control, but that actually there is something deeper going on and that what we are seeing is a broader loss of control stemming from the combination of hundreds of millions of people interacting with huge network systems constantly and in a way that’s faster than any of us can keep track of.
John Koetsier:
So I want to dive into lots of different parts of that, including AI systems that are black boxes and we don’t understand how they work and how they come up with an answer and why they don’t work in some cases and do work in other cases. But maybe let’s situate ourselves historically first. How is this qualitatively different than in the past? Let’s say a pre-digital past where everything is kind of always out of control, right? There’s no nations and companies and people [in control] and somehow this teeming mass all kind of works or fails, right?
How is it more out of control now?
David Auerbach:
Well, we can say that something new has been added, and it is out of control.
And in fact, yeah, the analogy I make is that rather than treating these technological systems as based on algorithmic recipes that we can just debug, it’s better to think of them like the weather. The weather is a classically chaotic system that we cope with, but nobody thinks that we can actually control. But in some ways, so is the global economy of laissez-faire economy entails ceding a certain amount of control and, you know, magically self-organization seems to happen.
And that’s true of these systems as well, but that doesn’t necessarily mean just as with capitalism, that doesn’t actually, doesn’t guarantee you that the self-organizing principles are necessarily going to be beneficial. Some of them may be, but some of them aren’t. So I think what we’re looking at is we have in a way somewhat similar to other realms that we already are accustomed to a lack of control in, but we have yet to realize it and we have yet to understand exactly the ways in which we don’t have control over it, because we are still trying to play whack-a-mole and fix things one bit by bit.
So that’s a very good question, and it’s, again, I think we used to have more control over computer technology. Eventually, it was a bit of a pain in the butt, but the possibility still exists to track these bugs down and squash them one by one. And Windows did indeed get better over time. It did get more stable. If you compare Windows 3.1 to Windows 95 to, I guess, NT and XP were, I think, one sort of actually approached something stable.
There was that progression. network technologies, you aren’t seeing a progression toward stability. And that’s the difference we can say even within the digital realm.
John Koetsier:
In fact, are you seeing a progression towards instability?
David Auerbach:
I think we’re seeing a progression towards greater complexity, and that does bring a greater degree of instability with it. Now, that’s not to say that these systems may not stabilize, but they may not stabilize in ways that we particularly like. If we want to see a stable version of a toxic online media ecosphere, nobody wants that stability.
So I do think that even when it comes to something there’s a certain amount of instability baked into it because you just have the latitude for sudden viral floods of actions in one way or the other. You’re even seeing this in the traditional stock market when people buy up GameStop and instead of soaring and there’s no explicit collusion or anything. It’s just people are able to coordinate an exchange in a networked feedback driven way that wasn’t possible before.
John Koetsier:
That is a fascinating example. I’m in Wall Street Bets, which is the Reddit subreddit that drove that.
David Auerbach:
You are at the heart of it with who is the GameStop guy? I forget his name …
David Auerbach:
I write about him in the book. I write about him in the book, actually. So while you’re going, I’m gonna find this, because it’s an amazing story, yeah. That you aren’t part of some conspiracy to artificially inflate stock prices, or maybe you can correct me that you are, I don’t
John Koetsier:
Or maybe it is, but it’s a different kind of conspiracy. It’s just an agreement out in the open. Anybody can see it. Hey, let’s do this.
David Auerbach:
I think that makes, I would say that that actually makes it a non-conspiracy.
John Koetsier:
Yeah, I guess so.
David Auerbach:
One of the people who endorsed the book called the an anti-conspiracy theorist, and I think that that’s it, is that there’s nothing particular that, there’s nothing too nefarious going on or at least there’s much less nefarious stuff going on than we’d like to think. What’s actually happening is there’s just too much chaos. And this is what the SEC ran up against this. There’s nothing they’re not actually doing. People aren’t actually doing anything illegal. But you can see that they’re unhappy about this. It’s like, we need to stop participation.
John Koetsier:
How dare the little guys win. We have a system here. We have rules. We have big investment banks. They’re supposed to win!
David Auerbach:
Very much so. Yeah, well, I think the attitude is, yeah, the attitude is they’re messing with the system. Even if you don’t win, they don’t, you’re messing with it. You know, it’s like stay out of it, stay out of our turf.
John Koetsier:
Wall Street has a long history of messing with the system in order to make money off of things that are not socially beneficial to anybody except the people who take the tens of millions of dollars of bonuses.
David Auerbach:
Indeed. But they are doing God’s work, as some people have said. Keith Gill, that was his name.
David Auerbach:
Keith Gill, also known as Roaring Kitty on YouTube … I write about him in the book, because he testified before Congress, as this just all shucks guy, and just said, well, I just like the stock. And it didn’t seem like a performance to me. who’s like an insurance adjuster or whatever and he’s doing his thing. And what’s crazy is, you know, I don’t think it’s not like he’s going to be the center of things in the future.
It’s just that these phenomena can now pop up, become viral, and then fade back. And the next one, you don’t even know where it’s going to be. So by the time you close the barn door on one, the next one’s already happened. And that’s the sort of lack of control that I’m talking about.
John Koetsier:
From 15 minutes of fame to 15 seconds of fame, perhaps in the internet age, who knows. So that’s perhaps one example, perhaps Silicon Valley bank failing, the biggest bank failing in each decades at least, is another example, I’m not entirely certain. Let’s get into some specifics of what you see as impacts of us losing control of technology. What are some other scenarios that have happened?
David Auerbach:
Well, the cryptocurrency one is a big one that, you know, to some extent, a lack of control is baked into it. But some of the phenomena we’re seeing are certainly not what people intended. And the, in some ways it does serve as an analog for these systems more generally in that they are consensus based mechanisms. a major amount of the currency or processing power. But I don’t think that cryptocurrency is so far off from how these systems work more generally.
And you’re getting to this point where you could see a crash, something like in 2008, except you’re not, you know, a bailout isn’t really going to be possible because the money is just, well, I don’t know, maybe it’ll be socialism today because you’ll just have to give money to all sorts of random players. I don’t know. But, you know, it actually poses bigger problems than what we saw with Lehman in 2008. And that wasn’t exactly popular with everybody either.
John Koetsier:
No.
David Auerbach:
But now that you’re seeing, last year, obviously, there were a number of crypto crashes, scandals. You had the play to earn cryptocurrencies in the Philippines where store owners during COVID were just sitting around playing cryptocurrencies and hopefully cashing them out because it’s completely crashed now. But you saw this complete crash of lots of currencies and NFTs. You saw the particular collapse of those so-called stablecoins, Terra and Luna, and then even more so the FTX, San Bankman-Freed blow-up, which was especially interesting to old school elite economists.
John Koetsier:
Yes.
David Auerbach:
So that should tell you what direction we’re moving in and there’s too much money there for banks and investors to ignore. So the crypto economy is being integrated into the larger economy and it’s going to bring that level of destabilization to things more broadly.
John Koetsier:
So…
David Auerbach:
I can give you another example too, but that’s another big one.
John Koetsier:
Let’s go to the other one because I’m wondering about crypto if that isn’t just a, first of all, hype cycle and a bubble, a tulip bulb craze plus a ton of grift and a ton of just financial malfeasance and cheating and everything like that. But I see that if you integrate that into the ordinary economy and you get some of that chaos there as well, that can be a challenge. What’s the second example you want to bring out?
David Auerbach:
The second one was actually the recent AIs and how they are talking about releasing nuclear codes and how they love you and how they’re having nervous breakdowns.
John Koetsier:
Or trying to get you to divorce your wife.
David Auerbach:
Yeah, that’s right. And in that case, what I wanna talk, it’s because of how they act, it’s tempting to think of them as somewhat autonomous discrete entities.
They aren’t, they don’t mean any of this stuff. They don’t even understand what they’re saying. They’re very convincing, but they don’t mean it. They don’t understand any of it. actually doing is reflecting the collective dreams and nightmares of what’s fed into them.
And so that is why it’s similar to these other systems in that it is actually the actions, the uncoordinated and sometimes coordinated actions of many, many people that actually determine what these AIs put out, not the programmers and not some brain inside the AI because that doesn’t exist. It’s much more that what we see, what these AIs tell us, is a reflection of our collective and why does it spin out such scary fantasies? Because if you ask it for scary AI stuff, it’s going to draw on what it’s been fed.
John Koetsier:
Yes.
David Auerbach:
And that, in turn, will cause people to write articles about scary AI fantasies, which will get fed into the next iteration of these chatbots.
John Koetsier:
Yes.
David Auerbach:
And that is a similar feedback loop to what you see with these other things where you get these feedback cycles of, okay, everybody’s getting in on GameStop, Everybody is suddenly talking about Elon Musk, that it is a similar feedback driven effect. It’s just that with AI it’s not as obvious because it appears to be sort of generating this out of whole cloth, but it’s not.
John Koetsier:
What interests me about large language models like ChatGPT and others like that, obviously they’re interesting inherently and what they can do, what they appear to be able to do is frankly in some cases amazing. In some cases game changing. I’ve seen for instance the CEO of a drone delivery startup actually ask chat GPT to create code which was then used in some of the systems. engineer, right? I mean insane stuff that you could not have imagined maybe 20 years ago or something like that actually happening today. What interests me is that it’s getting easier to train these large language models. And we can’t assume that Google, Microsoft, maybe Apple, and others, open AI obviously, are the only ones who have them.
We see Yandex is working on them, others.
There’s so much malfeasance on the internet, ad fraud and other things like that, very high level, high tech criminality. What is going to happen when some criminal organization trains one of these and actually gives it arms and legs in the real world, so to speak, of the internet, gives it free reign.
David Auerbach:
I don’t think there’s any stopping that. I mean, if I run a cult and I want to set up a chatbot that is fed on, that’s trained on particular things that may draw people to my cult, I can do that. And technology …I don’t see any way you’re going to keep this out of hands of just anybody, not just like states or other states or shady corporations, I think. Anybody’s going to be able to train these things.
John Koetsier:
Your cult example is interesting because we’ve seen how emotionally engaged people get with AI systems. There’s an example, I forget the company name, but they created an AI companion. And they recently took the ability for it to talk dirty. I don’t know exactly the details, but basically get sexual essentially.
David Auerbach:
Right.
John Koetsier:
That caused huge angst among a subset of their customers who are …
David Auerbach:
Yeah, they’re traumatized. Yeah, they feel abandoned. Replika, right?
John Koetsier:
Yes, I think you’re right.
David Auerbach:
Yeah, they even named themselves Replika. I mean, it’s just like, are you, it’s just like, you’d think that would scare people off, but apparently not.
John Koetsier:
Yeah.
David Auerbach:
I mean people got attached to their Tamagotchi’s, those little virtual pets, and those things were incredibly primitive. So from my perspective, there’s absolutely no limit to how attached humans can get. a demand for it. I mean, I think people will want that to feel that the way that they make it feel to such an extent. There’s no, there’s no one’s going to ban it …
…and it’s going to change, because as I said, they do not feel in the way that we feel. But if it makes people feel good anyway, well, maybe we don’t need the human emotions we feel. Maybe our concept of love is [out of date.]
Did you ever play Farmville back when that was a big thing?
John Koetsier:
My kids did, yes.
David Auerbach:
Yeah, those things were great at sucking money out of people because you didn’t want to see your crops wither. Well, that’s a pretty primitive way of pulling on people’s heart strings. Imagine generation, body generation, you’re going to see fairly convincing virtual people in a fairly short time frame, not just in text but in voice and appearance.
John Koetsier:
Imagine that in the hands of the types of people who scam lonely individuals out of thousands, hundreds of thousands, millions of dollars, whether that’s elderly people or whether that’s people who are looking for love in all the wrong places and will get scammed. Imagine that technology, being able to do that at scale with a unique face, a unique body.
David Auerbach:
Like Glengarry Glen Ross except with robots you know
John Koetsier:
I can also see in terms of being out of control, um, some of the technological systems that we’re building are too complex for any one individual to understand and comprehend. And in many cases, individuals, uh, teams of individuals as well. And then we’re actually setting those up and those are colliding in real time on the internet and engaging. And, of that might become. What are some safeguards? What should we be doing about this?
David Auerbach:
Well, as I said, trying to fix things in a targeted way, you’re always too late. By the time you’re aware of it, it’s already gone.
So what I advocate in my book, Meganets, is more softer, non-targeted fixes. To not censor, but to do things like rate limit and enforce heterogeneity. Because what you’re seeing also from recommendation engines, these incredibly complex algorithms, which are evolving constantly, aren’t guaranteed to do the same thing a second later because they keep getting new inputs from all the people that are working on them, that they tend to cluster like with like.
They’re giving people what they want.
Well, sometimes giving people what they want may want a little less, may actually be good because I think what we’re looking at, there’s three qualities that I talk about. There’s this volume, the sheer size, velocity, the sheer speed, and virality, by which I mean the sort of feedback loop where things build on themselves. And since you’re not going to be able to stamp out the bad instances one by one, even Facebook admits they can’t. What did they do in the run-up to the 2020 election? They banned all political advertising. That’s not a company that [has it’s data under control.]
Like this is something that has been acknowledged, right? And that sort of less targeted mechanism. They also, they said, oh, you can’t forward a link to more than five people. Well, that will certainly do it. Now, I don’t know if those two particular mechanisms are good ones, but I think that’s the sort of direction we should be looking in to say, OK, look, in terms of how much things can be forward. We can try to sort of get more disparate content mixed together. I think TikTok is doing something like that. They’re trying not to recommend such tight matches because I think teen girls were getting too many pro-anorexia videos, things like that.
So that’s the direction I’d like to see us go in. And I think there’s a lot of room for experimentation there. key to work. But what I can’t say is that what people are complaining about and asking for now isn’t going to happen and it hasn’t been working because I don’t think most people would say that things are getting better. So we need to change the question we ask, which is not how to eliminate the bad stuff, but how can we make it so that the potential impact of it is slower and more moderate.
John Koetsier:
Two things come to mind as you’re talking there. One is: there’s a small town in California that has banned people from coming because there’s amazing hills and fields with wildflowers. And it got popular online. And they literally were inundated with hundreds of thousands of visitors last summer. And tourism is great, dollars are great, but totally destroying a town and actually destroying the thing that you came to see and people walking out to be surrounded by the wildflowers, they severely clamp down on how many people could come by limiting parking and all this stuff, right?
That’s one thing, that’s sort of a chaotic outcome of algorithms that are feeding off human behavior.
The other is, and I forget what country this is, but there was literally a genocide that was driven by WhatsApp or Messenger and Facebook in the past few years. I believe it was in Africa or Asia. I’ll find out later and I’ll put it on screen. I don’t know if it’s ringing a bell to you, but it’s literally something that was forwarded, string, string, string long via WhatsApp, I believe. And so Facebook has tried to do some things to limit that as well.
But it’s amazing how the algorithms that we’ve created to find the best have really limited the range of experience … you search for best restaurants here and you go there you have a horrible experience because it’s crazy busy and everybody is there. It’s like we’re afraid to you know what let’s just try something. See something, go somewhere.
David Auerbach:
I mean, it’s the, it’s that that’s the thing is that is that are the systems I mean, what are algorithms going to do by definition they’re going to try to to make matches. So unless you’re actually explicitly putting things in to actually degrade them. And I guess what we can talk about is an algorithmic degradation. Yeah, you’re going to see it try to say like, okay, here’s more of the same, here’s more of what I think you want, because what, you know, these they can do, even in AI, what’s it going to do? It’s going to give you what you want. And I think what we’re seeing is that giving people what they want as our capacity to do so increases the problem of giving people what they want becomes more apparent in many, many contexts.
John Koetsier:
I don’t even know if it’s what people want, but it’s what we’re kind of hardwired for. We’re kind of hardwired for newness. We’re kind of hardwired for controversy. We’re kind of hardwired for, you know, something to get passionate about.
And that can be exploited in certain social systems, search systems, recommendation engines and stuff like that. and look forward a few years. I’m not confident that we’re going to find ways along the brakes that you’re suggesting we put on systems and everything like that.
What’s the end state here? What’s the end game? What do you think will come crashing down?
David Auerbach:
Crashing down or just the changes because crashing down. I think you will see more instability economically in particular because you’re going to see more things like like the game stop like the Wall Street Bets except on a bigger and weirder level.
I think cryptocurrency will have a greater impact there as well So you’re gonna see more things happening with or without just the dishonesty of something like FTX to see is again, yeah, the creation of narrative bunkers in which people of who share the same opinions, who share the same way of thinking about things are looped together and instead of fighting with other groups, they just they aren’t even aware of them.
So you are going to get increasing number of small microcultures that are going to have a hard time even understanding.
And I think if you look online, the amount of jargon that every subgroup has and the performative role that it plays, that we identify you by, oh, you’re using the right words. And these words, sometimes they’re obnoxious to other groups, but other times it’s just like, I don’t know how to talk like this. I literally don’t understand what these people are saying.
And even in talking about my book to a wide variety of people, I’ve people have takes on technology. Some people see it in terms that are almost incomprehensible to me or in ways that are so different from what other interviewers say to me that it’s as though we’re talking about different issues. And I can’t really think of, that’s the lack of fundamental agreement even on the framing of these issues.
This is something that is fairly new to have a fairly unified top-down media environment that would set narrative parameters. There was still some room for argument or whatever, but everybody was speaking the same language. That’s disappearing now.
John Koetsier:
That does not bode well for peace and living together. We’ve seen an increase in violence, not just in language and in what people do online, but also in the real world as we live in greater and greater different worlds, essentially, and those clash and we can’t seem to find a way to get along. That is worrisome.
David Auerbach:
Well, if there’s any consolation in this, I do think that it will produce less clashing and simply incomprehension. People will simply ignore it. There are huge, I’m also amazed that I can talk about a particular meme or a particular phenomena. And there are a lot of people who’ve never even heard of it. There’s just too much out there.
So the fact that there’s no sort of set of canonical, essential or central signifiers to a culture now. It’s the atomization of culture into a lot of narrative bunkers and microcultures. And the effects of that may ultimately yield less conflict and more just simply a sense of the world being incomprehensible as a whole, of it being impossible to get enough detached perspective to actually understand what everybody’s doing.
John Koetsier:
Which is a political tactic in and of itself, because if everything is incomprehensible, if everything is too complicated, then we just throw our hands up and say whatever, and then powers that be can do what they will and do what they wish. But also the challenge I think is that it’s not just in the digital realm that we have this atomization of culture, eventually it hits the ground. Eventually it’s in the real world.
David Auerbach:
I think that in that regard, this is an example that I cite in the book as well, which is that it’s not that the powers that be have lost control. It’s sort of devolved into a little bit of power has been fallen on each of us. We can’t do much with it because we’re only feeding into these massive machines. But the powers that be can’t set narratives the way they used to. When you look at the parties today, something like Brexit was a lot less a matter of the powers that be than some people may have seen. It’s not that the attitudes weren’t, but these things tend to run out of control. So if the powers that be were running it, they would achieve their objectives through slightly less messy and slightly less effective means.
So I think you are looking at the fact that in the lunatics, and everybody’s a lunatic, are running the asylum, that it is harder for the powers that be to actually take advantage of this. And in New York, in the New York mayoral race, there’s actually, I think, less positioning and effectively the least unacceptable candidate won. Because there was so much chaos and so much inability micro-targeting, because we had moved to ranked choice voting, rather than just like vote for your favorite candidate. Polling became impossible. So the candidates themselves were at a loss and just sort of said their spiel. So I don’t know if you want to take heart from that …
John Koetsier:
I don’t.
David Auerbach:
Well, the powers that be are losing control as well.
John Koetsier:
You know, nobody wins in chaos, or maybe the strongest survives, but it’s a nasty brutish and short world.
There are real challenges there. You know what we never did … we never defined a meganet. And I’m going to read your definition here of a mega net.
A mega net is a persistent, evolving and opaque data network that controls at least heavily influences how we see the world. It contains both algorithm and AI-driven servers, as well as the millions upon millions of users that are frequently or always active on those servers. The result is an ongoing feedback loop that causes the meganets to change rapidly and unpredictably in proportion to the three qualities of meganet content, massive volume, high velocity, explosive virality. That means quick change, unpredictable change, sometimes good things, sometimes awful things, increasingly out of control.
David Auerbach:
Yeah, that’s per the book, and I hope that that speaks to, that sums up the things we’ve been talking about. That is my view of it. And that was, you know, when I came to understand it and that way I felt that a lot more things made sense.
I have, yeah, I’ve got, you know, I at least have a framework so many other people are feeling this loss of control. And that understanding is at least the first step to taking what action can be taken.
John Koetsier:
Yeah, it’s interesting as well because we talked about it impacting the real world. It, it, it, it’s making changes in the real world at a speed that you would never, never, ever see. Uh, just one example in the past six months, uh, an obscure government report comes out that gas stoves, uh, may cause cancer in young children, may be unhealthy. Boom. A month later, you have a law on the books in Florida. Tax-free.
David Auerbach:
Yeah, exactly, yeah. And, you know, there are people who definitely try to take advantage of these things, but even they are limited to the extent in which they can push one thing or another.
That one hit big, other things did not. I’ve seen the other, there have definitely been other things that try to get pushed around like the, what was it called, the, It was a future in which no one could supposedly no one could leave their town. Everything would be local. Like
John Koetsier:
Yes, it’s the 15 minute town concept. I think it’s from Europe somewhere or something like that, which is basically everything you need is close. So you don’t have to travel super far for work, for groceries, for entertainment, for blah blah. It was a really good thing in a sense. And of course, some conspiracy theorists said, hey, well, they’re making a new law. You can’t leave your town.
David Auerbach:
Right, exactly. So I got spun up into that. That one didn’t seem to hit as much as the gas stoves, though,
John Koetsier:
It was too stupid.
David Auerbach:
But it wasn’t for lack of trying, though.
David Auerbach:
And that’s the thing is that even if they don’t blow up to that extent, they still have their adherents . And I still see people mentioning the 15 minute town. And that’s what’s going to happen is that you’re seeing all these more subtle changes that are affecting people. of people who hang on to that 15 minute town concept. There’s no sort of ecological and pruning and natural selection of ideas. People can hang on to them and reinforce them.
And that contributes to the atomization as well.
John Koetsier:
Exactly. And here’s the problem as you have that continued, just entirely different sense of what the world is, what reality is, what the baseline principles and foundations of our lives are. As that totally shifts, you get people who believe that something that somebody thinks is innocuous or not a big deal is an existential threat to their lives. but places with guns. And the US has a lot of those. And there you get problems. We have to bring this to a close. I really do appreciate this conversation. I don’t know if we understand more, but maybe we understand a little bit more about power.
David Auerbach:
I hope so. Understanding is good. Some understanding.
John Koetsier:
Exactly.
David Auerbach:
Even if the limits of our understanding are good to know as well.
John Koetsier:
We’ll have to design a super AI to understand it all for us. Thank you so much for your time, David.
David Auerbach:
Thanks for having me, John. It was great.
TechFirst is about smart matter … drones, AI, robots, and other cutting-edge tech
Made it all the way down here? Wow!
The TechFirst with John Koetsier podcast is about tech that is changing the world, including wearable tech, and innovators who are shaping the future. Guests include former Apple CEO John Scully. The head of Facebook gaming. Amazon’s head of robotics. GitHub’s CTO. Twitter’s chief information security officer, and much more. Scientists inventing smart contact lenses. Startup entrepreneurs. Google executives. Former Microsoft CTO Nathan Myhrvold. And much, much more.
Subscribe to my YouTube channel, and connect on your podcast platform of choice: