Where does Boston Dynamics fit in the emerging golden age of robots?
In a lot of ways we’re entering a golden age of robots. We’re seeing prototype humanoid robots from Apptronik, Amazon, Sanctuary AI, Figure.ai, Tesla, Fourier Intelligence, and probably more. And of course Boston Dynamics is the OG of humanoid bipedal robots with Atlas.
But there are a lot of form factors out there. And none of them is perfect for everything.
(Subscribe to my YouTube channel)
In this episode of TechFirst, I dive into the current state and future of robotics with experts from Boston Dynamics: Alex Perkins, Mike Murphy, and Marco da Silva.
We talk about various form factors, including the bipedal Atlas, quadruped Spot, and the wheeled Stretch, highlighting their unique capabilities and applications. We also chat about the complexities of robotic design, from gripping mechanisms to AI in training and sensing, and speculates about the future of humanoid robots and their potential societal impacts.
Get the TechFirst podcast
Transcript: Boston Dynamics and the golden age of robotics
Note: this is AI-generated, and so not perfect.
Mike Murphy: We’re at a stage where you see a lot of excitement and a lot of people, uh, for lack of a better phrase, throwing mud at the wall to see what sticks
John Koetsier: Where does Boston Dynamics fit in the emerging golden age of robotics? Hello and welcome to Tech First. My name is John Gutsier. In a lot of ways, this is the Golden Age robotics rep passing from an era in which most robots were static. They’re a platform, an arm, a complex, but stationary machine to an era of mobile.
Active working robots. They might be bipedal, they might be quaded, they might be wheeled or otherwise mobility enabled. There are other form factors, but the absolute OG of mobile and humanoid robotics is Boston Dynamics, of course, and in age of humanoid robots from the likes of apron, Amazon Sanctuary ai, figure.ai, Tesla four A intelligence and more.
Boston Dynamics is productizing dog-like robots in spot. It’s attacking the warehouse and logistics market with a renew with a wheeled robot stretch, and continues to develop humanoid robots with its renewed Atlas platform. Today we have. Wow. We have Bonanza. It’s pretty cool. We’re chatting with three pretty cool people at Boston Dynamics.
One is Alex Perkins. He’s a senior technical director and Chief Roboticist for Stretch. Another is Mike Murphy. He’s senior technical director and chief systems engineer, and we also have Marco De Silva, who’s VP and GM for Spot. Welcome all of you. Thank you so much for having us.
Mike Murphy: We’re excited to be here.
John Koetsier: This is very cool. I’m excited to have you guys, I’m going to start it off with a real general question. I mentioned some names in robotics. There’s, there’s so much investment going on here. Are we in a golden age of robotics?
Marco da Silva: I’ve been thinking a little bit about that. Uh, I feel like we’re, we’re right on the cusp of the golden age of robotics.
Um, I, I still think, you know, the robots that, um. The kinds of robots that we dream of here at Boston Dynamics are on the verge of, uh, you know, having the, the impact that we’re dreaming of. And we’re just starting to see that adoption, uh, and that’s really, um, primed to take off right about now. Um, but there’s also, you know, a lot, a lot of hype going on.
So, uh, we have to live up to it.
John Koetsier: So we’re in that hype cycle. Uh, it’s gonna be amazing, you know, if we’ve invented the third, the three laws of robotics yet, and, you know, when will a robot servant come into my home and clean and wash and do everything. Uh, so we’re not quite there. Any other thoughts from the other guys?
I.
Mike Murphy: I agree with Marco. We’re at a stage where you see a lot of excitement and a lot of people, uh, for lack of a better phrase, throwing mud at the wall to see what sticks.
It reminds me of the early days of, you know, the automobile or the airplane where you had all sorts of crazy different designs and ideas, and eventually, I think when you really hit the golden age is when some of those really start to stick.
And actually really start to transform society. We’re at the excitement stage, but we’re pretation stage.
John Koetsier: Wow. Cool. Let’s pull on that thread actually. You mentioned, you know, we’re developing all kinds of cool crazy things. Throwing stuff at the wall, seeing what will stick, different form factors, all that stuff.
Let’s talk about form factors. You obviously have, um, made a name for yourself with a bipedal form factor. I. A humanoid robot at the Atlas platform doing amazing cool things. Marco, all this stuff that’s continuing development. We’ll chat about that. But you’ve got a quadruped form factor that’s pretty successful in nature too, by the way.
There’s a lot of quadrupeds in nature. Uh, you know, we think of humans in two legs and everything, but you also have a wheeled robot. What is the right form factor? Is there a right form factor?
Alex Perkins: The right form factor is whatever’s the right form factor for the application you’re, you’re really targeting.
Um, you know, we, we tend to, uh, gravitate towards things. We know human form, animal form, and there’s great utility and, um, flexibility in those forms. As you, as you mentioned, they’re very successful on earth. Humans can do a lot of stuff, but there’s some areas where if you know what the application that needs to be done is, you can design a, a.
Robot for that, that that could be something very specific, like a dish washing machine, which you can consider to be a very, uh, narrowly targeted robot. And then there’s kind of, uh, general box moving robots in warehouses, which is kind of the, the, um, structure that stretch is taking. And then there’s the very general could do anything humanoid form.
And I think they’re all appropriate for the right, uh, use case.
Mike Murphy: And to expand on that little known piece of history, stretch actually evolved out of a humanoid form.
The first box moving we ever did was with Atlas, and as we started to understand that market more, we transitioned from this walking robot, but warehouses are mostly flat concrete. You don’t need wheels.
Let’s use or let, let, we don’t need feet, let’s use wheels. So we moved on to handle two legs still, but wheels. And then, hey, do we really need all this dynamic balance? And we get to stretch where we have the static balance. So the evolution of form actually to suit the application is really one of the cool parts of the, uh, cusp of the golden age of robotics as it were.
John Koetsier: That’s a lot of the art and the genius there, right to fit. The machine to the task. And you’re right about warehouses. They, they’re flat, concrete, very smooth in most cases. And they’re also designed for stuff like forklifts, you know, narrow ones in some cases, but forklifts to get in and out and all that stuff, right?
So if you have a wheeled robot, uh, the savings are immense, right? I mean, in terms of energy, you can take a lot more energy with you because you can take a heavy, heavier bat battery pack, you have a lot more stability because you can have a wider form and all that stuff. Uh, so it makes perfect sense. How do you do the calculus though of I build a general form factor, let’s say humanoid versus a specific form factor.
How do you do the calculus of what is right for each task?
Marco da Silva: That’s a hard, a hard question to answer. And you know, I think it’s in general, uh. No one has really figured out a proper way to, to make those, those trade-offs.
Um, if you look at something like a humanoid robot, it’s easy for people and investors to imagine that, you know, it can do anything a person can do.
But then when you get into a specific application and understand the market better, um, you start to see that, you know, there are some trade-offs you can make to make the machine more reliable, to make it more performant, to make it cheaper.
Um, and that’s, that’s kind of the, the journey we’re on even with, with Spot, um, where we’re seeing opportunities to kind of specialize. You know, the legs are important, you know, a lot of our industrial spaces were designed for people to get around and, um, and do jobs, uh, that only people can do.
So having the legs is important, but, um, you know, integrating the, the proper sensing that you need to do the jobs that people, uh, are doing on these sites. It’s something that we’ve had to like, learn as we’ve deployed spot and, uh, figure out how to make more reliable, uh, make more purpose built for, for that application.
John Koetsier: It’s a super interesting question. I just recently haven’t released the interview yet. It’ll probably come out today. Recently interviewed the CEO of Dexory, which is a UK startup in robotics, and they have what they bill as the tallest autonomous robot on the planet, and it’s designed for a warehouse, for a logistics facility, and it goes up 15 meters high, and it’s to discover what stock do we have, what do we say we have, what do we actually have, what condition is in all that stuff.
And, and for, for that purpose, you need that kind of form factor. You need the ability to see up high and get different angles and everything like that.
Let’s talk about spot a little bit. Uh, quadruped form factor, very stable, can go across different terrain, right? Probably can get faster, achieve faster speeds than current bipedal mobility, uh, form factors that we have available today and works outdoors quite well.
Talk about that form factor and why it is what it is.
Marco da Silva: Yeah, I, I, I got a healthy appreciation of the advantage of Quadrupeds. Um, several years ago we were doing a, a VR video at Mark Ray’s house, uh, in, in, uh, Massachusetts here.
And the idea was to show Spot and Atlas walking together in the woods. And Spot was basically just doing laps around Atlas. I was operating Atlas at the time, and it was, you know, pretty frustrating trying to keep up.
I was, you know, we, our robot was walking on the path and it was doing an admiral job, but spot was just, you know, scampering around the woods, climbing up hills, um, you know, just doing laps.
So you got a real appreciation for, um, how the simplicity of that quaded form factor is really advantageous for mobility. Um, and so if. Know, your, your chief goal is to get around and get around quickly. Quaded has a lot of advantages there.
John Koetsier: I gain an appreciation for the quaded form factor every time I go hiking.
Um, there’s, there’s a mountain grind near where I lived that, uh, before I injured my knee, which is totally getting better. I used to do quite regularly, and the dogs would just go nuts, right? They’re going up this steep hill and they, you know, they’ve got it, they got a handle. It works. It’s perfect. Two legs, a little more challenging.
Mike Murphy: I think there’s another thing worth noting about a quaded form factor when it comes to robots is that you have three actuators times four, which is from a, you’re talking about a robot. So you’re talking about something that has to be produced. Manufactured cost matters, simplicity matters.
There’s a huge advantage for a quadruped over a biped where you might have 15 actuators times two versus three actuators times four.
Hey, one of ’em is a heck of a lot easier to produce and make actually impactful in the world.
John Koetsier: That’s huge. That’s absolutely huge. Uh, I wanna bring, Stretch into it as well. Stretch is of course the wheeled manufacturing warehouse robot that you have.
Uh, because we see all these demos, uh, from humanoid robotics platforms, and don’t get me wrong. I love it and I’m excited about them and, and yeah, that’s the dream. And the asthma of robot, absolutely wanna see that and everything, but we see these demos and teams have invested so much time, so much money, so much training in getting this bipedal locomotion down, right? Making that happen, making that work.
And then we see it and it’s walking like a geriatric 85-year-old, um, you know, somebody who can’t go very quickly, right?
And, and then, and then you think about, okay, this one is going in a warehouse. Like Tesla said, they’ve got two of their Optimus units working in the warehouse right now, or in the factory right now.
Now I’m on two legs. I want to grab something. Let’s say it’s a 50 pound box. For a human that’s fit. That’s not that big a deal. It’s, it’s a decent weight, but it’s, but now for a robot, all of a sudden you have this off axis amount of weight and all a sudden you’ve got a balance for that and every, all kinds of challenges there.
And you’re moving like this, right? I mean, there’s a real challenge. Talk about stretch and speed.
Alex Perkins: Yeah. Maybe I’ll take this one and I’ll, I’ll speak out of both sides of my mouth a little bit because we have a history of, you know, a slower, uh, older atlas robot where we were doing early. A manipulation with boxes.
So we experience that, um, that case rate, for example. But we also have an active program where we’re overcoming some of those, uh, speed challenges. And we won’t speak too much to that, but I won’t discount it. There’s a lot we can do and there’s a lot we’re doing. But going back to our, our history here, I.
Um, we, we did exactly what you just described. We got excited about human noise, moving boxes. Our customers, uh, love to see that on videos they asked about, you know, can we have these in our, our warehouse? So we started doing some, um, some experiments internally to our labs. And you notice the kind of limitations.
Um, you mentioned weight. Uh, you can make robots pretty strong. For a human form factor, actually the, the reach compared to industrial robot arms is pretty challenging, especially when it comes to needing both hands and then needing reach. You’re limited by your farther arm. There’s some constraints there that just start to compound into performance limitations.
We then move to this, um, wheels plus Legs Robot we call handle. Um, you may have seen it looks like an ostrich and we were doing some applications in the warehouse with that machine and it was doing a lot better. You can really good straight line speed, uh, good efficiency. But like you mentioned, can’t carry that much battery with a system that’s carrying its weight like that.
Uh, and it had to balance all the time, and it, it also needed to, um, it could only look and manipulate where it was pointing. So there’s a lot of this turning around. Uh, we then started to explore the current form factor for the stretch robot, which is this, um, size of a pallet footprint. From 1.2 by one meters and then, uh, uh, an integrated base with batteries and safety systems and an arm on top that perception system.
And that thing, when you compare ab against the ostrich machine, it was, you know, five times faster immediately as soon as we’ve produced the first prototype and we’re like, oh yeah, this is the thing for this application. And, uh, we’ve been moving, um, cases much more quickly ever since.
John Koetsier: First prototype was five times faster?
Yeah. Wow.
Alex Perkins: You couldn’t, yeah. I mean, you mentioned before. The, uh, you know, the, the turning slowly and the spinning and maintaining balance. You don’t need to do that. If your arm can pivot and you have the reach and you have the strength, you can just cycle very quickly.
John Koetsier: Now, one of the things that if you do manufacture a humanoid type of robotic platform is you can assume that it can kind of do it all if you train it right?
So most jobs that we have are not. Move this thing from here to here, right? It’s, there are more complexities involved, right? And so having a general form factor enables if you can train for it and if it works with the, with the constraints of your equipment to do those. Multiple things. If you build something like stretch, then I assume this is a question, I assume you’ve gotta put a lot of thought into how does this work with humans?
Where are the handoffs and how does the, how does the integration work? Talk about that a little bit.
Alex Perkins: Yeah, it’s a great insight. Um, in the warehouse and manufacturing space, there they’re, uh, robot arms called cobots that are intended to be kind of nearby people and you have a little more interaction.
Stretch and machines like it that move significant amounts of material very quickly, really aren’t suitable to be working in the exact same space as people around it.
So we’re, we’re in the same environment, but we shouldn’t be sharing the same spacing. Really collaborating on activities. There’s, there’s more of a separation and handoff of the work stream.
Um, but you can’t always, um, uh, assume that’s gonna be the case because. People are needed in all sorts of applications, and they have their own jobs in the warehouse and they’re gonna cross paths now and again, the robot might need help because shrink wrap is blowing in the wind. There’s gonna be some interactions and we spend a lot of our time on designing systems that allow that kind of interplay in a safe, uh, capacity where the robot can do what it needs to do when it knows it’s in free space, but it can always detect that someone’s coming in, do the right actions like slow down or stop, allow that, that interaction to occur and then resume when it’s, it makes sense to do so.
John Koetsier: Cool. That makes sense. Uh, while we’re on that topic, um, I want to talk about gripping, um, because I did see that stretch. It looks like a suction gripper. Um, and, and I wanna talk about why you chose to do that and, and what are the trade-offs here and there?
It’s funny, I was chatting with Sanctuary’s CEO Jordy Rose, and he told me more than half the complexity of a robot is in the hands.
If you’re talking about humanoid robot, right? So many digits, articulation motors, uh, connectors, uh, all that stuff that’s required, right? You chose a, uh, a simpler methodology. Why?
Alex Perkins: Well, in some ways we did, in some ways we didn’t. Um, the, the gripper for stretch, we call the smart gripper and it actually has, um, individual cup control of every cup.
So senses and, uh, can control the flow into each of the cups. So if there’s a damaged box, we can turn off flow through the, uh, cups don’t, that don’t have good purchase, but maintain and maximize the suction on the cups that do. And we can also do things like multi pick where you can grab multiple boxes at once and then treat them individually as you place them off.
And this is great. It also means there’s, there’s actually the most number of degree to freedom in the gripper of the whole system. Um, so it’s very similar to like kinda a human hand, right? But you did ask about, uh, you know, physical purchase versus suction, vacuum suction.
It turns out we’ve tried a variety of things, especially for moving boxes and for enough boxes in the world, you can move them very successfully with suction.
If you get a good, uh, grasp on a box, you can hold hundreds of pounds and, um, we found effective ways to do that even in a mobile system, which is great. It does mean that stretch can’t currently pick up every object that you’ll find in any warehouse, but there are enough, let’s say, uh, cardboard, uh, brown boxes in the world that there’s plenty for stretch to do now and we can improve on it, uh, as we go forward into the future.
Mike Murphy: Cool. This is one of those places where I’d point out, you know, you said people make a humanoid robot and they assume it can do what a human can do with enough training, uh, et cetera.
When I pick up a big box, I’m actually using a lot of sensing very far away from my hand to figure out where is that box touching me?
How am I holding it that, you know, makes that biped or too armed, uh, control of an object? Very complicated. I would argue that suction in the warehouse environment is actually a super human capability that machines can have, that people don’t, just like wheels, I would argue, are a super human capability on smooth concrete over legs.
Yeah,
John Koetsier: that’s a super interesting insight and it also brings to mind that when we as a human go and lift something, we make a thousand tiny decisions about what we think it is, how heavy we think it is, where we think we can lift from, where we can apply pressure, where it wouldn’t be safe to apply pressure, all that stuff.
If you are using. Appendages, whether they’re hands or, or something like that. You have a lot of decisions to make if you’re, if you’re going to use suction, you know you’re going for the top and you’re getting pretty much full top coverage. So you’re, you’re gra you’re grasping a lot at once. Correct.
Alex Perkins: Yeah. And there’s a there’s a challenge in the, the situation you described with using kind of physical appendages here. Oftentimes boxes are stack very tightly together. Yes. So that they can fit the most of, of them in a pallet or in a truck. And in order to get your hands around the object, you have to find those creases and move in, shift things.
Humans do it very naturally. It’s a quite difficult problem for, uh, manipulating robots at this point. If you have an exposed face, whether it’s the top or the front, and you can get suction against that face, uh, your, your job’s mostly done.
John Koetsier: I like that. It’s kind of interesting. I did work in a warehouse when I was in university and um, you know, I had to move stuff, ship stuff, build stuff, all that stuff.
And it is not always easy, right? You come to a wall of boxes, what do you do? Okay? Lift the top one a little bit. You grab a corner underneath and then pull, you don’t have full purchase, and you, you shift as you comes farther out. It is a complex process. Talk about Spot, um, Spot is, uh, I don’t think there’s any gripping on Spot.
Am I wrong on that? Uh, have you, have you considered putting a mouth on Spot, like a dog to carry stuff?
Marco da Silva: Yeah, we, we actually do sell, uh, an arm, uh, manipulator arm for, uh, for spot. It’s a seven degrees of freedom if you include the gripper, it has like a, we call it, you know, uh. Oven oven, mit, uh, gripping capability.
Um, but it has, you know, a color camera in there and a depth sensor in there to try and, uh, make it possible to do things like click on an object and pick it up autonomously. Um, so it’s seen most adoption with, in our kind of like public safety use cases where, um, you know, they, they wanna investigate a suspicious package, pick it up, shake it around, open doors, um, that sort of thing.
But yeah, it’s, it’s a very simple gripper, but it’s kind of interesting how much you can do with even a, um, you know, there’s like this toy, right? Like the gripper claw thing that you can, uh, buy at a, a fair or something. And you know, if you just walk around and pick things up in your, your house, you’re fairly capable with that.
Mm-Hmm. Um, and so that’s what we went after, uh, on spot. Where does that mount on spot? Um, it’s kind of on the front of the robot, so right above, uh, the front hips. Okay.
John Koetsier: Interesting. Interesting. Very, very cool. Um, I wanna talk a little bit about training and, and we’re gonna get into sensing and movement and safety because a lot of the challenges with robotics, uh, are related to training and a, and learning and ai.
Talk about some of what you’re doing in that area in terms of. The robot understanding where it is in the world, what’s around it, what to do and how to do it.
Marco da Silva: Yeah. This, this is actually, um, one of the, you know, you were talking about the golden age of robotics. I think this is gonna be one of the enablers for that.
Um, there’s been a tremendous amount of progress on these general purpose, um, generative AI models, foundation models. That bake in a lot of, uh, semantic understanding that was really difficult to do with ML before. Um, and I think you’re gonna start seeing, um, a lot more, um, use of that in, in the future for the robot to understand, um, where it’s appropriate to step, where it’s not appropriate to step, that sort of thing.
Um, you know, I think when we first started developing, um, spot, it was a very simple decision like. This is a bad place to step or this is a good place to step, or, you know, this is okay for the body to go there or it’s not. Um, and now that, that, uh, decision making can be a lot more nuanced thanks to these improvements.
John Koetsier: I was chatting with, um, apron, CEOA couple months ago, and we were chatting about LLMs and training and, and we were talking about starting to adapt them for building the stages of learning. And the, the example that we used was, uh, suppose I have a robot, uh, and it’s in my home and it’s in the kitchen. I say, make a sandwich.
You know, or, or cut a tomato, let’s say. Right? And, and for a human, that one instruction works. ’cause we have this whole worldview. We understand, you know, sort of where knives can be found, that you need a cutting board. You’ve gotta go in the fridge to get the tomato. The, the, the, the, the cutting board goes on the, the counter, the tomato goes on there, the knife goes in your hand and what you do and everything.
But for a robot that’s tens of thousands of instructions if you were gonna program that.
Right. And, and so he was looking at potentially using LLMs for, uh, translating those instructions into essentially programming to do different things. Have you looked at stuff like that? How are you approaching complex actions?
Marco da Silva: Yeah, I mean, we’ve definitely played around with that and we, we have a video on our YouTube channel that’s kind of a simple version of that where, um, you know, you can have a natural conversation with the robot, uh, and it speaks back to you and it’s like describing what it sees. Um, but you can also command it to do things like show me the, uh, you know, the kitchen and it kind of knows how to, uh, translate that text back into its map.
Um, and then plan a, a route, uh, that way. And we’ve had, um, lots of, um, customers of ours who have leveraged spot’s, API to, to do similar work. Um, so that’s definitely like an active area of, of development out there.
Mike Murphy: Mm-Hmm. I’ll also point out that in the warehousing space, lots of warehouse management systems basically use voice commands to a picker to say.
Go to this bay and get me five boxes. The level of software integration that would have to go in and the number of APIs you would have to cut in to integrate with all these different warehouse management systems could be immense. Or you can simply take that very simple command that a human understands instantly and poof, you’re done.
John Koetsier: Mm-Hmm. So you’ve got that right now. You can do that right now. Or you’re working on that so that you can basically tell stretch, Hey, go to such and such a bay. Grab these boxes and come back.
Mike Murphy: Yeah. Go to go to such and such a bay. Grab these boxes. Tell me how much inventory is left on the pallet. Like these are pretty simple commands and we’re working towards fully integrating those so that stretch can just go take that level of input and execute a task.
John Koetsier: Super, super interesting. Well, I wanna talk a little bit about, uh, humanoid robots and, um, you’ve got, I mentioned you guys were the OG in robotic space. We’ve seen Atlas do crazy things for a long time. Uh, it’s been amazing. You’ve changed the platform, uh, and electrified the platform from hydraulics.
You’re still developing and innovating there. I don’t think you’re actually selling a product there. I’m not sure anybody is selling a product right now for Humanoid Robot. I think a few people have sort of announced that, but you know, maybe there’s a couple test customers, but I don’t think anybody’s got like, here’s your robot store.
Go get your humanoid robot, you know, 10,000, a hundred thousand dollars a pop and walk them out the door and put them to work in warehouse. I don’t think anybody has that right now. And I realize you guys aren’t the ones who are working directly on that. Talk to me about that development, that effort, where you see that going and where you see that, that type of robot being useful.
Mike Murphy: I’ll, I’ll spin it maybe a little differently.
What do you need to make a great humanoid? What are the fundamental building blocks? Man, you’re not carrying that much battery. So you need some pretty awesome battery chemistry to allow you to get either the energy or power density that you care about. Um, your actuators need to be super efficient because you don’t have that much battery going around.
Your ability to reason about that 50 pound box. And how is that disturbing my dynamics? Because my structure is much lighter, so how do I. Use a lighter structure to still do a heavier task. All of these technologies, whether they end up in a humanoid robot or not, make spot better, make stretch better, make robotics as a whole better.
So whether the end all be all of all form factors is humanoid robots. I don’t think anybody here is gonna argue, but anything that is developed in the pursuit of a humanoid robot, I think poses great technological progress for robotics as a whole.
John Koetsier: Super interesting. Um. Let’s talk about robotics. Uh, you guys mentioned off the top we’re approaching, we’re potentially entering a golden age robotics, and we know that we have immense challenges.
Uh, there’s a continual and ongoing labor shortage. Uh, certainly in some countries, not all countries, uh, there’s a continuing ongoing crisis for elder care, healthcare and other things like that. You look at a country like Japan where the birth rate is a men is crazy low, and the, and the, and the the population is really, really old.
Who’s gonna take care of all those people? How’s that gonna work? Right? Um, we see robotics coming into its own as a field that can potentially help us here. We also see all kinds of it you, if, if we had. If labor was free. Okay, so really dangerous. We all work, we need money, right? So we gotta figure out a lot of stuff and we do that.
But if labor was free, you know, what would that enable us to do environmentally? What would that enable us to do? For homeless people, what would that enable us to do in, in so many different places? And of course robots are tremendously expensive right now, but we see the progress and we see the evolution.
We, we can assume we can extrapolate, you know, from other electronics that they trend to cheaper. Right. Uh, where do you see this, this path going?
Alex Perkins: Um, maybe I can, I’ll start. Um, we, we have a few, um. Avenues right now where our robots are, are kind of helping humanity and, and actually doing so for the people who are currently, uh, doing some of these jobs.
So in the Spot world, maybe it’s inspecting a nuclear, uh, facility that’s be been decommissioned and you have to expose yourself to radiation. Spot should do that job, right?
And in the warehouse space, you’re sending people into a very hot, uh, container that’s been sitting in the yard to unload these boxes and it’s baking in the sun and the boxes are heavy.
And you said you did this job. Nobody wants to go in and do that. It’s unhealthy in the moment.
Heat, heat, uh, effects. And it’s unhealthy over time as you have this ergonomic challenge and you move all this weight. These are jobs that our robots can currently go and do. Now I think we’ll move. Beyond that and start to get into like freeing up other, uh, uh, spaces for people to figure out how do I wanna enjoy my, uh, my, my space on this planet and improve the planet.
Right now we can just directly, uh, in, in attack those dull, dirty, dangerous activities with our products.
John Koetsier: I. Mm mm-Hmm. It’s interesting if you think of something like Stretch, right? Obviously you’re positioning it for warehouses and everything like that, it makes perfect sense. But you can imagine a form factor that’s not terribly different than that with a few different actuators and arms on it.
In healthcare of continual problem is moving people from bed to bed, getting people up, helping somebody have a shower when they’re not mobile, right? And, and, and a hospital is designed to let things the size of a stretcher on wheels. Go through. Right. So you can imagine similar form factors.
Alex Perkins: I think you just got the, the name for the new robot.
A stretcher. Not far off from what we have now.
John Koetsier: Trademarked. Yeah. Excellent. Okay, cool. Um, I guess last hits, um, what, what can we expect from Boston Dynamics over the next year? Uh, what can we expect to see?
Alex Perkins: Well, let’s see. On the warehouse robotics side, we have some, um, initial customers working with our early versions of our. Uh, stretch robot and we are working hard to expand, uh, globally, essentially. Uh, and you’ll see more of these in warehouses. Uh, I mean, you, most people don’t go into warehouses on a daily basis, but if you work in the industry, our robot will become more commonplace as we roll it out more broadly, um, which is very exciting for us.
Mike Murphy: And more and more of the packages that you order online will have at some point along their journey, had their products touched by a stretch robot.
John Koetsier: Cool, cool. Yeah. Deliver.
Marco da Silva: Yeah. Someday maybe. Uh, so on the, on the spot side, you know, we’re starting to, um, really see scaled adoption with our industrial inspection customers.
So, um, should start to see and know a lot more publicity around, um, you know, big customers seeing value out of spot, uh, inspecting their industrial facilities. Um, you know, we, we’ve got a couple already in kind of like food and beverage, but we’ll start to see it in other industries as well this year.
Cool.
John Koetsier: Are, are you. I was gonna wrap up here, but I mean, you know, that, that just begs a bunch of other questions, right? Are you working hard on integrating robots with other robots? So you mentioned inspection, right? So that’s great. Obviously hey, you know, boots on the ground type of thing. Right? Um, useful, uh, excellent.
You can imagine that integrating with a drone, a flying system, right, to get different perspective on stuff. How does that work? Are you building software integration protocols or even physical integration protocols or that sort of thing can spot eventually launch its aerial drone component and, and fly over and report back?
Um, or am I going too far in the future?
Marco da Silva: We’re, it’s certainly something we’ve got our eye on. So, you know, spot it has, uh, uh, an API and Orbit our, you know, kind of fleet management system for spot, uh, also has an API and it’s kind of purpose built to manage fleets of robots and data coming off of those robots.
Um, so we think there’s, you know, kind of a natural, uh, extension there. Um, you know, to, to. Eventually, like manage robots in general, right?
John Koetsier: Mm-Hmm mm-Hmm. Manage, integrate the sensor data from multiple different, uh, robots, different form factors throughout your facility perhaps. And, uh, get a sort of holistic sense of awareness.
Almost build a digital twin, uh, if you would for something like that. Super interesting guys. I really, really appreciate your time. Thank you for taking this time.
Alex Perkins: Thank you. A pleasure. Thank you.
TechFirst is about smart matter … drones, AI, robots, and other cutting-edge tech
Made it all the way down here? Wow!
The TechFirst with John Koetsier podcast is about tech that is changing the world, including wearable tech, and innovators who are shaping the future. Guests include former Apple CEO John Scully. The head of Facebook gaming. Amazon’s head of robotics. GitHub’s CTO. Twitter’s chief information security officer, and much more. Scientists inventing smart contact lenses. Startup entrepreneurs. Google executives. Former Microsoft CTO Nathan Myhrvold. And much, much more.