Robots in the cloud: How Amazon’s ‘fake’ robots are making real robots smarter and faster

robot wall-e robomaker aws ROS robots cloud

Is the future of robotics in the cloud? In this episode of TechFirst we’re chatting with Amazon exec and AWS Robotics general manager Roger Barga, about how Amazon’s “fake” robots are making real robots smarter, faster.

We’re seeing more and more robots in manufacturing, services, hospitality, and almost every other industry … but there are still huge gaps in software to run and manage and coordinate all our robots and drones. Amazon’s reducing the cost to train, test and deploy robots by a factor of perhaps 1,000 thanks to AWS Robomaker and WorldForge.

We chat about companies like iRobot that are vastly improving their robots, the future of robotics in both the near and far term, and how Amazon ‘eats its own dog food’ by using AWS Robomaker to train and deploy robots in its own warehouses.

Scroll down for full audio, video, and a transcript of our conversation …

Subscribe to the TechFirst podcast

 

Watch: Robots in the cloud via AWS RoboMaker

Subscribe to my YouTube channel so you’ll get notified when I go live with future guests, or see the videos later.

Read: Robots in the cloud

(This transcript has been lightly edited).

John Koetsier: Is the future of robotics in the cloud? Welcome to TechFirst with John Koetsier. We’re seeing more and more robots in manufacturing services, hospitality, and frankly, almost every other industry. But there’s still huge gaps in software to run and manage and coordinate all our robots and our drones. And just maybe, simulated drones will make real robots actually smart enough to be useful.

To dive in and chat more, we’re talking with Roger Barga, who’s the General Manager of AWS Robotics and Autonomous Services at Amazon Web Services. Welcome, Roger! 

Roger Barga: Thank you, John. Thank you for the opportunity to talk about a topic that’s close to my heart.

John Koetsier: It better be close to your heart. It’s your career, you better love it 🙂 

Roger Barga: That’s right.

John Koetsier: Let’s start here. I interviewed somebody recently on drones, and he was talking about drones and robotics, and said that robots are kind of at the iPhone-before-the-app-store stage, right? There’s hardware, there’s amazing hardware in a lot of cases, but not enough software, especially connective software.

Does that make sense? Do you agree with that? 

Roger Barga: I would agree. I mean, if you look at the situation today and what’s been happening, the price of hardware to build a robot has been falling year after year for now the past two or three decades. You know, a sensor like LIDAR which 10 years ago could have cost $75,000 — and lidar is what a robot might use to sense its environment, localize itself. Today, you can get a really good lidar for around $1,000. There’s been technology algorithmic advances as well. The cameras are now actually allowing a robot to sense and localize itself, and these solutions are under $100. 

John Koetsier: Mm-hmm.

Roger Barga: So we’ve seen the price of hardware fall, but the labor price of actually writing the software to build a robot hasn’t.

So there’s really a stark contrast between the availability of inexpensive hardware and the software that you need to actually build and operate that robot.

But, going forward, we’re actually starting to see it’s changing. The future is looking promising. There’s been an effort — if you’re an academic or researcher you would have used something called ROS, which is the robot operating system, it’s open source software developed for academic research and education.

But in the last two or three years there’s been an effort called ROS 2, which is to make a commercial grade, production ready, secure, tested version of ROS that any developer can pick up and use for free. And there’s a huge ecosystem of open source software, again, coming up to production grade, production ready, as well as dozens of companies that are actually committing software to ROS. So the landscape is fast changing on the software front. 

John Koetsier: That’s really good news. I mean, obviously good things are happening on the hardware, there’s an explosion of different types of robots and everything like that. And even like LIDAR, there’s probably even cheaper versions now because you can get a version in … the iPhone 12 will probably have some sort of baby LIDAR in there, right? Not the kind that you’d want for your self-driving car maybe, or your autonomous robot, but that’s amazing.

But the software is really essential — and there’s two kinds of software that I’ve been thinking as I’ve been looking at this space that I really see the need for, right. The on-device, obviously, software — or that could be in the cloud if the robot is always going to be connected, but the software that runs what one robot does. And then there’s also this class of software that runs coordination, right? It could be device-to-device, it could be multiple robots of the same kind. It could be different types of robots, right, but getting them integrated into a sort of a coordinated whole.

Where are we in those areas? 

Roger Barga: Yeah, so I agree with the two categories. I’ll talk about a third one that I see as well. But there is that software that runs on the robot, and that’s been plodding along every year as new hardware gets put out and new algorithms for navigation come out. Developers have been building this, again, putting it into open source or commercializing it.

What’s disrupting — the disruption happening there, and you mentioned it, is it moving some of that functionality to the cloud. It actually keeps the cost of the robot lower. You don’t have to add complicated hardware, or processing hardware, or algorithms to your robot.

And developers are now starting to think, what needs to run on the robot because of low latency, disconnected operation, and what processing makes more sense to move to the cloud, and may actually be a force amplifier because robots could be sharing information with each other.

John Koetsier: Mm-hmm.

Roger Barga: Or leveraging a service like Amazon Polly or Amazon Lex that has speech recognition without writing a line of code that gets added to the robot. And that second category is software to manage the robot. We look at this as fleet management to get value out of a robot, often used by the operator of the robot to control, command, find out where it’s at.

And today that software is mostly bespoke. The Teslas of the world have stitched things together. The Amazon Robotics have built their own software stacks for managing their fleets. So that has a long ways to go before it’s general purpose.

The third category I’d like to share with you is that there’s this operation or DevOps for robotics. And if you think about it, the robots have very special needs. You can’t afford to push a new version of your application out to a robot. It’s operating in the physical world and if you don’t test it extensively, bad things could happen to your robot. 

John Koetsier: Yes.

Roger Barga: You could potentially lose it. So we say ‘Friends don’t let friends deploy apps to their software without testing and simulation.’ So this idea of being able to…

John Koetsier: Deploying to production is a real big deal when your deal is not just a website. It’s an actual physical — could be a multi-thousand pound — robot. 

Roger Barga: And can you imagine updating a jet aircraft while it’s in mid flight? And so these are the kinds of protections we have to guard against, and often that’s done with simulation. And so you do an extensive simulation, but you integrate simulation with your CI/CD pipeline so the code is always being tested before it’s deployed.

You have to check that the robot’s not doing some — performing some function before you attempt to update it. So again, there’s a lot of consideration, the DevOps for robots is a third category which we’re seeing just now starting to emerge as robotics starts to grow and become a bigger piece, you know, entity of software. 

John Koetsier: It’s funny what you bring up, because my son is a third year mechatronics student at the University of British Columbia, and he got invited by DJI to go to Shanghai, pre-COVID for, I think they call it RoboMaster, and they have to build various types of drones and control software and everything like that. And it is not easy, this is challenging stuff. 

Roger Barga: It is. It is. 

John Koetsier: What else do we need? I mean, we’ve talked about software that controls a robot or a drone. We’ve talked about software that integrates them and connects them and even manages them.

But what about connections into maybe enterprise systems, or reporting, or supply chains, or something like that? Because we’re increasingly seeing robots being used to do real work, to create stuff, to manage stuff. And we’re seeing robots start to hand off jobs to other robots, right? Is that yet another class or is that just APIs into existing applications? 

Roger Barga:

I would argue it’s a service integration task by and large, because robots are a great digital exhaust about what’s going on in your business. You want to know how much product’s being used, how fast it’s being moved, where the robot is at any certain point in time.

Often enterprises already have systems that have this information and they want to integrate the data coming off the robot into these existing systems to have a holistic view of what’s going on out on the plant floor, out on the fulfillment center floor, and to be able to integrate that robot as orders come in so it could be dispatched immediately to go find the item and fulfill the order. So there is a service integrator task there.

And again, going back to fleet management, there’s functionality we want to see in fleet management to manage these robots. Because together they’re getting something done, you know, overall optimizing some business objective, like moving product out as fast as possible every single day.

John Koetsier: Yeah. Yeah. So, AI is boosting our ability to do one crucial thing, right? I mean, it’s great to have hardware. If it doesn’t know what to do, or where to do it, or how to do it, or anything like that, it’s useless. And so we’ve got robots that are, you know, $50,000, $20,000, or cheaper, or more expensive, and they need to be trained in order to do something that’s useful.

So increasingly we’re seeing that training happen in virtual environments and that can speed it up. Talk about where we are with that and what that looks like, and how that works. 

Roger Barga: Yeah. So it’s an area that has great promise, but has not been reduced to practice yet. So we’re seeing a lot of very early explorations in research labs and in constrained environments. AWS DeepRacer launched in 2018 at re:Invent. It’s a 1:18th scale autonomous vehicle and it’s actually, it learns how to drive — a reinforcement learning agent learns how to drive the car in simulation — it runs on a RoboMaker simulation, the service my team has built.

And it runs dozens of simulations of different racetracks until the model generalizes enough so that it actually is doing a reasonably good job of driving around all these various tracks before the reinforcement learning agent is finally deployed to a real physical car. This is fun. It actually teaches reinforcement learning to students, but it’s far from being able to drive a real car or even do something like just pick an item up off a shelf.

But we’re seeing huge advancements right now.

I mean, if you’d have said three to five years ago this was gonna be possible to train a robotic arm in reinforcement learning, everybody would have scoffed. But, because the complexity of contact mechanics — modeling the hand or a grasping arm of a robot is so much more complex than just the dynamics of wheels on a car and learning how to drive and turn a car — but the physics models and simulation engines have gotten so much better that we’re now able to accurately model the dynamics of a complex robotic arm.

Perception is another big challenge.

If you’d say you’d be able to model high quality perception like a camera would see it, in a simulator, but today with gaming graphics engines from gaming engines being added to simulation. So this combination of better algorithms for the physics to be more realistic, better perception algorithms so that you can model what the robot will actually see in the real world. Those two together are actually making this now achievable. So we should expect to see advances in the next few years on this. But there is a challenge … if I may share with you. 

John Koetsier: Go ahead. Please.

Roger Barga: Yeah, in machine learning the only way this will generalize is if you show it lots and lots of examples, all possible conditions under which it’s expected to operate, and only then will the reinforcement learning algorithm generalize.

And to do this you have to create a simulation, but that is incredibly expensive. A single simulation can cost tens of thousands of dollars. And now you want to create hundreds of them to show diverse conditions under which the robot has to operate. Two months ago, we launched a service or capability of RoboMaker called World Forge, and within minutes it can create a simulation of a residential environment— and not just one, but it can create hundreds or thousands of them and vary across all simulations, and do so for about $1.50 per simulated world.

And so this is the kind of advances we’re going to have to have. So the technology is advancing, but we also need to create simulations so these models will generalize. 

John Koetsier: So, repeat that for us maybe a little bit differently. You said to create a simulated world or simulation that you can train a robot, and what did you say that was … it could cost $10,000 or more than that? 

Roger Barga: Or more than that. And weeks of work, because you have to have artists that will come in and actually then create the 3D environment. You have to have robot simulation engineers come in and add the physics and annotate all the objects in the environment with the physics — this is the friction of this environment of this road, this is the friction of this surface over here. So again, a lot of editing, creation, artists to create a single simulated world.

John Koetsier: Wow.

Roger Barga: Think about trying to create a house with six rooms and all the variations and all the furniture in the house and the physics of all those odd objects. That’s what WorldForge has automated. But I think we’re…

John Koetsier: For $1.50.

Roger Barga: Yeah, for a simulated world. Yeah. We’ve done it for residential areas. We’re going to go to do warehouses and hospitals next. But that’s the missing ingredient, is that you have to show it lots of examples for it to generalize. That requires different simulations. 

John Koetsier: Very, very interesting. It reminds me of a self-driving startup that I talked to recently and they were using simulated roads as well, based on lots of examples, to quickly train their AI as well. Are there any challenges or downfalls to that? I mean, like you mentioned repeatedly that you have to give it lots and lots and lots of examples.

Because I guess if it’s missing something, then there’s a piece of the world that it doesn’t know about, and somewhere someone’s going to find that out in the wild, in production, and so that’s the danger, I suppose. 

Roger Barga: It is indeed, which is why I’m very comfortable working in robotics and in constrained environments. And the robots also have a separate logic computing, a safety bubble, which will stop it immediately from its 5 miles/hour to an immediate halt. So it is, it’s a much harder challenge when you’re talking about self-driving cars, an order of magnitude more complex, actually.

John Koetsier: I have to ask, because I think Amazon is probably a big consumer of its own technology here, because Amazon is a massive user of robotics in your warehouses and everything like that.

Do you use that software? Do you eat your own dog food — use that software to build your own services as well? 

Roger Barga: Yes, we absolutely do. Amazon Robotics is a customer of my service. So we’re working with them to support their requirements. We’re working with teams that are adopting ROS 2 for their robots as well. And so, yeah, we do believe in this virtuous cycle of learning from our — from Amazon as a customer, and also then try to serve their needs as well.

John Koetsier: Well, that’s the genius of Amazon for 20 years, right? Everything is built as a service and everything’s available as an API, and everything internally is used that way. So it really worked to go externally with that. Very, very interesting things.

I want to talk about some of the projects that you’ve done that might’ve been really interesting, the ones that you thought were pretty cool. What are some of the most interesting jobs people have done recently via AWS with robots? 

Roger Barga: Yeah, so iRobot is one of my customers. They build a very high quality vacuum cleaner, actually several different models of vacuum cleaners, and they have to rigorously test it before they allow their code to get put into production.

They actually wanted to expand the suite of tests, but it was already taking them weeks to test the robot app before they would put it out into people’s homes and publish an update. They have used RoboMaker simulation and every time a developer attempts to check in code into the code repository after they’ve made some changes, they run thousands of different tests and simulations.

John Koetsier: Amazing.

Roger Barga:

Yeah, basically every single day they’re running 5,000 simulated tests, a quarter of a million square feet of vacuuming done every day in simulation to test their application. And they’ve gone down from three weeks of testing now down to about an hour and a half because they can run all these simulations in parallel across all possible scenarios they expect to see in a home.

And they’ve mentioned publicly, it’s actually they’ve caught bugs that would have gone into production into people’s homes, as a result of using simulation as a way of ensuring the quality. So that was that category of software we talked about, the dev ops, if you will, for keeping your robot healthy and operating correctly. 

John Koetsier: That’s a complete game changer. I mean, you just said you took something like three weeks of testing down to, what did you say … 45 minutes, half an hour? 

Roger Barga: A little over an hour, little over an hour. 

John Koetsier: Wow. That’s an utter game changer in what you can deliver to the market.

Roger Barga: Yeah, and it was a game changer for the company, as well. As I mentioned, their code base is always kept at a very high quality. Bugs aren’t being propagated and they’re catching things before they go into production, and they’re even identifying additional tests that they want to add to it as well. 

John Koetsier: Very interesting. Any other cool customers? 

Roger Barga: Yeah. And so, Woodside Energy, they already had ground mobility robots running ROS — they’re in Australia — these robots were actually carrying materials from point A to point B, and they wanted to add a little more intelligence to their robots without actually necessarily adding more hardware. S

o they integrated our cloud service extensions which will integrate their robot with our cloud services like Polly, like Lex, so you can have voice interaction with the robot. Kinesis Video to stream video off the robot or video-like data off the robot or telemetry.

And within a few hours they were able to stream video to create a control center, they could actually see what the robot was seeing. They realized the robot was actually going by some very interesting — they could record that video as well, that was something they wanted to do as a business objective as they’re doing their operations out on the field — but they realized the robot was going by some very interesting assets, like oil reservoirs that are glass. And they actually wanted to build a machine learning model to just snap an image and send an alert if the oil level or reservoir was below a threshold. 

John Koetsier: Oh wow. 

Roger Barga: They used SageMaker to build a machine learning model from labeled supervised data. They stream the video up to the cloud of snapshots of the oil reservoirs, feed it to the machine learning model that they created. They get an alert on that same dashboard where they’re seeing the video off the camera saying, ‘Oil reservoir 24 needs to be filled, it’s 10% below.’ So, things like that.

And they didn’t add a single piece of hardware to the robot to do that, they just added a little software and connected it to the cloud. 

John Koetsier: I mean, that’s intelligence, right? I mean, if you have a human driver going by there, he or she might notice that sort of thing and report it, or may not, who knows, but an intelligent and a person who is a good worker would do that. And now the robot that’s going past there anyways is doing the same thing. Multitasking essentially. 

Roger Barga: Exactly. Exactly. This is the kind of thing that excites me, again, it didn’t require bringing all these robots in, it was just a mere software update to the robot and away they go. 

John Koetsier: So, it feels like we’re at the cusp here. We’re kind of at the beginning. We’re just starting to unlock the potential for robots in manufacturing, in construction — which is a really, really challenging environment — and maintenance. Search and rescue is an area that I’ve seen a lot going on in drones and other things like that. In security, we’ve seen a number of robots coming out there. In fact, in hundreds of different areas.

Maybe prognosticate for us for a few minutes, what do you see coming out over the next 5 to 10 years in terms of robotics? 

Roger Barga: Yeah. So we’re seeing really strong adoption of ground mobility robots just to move material around … warehouses, logistics, fulfillment centers, home delivery such as Amazon Scout.

In 10 years, I fully expect that all mobile material will be moved by autonomous robots. It’s going to be a huge game changer.

I mentioned ROS 2 earlier about having a free commercial grade, production ready version of ROS that developers can build upon to start focusing, just what they want to add to their robot. I think startups are going to blossom because of that, it’s just lowered the cost of barrier of entry.

But I’m reminded of Amara’s Law. You know, Roy Amara was a futurist, a Stanford faculty member, and in 2006, he noted that we overestimate the impact of technology in the near term, and we tend to grossly underestimate it in the long term … and that today is known as Amara’s Law. I believe we’re overestimating the near term impact of robots today. We’re still at our infancy. It’s still day one.

But I also think that we’re underestimating the impact over the next 7, 10, 15 years. At AWS Robotics, we look to shape how robots are programmed, managed, and operated through cloud services, and the role the cloud will have in the future of robotics. And that’s what really excites me. And I think that’s going to be, again, one of the most powerful tools developers have access to, is cloud and cloud services. 

John Koetsier: I’ve heard that quote. I did not know it was attributed to him, but I love that quote. We’re not good at exponential thinking, are we? 

Roger Barga: No.

John Koetsier: Just … it’s hard for us to see that far.

Roger Barga: Yep.

John Koetsier: Well, I want to thank you for your time, Roger. It’s been a real pleasure. It’s been eye-opening and thank you for taking some time. 

Roger Barga: Thank you, John, for giving the opportunity to speak with you today. 

John Koetsier: Absolutely. For everybody else, thank you for joining us on TechFirst. My name is John Koetsier, of course. I appreciate you being along for the show. You’ll be able to get a full transcript of this podcast in about a week, sometimes three days, at JohnKoetsier.com and the story comes out on Forbes just after that. Plus of course, the full video will be on my YouTube channel. Thanks for joining. Maybe share with a friend. Until next time … this is John Koetsier with TechFirst.