How close are we to autonomous drones? Exyn Technologies says they’ve achieved pilot-free level 4 drone autonomy, and in this episode of TechFirst with John Koetsier we chat with the company’s CEO Nader Elm. This is a huge step that makes drone operations automatic, and programmable in an overall company’s operational planning.
Nader and I chat about what drone autonomy means, what level four looks like, and, crucially, what this unlocks for search, rescue, research, security, surveillance, inspection, delivery, and many more drone operations.
According to Exyn, this is no longer point-to-point flight, but open exploration flight at double the speed with a smoother flight path and higher quality data accuracy. And it’s the most advanced self-flying drone available now.
Scroll down to subscribe to the audio podcast, watch the video, and get the full transcript …
Or check out the story on Forbes …
TechFirst podcast: level 4 drone autonomy
Watch: Exyn announces self-flying drones
(Subscribe to my YouTube channel so you’ll get notified when I go live with future guests, or see the videos later.)
Read: autonomous drones reach level 4, according to Exyn
(This transcript has been lightly edited for length and clarity.)
John Koetsier: What do autonomous drones unlock? If self-driving cars make travel easy, autonomous drones make transport, inspection, security, research, and so much more … programmatic.
That’s an order of magnitude more useful than drones that have to be flown or driven by humans manually. In this episode of TechFirst with John Koetsier, we chatted with Exyn Technology CEO, Nader Elm, who has just announced the achievement of Level 4 drone autonomy.
I’ll start here: you’re announcing Level 4 Autonomy for your drones right now. What does that mean?
Nader Elm: So one of the things we realized very early on as we started talking about our robots is we talk about ‘autonomy’ and the word itself really doesn’t have that much definition.
John Koetsier: Yes.
Nader Elm: I think there’s been some discussion, at least in the driverless car market, so there’s a general understanding over there and there’s a framework that the SAEs put together, but there isn’t anything really outside of that. So we looked around. We couldn’t find anything. So we took it on ourselves to propose at least a framework for that, and not something that we would want to own, because that would be very disingenuous. So we borrowed a lot of the framework from SAE [ Society of Automotive Engineers ] because that’s already in market, and tried to establish a new way of articulating the different levels of autonomy at least when it comes to the world of drones, and that’s what we proposed. It’s version one and we’re looking for evolving that over time.
John Koetsier: So give us a sense of what Level 4 Autonomy means and why that’s such a big step from Level 3.
Nader Elm: So, generally, I think there’s been an evolution in drones anyways. There’s unmanned systems, so there’s no person on board and then it offloads a lot of the effort of flying the drone. But generally over time, I think what’s happening is there’s still a requirement for certain inputs, whether it be from an operator or leveraging any kind of infrastructure like GPS. So that’s the first strand.
The other strand is basically how much of the, you know, kind of exploration, how much of the mission itself can be handed over to the system itself.
So, what we’ve done is essentially with the definition of Level 4, kind of said not only is the operator not flying the system, the operator’s just giving very, very high level kind of mission parameters to the drone and leaving it to the drone to figure out how it’s going to fly itself — not just going from point A to point B, but just figuring out how it’s going to complete the mission from there.
And that’s basically what we have worked on. We call this ‘Scoutonomy’ and we’ve just launched the first iteration of that, and that’s our Level 4A.
John Koetsier: Excellent. So does that also speak to speed of the drone? I mean, like you might have Level 4 at a certain speed like crawling. You might have Level 4 at high speed. Talk about that a little bit.
Nader Elm: Yeah, so concurrent with what we’re doing on autonomy, we’ve also started developing higher levels of speed so you can cover more area for the mission. So, that’s basically another element of what we’ve been working on, which is working at higher speeds.
And that in itself is actually, separately and distinctly, a very, very different challenge, because you’ve got to perceive the world; you’ve got to process the world; you’ve got to anticipate where you’re going to be going to next.
John Koetsier: Yes.
Nader Elm: So, yeah. Crawl before you walk, before you run. So right now we’ve gone from crawl to a walk, and we’ve got aspirations to kind of go into a full sprint in the future.
John Koetsier: Excellent. Excellent. It’ll probably take some hardware to make that happen as well. Let’s talk a little bit about what this unlocks and what clients are going to do with this new capability.
I mean, when I hear about this, I think about drones perhaps doing missions, you know, you might have a drone that is tasked with every X number of hours being programmable to take a scout around a building, like a security drone or something like that. Or you might imagine, like a maintenance drone that takes a scout every two days around a bridge or something like that.
What does this unlock?
Nader Elm: So in the first instance, I mean, essentially what it does, it just makes it more usable to an operator. You don’t have to work too hard to figure out what the mission is. You talk about bridge inspection, for example.
One way to do it, you know, a lot of companies are doing this, which is basically the operator is directing the drone where to go. So that needs a lot of pre-planning and needs some foresight.
What we’re proposing is that you just need to know roughly where the bridge is. You need to instruct the robot to say, ‘Okay, it’s between here and here, and it’s a cube or a kind of a stretched area,’ and leave the robot to figure out how it’s going to span the bridge, go and cover the bridge completely, and know when it’s done and return to base.
That makes it a lot easier for you as the operator. You don’t have to sweat the details, leave that to the robot and makes it that much easier and better. And in some instances it’ll actually make it more effective because humans make mistakes. The algorithms, hopefully, will be able to remove any of those kind of errors, human errors that might otherwise be in there.
John Koetsier: So, in other words, it’s kind of like telling an adult to go and do something versus walking a kid through it step by step by step every time you go. Does a drone need to have an internal representation of ‘bridgeness’ and what to do with a bridge to enable this?
Nader Elm: It doesn’t. No, I mean, it’s working on basically the physical geometry of that thing. All it knows is it’s a cube, so I’m going to go into that space. I’m going to figure out where I can and can’t fly, and wherever I can, I’ve just got to make it complete. So beyond that though, that’s not to say that there are opportunities to now start adding in more smarts to basically say, ‘Okay, now it doesn’t have to be bridge,’ but if there are particular things of interest that you want to zoom in on you can train the robot, because, at the end of the day, this is edge computing by definition.
You can train it to look for certain things and if it finds them, to act differently. So, for example, scan the bridge. You identify rust or cracks, now zoom in — I need a lot more detail on those areas and then continue with the mission. So you can actually establish entirely new behaviors and new problem-solving mechanisms in real time, on board.
John Koetsier: That’s really interesting, and when you say, ‘You can train the robot,’ who is the ‘you’ here? Is that a generalized you? Or is that on a per-customer basis? Or is that you as your company training the robot when you see something like this zoom in — who’s training that? How’s that work?
Nader Elm: So initially, it’ll be us. I mean we’re putting on the machine learning models and the various training elements, but our goal is to open it up so other people can put on their machine learning software and their own training models as well, so that it can be more generalized; you can send it out to different applications. We don’t profess to be the be-all and end-all for all things. However, we want to be the enabler for a lot of other applications that other people have capabilities for.
John Koetsier: That sounds really interesting, because I might be a security company and I might have some expertise in development and programming and maybe even AI or something like that, but I don’t have a drone and I don’t have autonomy.
But if I can take this drone, I can take its autonomous capabilities, I can say, ‘Hey, take a little fly around this stadium every 30 minutes or so, come back to base to recharge if you need to, right, but take a fly around. If you see a human where there isn’t supposed to be one — and define that, right — then zoom in and give me some more shots and maybe alert me or something like that, right?
Nader Elm: That’s exactly right. And that’s the thing, we’re not smart enough to imagine every scenario and every kind of interesting thing that might happen in different industry verticals. In security you’re right, I mean, it might be, ‘Go fly, and if there’s something specific like a person that shouldn’t be there’ — and that’s another interesting that you can identify people, but now you’ve got to have enough contextual information to understand that they’re not supposed to be there.
John Koetsier: Yes.
Nader Elm: But there could be other applications as well that we just haven’t thought of. I mean, in the context of, for example, warehouses, if you’re flying through a warehouse and you see a particular object that might be … it’s supposed to be in the warehouse, it’s a forklift truck and so you can figure out okay, I’m going to work around that. But if it’s a person that’s not supposed to be in the aisle, then you can kind of say for safety reasons, ‘I want you to back off. I want you to return to base.’ So these are all interesting things, but experts and other companies we want to enable to be able to do that.
One way we look at it is basically we want to be the smartphone on which other people can develop their applications.
John Koetsier: Excellent.
Nader Elm: We just want to be the standard for that.
John Koetsier: Now, will you have an app store so that somebody who comes in has an application, a usage, a use model for it, and they want something that, whether it’s security or something like that, maybe a maintenance flight or something, they can pick a pre-trained model and apply to it on an app store that you run?
Nader Elm: Well, I mean, the answer is yes. And the app store, so I don’t — where I’d like to kind of get to, is we are not even in the loop. So we just publish the APIs and people can plug into it. They’ll obviously get support from us, but more like a value-added reseller would; they would develop the applications, put it on, and then put it out. So we want to enable that, you know, flourish and that opportunity, because I think there’s going to be so many applications that we haven’t even dreamed of yet.
John Koetsier: Yes, absolutely.
Nader Elm: I think we want to just enable that.
John Koetsier: Absolutely. Talk to me about autonomy in drones in general. You’ve unlocked Level 4 as you’ve defined it. Who else is working on this? How close are they? Is this first-ever global?
Nader Elm: Uh, well I’m not gonna say— [crosstalk]
John Koetsier: Hard to define right? [laughter]
Nader Elm: Exactly, yeah. So I won’t say that we’re the only people working on it. And again, we were very, very thoughtful when we came up with a framework to make sure we weren’t bending it so that we could be the first or the best or anything like that. We want to be very honest about it.
So within the context of that, we believe that we’ve got the most advanced.
However, I’m sure other people are working on something similar, if they haven’t released it yet, they probably will. But there are very few companies that are working in kind of the industrial space that have got something as robust as what we’ve developed. I’d like to think that we’re at least the first, but I’m sure there are other folks who will be releasing something soon.
John Koetsier: There are a lot of people who are working on drone delivery programs, and while those sometimes seem like the shiny objects that everybody gets a splashy press release out, ‘Delivered the pizza in Missouri with a drone’ or something like that, right? You know, yet that’s an interesting scenario for the medium term future. And I’ve interviewed a few people who’ve done stuff like in-flight recharging and other things like that.
Nader Elm: That’s right.
John Koetsier: One would think that this level of autonomy and awareness of what’s around because in an outdoor setting things change — you can’t fly a predetermined path because guess what? Trees grow, cars move, the wind comes up, clouds and fog happen, that sort of thing. How do you see this type of functionality impacting delivery, drone delivery?
Nader Elm: Specifically drone delivery, I mean, all of that regulation and FAA thing aside, just looking at the pure technical aspect of it … yeah, I mean, it is a definite interesting challenge because, to your point, you might’ve had that journey once before, but you can’t assume that the map is the same map every time you come back. So there’s got to be enough intelligence onboard the robot to be able to adapt to whatever the environment may be every time.
John Koetsier: Yeah.
Nader Elm: And that’s a principle we use in whatever situation we’re in. So even if there’s a repetitive flight in an environment — think of a warehouse — you can’t make the assumption that the environment hasn’t changed, but the mission is still the same. So we can adapt to that kind of an environment, and I think with delivery mechanisms it’s not just the tree or the new telephone pole that’s gone up — if people are still putting up telephone poles — but it’s going to be other extraneous things.
Like, for example, the dog that’s being set loose that’s chasing after the drone. You’ve got to be able to adapt safely to those kinds of situations as well. So, you know, there’s going to be how to get from the depot or the delivery truck, do the last mile, and then there’s going to be the landing site selection which is actually going to be very, very interesting. So that’s a whole other kind of thing.
So you get to the proximity where you’re supposed to deliver and now it’s the last mile, and now the final 50 feet that’s going to be the most treacherous. That’s going to be the most complex, and that’s where you need the most intelligence.
John Koetsier: I could totally buy that. I could totally see that. That’s always the most challenging. It’s interesting because some people that I’ve talked to said, ‘We’re not going to land. Because when you land that’s an FAA event,’ right? Landing and taking off is an FAA event and it requires a variety of bureaucracy and maybe checklists and start-up procedures or shut down procedures, that sort of thing, ‘so we’re going to drop and go.’
Nader Elm: Yeah. How would you feel about ordering your iPhone for dropship delivery literally?
John Koetsier: Literally dropship [laughter].
Nader Elm: Literally dropship it, yes.
John Koetsier: Guaranteed two feet or lower. It’s not… [laughter]
Nader Elm: Sure, exactly. And of course, knowing my luck, it would land on the concrete part of my patio.
John Koetsier: Shockingly, I’d be okay with it because, frankly, in the way they pack boxes these days, I’m pretty sure it’d be okay. But what’s interesting, you mentioned that environments are unpredictable.
We have two Western red-tail hawks that are nesting in a tree just up outside my office here, and the crows are always coming and attacking them. And I can imagine that happening to some drones as well, so the world is a very changeable place.
Nader Elm: Yeah, that’s a whole level of … yeah, I mean, avoiding attacking hawks. We’ve seen YouTube videos of hawks attacking drones. That’s a whole level of thing that I think we haven’t really kind of started looking into yet, collectively, we need to.
John Koetsier: Yep. Yep. Let’s cast our eye forward a little bit and look out two, three years or so. You’ve come pretty far in just six months, because you unveiled a set of technologies that allow drones to go into mines, for example, and autonomously — semi-autonomously at that point — map out what’s going on, what’s happening, has the new shaft been driven through yet, other things like that, and return back this 3D map.
And now you’re announcing Level 4 Autonomy. So if we look three to five years out, what kinds of capabilities do you think you’ll be shipping at that point?
Nader Elm: So for our part, I mean, there’s a lot more to do in terms of making this more usable, more adaptable to different kinds of environments. So, the underground mining, we’ve actually seen some pretty incredible results where it’ll go into areas that you didn’t even think you needed to go into. That was actually one of the highlights of the video that we put out.
So, over time, I think there’s things that we want to do to make it faster, make it higher resolution, make it more accurate. We’re seeing a lot of interest, certainly now in terms of what we’re doing, but also requests to see more of the same and just better performance over time.
But the other thing we were kind of contemplating is basically the ability to have multiple robots collaborate with each other so you can scale the problem, both in terms of scale and scope. So you can have multiple identical robots on a mission, so you can actually now cover a larger area, but also have specialized robots that might be different. So, heterogeneous swarms so they can actually now have specialized tasks and collaborate with each other on a mission.
So we’re seeing a lot of that, plus there’s other capabilities that we’re going to be putting on board. When we talk about mapping, right now we’re doing geometric mapping. We’re going to be incorporating the video, the RGB data, but we’re also seeing requests for other kinds of data, other sensors to be carried, so now you’ve got a much more holistic view of the environment.
So imagine a photo-realistic, three dimensional representation onto which you can now put heat, humidity, gas readings, radiological readings, and so on. So depending on the sensors you’re carrying, you can get a much richer understanding of your environment.
John Koetsier: That is really, really interesting. I’m also super interested in the possibility of drones working together and drones working together with robots. I mean, you can imagine perhaps in a search and rescue scenario you’ve got perhaps a fixed-wing drone which can go for long distances, for a long period of time, perhaps at a higher rate of speed, and identifying, ‘Hey, there’s some heat signature here, go check that out in more detail at a lower pace. You can go down underneath the tree canopy level. I can’t, right?
So, send those drones there.’ That gets really, really interesting because then you look at the possibility of having some kind of control software for your drone fleet or your robotic integrated fleet … do X, achieve Y, you know, that sort of thing. And the system sort of figures out the best way to make it happen.
Nader Elm: It goes back to exactly what you said in that you can instruct individuals to do one and then two and then three, or an objective for a team where you’re basically saying, ‘Here is what I’m trying to achieve … now leave it to the rest of the team to figure out how that’s going to be done, and break down the problem.’ And that is exactly what we’re trying to do.
John Koetsier: Excellent. Excellent. Well, thank you so much. This has been interesting and informative as per usual. Thank you for your time, Nader.
Nader Elm: No, pleasure to speak with you again, John.
Big reader, huh? Subscribe to the podcast already
Made it all the way down here? Who are you?!? 🙂
The TechFirst with John Koetsier podcast is about tech that is changing the world, and innovators who are shaping the future. Guests include former Apple CEO John Scully. The head of Facebook gaming. Amazon’s head of robotics. GitHub’s CTO. Twitter’s chief information security officer, and much more. Scientists inventing smart contact lenses. Startup entrepreneurs. Google executives. Former Microsoft CTO Nathan Myhrvold. And much, much more.
Subscribe on your podcast platform of choice: