NVIDIA researchers have released technology that will “enable developers to create autonomous drones that can navigate complex, unmapped places without GPS,” says the company.
“One of the challenges that many companies have encountered is in flying in environments that are GPS denied,” Jesse Clayton, Senior Manager Product for Intelligent Machines tells DRONELIFE. “But AI can be used to navigate in these types of environments,” says Clayton.
It’s a significant step forward for the drone industry. True deep learning on drones that can enable precise navigation in cluttered environments without GPS could have big implications for many applications – like package delivery, disaster response, and industry like mining or maritime applications.
NVIDIA’s Redtail demonstrates the technology by flying along a trail through a forest. “We chose forests as a proving ground because it’s one of the most difficult places to navigate,” said Nikolai Smolyanskiy, the team’s technical lead. “We figured if we could use deep learning to navigate in that environment, we could navigate anywhere.”
“Unlike a more urban setting, where there’s generally uniformity to the height of curbs, shape of mailboxes, and width of sidewalks, a forest is relatively chaotic. Trails in the woods often contain no markings. Light can be filtered through leaves and there can be bright sunlight to dark shadows. Trees can also vary in height, width, angle, and branches.”
All of this is done through deep learning and computer vision powered by NVIDIA Jetson TX1/TX2 embedded AI supercomputers. The Jetson – only the size of a credit card – is a powerful computing system lightweight and compact enough to be ideal for edge AI applications.
The Redtail can fly along forest trails autonomously, achieving record-breaking long- range flights of more than one kilometer (about six-tenths of a mile) in the lower forest canopy. The Redtail drone successfully avoids obstacles, maintaining a steady position in the center of the trail.
The team has released the deep learning models and code on GitHub as an open source project, so that the robotic community can use them to build smarter mobile robots. The technology can turn any drone into one that’s autonomous, capable of navigating along roads, forest trails, tunnels, under bridges, and inside buildings by relying only on visual sensors. All that’s needed is a path the drone can recognize visually.
The framework consists of Robotic Operating System (ROS) nodes and includes a deep neural network (DNN) called TrailNet, which estimates the drone’s orientation and lateral offset with respect to the navigation path. The provided control system uses the estimated pose to fly a drone along the path.
“The use of special training techniques has allowed us to achieve smooth and stable autonomous flights without sudden movements that would make it wobble,” said NVIDIA deep learning expert Alexey Kamenev.
Miriam McNabb is the Editor-in-Chief of DRONELIFE and CEO of JobForDrones, a professional drone services marketplace, and a fascinated observer of the emerging drone industry and the regulatory environment for drones. Miriam has penned over 3,000 articles focused on the commercial drone space and is an international speaker and recognized figure in the industry. Miriam has a degree from the University of Chicago and over 20 years of experience in high tech sales and marketing for new technologies.
For drone industry consulting or writing, Email Miriam.
TWITTER:@spaldingbarker
Subscribe to DroneLife here.