(Source: sciencedaily.com)
Mexican José Martínez, structured an innovative method to estimate the position and orientation of the vehicle, allowing it to recognize its environment, hence to replace the GPS location system for low-cost sensors such as accelerometers, gyroscopes and camcorders.
The main idea was to avoid the use of GPS and opted for the use of video cameras on board of the vehicle for visual information and applying an algorithm to locate and orient the drone during its flight to use such information. To do this, a function that allows to draw a specific route on a map using aerial view was also adapted, similar to google maps, it indicates autonomous navigation to a particular destination.
This knowledge was developed in the “PUCE projects: ‘Precise navigation of UAVs in Complex Environments and SMART Boomerang’,” work developed during his postdoc at the University of Bristol, in collaboration with the British company Blue Bear Ltd, which provided the drones and control algorithms while financing was obtained from Innovative UK and the Defence Science and Technology Laboratory (DSTL), British government agencies that finance technological innovation projects.
“Upon completion of these projects, I returned to Mexico as full-time researcher at the INAOE, where I won the Royal Society-Newton Advanced Fellowship financing awarded by the British Academy of Sciences. This will allow me to perform basic science research focused on the issue of aerial robotics,” said the researcher, who also holds a Master in Computer Science.
The project for which funding was obtained is called “RAFAGA: Robust Autonomous Flight of unmanned aerial vehicles in GPS-denied outdoor areas.” Its main objective is to investigate different methods to perform autonomous flight of a drone on the outside environment where several challenges as wind currents occur in areas where there is no GPS signal and have limited computational processing capabilities.
“At the stage of repeating, the pilot just makes the drone take off, but once in the air, autonomous flight algorithms kick into action and, by processing visual information captured by the camera, the vehicle recognizes where in the environment it is positioned, “said the researcher at INAOE.
Once it has recognized its location, visual information estimates vehicle position, which is sent to the control algorithms, responsible for moving the drone, so that it navigates to each of the points made in the route recorded during the stage teaching.
Software for ground control station was also developed, where the visual transmission from the drone is received in real time, this through the inspection chamber in charge to take photos or videos needed to detect fractures or flaws in structures.
The INAOE researcher reports that RAFAGA was achieved thanks to the Newton Fund, which aims to promote collaboration between the UK and developing countries in order to promote scientific and technological research. This time, funding ends in February 2017 and was awarded in partnership with CONACYT and the Mexican Academy of Sciences.
Continue Reading at sciencedaily.com…
Alan is serial entrepreneur, active angel investor, and a drone enthusiast. He co-founded DRONELIFE.com to address the emerging commercial market for drones and drone technology. Prior to DRONELIFE.com, Alan co-founded Where.com, ThinkingScreen Media, and Nurse.com. Recently, Alan has co-founded Crowditz.com, a leader in Equity Crowdfunding Data, Analytics, and Insights. Alan can be reached at alan(at)dronelife.com