from expressnews.com
With sensors covering his head, University of Texas at San Antonio graduate student Mauricio Merino concentrated hard. A camouflage-colored drone hovered with a soft hum in the middle of a campus research lab.
For now, though, it was fellow graduate student Prasanna Kolar who stood nearby to operate the unmanned aerial vehicle, also called a UAV, with a cellphone app — gently commanding it left and right.
The ultimate goal: Create a process to control the movements of groups of drones with nothing more than a thought, said Daniel Pack, chairman of UTSA’s electrical and computer engineering department.
The newly launched research comes at the intersection of two batches of funding. A team of researchers from the Unmanned Systems Laboratory in the university’s electrical and computer engineering department recently scored a $300,000 contract from the Office of the Secretary of Defense to investigate how soldiers could use their brain signals to operate drones for intelligence, surveillance and reconnaissance missions.
A separate $400,000 Defense Department grant allowed the school to buy two high-performance electroencephalogram, or EEG, systems. These provide a noninvasive way to measure brain waves.
Six professors in various departments, including the drone researchers, will use the EEG systems for projects studying brain-machine interaction.
Pack said his research might help the Army lighten an already heavy load for soldiers in the field.
“It becomes more burdensome to ask them to carry more things,” Pack said. “You have to have a computer or a mechanism that you use to control the UAVs. But if you can do this without having them actually carry additional equipment and using brainwaves directly, then you are helping our soldiers.”
Pack envisions drone operators wearing EEG sensors in their helmets and giving commands far more complicated than a simple “move left” or “move right.”
For instance, he wants a soldier in the field to be able to scout for enemies by commanding a group of drones to “go over the hill and see what’s up there.” Then the soldier might receive information back from the drones through something akin to Google Glass.
“Multiple UAVs will autonomously, amongst themselves, say, ‘You do this. I do this.’ And they will either take pictures of it or get a situational awareness” of what lies behind the hill, all because of a command from a single thought, Pack said.
People may have different brain waves for the same command, so researchers will have to “minimize the differences and maximize the similarities” between brain waves and come up with ways to interpret them into machine commands to make the concept work, Pack said.
Continue Reading at expressnews.com…
Alan is serial entrepreneur, active angel investor, and a drone enthusiast. He co-founded DRONELIFE.com to address the emerging commercial market for drones and drone technology. Prior to DRONELIFE.com, Alan co-founded Where.com, ThinkingScreen Media, and Nurse.com. Recently, Alan has co-founded Crowditz.com, a leader in Equity Crowdfunding Data, Analytics, and Insights. Alan can be reached at alan(at)dronelife.com
Leave a Reply