Making Small Unmanned Systems do Big Things
Guidance, navigation, and control challenges for Small Unmanned Aerial Systems (SUASs) are among the most significant barriers to practical use of these systems for military or non-military use. For two examples, consider that: (1) the reliability of unmanned aircraft flight control is currently far lower than equivalent manned aircraft; (2) the ability of SUAVs to effectively autonomously avoid obstacles or other aircraft is currently very limited. Addressing these, and related, challenges will improve performance and remove barriers to commercial use of SUASs and the military use of guided munitions, SUAS, and UAS in desired scenarios. This presentation will address recent progress in the areas UAS guidance, navigation, and control, with emphasis on adaptive flight control, vision-aided inertial navigation, and laser-aided inertial navigation. This will include progress in both theory and related flight test validation on unmanned research aircraft at Georgia Tech and elsewhere.
Eric Johnson has a diverse background in guidance, navigation, and control – including applications such as airplanes, helicopters, submarines, and launch vehicles. He received a B.S. degree from University of Washington, M.S. degrees from MIT and The George Washington University, and a Ph.D. from Georgia Tech, all in Aerospace Engineering. He also has five years of industry experience working at Lockheed Martin and Draper Laboratory. He joined the Georgia Tech faculty in 2000, performing research in adaptive flight control, navigation, embedded software, and autonomous systems as the Director of the Unmanned Aerial Vehicle Research Facility. He was the lead system integrator for rotorcraft experiments and demonstrations for the DARPA Software Enabled Control program, which included the first air-launch of a hovering aircraft and adaptive flight control of a helicopter with a simulated actuator failures. He was the principal investigator of the Active Vision Control Systems AFOSR project developing methods that utilize 2-D and 3-D imagery to enable aerial vehicles to operate in uncertain complex 3-D environments, which included the first purely vision based autonomous formation flights.