3D Perception for Autonomous Navigation of a Low-Cost MAV using Minimal Landmarks
Résumé
We present an implementation of autonomous navigation for Micro Air Vehicles which is well-suited for very inexpensive models: It only relies on a single camera and few additional on-board sensors to solve the challenges of flight planning and collision avoidance. Artificial landmarks are not required except in places with an ambiguous further flight path, such as corridor crossings or junctions. There they provide topological localization, which enables our system to perform tasks like way point following.
Even without any direct 3D sensor, our system is able to reconstruct metric distances from its monocular camera via two complementary methods: An oscillating motion pattern is superimposed to regular flight to reliably estimate up-to-date 3D positions of sparse image features. As an alternative, a specific flight maneuver can virtually create a vertical stereo camera to provide depth information densely across most pixels at single points in time. The unknown metric scale inherent in employing a single camera is determined by evaluating further sensors via a robust two-stage approach. We use the results from either method to traverse free space and avoid obstacles.