PhD Thesis

Visual guidance of autonomous micro aerial vehicles

Work default illustration

Information

  • Started: 12/04/2012
  • Thesis project read: 20/06/2014

Description

The main motivation of this Ph.D. thesis will be the study and development of new perception techniques to model, identify and recognize the scenario and their use in the guidance of MAVs. Specifically some of these techniques will be tested in assembly operations by means of flying robots. These assembly operations will consist on creating a structure autonomously formed by several bars and joints that initially should be picked up, transported to the assembly area and assembled into the current structure. Our work concentrates on all perception issues pertaining this task, but does not include the issues related to the actual grasping and manipulation.
The tasks that we will solve include the self-positioning of the MAV in front of a target. By means of this, the robot should be able to detect the object and precisely hover in order to grasp or release it. We will contribute with new approaches to drive the flying robot towards a target using visual servoing techniques. Thus, we expect not only to contribute with a new robust technique for uncalibrated cameras but with new control law proposals using that visual information.
An important aspect to be considered while guiding autonomously a vehicle is the robot odometry estimation. During the task, the robot should be able to estimate its velocity and position state using onboard sensors. Considering the MAV characteristics. This thesis aims to provide new sensor fusion techniques to efficiently estimate the vehicle odometry by filtering the data obtained from the cameras together with other onboard sensors such as an IMU.
In addition, flying with a suspended load is a challenging task because the load significantly changes the flight characteristics of the aerial vehicle, and the stability of the vehicle-load system must be preserved. Therefore, it is essential that the flying robot has the ability to minimize the effects of the arm on the flying system during the assigned maneuvers. Specifically our objective is to research and develop new techniques to minimize those effects while a visual guidance is performed in order to improve flight behaviours.
Considering a task oriented autonomous guidance, the robot must be lo- calized in the scenario. To this end, two different solutions will be proposed. Firstly, a coarse localization and mapping technique based on range readings provided by radio beacons. Secondly, a fine localization technique using all available oboard sensors toghether with visual mark detections.
Driving autonomously a free-flying aerial manipulator entails a lot of chal- lenging perception aspects. The objectives of this work is to provide working solutions to some of them, namely
- MAV odometry estimation using onboard sensors.
- New visual servoing techniques to drive the flying robot towards a target.
- Robot localization and mapping methods in order to guide the robot in the scenario.
- Control law proposals specifically designed for kinematically augmented MAVs.

The work is under the scope of the following projects:

  • ARCAS: Aerial Robotics Cooperative Assembly System (web)
  • AEROARMS: AErial RObotics System integrating multiple ARMS and advanced manipulation capabilities for inspection and maintenance (web)
  • RobInstruct: Instructing robots using natural communication skills (web)