Motion control using optical flow of sparse image features

Files
Seebacher_bu_0017N_10978.pdf(12.09 MB)
Main dissertation
Seebacher_bu_0017N_429/AR_Drone_Flight_Test_Indoor_01.mov(18.33 MB)
View AR_Drone_Flight_Test_Indoor_01.mov
Seebacher_bu_0017N_429/AR_Drone_Flight_Test_Outdoor_01.mov(43.76 MB)
View AR_Drone_Flight_Test_Outdoor_01.mov
Seebacher_bu_0017N_429/AR_Drone_Flight_Test_Outdoor_02-1_Drone.mov(41.91 MB)
View AR_Drone_Flight_Test_Outdoor_02-1_Drone.mov
Seebacher_bu_0017N_429/AR_Drone_Flight_Test_Indoor_High_Skew.mov(7.76 MB)
View AR_Drone_Flight_Test_Indoor_High_Skew.mov
Date
2015
DOI
Authors
Seebacher, J. Paul
Version
OA Version
Citation
Abstract
Reactive motion planning and local navigation of robots remains a significant challenge in the motion control of robotic vehicles. This thesis presents new results on vision guided navigation using optical flow. By detecting key image features, calculating optical flow and leveraging time-to-transit (tau) as a feedback signal, control architectures can steer a vehicle so as to avoid obstacles while simultaneously using them as navigation beacons. Averaging and balancing tau over multiple image features successfully guides a vehicle along a corridor while avoiding looming objects in the periphery. In addition, the averaging strategy deemphasizes noise associated with rotationally induced flow fields, mitigating risks of positive feedback akin to the Larsen effect. A recently developed, biologically inspired, binary-key point description algorithm, FReaK, offers process speed-ups that make vision-based feedback signals achievable. A Parrot ARDrone2 has proven to be a reliable platform for testing the architecture and has demonstrated the control law's effectiveness in using time-to-transit calculations for real-time navigation.
Description
License