High
Speed Event-Driven Robot Vision
(id
17431 MEEC, id 17432 MEAer)
Description
Consumer cameras capture full images at few tens or hundreds of frames per
second. While this is convenient for record and playback of videos for visualisation by humans, their use in robotics is not very
efficient. First, they send massive amounts of information per second to the
robot processing system, that most often contain no relevant information.
Second, if a relevant event happens at a certain time, the robot can only be
aware of it after the full image is acquired, transmitted
and processed by the robot brain.
These facts have led to the creation of Event Based Cameras a.k.a DVS (Dynamic Vision Sensor). These cameras do not
send full frames. Instead, they stream, at a very high rate, the coordinates of
pixels as soon as they change their brightness level above a threshold. This is
more similar to the way human vision works and allows
to have latencies on the order of the microsecond to detect changes in the
visual array.
Despite being a better sensor for robot vision, having a pixel stream
instead of a full frame, changes the image processing paradigm, and new
algorithms are being developed to compute many types of robot visual skills.
In this project, it is proposed to develop pixel stream algorithms for an
important robot skill: the visual odometer. This allows robots to have a notion
of their own instantaneous linear and angular positions and velocities. Being
able to do this is an important step to have highly dynamic robots navigating
safely in the environment (e.g. drones).
References:
Introduction to event cameras:
http://rpg.ifi.uzh.ch/research_dvs.html
Requirements (grades, required courses, etc):
The candidate to this work must have good knowledge of image processing and vision, robotics and control. It must be comfortable in programming Matlab and c/c++ languages.
Place for conducting the work-proposal:
ISR / IST
More information
in:
http://users.isr.tecnico.ulisboa.pt/~jag