Pioneer P3-AT mobile robot (rightmost photos: config-1 at ISR/IST)

 


MSc dissertation proposal 2010/2011

 

Vision-based Motion Control of a Mobile Robot

 

 

Introduction:

 

"Although a few of the robots of tomorrow may resemble the anthropomorphic devices seen in Star Wars, most will look nothing like the humanoid C-3PO. In fact, as mobile peripheral devices become more and more common, it may be increasingly difficult to say exactly what a robot is. Because the new machines will be so specialized and ubiquitous--and look so little like the two-legged automatons of science fiction--we probably will not even call them robots. But as these devices become affordable to consumers, they could have just as profound an impact on the way we work, communicate, learn and entertain ourselves as the PC has had over the past 30 years."

Excerpt of A Robot in Every Home, Bill Gates, ScientificAmerican.com, January 2007.

 

Future robots will be more than mere tools doing isolated tasks: they will be quasi-team members whose tasks have to be integrated with those of humans or other robots [HRI2007]. One of the key aspects of the integration is collision free motion among the various robots and people working in the environment. This work is therefore mainly concerned with designing motion controllers capable of responding efficiently, while considering the own dynamic limitations of the robots and the constraints to motion imposed by other robots or people.

 

 

Objectives:

 

The objectives of this work are twofold: (i) Visual tracking of patterns by their colours and/or shapes; (ii) Controlling a unicycle-type robot in order to reach the visually tracked target-pattern. At the end is expected to demonstrate a point to point navigation capability by a mobile robot under the view of vision sensors.

 

 

Detailed description:

 

Controlling the motion of a mobile robot is still a very active research-and-development topic mainly because the large variety of applications that intrinsically implies a large diversity of locomotion methods and sensors. In this work-proposal the robots are of the very common unicycle-type, i.e. carts or cars moving in a two-dimensional world having two parallel driven wheels, one mounted on each side of their centre, and one or two offset castors to maintain balance. These vehicles allow simultaneous arbitrary rotation and translation being only constrained in the sideways motion (the sideways motion cannot be instantaneous). The main sensors to use are computer vision and odometry.

 

More specifically, the work-proposal consists of controlling one unicycle robot to navigate to a target location following a reference path [Carona08]. The target location is defined by an off-board video camera. The path following, or tracking, can be aided by both the off-board and on-board sensors. The controller must consider the dynamic constraints of the mobile robot and re-adjust the motion according to unexpected obstacles (e.g. other robots or people). In other words, the on-board sensors will be allowed to a certain extent to redefine the path to follow, based on the characteristics of the robot and the state of the environment. The control methodologies may involve some visual path following [Gaspar00], visual servoing [VISP], or dynamic control [Aguiar04] as considered to be the most adequate for the tasks at hand.

 

The work is therefore organized in the following main steps:

1) visual tracking of patterns marking the robot and the target, based on colours and/or shapes

2) combining the sensors information in common reference frames, using information fusion filters

3) controller design for the unicycle type robot, in order to move it towards the target

 

References:

 

[Carona08] Control of Unicycle Type Robots: Tracking, Path Following and Point Stabilization, Ricardo Carona, A. Pedro Aguiar, José Gaspar, in Proc. of IV Jornadas de Engenharia Electrónica e Telecomunicações e de Computadores, pp180-185, November 2008, Lisbon, Portugal.

 

[HRI2007] - 2nd ACM/IEEE International Conference on Human-Robot Interaction, http://hri2007.org/

 

[Aguiar04] "Pose Estimation of Autonomous Vehicles using Visual Information: A Minimum-Energy Estimator Approach", A. Pedro Aguiar and João P. Hespanha, in Proc. of IAV2004 - 5th IFAC/EURON Symposium on Intelligent Autonomous Vehicles, Lisbon, Portugal, Jul. 2004.

 

[Gaspar00] Vision-based Navigation and Environmental Representations with an Omnidirectional Camera, José Gaspar, Niall Winters, José Santos-Victor,  IEEE Transaction on Robotics and Automation, Vol 16, number 6, December 2000

 

[ISR-galery] Some images of unicycle type robots at ISR: see labmate, scouts, pioneers, ... in "Mini galeria de fotografias de projectos @ IST/ISR", http://users.isr.ist.utl.pt/~jag/infoforum/isr_galeria/

 

[OpenCV] Open Computer Vision Library, http://sourceforge.net/projects/opencv/

 

[VISP] Visual servoing videos, http://www.irisa.fr/lagadic/visp/video.html

 

 

Expected results:

 

At the end of the work the students will have enriched their knowledge in:

* Computer vision

* Control of unicycle type vehicles

 

Examples of expected demonstrations in simulated and/or real environments:

* remotely controlling the unicycle-type robot by pointing to one reference target-object to be reached by the robot

* guiding the robot towards a target, while the robot negotiates its own navigation in order to avoid collisions with other robots

 

 

Observations:

--

 

 

More MSc dissertation proposals on Computer and Robot Vision in:

 

http://omni.isr.ist.utl.pt/~jag