Pioneer P3-AT mobile robot (rightmost photos: config-1 at ISR/IST)

 

MSc dissertation proposal 2007/2008

 

Visual Control of a Unicycle Type Robot

 

 

Introduction:

 

"Although a few of the robots of tomorrow may resemble the anthropomorphic devices seen in Star Wars, most will look nothing like the humanoid C-3PO. In fact, as mobile peripheral devices become more and more common, it may be increasingly difficult to say exactly what a robot is. Because the new machines will be so specialized and ubiquitous--and look so little like the two-legged automatons of science fiction--we probably will not even call them robots. But as these devices become affordable to consumers, they could have just as profound an impact on the way we work, communicate, learn and entertain ourselves as the PC has had over the past 30 years."

 

Excerpt of A Robot in Every Home, Bill Gates, ScientificAmerican.com, January 2007.

 

Future robots will be more than mere tools: they will be quasi-team members whose tasks have to be integrated with those of humans (or other robots). In order for this to happen, the robots should be coordinating their behaviours with the requirements and expectations of human team members [HRI2007]. In this work we consider behaviours where mobile-robots follow humans or other robots (therefore making chained systems). Particular attention is given to the sensorial, control and navigation aspects.

 

 

Objectives:

 

The objectives of this work are twofold: (i) Visual tracking of a pattern by its colour or shape; (ii) Control of a unicycle-type robot in order to follow the visually tracked pattern. At the end is expected to demonstrate chaining capabilities by a mobile robot equipped with a vision sensor.

 

 

Detailed description:

 

Controlling the motion of a mobile robot is still a very active research-and-development topic mainly because the large variety of applications and the available locomotion methods and sensors. In this work-proposal the robots are of the very common unicycle-type, i.e. carts or cars moving in a two-dimensional world having two parallel driven wheels, one mounted on each side of their centre, and one or two offset castors to maintain balance. These vehicles allow simultaneous arbitrary rotation and translation being only constrained in the sideways motion (the sideways motion cannot be instantaneous). The main sensors used are computer vision and odometry.

 

More specifically, the work-proposal consists of controlling one unicycle robot to follow a visually-tracked person or other robot. The visual-tracking is simplified to tracking colours, shapes or distinctive features - some of these tools are already available e.g. in vision toolboxes as [OpenCV]. The control can be of two types: maintaining a distance or maintaining a pose relative to the target. These two purposes depend on the output of the visual tracking, which provides in some cases distances and in other ones poses.

 

The work is therefore organized in the following main steps:

1) visual tracking of patterns based on colours and/or shapes

2) controller design for a unicycle type robot, in order to follow a target at a constant distance, and imitate the target's pose (if available from the visual tracking)

3) develop a cable-less joystick combining the visual tracking done onboard a unicycle type robot and the controller; demonstration of the cable-less joystick on a mobile-robots simulation environment and/or on a real robot

 

References:

 

[HRI2007] - 2nd ACM/IEEE International Conference on Human-Robot Interaction, http://hri2007.org/

 

[ISR-galery] Some images of unicycle type robots at ISR: see labmate, scouts, pioneers, ... in "Mini galeria de fotografias de projectos @ IST/ISR", http://users.isr.ist.utl.pt/~jag/infoforum/isr_galeria/

 

[VISP] Visual servoing videos, http://www.irisa.fr/lagadic/visp/video.html

 

[OpenCV] Open Computer Vision Library, http://sourceforge.net/projects/opencv/

 

[Aguiar] "Position Tracking for a Nonlinear Underactuated Hovercraft: Controller Design and Experimental Results", A. Pedro Aguiar, L. Cremean, and João P. Hespanha, in Proc. of CDC’03 – 42nd IEEE Conference on Decision and Control, Maui, Hawaii, Dec. 2003

 

[GWSV00] Vision-based Navigation and Environmental Representations with an Omnidirectional Camera, José Gaspar, Niall Winters, José Santos-Victor,  IEEE Transaction on Robotics and Automation, Vol 16, number 6, December 2000

 

 

 

Expected results:

 

At the end of the work the students will have enriched their knowledge in:

* Computer vision

* Control of unicycle type vehicles

 

Examples of expected demonstrations in simulated and/or real environments:

* remotely controlling the unicycle-type robot by moving one 3D object which is tracked visually by the robot

* making a robotic trailer by mounting the 3D tracked object on another mobile robot, i.e. chaining two robots using a visual link

 

 

Observations:

--

 

 

More MSc dissertation proposals on Computer and Robot Vision in:

 

http://omni.isr.ist.utl.pt/~jag