Sensor networked mobile robotics project
The traditional mobile robot architecture uses onboard sensors to see its
environment. R2D2, the Terminator, and the majority of examples from science
fiction (as well as non-fiction) all follow this form. These robots are limited
to a first-person field-of-view. They also must solve the difficult problem of
moving data fusion , which includes the correspondence problem and the
self-localization problem. Images or data sensed from one viewpoint must be
fused with data sensed from additional viewpoints as the robot (and hence
sensors) move through the environment.
The sensor networked mobile robotics project is researching an alternative
architecture to mobile robotics. In this architecture, the sensors are deployed
as a stationary network distributed throughout the environment. The robot itself
is "blind", and sees by receiving transmissions from the sensor network. An SN
robot enjoys a third-person perspective. This is sometimes referred to in the
literary world as the "God's eye view". By being able to see the entire
environment, the robot should be able to more effectively plan and execute
motion. In addition, the data fusion problem is simplified, because although the
robot moves, all the sensors remain stationary.
SNR "Curious Dog"
A recent master's project used the SNR architecture to
create a "curious dog". The robot literally runs into objects and watches what
happens. From these interactions the robot can determine dynamic properties of
the objects, such as their mass or coeeficient of friction with the floor. Sort
of like a dog nosing objects in a sandbox.
The robot was modified and fitted with a "ram":

The objects
used for experiments included a Tonka truck, a lightweight plastic ball, a large
empty box and a small heavy box:

Click
here to see
a movie clip demo of the robot moving around its box, hitting objects. This clip
is playing at 10x speed.
Click here to see a
movie clip demo of the "interaction analyzer" system in action. The collisions
in this clip are playing at normal speed, but the rest is playing at 10x speed.
Papers about this project:
- A
Real-Time Occupancy Map from Multiple Video Streams , in IEEE
Conference on Robotics & Automation , May 1999, PDF (1.1 Mbytes)
- Calibrating
a Camera Network Using a Domino Grid, Pattern Recognition (journal), vol.
34 no. 5, May 2001, PDF (0.7 Mbytes)
- A
Spatial-Temporal Occupancy Map from Multiple Video Streams, technical
report, August 1999, PDF (1.3 Mbytes)
- Path Planning for
Mobile Robots Using a Video Camera Network , IEEE/ASME International
Conference on Advanced Intelligent Mechatronics, Atlanta GA, 1999, PDF (0.4
Mbytes)
- Mobile Robot
Navigation Using an Environment-Based Video Camera Network , technical
report, May 1999, revised March 2000, PDF (0.7 Mbytes)
- Sensor
Network Perception for Mobile Robotics , in IEEE Conference on Robotics
& Automation , April 2000, PDF (0.5 Mbytes)
- Robot Navigation using a Sensor Network, Master's Thesis, Laboratory of
Image Analysis, Aalborg University, June 1998.
- Learning the dynamic properties of an unknown object through interactions
with an autonomous mobile robot, Master's Thesis, Laboratory of Image
Analysis, Aalborg University, June 2000.
Ongoing efforts:
Our controller stinks. Click here to see a movie
clip of how bad it is. Mostly this is because we are controlling at 20-30 Hz,
the rate of the vision system.
Our simple path planner has opened up some interesting directions for
research. The idea is to search a set of polynomial paths for a "good" route.
Click here to see
a movie clip of the polynomial search in action. We want to see how well this
works in the real world.
The vision system remains a hot topic, particularly the data fusion.
Scalability, outdoor deployment, and mobility are all current interests.
Last updated November 2002
SNR Project Page / Clemson /
ahoover@clemson.edu