Sensor networked mobile robotics project


The traditional mobile robot architecture uses onboard sensors to see its environment. R2D2, the Terminator, and the majority of examples from science fiction (as well as non-fiction) all follow this form. These robots are limited to a first-person field-of-view. They also must solve the difficult problem of moving data fusion , which includes the correspondence problem and the self-localization problem. Images or data sensed from one viewpoint must be fused with data sensed from additional viewpoints as the robot (and hence sensors) move through the environment.

The sensor networked mobile robotics project is researching an alternative architecture to mobile robotics. In this architecture, the sensors are deployed as a stationary network distributed throughout the environment. The robot itself is "blind", and sees by receiving transmissions from the sensor network. An SN robot enjoys a third-person perspective. This is sometimes referred to in the literary world as the "God's eye view". By being able to see the entire environment, the robot should be able to more effectively plan and execute motion. In addition, the data fusion problem is simplified, because although the robot moves, all the sensors remain stationary.

SNR "Curious Dog"

A recent master's project used the SNR architecture to create a "curious dog". The robot literally runs into objects and watches what happens. From these interactions the robot can determine dynamic properties of the objects, such as their mass or coeeficient of friction with the floor. Sort of like a dog nosing objects in a sandbox.

The robot was modified and fitted with a "ram":

The objects used for experiments included a Tonka truck, a lightweight plastic ball, a large empty box and a small heavy box:

Click here to see a movie clip demo of the robot moving around its box, hitting objects. This clip is playing at 10x speed.

Click here to see a movie clip demo of the "interaction analyzer" system in action. The collisions in this clip are playing at normal speed, but the rest is playing at 10x speed.

Papers about this project:

Ongoing efforts:

Our controller stinks. Click here to see a movie clip of how bad it is. Mostly this is because we are controlling at 20-30 Hz, the rate of the vision system.

Our simple path planner has opened up some interesting directions for research. The idea is to search a set of polynomial paths for a "good" route. Click here to see a movie clip of the polynomial search in action. We want to see how well this works in the real world.

The vision system remains a hot topic, particularly the data fusion. Scalability, outdoor deployment, and mobility are all current interests.

Last updated November 2002


SNR Project Page / Clemson / ahoover@clemson.edu