|
Mobile robots as gateways
into wireless sensor networks |
Jim Butler (Updated May 2,
2003)
Overview
It is well
known that Intel leads the industry in wireless sensor
network research. What may not be quite as well known is
Intel's recent work in mobile robotics. In particular,
Intel is helping researchers create small, sophisticated
mobile robots that can act as gateways into wireless
sensor networks.
This is a new venture that is
focused on intelligent mobile robots -- robots that are
used in flexible environments, not automated toolsets in
fixed locations. For example, Intel-based mobile robots
will be used at the James Reserve by the Center for
Embedded Networked Sensing (CENS) to map terrain and
monitor habitats. Intel silicon for robotics
applications is also being used by researchers, such as
professor Tucker Balch at the Georgia Institute of
Technology. Professor Balch is exploring how robots can
organize and perform like social insects, such as bees
and ants. Future projects may include the possibility of
building a ground-based Robonaut, as well as the brains
of the 2009 Mars Rover.
Intel's focus is not on
the mechanical aspects of robots -- the wheels, motors,
grasping arms or physical layout. Instead, this venture
is focused on the silicon and software that give a robot
its capabilities and intelligence. Intel's role is to
assist researchers in putting powerful, sophisticated
intelligence into small, standardized packages for
mobile robotics. With wireless technologies now
practical and available, this is a novel area for
research and investigation.
To assist
researchers, Intel is offering inexpensive,
standards-based hardware, an open-source operating
system, and drivers for use in robotics environments.
The open-source package lets researchers take advantage
of leading-edge Intel XScale microprocessors and Intel
Centrino mobile technology, while reducing the overall
costs of developing robotics systems.
What is
a robot?
Robotics is not a new field. It has
been around for decades. In fact, most people have
robots in their own home, even if they don't recognize
the robots as such. For example, a dishwasher
automatically washes and dries your dishes, then grinds
up the rinsed-off food so the organic matter doesn't
clog your drains. A washing machine soaks, soaps,
agitates, and rinses your clothes. Down the street, the
car wash-n-wax cleans, brushes, washes, and waxes your
car, all for a few dollars. One of the better known
home-oriented robots is iRobot's smart vacuum cleaner,
called the Roomba, which has already won the Good
Housekeeping Award for efficiency and ease of use.
More sophisticated robots are used in
manufacturing plants and warehouses. Car makers use
automated machines to position car frames, bolt pieces
together, and even do welds and priming. In wafer
communications, test systems position themselves along
grids, take measurements, and then correlate the data
into graphs. Robot-assisted heart microsurgery is now
performed routinely in the U.S.
To some extent,
we have become so used to robots that we no longer pay
attention to the automated machines. We look only at the
tasks they complete, and we think of them simply as
tools. It is easy to think this way: most of today's
robots are stationary tools in fixed locations, like a
fruit sorter in a cannery, or an alarm sensor that
triggers a call to security.
Robots growing
in sophistication
Although we are surrounded
by robots that we think of as automated tools, there are
some sophisticated robots already in use (photo below).
A remote telepresence is one of the most common
applications that today's mobile, autonomous robots
provide. Intelligence for these robots is handled via an
embedded microcontroller that manages internal systems,
and by a laptop that is attached to the robot. Humans
control the robot through wireless communications. In
this way, humans can tell the robot to change
directions, shift a camera angle, take measurements,
grasp objects, and so on. For example, mobile robots can
let security personnel stay in a central office and
still check out unsupervised areas in a warehouse or
other remote site.
 Carnegie
Mellon University's TagBots use Intel
boards
With advances in microchip
design, nanotech sciences, software architecture, and
mini-power cells, robot systems can be more than just
another pair of eyes. They are already being tested and
used in a variety of applications. They can traverse
different, even dangerous environments and perform
complex tasks on their own. For example, mil-spec iRobot
Packbots have been used in Afghanistan to detect and map
the locations and contents of caves. Another iRobot
rover was used in the historic exploration of both the
southern and northern shafts that led to the Queen's
Chamber in the Great Pyramid at Giza (Egypt). The rover
was able to illuminate areas beyond the blocking stones
in the shafts, which had last been viewed by human eyes
some 4,500 years ago.
Robot mobility
issues
Regardless of a robot's design or
tasks, there are still three main issues with its
mobility:
- Localization: How does a robot know where it is in
its environment?
- Mapping: How does the robot know the details of
its environment?
- Navigation: How does a robot traverse its
environment?
Intel works closely with
researchers to identify novel ways for a robot to
perform its mobility tasks. Intel is particularly
interested in machine-vision libraries that can be used
to perform localization and mapping based on monocular-
or stereo-vision systems. For example, right now, most
robots navigate by using infrared or radio waves to
avoid objects in their paths. However, Intel software
researchers recently developed several libraries that
are very applicable to robotics systems. Intel's
computer vision library is already used extensively by
vision researchers.
Intel has also released a
test version of a technical library for building
Bayesian networks to support machine-learning
activities. Bayesian networks are a form of
probability-based artificial intelligence. Such a
network would let a robot navigate by matching sensor
data to a map stored in its memory.
Gateways
into sensor networks
Two technologies in
particular seem to be moving toward an interesting
convergence: mobile robotics and wireless sensor
networks. The two main questions here are:
- Can a mobile robot act as a gateway into a
wireless sensor network?
- Can sensor networks take advantage of a robot's
mobility and intelligence?
One major issue
with a mobile robot acting as a gateway is the
communication between the robot and the sensor network.
Sensor networks typically communicate using 900 MHz
radio waves. Mobile robots use laptops that communicate
via 802.11, in the 2.4- to 2.483-GHz range. Intel hopes
to prove that a sensor net can be equipped with 802.11
capabilities to bridge the gap between robotics and
wireless networks.
Intel recently demonstrated
how a few motes equipped with 802.11 wireless
capabilities can be added to a sensor network to act as
wireless hubs. Other motes in the network then use each
other as links to reach the 802.11-equipped hubs. The
hubs forward the data packets to the main 802.11-capable
gateway, which is usually a laptop. Using some motes as
hubs cuts down on the number of hops any one data packet
has to make to reach the main gateway. It also reduces
power consumption across the sensor net.
Intel
believes that one of the most interesting technology
convergences will be in designing mobile robots that can
act as gateways into the wireless sensor networks. For
example, Intel recently installed small sensors in a vineyard in Oregon to monitor
microclimates. The sensors measured temperature,
humidity, and other factors to monitor the growing cycle
of the grapes, then transmitted the data from sensor to
sensor until the data reached a gateway. There, the data
was interpreted and used to help prevent frostbite,
mold, and other agricultural problems.
The
agricultural example shows just how a sensor network
could take advantage of a mobile robot's capabilities.
Over time, sensors need to be recalibrated, just like
any other measuring equipment. If a robot could act as a
gateway to the sensor network, it could automatically
perform tasks such as calibration. For example, a robot
could periodically collect data along the network,
determine which sensors are out of tolerance, move to
the appropriate location, and recalibrate each
out-of-tolerance device.
To look into using
mobile robots as gateways to such wireless sensor
networks, Intel is bringing in a Ph.D. candidate from
the University of Southern California, under the
guidance of professor Gaurav Sukhatme. This person will
work with Intel on integrating wireless sensor networks
into robotics research for localization techniques. This
type of collaboration is just one example of how Intel
is promoting the convergence of microelectronics and
robotics.
Numerous collaborations on robotics
projects
Overall, Intel is working with
approximately 20 robotics research groups, including
Carnegie Mellon University (CMU), University of Southern
California (USC), University of Pennsylvania,
Northwestern, and Georgia Tech. Intel is also in
discussions with universities and robotics
manufacturers, such as Sony, about robotic dogs, and
Honda and Samsung on using Intel silicon to build
robotic humanoids. Intel is also in discussion with NASA
and DARPA (the Defense Advanced Research Projects
Agency) on several major projects.
Other pilot
projects include professor Sebastian Thrun's CMU
research into an aerial mapping helicopter (photo
below), which is currently about 4 feet in length and
which has been demonstrated in certain DARPA programs.
Acroname is also using Intel's open-source robotics
package in their latest commercial robot, called Garcia
(see photo at beginning).
 Sebastian
Thrun's aerial mapping helicopter
In
other collaborations, professor Balch of Georgia Tech is
using Intel technology to develop hundreds of mobile
robots in order to model the swarm behavior of insects.
Professor Vijay Kumar is using Intel's XScale boards
(photo below) and open-source software for off-road
robot investigations. Professor Illah Nourbakhsh is
teaching mobile robot programming using new robotics
systems with Intel XScale boards and the Linux operating
system.
 Intel
boards are being used in a number of robotics
projects
Robotics task
force
The thrust of Intel's robotics effort
is to reduce the cost and engineering required to build
small, powerful, sophisticated robots. This thrust,
however, requires standards and protocols. Right now,
robotics standards and protocols are in their infancy.
With technology convergence becoming increasingly
important in Intel's areas of interest, Intel is leading
industry efforts for the Robotics Engineering Task Force
(RETF).
The RETF is modeled after the Internet
Engineering Task Force (IETF). RETF allows government
and university researchers to work together to establish
standard software protocols and interfaces for robotics
systems. Currently, government representatives include
researchers from NASA, DARPA, and NIST (National
Institute of Standards and Technology). All told,
approximately 35 government and university researchers
are already participating in the RETF.
The most
pressing issue for the RETF is devising standards for
commanding and controlling the mobile robots. The task
force has already defined a charter to develop standards
for robotics systems. A working draft of the first
framework document is now being reviewed for comments.
The task force has also begun work on standards
for bridging networks, on protocols, and on application
programming interfaces (APIs). Current issues being
discussed include intellectual property rights and
copyright. The task force hopes to begin work on full
specifications as soon as the framework document is
approved. The task force expects to publish its work as
open-source code when the work is complete, something it
hopes to finish in about two years.
Standardized building blocks
As
one of the industry leaders of the RETF, Intel is
devising low-cost reference designs for relatively small
robots. The reference designs are based on silicon for
Intel's XScale microprocessor and Intel Centrino mobile
technology, flash memory, and 802.11 wireless networking
with built-in support for wireless sensor networks. The
designs give researchers an intermediate scale between
the embedded microprocessors currently used in internal
robotics and the large-scale laptops used for mobile
intelligence.
The robotics package also includes
the open-source Linux 2.4.19 operating system, as well
as a multitude of open-source drivers. Drivers include
vision-system drivers for sensing infrared, drivers for
ultrasonic devices that measure the distance from a
robot to objects in the robot's environment, and so on.
The software platform also supports Java applications,
and integrates USC's Player device server for robotics
systems. All elements in the open-source robotics
package are wirelessly connected using 802.11 networks.
With internal robot systems standardized,
researchers and developers will not have to redesign the
wheel for each robot's brain. Instead, developers can
spend more time on mobility, visual recognition systems,
and the software for artificial intelligence (AI).
Summary
Having achieved a
reputation for leading-edge work in wireless sensor
networks, Intel is starting a new venture into wireless,
mobile robotics technology. Intel hopes that the two
technologies -- mobile robotics and wireless networks --
can be combined efficiently in, for example, ubiquitous
computing environments. In the Intel vision, mobile
robots act as gateways into wireless sensor networks,
such as into the "Smart Dust" networks of wireless
motes.
Intel's role in this venture is to assist
robotics researchers by providing standardized silicon,
an open-source operating system, and a multitude of
open-source software drivers for robotics applications.
The robotics development package includes silicon for
the Intel XScale microprocessor or Intel Centrino mobile
technology, the Linux 2.4.19 operating system, and a
plethora of robotics software drivers.
Intel has
also released a test version of a technical library for
building Bayesian networks, which will help advance the
ability of robots to navigate their environments. Pilot
systems based on Intel's open-source packages are
already being deployed in a variety of flexible
environments in agricultural, security, and military
applications.
With standardized low-cost and
open-source building blocks, developers won't have to
spend as much time building the brains of their robots.
Developers of embedded systems or wireless sensor
networks will find that the open-source robotics package
makes it much easier to design tomorrow's robotics
systems today.
Note Subsequent to initially
publishing this article, LinuxDevices.com requested
further details regarding the XScale/Linux based
hardware/software robotics platform that Intel has
developed. Here is the reply we received from Jim Buther
. . .
"We are currently engaged with a
limited number of researchers at CMU, USC, Stanford,
Georgia Tech, etc. to develop a common platform based
on XScale and Linux. We have deployed over 100 XScale
boards to a variety of robotics researchers. The Linux
port is working reasonably well with deployments to
about 15 researchers."
"The boards are not yet
available from a third party but we have plans to
enable a third party shortly. The boards are identical
to those used in our wireless sensor network gateway
where we are collaborating with Crossbow, Inc. in the
development of new systems (info). We are also working with
Acroname on the development of robots using these
XScale boards (info)."
"The Linux source
code is available at Carnegie Mellon University (here) and Portland State
University (here) ). We have taken the
standard 2.4.19 release for ARM and applied the XScale
PXA250 patches plus one for the Stayton
board."
"We are also engaged in creating
specifications for reusable, interoperable building
blocks for mobile robots. More information is
available at the Robotics Engineering Task Force
website."
References
Information about programs
in robotics and sensor networks at the University of
California at Berkeley can be found on the Berkeley site. The site includes
articles from Pervasive Computing magazine,
related topics, information about pilot projects, and
more.
Developers can also find information about
robotics from USC Robotic Embedded Systems Lab.
Information about the Robotics Engineering Task
Force can be found at the RETF
Web site.
Bio of author: Jim Butler is
a principal investigator for Intel Research and the
Emerging Platforms Lab, part of the Corporate Technology
Group. He has been with Intel 11 years, and has worked
on a variety of high-profile, leading-edge projects. His
work on the Indeo video codec was some of the first
video compression work developed by Intel. Another of
Jim's projects, CNN @ Work, was the first commercial
application for multicast video-over-IP. Butler has also
worked on satellite data broadcasting, focusing on
embedded digital content streams in broadband compressed
video services.
Prior to Intel, Butler developed
accelerator boards and software tools at Tektronix, Inc.
-- tools that were used by Industrial Light & Magic
for the special effects in "Terminator 2". He also
worked for Integrated Systems, Inc., where he tested
module prototypes for the Marshall Space Flight Center,
performed modeling for the Boeing Space Station program,
and also worked in the Advanced Solid Rocket Motor
program. Butler has been technical advisor for the
European Satellite Multimedia Service, and was also on
the technical advisory board for SkyStream for its
broadband systems.
From 1997 to 1999, Butler was
one of the four co-founders of the Pacific Convergence
Corporation. Along with his co-founders, Butler
established a joint venture between Intel and the
Pacific Century Group to address broadband Internet
delivery using satellite and cable redistribution in
China. He has also been an Intel Board Observer for
Skila, Inc. Along with several other engineers, Butler
was a recipient of the Intel Achievement Award in 2000.
In January, he hosted the second Intel Robotics Workshop
and Forum, 2003, which drew almost 300 participants.
Butler holds a patent for automated media capturing
system technology. He earned his B.S.C.E. and his
M.S.E.E. from the University of Illinois. Butler is a
member of both IEEE and AAAI.
Copyright ?
Intel Corporation 2003. All rights reserved. Reproduced
by LinuxDevices.com with
permission.
 | | |
|
|
| |
|