Thursday, February 7, 2008

Wireless sensor networks - Mobile robots as gateways



This is a new venture that is focused on intelligent mobile robots that are used in flexible environments, not automated toolsets in fixed locations. For example, Intel-based mobile robots will be used at the James Reserve by the Center for Embedded Networked Sensing (CENS) to map terrain and monitor habitats.

Intel silicon for robotics applications is also being used by researchers, such as professor Tucker Balch at the Georgia Institute of Technology. Professor Balch is exploring how robots can organize and perform like social insects, such as bees and ants. Future projects may include the possibility of building a ground-based Robonaut, as well as the brains of the 2009 Mars Rover.


Intel's focus is not on the mechanical aspects of robots -- the wheels, motors, grasping arms or physical layout. Instead, this venture is focused on the silicon and software that give a robot its capabilities and intelligence. Intel's role is to assist researchers in putting powerful, sophisticated intelligence into small, standardized packages for mobile robotics. With wireless technologies now practical and available, this is a novel area for research and investigation.


To assist researchers, Intel is offering inexpensive, standards-based hardware, an open-source operating system, and drivers for use in robotics environments. The open-source package lets researchers take advantage of leading-edge Intel XScale microprocessors and Intel Centrino mobile technology, while reducing the overall costs of developing robotics systems.

What is a robot?



Robotics is not a new field. It has been around for decades. In fact, most people have robots in their own home, even if they don't recognize the robots as such. For example, a dishwasher automatically washes and dries your dishes, then grinds up the rinsed-off food so the organic matter doesn't clog your drains. A washing machine soaks, soaps, agitates, and rinses your clothes. Down the street, the car wash-n-wax cleans, brushes, washes, and waxes your car, all for a few dollars. One of the better known home-oriented robots is iRobot's smart vacuum cleaner, called the Roomba, which has already won the Good Housekeeping Award for efficiency and ease of use.


More sophisticated robots are used in manufacturing plants and warehouses. Car makers use automated machines to position car frames, bolt pieces together, and even do welds and priming. In wafer communications, test systems position themselves along grids, take measurements, and then correlate the data into graphs. Robot-assisted heart microsurgery is now performed routinely in the U.S.


To some extent, we have become so used to robots that we no longer pay attention to the automated machines. We look only at the tasks they complete, and we think of them simply as tools. It is easy to think this way: most of today's robots are stationary tools in fixed locations, like a fruit sorter in a cannery, or an alarm sensor that triggers a call to security.

Robots growing in sophistication


Although we are surrounded by robots that we think of as automated tools, there are some sophisticated robots already in use (photo below). A remote telepresence is one of the most common applications that today's mobile, autonomous robots provide. Intelligence for these robots is handled via an embedded microcontroller that manages internal systems, and by a laptop that is attached to the robot. Humans control the robot through wireless communications. In this way, humans can tell the robot to change directions, shift a camera angle, take measurements, grasp objects, and so on. For example, mobile robots can let security personnel stay in a central office and still check out unsupervised areas in a warehouse or other remote site.



Carnegie Mellon University's TagBots use Intel boards



With advances in microchip design, nanotech sciences, software architecture, and mini-power cells, robot systems can be more than just another pair of eyes. They are already being tested and used in a variety of applications. They can traverse different, even dangerous environments and perform complex tasks on their own. For example, mil-spec iRobot Packbots have been used in Afghanistan to detect and map the locations and contents of caves. Another iRobot rover was used in the historic exploration of both the southern and northern shafts that led to the Queen's Chamber in the Great Pyramid at Giza (Egypt). The rover was able to illuminate areas beyond the blocking stones in the shafts, which had last been viewed by human eyes some 4,500 years ago.

Robot mobility issues

Regardless of a robot's design or tasks, there are still three main issues with its mobility:



  • Localization: How does a robot know where it is in its environment?


  • Mapping: How does the robot know the details of its environment?


  • Navigation: How does a robot traverse its environment?


Intel works closely with researchers to identify novel ways for a robot to perform its mobility tasks. Intel is particularly interested in machine-vision libraries that can be used to perform localization and mapping based on monocular- or stereo-vision systems. For example, right now, most robots navigate by using infrared or radio waves to avoid objects in their paths. However, Intel software researchers recently developed several libraries that are very applicable to robotics systems. Intel's computer vision library is already used extensively by vision researchers.



Intel has also released a test version of a technical library for building Bayesian networks to support machine-learning activities. Bayesian networks are a form of probability-based artificial intelligence. Such a network would let a robot navigate by matching sensor data to a map stored in its memory.

Gateways into sensor networks

Two technologies in particular seem to be moving toward an interesting convergence: mobile robotics and wireless sensor networks. The two main questions here are:



  • Can a mobile robot act as a gateway into a wireless sensor network?


  • Can sensor networks take advantage of a robot's mobility and intelligence?


One major issue with a mobile robot acting as a gateway is the communication between the robot and the sensor network. Sensor networks typically communicate using 900 MHz radio waves. Mobile robots use laptops that communicate via 802.11, in the 2.4- to 2.483-GHz range. Intel hopes to prove that a sensor net can be equipped with 802.11 capabilities to bridge the gap between robotics and wireless networks.

Intel recently demonstrated how a few motes equipped with 802.11 wireless capabilities can be added to a sensor network to act as wireless hubs. Other motes in the network then use each other as links to reach the 802.11-equipped hubs. The hubs forward the data packets to the main 802.11-capable gateway, which is usually a laptop. Using some motes as hubs cuts down on the number of hops any one data packet has to make to reach the main gateway. It also reduces power consumption across the sensor net.

Intel believes that one of the most interesting technology convergences will be in designing mobile robots that can act as gateways into the wireless sensor networks. For example, Intel recently installed small sensors in a vineyard in Oregon to monitor microclimates. The sensors measured temperature, humidity, and other factors to monitor the growing cycle of the grapes, then transmitted the data from sensor to sensor until the data reached a gateway. There, the data was interpreted and used to help prevent frostbite, mold, and other agricultural problems.

The agricultural example shows just how a sensor network could take advantage of a mobile robot's capabilities. Over time, sensors need to be recalibrated, just like any other measuring equipment. If a robot could act as a gateway to the sensor network, it could automatically perform tasks such as calibration. For example, a robot could periodically collect data along the network, determine which sensors are out of tolerance, move to the appropriate location, and recalibrate each out-of-tolerance device.

To look into using mobile robots as gateways to such wireless sensor networks, Intel is bringing in a Ph.D. candidate from the University of Southern California, under the guidance of professor Gaurav Sukhatme. This person will work with Intel on integrating wireless sensor networks into robotics research for localization techniques. This type of collaboration is just one example of how Intel is promoting the convergence of microelectronics and robotics.

Numerous collaborations on robotics projects

Overall, Intel is working with approximately 20 robotics research groups, including Carnegie Mellon University (CMU), University of Southern California (USC), University of Pennsylvania, Northwestern, and Georgia Tech. Intel is also in discussions with universities and robotics manufacturers, such as Sony, about robotic dogs, and Honda and Samsung on using Intel silicon to build robotic humanoids. Intel is also in discussion with NASA and DARPA (the Defense Advanced Research Projects Agency) on several major projects.

Other pilot projects include professor Sebastian Thrun's CMU research into an aerial mapping helicopter (photo below), which is currently about 4 feet in length and which has been demonstrated in certain DARPA programs. Acroname is also using Intel's open-source robotics package in their latest commercial robot, called Garcia (see photo at beginning).




Sebastian Thrun's aerial mapping helicopter




In other collaborations, professor Balch of Georgia Tech is using Intel technology to develop hundreds of mobile robots in order to model the swarm behavior of insects. Professor Vijay Kumar is using Intel's XScale boards (photo below) and open-source software for off-road robot investigations. Professor Illah Nourbakhsh is teaching mobile robot programming using new robotics systems with Intel XScale boards and the Linux operating system.




Intel boards are being used in a number of robotics projects



Robotics task force



The thrust of Intel's robotics effort is to reduce the cost and engineering required to build small, powerful, sophisticated robots. This thrust, however, requires standards and protocols. Right now, robotics standards and protocols are in their infancy. With technology convergence becoming increasingly important in Intel's areas of interest, Intel is leading industry efforts for the Robotics Engineering Task Force (RETF).



The RETF is modeled after the Internet Engineering Task Force (IETF). RETF allows government and university researchers to work together to establish standard software protocols and interfaces for robotics systems. Currently, government representatives include researchers from NASA, DARPA, and NIST (National Institute of Standards and Technology). All told, approximately 35 government and university researchers are already participating in the RETF.



The most pressing issue for the RETF is devising standards for commanding and controlling the mobile robots. The task force has already defined a charter to develop standards for robotics systems. A working draft of the first framework document is now being reviewed for comments.



The task force has also begun work on standards for bridging networks, on protocols, and on application programming interfaces (APIs). Current issues being discussed include intellectual property rights and copyright. The task force hopes to begin work on full specifications as soon as the framework document is approved. The task force expects to publish its work as open-source code when the work is complete, something it hopes to finish in about two years.

Standardized building blocks



As one of the industry leaders of the RETF, Intel is devising low-cost reference designs for relatively small robots. The reference designs are based on silicon for Intel's XScale microprocessor and Intel Centrino mobile technology, flash memory, and 802.11 wireless networking with built-in support for wireless sensor networks. The designs give researchers an intermediate scale between the embedded microprocessors currently used in internal robotics and the large-scale laptops used for mobile intelligence.



The robotics package also includes the open-source Linux 2.4.19 operating system, as well as a multitude of open-source drivers. Drivers include vision-system drivers for sensing infrared, drivers for ultrasonic devices that measure the distance from a robot to objects in the robot's environment, and so on. The software platform also supports Java applications, and integrates USC's Player device server for robotics systems. All elements in the open-source robotics package are wirelessly connected using 802.11 networks.



With internal robot systems standardized, researchers and developers will not have to redesign the wheel for each robot's brain. Instead, developers can spend more time on mobility, visual recognition systems, and the software for artificial intelligence (AI).

No comments :