Intel Pentium M Processors Power DARPA Race Winners
By Robotics Trends Staff - Filed Jan 20, 2006
Three Intel-sponsored robotic vehicles, all packing Intel technology, were the top finishers in the second DARPA Grand Challenge race. The Stanford University racing team took the $2 million prize with their autonomous Volkswagen Touareg named Stanley, utilizing sophisticated artificial-intelligence techniques to cover 132 miles of Nevada desert in just under 6 hours and 54 minutes, for an average of just over 19 miles per hour.

Second and third places in the race, sponsored by the Defense Advanced Research Projects Agency (DARPA), went to a pair of vehicles from Carnegie Mellon University’s Red Team. CMU’s Humvee Sandstorm finished just over 10 minutes behind the leader, and its Hummer H1ghlander was 20 minutes behind.

The brains of all three winning vehicles were identical, and were donated to Stanford and Carnegie Mellon by Intel. Several Intel engineers worked separately with the rival teams. Each vehicle used computers with six Intel Pentium M processors, which were low power enough to run off the alternator in winner Stanley (Sandstorm and H1ghlander had auxiliary power generators supplied by Caterpillar Inc.). Instead of laptops or large servers, the six Intel Pentium M’s were packaged as blades in a rugged platform designed to be earthquake proof (no spinning hard disks and a spike-resistant power supply).DARPA Grand Challenge robot

Twenty-three autonomous ground vehicles that had made it to the finals (from 195 original teams) started the race around dawn on October 8, but only five crossed the finish line and one of those surpassed DARPA’s 10-hour time limit. The vehicles had no drivers and were not operated via remote control, and DARPA scanned the course during the race to make sure there was no human interference.

Even though only five vehicles completed the race, the outcome was still impressive compared with last year’s Grand Challenge, when every entrant either stalled or crashed less than 8 miles into the race. Last year’s starters tried to win with new sensor hardware. This year, the top three winners concentrated much more on software to digest the sensor readings and also focused on building rugged, reliable vehicles. “We focused on reliability from the start of this effort, and reliability won the race in the end,” said Gary Bradski, manager of the Machine Learning group in Intel Research who worked closely with the winning Stanford team. Bradski spent part of his sabbatical working on Stanley, and recruited Intel engineers Adrian Kaehler, Bob Davies and Ara Nefian to help with the Stanford vehicle.

Rugged Intel Processors Stand Up to Desert Tests
Each winning vehicle used one Intel 5091 chassis and six MPCBL5525 1.6-GHz Intel Pentium M processor modules, which functioned as the brains of the robotic vehicles. They were packaged in a blade form factor and conformed to telecom specifications, making them more rugged and better able to take the shocks experienced during the race.

The platforms with Intel Pentium M processors provided the computing capability, controlling all aspects of the vehicle and significantly decreasing the time between seeing an obstacle and actually hitting the brakes. All the modules delivered superior and reliable performance at an extremely low-power budget. Several other components, including the LIDARS (LIDARS, which stands for light detection and ranging system, are devices that use a laser beam swept back and forth in a line to sense distance), cameras, cables, alternators, fans, drive-by-wire hardware, GPS (global positioning systems), and so on, had failures, but the Intel Pentium M processor blade servers continued to run through dust, insects, and numerous other challenges.

“Near disasters occurred on this project almost every week,” said Richard Vireday, a senior software engineer with Intel’s Enabling Platforms and Services group who worked with the Carnegie Mellon teams. “There were hard disks failing all the time, vehicle rollovers, quarter inch thick dust on the blades (see photo), mechanical difficulties, drive-by-wire failures, desert heat, you name it. There are a thousand stories the teams can now tell, and we only witnessed parts of them.” DARPA Grand Challenge robot

“The computers proved rock solid for both Stanford and CMU,” Vireday continued. “While Stanford’s vehicle experienced numerous failures and glitches in various components, the team never had problems with the blades. Pentium Ms were chosen for reliability, power consumption and performance—they were the best combination of all three and worked well in both the Stanford and CMU vehicles.”

Intel Technologies Behind Winner Stanley
The Stanford vehicle, Stanley, was based on a stock, Diesel-powered Volkswagen Touareg R5, modified with full body skid plates and a reinforced front bumper. Stanley was actuated via a “drive-by-wire system” (which allows the computer to control the steering, braking, acceleration and suspension of the vehicle) developed by Volkswagen of America’s Electronic Research Lab.

The low-power Intel Pentium M processor-based platforms were critical to the design of the Stanford vehicle, as they allowed operation from an alternator on the Volkswagen Touareg engine. Several other teams required an on-board generator to power their computing systems. Because the systems on Stanley were all powered from the Touareg’s alternator, power consumption had to be kept to a minimum.

Additional components on Stanley included one Intel® NetStructure® MPCHC 5091 chassis—a rugged enclosure that provides efficient side-to-rear cooling, an Intel NetStructure® ZT 7102 3U Chassis Management Module, three redundant Intel NetStructure® ZT 6303 Hot Swap AC Power Supplies and a ZT 8102 Gigabit Ethernet switch for the video camera, and for communication between the blades and connection to all external networks.

During the race, Stanley had to gather and process data from a variety of sensors for obstacle detection, sensing, and terrain mapping. Then it had to use the data to plan and autonomously navigate the optimal driving surface through real-time control of drive-by-wire systems. To conserve power, only three blades were used during the race. The vehicle’s computing capability was provided by two blades, while the third one was used to collect and log the video packets that were used by the video subsystem for the vehicle navigation.

Intel Team Helps Stanley “See” A team from Intel worked with the Stanford team to develop Stanley’s computer vision software. To “see,” the vehicles all needed to coordinate laser-range sensors and cameras to comprehend the world beyond their bumpers. The robots needed to determine safe paths to travel, set the safest speed for a particular terrain, and process GPS signals to identify location and orientation.

Stanley’s sensors included five LIDARS for low-speed obstacle detection, a GPS receiver and compass for position sensing, a monocular vision system, a 6 Degrees of Freedom inertia measurement unit and finally a wheel speed measurement unit.

An array of sensors plus the GPS determined the vehicle’s position to within one meter, and software refined this accuracy down to within one foot. Lasers mounted on the roof continuously scanned the ground in front of Stanley looking for obstacles. All sensors acquired environmental data at rates between 10 and 100 Hertz. Map and pose information were incorporated at 10 Hz, enabling Stanley to avoid collisions with obstacles in real-time while advancing along the 2005 race route. All of this data had to be analyzed 10 times a second to weigh the relative safety of many possible routes and pick the optimal path.

The vision technologies were based on Intel’s Open Source Computer Vision Library, which includes Intel’s machine learning software. These libraries were instrumental in the video processing and worked with the vehicle’s laser vision system to expand Stanley’s range and depth of vision so the vehicle could drive faster. Data from the LIDARS was fused with images from the vision system to perform more distant look-ahead, while computer algorithms learned the terrain and mapped out an optimal driving surface. If a path of drivable terrain could not be detected for at least 40 meters in front of the vehicle, speed was slowed and the LIDARS were used to locate a safe passage. The acceleration provided by the Intel Compiler allowed the path planner to consider double the number of paths and thus improve safety margins.

Intel performance tuning tools and libraries, such as the Intel® Integrated Performance Primitives, enabled the vision software to run faster and better on the Intel processors which, combined with leading-edge source code, reduced latency to one-tenth of a second response times—on par with or better than typical human drivers. The Intel C++ Compiler speeded up the execution of critical code sections.

Intel Software Used in the CMU Vehicles
The Carnegie Mellon team used more complicated software, resulting in a latency closer to 1/4 second. “CMU spent more time on mapping, but we made no adjustments to Stanley’s map,” said Bradski. “Instead, we let the planner decide the safest route from the real-time data from the sensors using the GPS corridor map for general goals.” DARPA Grand Challenge robot

Brad Chen, the director of Intel’s Performance Tools Lab, worked closely with the Carnegie Mellon team. “We used much better algorithms this year to connect the dots between the GPS way points, since our understanding of the GPS data was in error last year,” said Chen. “Also during last year’s race, we hit an obstacle that threw us off course. This year we had much more accurate steering, as well as more computer power, which decreased the time between the vehicles seeing an obstacle and actually hitting the brakes.”

Carnegie Mellon used the software development system from Chen’s lab, and its longer-range gimbal LIDAR, to enable the H1ghlander and Sandstorm vehicles to run at up to 40 mph (~65 kph) without having to tap the computer vision system.

“Intel’s performance-tuning tools made a difference between the CMU vehicles being able to run at 40 mph vs. 10 mph,” said Chen. “Also, the embedded software used for the drive-by-wire systems from Caterpillar used a lot of cutting-edge source code. That helped reduce our latency to a quarter-second response time, which is typical of human drivers.”

CMU’s software hierarchy was complicated by a larger complement of sensors, including six fixed LIDARS, two radars, and stereo cameras mounted on the gimbal with a seventh LIDAR. Still, the CMU vehicles trailed winner Stanley by just minutes.

A New Era in Automotive Safety
The race results may signal a new era in automotive safety and military applications. Many auto accidents are caused by inattentive drivers. The Grand Challenge has opened the possibility of new innovations and applications that can promote the development of safer vehicles through automation.

Volkswagen was already experimenting with the drive-by-wire system used in Stanley. VW eventually wants to use the system in commercial vehicles to enable them to automatically avoid collisions. (Carnegie Mellon used a drive-by-wire system designed by Caterpillar, one of the team’s sponsors.) Automating navigation, control, and avoidance technologies in the future could significantly reduce the traffic accident rate.

“The DARPA Grand Challenge may be over, but the age of autonomous vehicles has begun,” said Bradski.

About the DARPA Grand Challenge
Twenty-three finalists from among 195 teams from 36 states and four foreign countries competed in DARPA’s 2005 Grand Challenge, setting off just minutes before sunrise on October 8 at five minute intervals. For several months prior to the actual race, these teams advanced to the final event by completing a series of rigorous tests designed to assess their capability of completing the desert course. The results prove conclusively that autonomous ground vehicles can travel long distances over difficult terrain at militarily relevant rates of speed.

At 4 AM, two hours before the race began, DARPA gave participants GPS (global positioning systems) waypoints to define the course, corridor width (specifications on how far the vehicles could stray from a straight-line drawn between the two waypoints), and also maximum speed. To navigate the course, the vehicles required ultra-high-performance computers with the most advanced mapping, sensing, and reasoning systems. The course mimicked driving conditions in Iraq and Afghanistan, including winding dirt trails and dry lake beds filled with overhanging brush. Parts of the route forced the robots to zip through three tunnels designed to knock out their GPS signals. Vehicles had a maximum of 10 hours to complete the course.

The vehicles had to drive without help from humans and DARPA took precautions to protect against cheating, conducting wireless surveillance of the course and looking for rogue radio signals. For safety, DARPA personnel in pickups followed each vehicle, and all entries were equipped with a “kill switch” to enable the chase vehicles to stop them in emergencies. Spectators tracked race progress via remote broadcast of DARPA transponders in the vehicles.

DARPA created the competition in response to a congressional mandate for one-third of the U.S. Army’s vehicles to be unmanned by 2015 to reduce supply line casualties in combat environments. The Grand Challenge was an open call for contestants to design autonomous robotic ground vehicles with the goal of successfully navigating a desert course. An autonomous vehicle cannot have a driver or be driven via remote control. It needs to think, move, and avoid obstacles on its own while maneuvering through unknown terrain.

Summary
Intel Pentium M processors provided the brains for the first robotic three vehicles to cross the finish line in the DARPA Grand Challenge race across the Nevada desert. All three winning vehicles used identical blade computers, which were donated to Stanford and Carnegie Mellon by Intel. Each vehicle used computers with six Intel Pentium M processors, which were packaged as blades in a rugged platform designed to be earthquake proof with a spike-resistant power supply.

A number of Intel technologies and software were also used in the winning vehicles, including software that enabled the vision technologies.

The completion of the race may signal a new era in automotive safety for civilians and the military alike, setting the stage for new innovations and applications that can promote the development of safer vehicles through automation. 

Rugged Intel Processors Stand Up to Desert Tests
Each winning vehicle used one Intel 5091 chassis and six MPCBL5525 1.6-GHz Intel Pentium M processor modules, which functioned as the brains of the robotic vehicles. They were packaged in a blade form factor and conformed to telecom specifications, making them more rugged and better able to take the shocks experienced during the race.

The platforms with Intel Pentium M processors provided the computing capability, controlling all aspects of the vehicle and significantly decreasing the time between seeing an obstacle and actually hitting the brakes. All the modules delivered superior and reliable performance at an extremely low-power budget. Several other components, including the LIDARS (LIDARS, which stands for light detection and ranging system, are devices that use a laser beam swept back and forth in a line to sense distance), cameras, cables, alternators, fans, drive-by-wire hardware, GPS (global positioning systems), and so on, had failures, but the Intel Pentium M processor blade servers continued to run through dust, insects, and numerous other challenges.

“Near disasters occurred on this project almost every week,” said Richard Vireday, a senior software engineer with Intel’s Enabling Platforms and Services group who worked with the Carnegie Mellon teams. “There were hard disks failing all the time, vehicle rollovers, quarter inch thick dust on the blades (see photo), mechanical difficulties, drive-by-wire failures, desert heat, you name it. There are a thousand stories the teams can now tell, and we only witnessed parts of them.” DARPA Grand Challenge robot

“The computers proved rock solid for both Stanford and CMU,” Vireday continued. “While Stanford’s vehicle experienced numerous failures and glitches in various components, the team never had problems with the blades. Pentium Ms were chosen for reliability, power consumption and performance—they were the best combination of all three and worked well in both the Stanford and CMU vehicles.”

Intel Technologies Behind Winner Stanley
The Stanford vehicle, Stanley, was based on a stock, Diesel-powered Volkswagen Touareg R5, modified with full body skid plates and a reinforced front bumper. Stanley was actuated via a “drive-by-wire system” (which allows the computer to control the steering, braking, acceleration and suspension of the vehicle) developed by Volkswagen of America’s Electronic Research Lab.

The low-power Intel Pentium M processor-based platforms were critical to the design of the Stanford vehicle, as they allowed operation from an alternator on the Volkswagen Touareg engine. Several other teams required an on-board generator to power their computing systems. Because the systems on Stanley were all powered from the Touareg’s alternator, power consumption had to be kept to a minimum.

Additional components on Stanley included one Intel® NetStructure® MPCHC 5091 chassis—a rugged enclosure that provides efficient side-to-rear cooling, an Intel NetStructure® ZT 7102 3U Chassis Management Module, three redundant Intel NetStructure® ZT 6303 Hot Swap AC Power Supplies and a ZT 8102 Gigabit Ethernet switch for the video camera, and for communication between the blades and connection to all external networks.

During the race, Stanley had to gather and process data from a variety of sensors for obstacle detection, sensing, and terrain mapping. Then it had to use the data to plan and autonomously navigate the optimal driving surface through real-time control of drive-by-wire systems. To conserve power, only three blades were used during the race. The vehicle’s computing capability was provided by two blades, while the third one was used to collect and log the video packets that were used by the video subsystem for the vehicle navigation.

Intel Team Helps Stanley “See” A team from Intel worked with the Stanford team to develop Stanley’s computer vision software. To “see,” the vehicles all needed to coordinate laser-range sensors and cameras to comprehend the world beyond their bumpers. The robots needed to determine safe paths to travel, set the safest speed for a particular terrain, and process GPS signals to identify location and orientation.

Stanley’s sensors included five LIDARS for low-speed obstacle detection, a GPS receiver and compass for position sensing, a monocular vision system, a 6 Degrees of Freedom inertia measurement unit and finally a wheel speed measurement unit.

An array of sensors plus the GPS determined the vehicle’s position to within one meter, and software refined this accuracy down to within one foot. Lasers mounted on the roof continuously scanned the ground in front of Stanley looking for obstacles. All sensors acquired environmental data at rates between 10 and 100 Hertz. Map and pose information were incorporated at 10 Hz, enabling Stanley to avoid collisions with obstacles in real-time while advancing along the 2005 race route. All of this data had to be analyzed 10 times a second to weigh the relative safety of many possible routes and pick the optimal path.

The vision technologies were based on Intel’s Open Source Computer Vision Library, which includes Intel’s machine learning software. These libraries were instrumental in the video processing and worked with the vehicle’s laser vision system to expand Stanley’s range and depth of vision so the vehicle could drive faster. Data from the LIDARS was fused with images from the vision system to perform more distant look-ahead, while computer algorithms learned the terrain and mapped out an optimal driving surface. If a path of drivable terrain could not be detected for at least 40 meters in front of the vehicle, speed was slowed and the LIDARS were used to locate a safe passage. The acceleration provided by the Intel Compiler allowed the path planner to consider double the number of paths and thus improve safety margins.

Intel performance tuning tools and libraries, such as the Intel® Integrated Performance Primitives, enabled the vision software to run faster and better on the Intel processors which, combined with leading-edge source code, reduced latency to one-tenth of a second response times—on par with or better than typical human drivers. The Intel C++ Compiler speeded up the execution of critical code sections.

Intel Software Used in the CMU Vehicles
The Carnegie Mellon team used more complicated software, resulting in a latency closer to 1/4 second. “CMU spent more time on mapping, but we made no adjustments to Stanley’s map,” said Bradski. “Instead, we let the planner decide the safest route from the real-time data from the sensors using the GPS corridor map for general goals.” DARPA Grand Challenge robot

Brad Chen, the director of Intel’s Performance Tools Lab, worked closely with the Carnegie Mellon team. “We used much better algorithms this year to connect the dots between the GPS way points, since our understanding of the GPS data was in error last year,” said Chen. “Also during last year’s race, we hit an obstacle that threw us off course. This year we had much more accurate steering, as well as more computer power, which decreased the time between the vehicles seeing an obstacle and actually hitting the brakes.”

Carnegie Mellon used the software development system from Chen’s lab, and its longer-range gimbal LIDAR, to enable the H1ghlander and Sandstorm vehicles to run at up to 40 mph (~65 kph) without having to tap the computer vision system.

“Intel’s performance-tuning tools made a difference between the CMU vehicles being able to run at 40 mph vs. 10 mph,” said Chen. “Also, the embedded software used for the drive-by-wire systems from Caterpillar used a lot of cutting-edge source code. That helped reduce our latency to a quarter-second response time, which is typical of human drivers.”

CMU’s software hierarchy was complicated by a larger complement of sensors, including six fixed LIDARS, two radars, and stereo cameras mounted on the gimbal with a seventh LIDAR. Still, the CMU vehicles trailed winner Stanley by just minutes.

A New Era in Automotive Safety
The race results may signal a new era in automotive safety and military applications. Many auto accidents are caused by inattentive drivers. The Grand Challenge has opened the possibility of new innovations and applications that can promote the development of safer vehicles through automation.

Volkswagen was already experimenting with the drive-by-wire system used in Stanley. VW eventually wants to use the system in commercial vehicles to enable them to automatically avoid collisions. (Carnegie Mellon used a drive-by-wire system designed by Caterpillar, one of the team’s sponsors.) Automating navigation, control, and avoidance technologies in the future could significantly reduce the traffic accident rate.

“The DARPA Grand Challenge may be over, but the age of autonomous vehicles has begun,” said Bradski.

About the DARPA Grand Challenge
Twenty-three finalists from among 195 teams from 36 states and four foreign countries competed in DARPA’s 2005 Grand Challenge, setting off just minutes before sunrise on October 8 at five minute intervals. For several months prior to the actual race, these teams advanced to the final event by completing a series of rigorous tests designed to assess their capability of completing the desert course. The results prove conclusively that autonomous ground vehicles can travel long distances over difficult terrain at militarily relevant rates of speed.

At 4 AM, two hours before the race began, DARPA gave participants GPS (global positioning systems) waypoints to define the course, corridor width (specifications on how far the vehicles could stray from a straight-line drawn between the two waypoints), and also maximum speed. To navigate the course, the vehicles required ultra-high-performance computers with the most advanced mapping, sensing, and reasoning systems. The course mimicked driving conditions in Iraq and Afghanistan, including winding dirt trails and dry lake beds filled with overhanging brush. Parts of the route forced the robots to zip through three tunnels designed to knock out their GPS signals. Vehicles had a maximum of 10 hours to complete the course.

The vehicles had to drive without help from humans and DARPA took precautions to protect against cheating, conducting wireless surveillance of the course and looking for rogue radio signals. For safety, DARPA personnel in pickups followed each vehicle, and all entries were equipped with a “kill switch” to enable the chase vehicles to stop them in emergencies. Spectators tracked race progress via remote broadcast of DARPA transponders in the vehicles.

DARPA created the competition in response to a congressional mandate for one-third of the U.S. Army’s vehicles to be unmanned by 2015 to reduce supply line casualties in combat environments. The Grand Challenge was an open call for contestants to design autonomous robotic ground vehicles with the goal of successfully navigating a desert course. An autonomous vehicle cannot have a driver or be driven via remote control. It needs to think, move, and avoid obstacles on its own while maneuvering through unknown terrain.

Summary
Intel Pentium M processors provided the brains for the first robotic three vehicles to cross the finish line in the DARPA Grand Challenge race across the Nevada desert. All three winning vehicles used identical blade computers, which were donated to Stanford and Carnegie Mellon by Intel. Each vehicle used computers with six Intel Pentium M processors, which were packaged as blades in a rugged platform designed to be earthquake proof with a spike-resistant power supply.

A number of Intel technologies and software were also used in the winning vehicles, including software that enabled the vision technologies.

The completion of the race may signal a new era in automotive safety for civilians and the military alike, setting the stage for new innovations and applications that can promote the development of safer vehicles through automation. 

<< Return to story