It could be a scene straight out of the Jetsons: a car that is programmed to pick up the kids from soccer practice. No more mom and dad as chauffeurs.
Such a science-fiction scenario is not that far-fetched.
On Oct. 26, robotic vehicles, including one from Stanford University and another developed by a German team and led by a Palo Alto woman, will drive 60 miles of roads at the former George Air Force Base in Victorville, Calif. -- without drivers.
Following the qualifying event, 20 of the competitors will move on to the final DARPA Urban Grand Challenge on Nov. 3, also in Victorville. The robotic vehicles will compete for $3.5 million in cash prizes.
In 2001, Congress mandated that by 2015, one-third of combat-ground vehicles should be unmanned. To develop new technologies for military applications, the Defense Advanced Research Projects Agency (DARPA) of the U.S. Department of Defense sponsors the robot-car competition.
The Stanford Racing Team won $2 million in the 2005 DARPA Desert Challenge with their Volkswagen Touareg named "Stanley."
In a little more than one week, Stanford hopes to take the top prize again -- this time, in an urban challenge, where the robots must navigate stop signs, intersections, parking spaces and other robotic cars.
Stanley is in the Smithsonian Institution now. "Junior" -- the son of Stanley and named for Leland Stanford Jr. -- will take the road to Victorville.
A 2006 German-import VW Passat, Junior is an ideal car, the researchers said. It has a drive-by-wire control system, which makes it responsive to electrical signals and inexpensive to modify, according to team spokesman David Orenstein.
In the final days before heading to Victorville for the qualifier, Junior's testing team was in a parking lot adjacent to Shoreline Amphitheater in Mountain View, working intensively to fine-tune their hopefully prize-winning system.
A look inside the trunk space of the cobalt-blue station wagon reveals two racks of computers with two quad-core servers, GPS navigation, radar systems and an emergency-control system in the event the computer malfunctions.
Laser and radar sensors mounted on the front hood and rear fenders allow the robot to sense distance and the position of trees, curbs and other obstacles. A radar laser mounted on the roof, spinning 10 times per second, develops three-dimensional images of Junior's surroundings by using 64 individual lasers.
The car also has a small diesel engine and can travel many miles on one tank of gas. There is no fill-up station on the course.
Mike Montemerlo, a senior research engineer at Stanford University's Artificial Intelligence lab, heads Junior's testing team. He also took part in the 2005 Desert Challenge, where brains beat brawn. Stanford's sturdy Touareg looked wimpy compared to the two monster all-terrain vehicles brought in by the competition's big gun, Carnegie Mellon University, the heavy favorite to win. But Stanley pulled an upset, successfully completing a 132-mile desert-terrain course in just 6 hours, 53 minutes and 58 seconds.
The prize money established a permanent fellowship at Stanford -- the Stanley Scholar -- and money for research, Montemerlo said.
"We concentrate on the software. We think the DARPA Challenge is a software competition," Montemerlo said.
Pointing to the spinning laser on Junior's roof, he noted that it refreshes the robot's model of the world and makes a new driving decision 10 times per second.
Junior's 360-degree view of the world is captured on a laptop screen situated inside the car. The scans look like record grooves or lines on an elevation map, with people, trees, pieces of concrete or other moving and stationary objects all captured in the imagery. The computers also create "hot" and "cold" areas on other maps, showing the car where it can and cannot go.
Montemerlo downloaded a program containing the route Junior would take. The car would start at the far end of the parking lot and drive north to a virtual intersection. It would encounter stop signs and make U-turns, drive alongside another vehicle, move around it, then park in a parking lane, back out and drive to a final drop-off point near the curb. The computer screen showed a map of the route and tracked the car's movements -- where it would go, and where it would be in real time. If there were double yellow lines on the map, Junior acted as if those lines are there, Montemerlo said.
Dirk Langer, a researcher with Volkswagen America Electronics Research Lab, which supports Stanford's team and offers some technical assistance, sat in the driver's seat. Dirk Haehnel, a software engineer at Stanford who is responsible for perception development, sat in the front passenger seat, monitoring Junior's every move.
Langer flicked a panel of switches, turning the steering, brakes and power systems over to Junior. Then, he sat back, hands resting in his lap. The high-tech lasers twirled like pinwheels. The steering wheel jerked and spun, as if controlled by a poltergeist. Junior moved forward, making ungainly movements like a teenage driver practicing for a driver's license.
"He's not very smooth, but we hope he is reasonably safe," Langer said.
Junior reached an intersection and braked. Perceiving the path free of other vehicles, the robot negotiated a traffic circle and changed lanes. It made a left at an intersection, turning on the directional, then made a right and pulled into a parking spot. Junior backed up, drove around a parked Touareg.
"He understands he can't go through the parking spot," Montemerlo said.
Montemerlo tracked every move and he has done this so many times he knows them by heart.
"He's actually using the turn signals. Now he stops at the stop sign and waits for oncoming traffic. He does a lane change in order to make the right turn through the cones and then a left turn to the finish line," he said.
Junior had just completed a simulated mission. Next, the team put the robot through a series of simulated bad-road conditions. By its second trial run, Junior's car-sickness-provoking movements had been resolved, and an errant directional, signaling right when the car was turning left, had been properly synchronized.
All of these details must be scrupulously refined before race day. The teams do not know what the DARPA Challenge course will look like. Only minutes before the race, they will download the course the robots are expected to follow.
"It's a hard challenge compared to the previous race. There are so many unknown facts," Langer said. But he added the team is confident they have a good chance of winning.
Twelve days before the challenge on a Sunday afternoon, the Stanford team was testing Junior and Junior 2. The second car is a twin of the first, but emblazoned with decals of the team's sponsors: Applanix, Google, Intel, Mohr Davidow Ventures, NXP, Red Bull, Stanford Engineering and Volkswagen. The twin is the actual car that will run in the competition, but Junior is the one put through the paces.
A silver Passat entered the adjacent parking lot, separated from Junior by a grassy knoll. Heads turned.
The AnnieWAY had arrived.
Headed up by Palo Altan Annie Lien, the team is a spin-off of the German Research Foundation's Collaborative Research Center on Cognitive Automobiles. Four German research institutions are involved in the project: The University of Karlsruhe, the Technical University of Munich, the Fraunhofer Gesellschaft and the University of Bundeswehr Munich.
Lien isn't sure why the car is named after her, but she is the only woman on the team. The guys named it, she said, and they smiled. Lien worked at the Robert Bosch Research and Technology Center in Palo Alto as a user-interaction specialist until recently and is about to begin a job at Volkswagen as user-experience researcher, where her interest is in human/computer interfaces.
The team has no sponsors. All the funding comes from the Collaborative Research Center, she said.
The AnnieWAY has four processors in an off-the-shelf computer and a D-space box, which provides low-level control "in case the brain fails," said Soeren Kammel, the technical team leader from Karlsruhe University.
A navigation system tracks the car's position and a spinning laser scanner, like the one Stanford uses, is mounted on the roof. AnnieWAY uses color cameras to detect lane markers and has an emergency-braking system in the event the car loses control, he said.
"For the race, the most important thing is that the intelligence can cope with or follow traffic rules, such as how to decide what to do when two cars simultaneously enter an intersection. It moves forward slowly to see what the other car is doing," team member Ben Pitzer, a Ph.D. candidate who works at Bosch, said.
"The system is very modular. All of the processes communicate with each other," he added, demonstrating the various mapping systems monitored from a laptop in the back seat. "It divides the environment into areas it can drive in. Black is where it can drive; red is where it can't drive. (The laser) can scan up to 120 meters away."
He switched to another program. A 3-D image of the AnnieWAY appeared at the center of concentric rings of black lines. Some of the lines rose from the two-dimensional rings to become little blobs of denser lines. The blobs stretched and moved around the car's image. These were people as AnnieWAY perceives them.
The tracking system is precise, but not so much as to pick up leaves, Pitzer said. It is looking for dynamic objects -- such as people, animals and moving cars -- or larger inanimate objects.
Kammel relinquished the car's controls to the robot. The AnnieWAY jolted in short hops. A problem with the safety-braking feature caused the car to lurch, but once disabled, the car moved smoothly, its steering system sensing the road's nuances.
"Even at night, it will still sense where it needs to go," Pitzer said.
As amazing as autonomous cars appear to be, there is still a large learning curve, he added.
"There are a lot of situations where you see how good human reason does and how bad the computer does," he said.
The robot can't distinguish between different kinds of moving objects in its current configuration, although living beings have specific patterns that distinguish them. A computer can use the movement of shoulders or necks to differentiate people from trees, Kammel said.
Sebastian Thrun, project leader of the Stanford Racing Team, said making the right choice of action -- the choice of "What shall I do?" -- is more difficult than one might think. It's the nuances of communication -- that wave or nod that a person gives or gets at an intersection when two cars meet -- that is distinctly human.
"General-purpose understanding remains unsolved. It's a mystery of human intelligence. ... You can't get an accurate matching of words and images," he said.
Lien agreed that one of the challenges to engineers is the tendency for other people to anthropomorphize machines. The AnnieWAY team never refers to the car as "she." In part, that may be because the team mostly speaks German, and the term for car is gender neutral, she said.
"Humans really treat machines as if they are humans. As technology gets smarter, humans expect them to do things humans do. The machine can do quite a lot. It can calculate numerical value. In a sense it is smarter than us,' she said, "but they are still machines."