A thousand spectators held their breath as a Toyota Prius named Little Ben slowed down, activated its right-turn signal and approached a cluttered intersection in the California desert.

At this premier competition of the world’s smartest robotic vehicles, the 2007 DARPA Urban Challenge, the scene awaiting Little Ben looked like an accident waiting to happen. It would have daunted a human driver, requiring steady nerves and navigational skills, and precise peripheral and depth-of-field vision.

Three other driverless cars – three of the 11 finalists in the Challenge – were blocking the route Little Ben needed to take to reach its goal.

Knight Rider, a Subaru Outback fielded by the University of Central Florida team, had arrived first on the spot. But Knight Rider had inexplicably stopped at the intersection on Ben’s right, and the robot’s human-operated DARPA “chase vehicle” had then stopped two car lengths behind the Outback.

Skynet, a Chevy Tahoe entered by Cornell University, had halted behind Knight Rider’s chase vehicle, pulled around to pass, then stopped again – in the wrong lane.

Further complicating matters, Talos, a Land Rover belonging to the MIT team, had also stopped and was facing Little Ben from across the intersection.

Meanwhile, several other chase vehicles idled nearby, their drivers poised to use remote control if Little Ben careened toward one of the stalled robots in an ill-advised attempt to get through.

Little Ben, product of the Ben Franklin Racing Team of Lehigh, the University of Pennsylvania and Lockheed Martin, began executing its right turn. Shouts of “Oh, no!” rose from the crowd watching the race on a Sony JumboTron.

A single question, says John Spletzer, seemed to be going through everyone’s mind: Could a robotic car evaluate this little slice of chaos and respond as deftly as an experienced human driver would?

“There were already three robotic cars at the intersection that didn’t know what to do,” says Spletzer, an assistant professor of computer science and engineering and lead Lehigh member of the Ben Franklin Racing Team. “Adding a fourth robot was a recipe for disaster.”

But Little Ben, with an elaborate system of laser and camera sensors, had already seen what it needed to make its decision. There was daylight separating Knight Rider, Skynet and the chase vehicles. As the Prius eased between its stationary rivals and continued on its way, the JumboTron viewers stood and cheered.

“That was the highlight of the race,” says Spletzer. “Ben didn’t know why the other robots had stopped. But he did what any human driver would try to do and what we all take for granted. He was the only one of the four robots that knew how to react and do the correct thing in that situation."

Designing and outfitting a car to drive itself, to size up real-life traffic situations, and to determine and execute the optimum responses to those events – that was the goal of the 89 teams that entered the Urban Challenge. DARPA, the Defense Advanced Research Projects Agency, is seeking to develop driverless, ground-combat vehicles for the U.S. military and to meet a congressional mandate that one-third of those vehicles be unmanned by 2015.

A rigorous, yearlong winnowing process narrowed the field of 89 original entrants to 11 for the final event – the Grand Challenge, which was held Nov. 3 on an urban course at the former George Air Force Base in Victorville, Calif.

The 2007 Challenge was the third in three years and the first to require that cars interact with moving as well as stationary objects. (This was the first Challenge for Lehigh; Penn entered a team in the 2005 Urban Challenge but did not make it to the final event.)

To qualify for the 2007 Grand Challenge, cars had to show they could change route plans in the event of a blocked road, stop and wait their turn at a four-way stop sign, merge into moving traffic, navigate a traffic circle, park in a specific parking spot, and obey all California traffic laws.

To have a shot at winning, the finalists had to complete the 58-mile Victorville course in six hours or less. Cars were also judged by how well they drove – as DARPA director Tony Tether said, “the vehicles must perform as well as someone with a California driver’s license.”

The stakes were high, as DARPA was offering $2 million, $1 million and $500,000 to the teams fielding the first-, second- and third-place cars.

Little Ben did not finish in the money – top honors were taken by Carnegie-Mellon, Stanford and Virginia Tech. But the Prius garnered several key honors – it was one of only six cars to complete the race and one of just five to do so in less than six hours.

More significantly, the Ben Franklin Racing Team was the only one of the six final teams not to receive $1 million in advance funding from DARPA. Indeed, Little Ben was the only one of the original 78 Track B, or unfunded, cars to finish the race.

The total cost of outfitting Little Ben came to less than $250,000. Much of that was provided by Lockheed and Thales Communications, the team’s other funding source. (Thales’s president and CEO, Mitch Herbets, earned a B.S. in electrical engineering from Lehigh in 1979.)

“For a small team that was not externally funded,” said Dan Lee, associate professor of electrical and systems engineering at Penn and leader of the Ben Franklin Racing Team, “we gave the large engineering schools a run for the money. They had more resources, students and sponsors. We had to make a lot out of very little.

“We were not the fastest, flashiest or showiest car but we were one of the safest and most reliable.”

Necessity, mother of integration

To navigate roads and traffic by itself, says Spletzer, a robotic car must possess the same intelligence and sensing skills as human beings. It has to be able to recognize the lanes, median and shoulder of the road, and to distinguish between approaching vehicles and other obstacles.

The car’s “road-segmentation” and “lane-tracking” abilities must be particularly robust, says Spletzer. It must be able to identify and adhere to the portion of the terrain that is drivable road surface. If no lines are painted on a road, or if the car’s GPS system fails – a likelihood under bridges, overpasses and skyscrapers – the car must continue driving on the paved portion of the road.

“The car must at all times know where the road is, where the car’s half of the road is, and where the edge of the road is,” says Spletzer.

“The car not only needs to stay in its lane and remain the proper distance behind the car in front of it, it also needs to know to stop behind double-parked cars in its lane and wait for traffic ahead to clear before it proceeds.”

To equip Little Ben with human-like intelligence and sensing skills, the Ben Franklin Racing Team employed a variety of sensing devices that worked in concert to gather and crunch data. These included GPS, a stereo vision camera and lidar laser systems (lidar is an acronym for light detection and ranging).

The stereo vision camera, located on the front of Little Ben, possessed a 50-degree field of view and functioned like human eyes, says Spletzer.

“The stereo head’s two cameras enabled Ben to estimate depth of field and to know that a traffic line was a line on the ground and not a white pole pointing up into the air. The cameras also tracked the ground plane in front of the car to estimate the car’s pitch, which oscillates when a car brakes, turns or reacts to a bump in the road. Being able to track this plane enabled us to stabilize the ‘camera in’ software.”

A series of lidar lasers, said Spletzer, worked with the stereo camera to detect obstacles and to keep Little Ben from drifting out of lane. Mounted atop the car was a high-definition lidar system developed by a California company called Velodyne, bought for the Prius by Lockheed Martin, and refined by the Penn team members. With 64 embedded lasers, this system provided a 360-degree field of view and worked with two side lidar systems to detect lane markers.

Little Ben used lidar systems developed by Sick AG, a German company, and purchased by the team with donations from Thales. Two lidars atop the car enhanced observation and road segmentation. Two under the front headlights collaborated to detect merging traffic. A third lidar in the front of the car watched for obstacles, while two lidars positioned near the side view mirrors monitored lanes and curbs.

Jason Derenick, a Ph.D. candidate in computer science and engineering at Lehigh, refined the lidar systems so Little Ben could detect obstacles the size of a curb as well as reflections from the lane markings.

“Jason developed a robust obstacle-detection system using the Sick lidars,” said Spletzer. “The actual lane detection was accomplished by sensor fusion, by merging lidar and stereo-camera data.”

In fact, said Spletzer and Lee, integration – the ability to gather information using multiple sources and to combine and interpret that data in real time – was the overall key to Little Ben’s success.

“Much of the work that went into designing and developing Little Ben was not cutting-edge research,” said Spletzer, “but rather integrating various technologies and applying them to specific tasks.

“A lot of research on autonomous cars had already been done. Our challenge was to integrate and extend these results in a very limited timeframe. Previous work on autonomous cars and highways focused on getting cars to stay in lane. But the task at the Grand Challenge was to enable a robotic car to respond dynamically to its environment. Nothing at that level of difficulty had been done before.

“This was a big software engineering problem. You had to merge information from a lot of sensors, calculating the weight of each input before making a determination. That requires algorithms for making intelligent decisions and acting on input.

“The car had to switch smoothly from one mode to another, from four-way stop to U-turn to passing to merging and so on. The goal was to get all the sensors to communicate to Ben’s central [computer] in real time – while the car is traveling 25 mph, or 37 feet per second.”

The complexity of the challenge provided ample opportunity for failure. Of the 36 cars that made it to the national qualifying event, only 11 were approved to take part in the Grand Challenge; DARPA had expected a final field of 20. And of the final 11 cars, five were disqualified for various infractions during the first two hours of the event.

“I did not think we would do as well as we did,” Spletzer confessed. “A lot of really good teams fell out of the competition. It takes just one mistake.”

A second phone call home

The Grand Challenge had hardly begun when such a mistake almost befell the Ben Franklin Racing Team.

Not 30 minutes into the race, a message flashed on the JumboTron saying that Little Ben had been “retired.” A year’s effort, and countless weeks of working day and night, appeared lost.

“I saw that message and called my wife back in Pennsylvania,” said Spletzer, who was watching the video from bleachers provided for spectators. “I told her, ‘We’re finished, we’re the first one out.’”

Out on the course, Little Ben, after traversing an unpaved stretch of terrain, had slowly negotiated a steep, bumpy, downhill grade, then stopped atop a small berm overlooking the paved road. The car remained on the berm, leading judges to conclude that the Prius was overwhelmed by the transition from unpaved to paved surface.

But the judges reconsidered, and rescinded their decision to disqualify, when Little Ben finally turned right onto the highway.

“Apparently, the steepness of the incline from the berm to the paved highway made Ben’s lidar think that the asphalt was actually a short wall, an obstacle,” said Spletzer. “We had accounted for this contingency in our programming, but Ben needed time to switch from ‘stop’ to ‘proceed’ modes.

“Sure enough, Ben pulled out, signaled and started to turn onto the highway. When he did, everyone in the JumboTron tent broke into applause. The judges decided to give Ben a break, and I called my wife back.”

What lies ahead for the Ben Franklin Racing Team, and for robotic cars?

If DARPA decides to hold a fourth Urban Challenge, Lee and Spletzer say they would be open to renewing their collaboration. As for autonomous cars, Lee notes that, in some respects, the future is already here.

“A lot of new technologies are coming to high-end civilian cars, such as adaptive cruise control, crashavoidance features and lane-departure warning systems that rely on computer vision or sensors instead of a highway rumble strip.”

Spletzer, meanwhile, is confident engineers can go further. Just as computers have been programmed and trained to beat the world’s grand chessmasters, he says, robotic cars may someday navigate roads more ably than humans can.

“Computers can really do well when you focus them on specific tasks,” he says, citing IBM’s “Deep Blue,” the first machine to win a chess tournament against a human.

“Ultimately, we’d like to make robotic cars that can drive better than humans can. That won’t be easy. The driving environment has many complicated aspects. It’s not limited to an eight-by-eight chessboard.

“It’s going to be some time before cars drive themselves – 25 to 50 years, maybe closer to 50.

“I plan to be around then.”

Lehigh graduate student Jason Derenick (left) and Lehigh professor John Spletzer test – and retest – Little Ben’s ability to wait its turn at a four-way stop.

Penn professor Dan Lee, leader of the Ben Franklin Racing Team, inspects the high-definition Velodyne lidar system that gave Little Ben a 360-degree field of view and helped it detect lane markers.

Of 89 robots that entered the yearlong Challenge, only 11 survived the qualifying rounds, and only five, including Little Ben, completed the 58-mile final event.

The task at the Grand Challenge was to enable a robotic car to respond dynamically to its environment. Nothing at that level of difficulty had been done before.
John Spletzer

The team spent much of its time integrating existing technologies and enabling them to interpret streams of information in real time.