TerraMax™, the completely autonomous, unmanned truck built by Team Oshkosh for the 2007 DARPA Urban Challenge
Once seen only in science fiction books and movies, the autonomous vehicle—a vehicle capable of driving itself without any human intervention—is quickly becoming a reality. True driverless vehicles are able to "think" like a human driver, and possess the ability to make decisions in real-time, including obeying traffic laws, avoiding obstacles and other vehicles, and planning and navigating traffic routes.
Team Oshkosh, a collaboration between Oshkosh Corporation, Teledyne Scientific and Imaging Company, the University of Parma’s VisLab, Auburn University, and Ibeo Automobile Sensor GmbH, was one of eleven teams selected as finalists to compete for the $2 million grand prize in the DARPA Urban Challenge. The Urban Challenge, an autonomous vehicle research and development program aimed at developing technology to keep warfighters off the battlefield and out of harm’s way, was held on November 3, 2007 and required vehicles to negotiate a 96-kilometre course in less than 6 hours.
TerraMax system diagram from DARPA technical paper
Team Oshkosh's TerraMax™ vehicle is based on Oshkosh Corp.’s Medium Tactical Vehicle Replacement (MTVR) defense truck platform. The "brains" behind the vehicle consist of a number of different systems, including sensors (LIDAR and vision system with cameras); vehicle status and navigation; vehicle management (waypoint following, stability control); perception (obstacle and road detection, lane markings); autonomous behaviour; and system control and UI.
Access video clips showing TerraMax in action
The TerraMax vision system, developed by VisLab of Parma University (Italy), uses nine (9) Point Grey Flea2 1024x768 IEEE-1394b (FireWire) cameras on the vehicle for much of the vision-based guidance system, including:
TerraMax example of obstacle detection in an urban environment
Images are acquired from the Flea2 via external trigger or in free-running mode, depending on the situation, using Format 7 Mode 0 (region of interest mode). The raw Bayer data is color processed on-board the camera, then streamed at S800 speeds over the FireWire interface to the vision system. Stereo algorithms are then used to reconstruct the 3D environment, and provide information about an extended area of the immediate surroundings. Team Oshkosh describes this system extensively in their technical paper.
"We chose the Flea2 for a number of reasons, most notably its compactness and the fact it provided the XVGA resolution and trigger functionality we required," explains John Beck, Chief Engineer Unmanned Systems at Oshkosh Corporation. Paolo Grisleri, who was responsible for the vision systems hardware, adds, "The IEEE-1394b bandwidth, support and documentation for the camera under Linux, and outstanding image quality were significant factors in our decision to use the Flea2."