By Barry Hochfelder, Editor
The topography of Mars has everything from lava-flattened plains to cratered highlands. It has volcanoes and mountains but no oceans. Its temperature ranges from −140 °C (−220 °F) during the polar winters to highs of up to 20 °C (68 °F) in summers. In short, it’s a perfect place for robotic exploration.
Some of the requirements, though, still require human control. Teleoperated robots are a major component of space exploration programs. These robots are designed to take the place of human explorers; reducing mission costs and risks. However, these robots require a human to direct them to “interesting” features. To promote advancement of robots for space exploration, the Mars Society founded the University Rover Challenge, a competition that tasks teams with designing, building, and operating a robotic system in an analog Mars mission.
This year’s event was held May 28-30 at Mars Desert Research Station (MDRS) northwest of Hanksville, Utah. The stark area has geologic, biological and environmental features that approximate what might be encountered on Mars. The competition was won by York University (Toronto). Other competitors were runner-up Brigham Young University (Provo, Utah), the University of Nevada (Reno, Nevada), 2008 winner Oregon State University (Corvalis, Oregon), Georgia Tech University (Atlanta, Georgia), Warsaw University of Technology (Warsaw, Poland) and UCLA (Los Angeles, Calif.).
FIGURE 1. Team UCLA's rover is equipped with two Dragonfly2 cameras
“The competition evolved from the first year where it was a basic build, remotely operated system that can deploy a payload,” explains Andrew Boggeri President of the Robotics Club at UCLA. “This year, you have to do a lot of tasks with mapping. The requirements for camera resolution have steadily gone up. The team is like an astronaut in a module performing remote science investigations using the robot.”
The competing teams must design, build and operate a Mars Rover-like system in a series of simulated Mars mission tasks:
“It’s very much a red planet in the Utah high desert,” Boggeri says. “There’s iron oxide in the dirt, so it’s red. It’s in a rocky flood plain and it’s got very fine silt. Other than the heat, it’s the most damaging thing for the camera. Really fine dust gets all over your electronics; it gets into everything. There’s small scrub vegetation and large boulders. You drive out, come up and see it and go, ‘Wow!’ It’s like what you’d expect to find on a mission to Mars.”
The UCLA system features a Mini-ITX (compact form factor for motherboards) from Logic Supply (Burlington, Vt.), running a customized version of Ubuntu Linux that handles the high-level communication and control tasks. The Rover operates on client-server architecture, with the Rover acting as the server. Interfacing with the server program through control subroutines are the serial motor controller, Point Grey (Richmond, BC, Canada) Dragonfly2 cameras, and a number of microcontrollers (used for sensor integration).
FIGURE 2. Team UCLA's rover is equipped with two Dragonfly2 cameras
Other supporters and sponsors include Northrup Grumman Space Technology (Los Angeles), Schlumberger (principle offices in Houston, Texas; Paris, France; and The Hague), a consulting firm that specializes in oil and gas exploration, Atmel (San Jose, Calif.), which makes flash microprocessors, and Solidworks (Concord, Mass., and Santa Monica, Calif.), a 3D CAD company. In addition, the engineering alumni association at UCLA has supported almost 40 projects.
When the Rover is powered on the system boots, logs in, and then initiates the control subroutines and RoboServer and waits for a client to connect. Upon connection, the client computer is presented with a graphical user interface showing the attached sensors and readings, a camera view, the arm control sliders, and the motor control wheel. The client is then able to remotely operate ROVER2. Depending on battery loadout and driving conditions, ROVER2 is able to operate from one to two hours continuously.
The server and client communicate over an 802.11b primary network with a 900MHz Ethernet data radio network serving as backup. The AX3500 motor and arm controller communicates with RoboServer over the serial port while ATmega microcontrollers gather sensor data and send it over USB.
Navigation is primarily visual, with two Dragonfly2 cameras (one color and one black and white) providing streaming video at an average rate of 7fps over 900MHz and 12 fps over 802.11b. The software also allows for static pictures. To save on bandwidth, only one camera is active at a time, with the ability to swap between any number of attached cameras (up to the maximum supported by the 1394 hardware on board).
“We use the libdc1394 camera library to interface with the Dragonfly2 and take advantage of the bitmap image format to allow for a custom visual spectroscopy program,” says Boggeri, who will enter his senior year in aerospace engineering.” Using an IR-pass lens on the black & white camera and the red, green and blue color channels available on the color camera, the team created a program to perform visual spectroscopic measurements of a target to identify its composition and the error factor associated with the measurements.
“That gives us a fourth channel so we get RGB and IR responses,” he adds. “The program calibrates its readings using the published data for the CCD sensor spectral response against the spectral response of a Spectralon target. (Spectralon is a thermoplastic resin that can be machined into a wide variety of shapes for the fabrication of optical components.) The IR response is pretty accurate. We compare our rough spectral vs. known values for rocks and minerals. We can say it’s gypsum or that there’s a high concentration of copper in a rock.”
One problem the UCLA team ran into this year was underestimating the resolution needed for the visual range-finding mapping task. The target was white PVC pipes of about 10 cm (3.97 inches) in diameter and one to two meters (3.28-6.56 feet) high, with a white 15 x 30 cm (5.9 inches x 11.8 inches) flag attached. Teams had to survey the markers from about 0.8 km (875 yards), then had to determine the absolute map coordinates of the PVC pipes. “We weren’t able to find a zoom lens,” Boggeri explains. “We based it on the 1032 x 776 resolution of the Dragonfly 2, but we miscalculated the size of the markers.
“We knew the absolute dimensions of the marker. With a standard lens, the marker’s apparent size was only one pixel, not enough to carry out an accurate calculation of its distance from the Rover. There are two solutions possible for next year’s competition: one is to go to a higher resolution camera, the other to put a zoom lens on the camera. It narrows the field of view, but you get greater resolution at a distance. I’ve discovered that it’s hard to find zooms for a C-mount camera. And they’re very expensive.”
There’s a year to figure it out. Meanwhile, the URC judges are thinking of next year’s contest. “Although many of the tasks for 2009 were similar to those in 2008, the requirements and task courses were much more difficult this year,” said Kevin Sloan, Director of the University Rover Challenge. “Despite how hard we made things for these teams, they found ways to accomplish amazing feats. All of the judges and I were extremely impressed at what these students have done. Now the hard job for the judges is to devise even harder tasks for the 2010 URC.”