By Michael Gibbons
Imagine being able to tour the Magna Leptis, one of the biggest and best preserved Roman cities in North Africa, from the comfort of your own home, or tour a new construction project before it has even been built. These ideas have formed the basis for a growing number of commercially available augmented reality (AR) and virtual reality (VR) applications. A necessary requirement for augmented and virtual reality applications is the realtime tracking of human users and artifacts in the environment using optical measurement technologies. The goal of these technologies is to calculate the exact pose (position and orientation) of a tool, object or person within a pre-defined coordinate system.
While the cost of computing, projection and display components has decreased dramatically over the last decade, the cost of motion tracking components has not. Frustrated by this fact, and motivated by their vision of making immersive VR applications more affordable and thus accessible to broader audiences, a small team of researchers around Dr. Hannes Kaufmann from the Vienna University of Technology’s Interactive Media Systems Group began to design their own low-cost, easy-to-use motion tracking system. In 2007 they launched iotracker (www.iotracker.com), a line of products designed to provide users with an affordable infrared optical tracking solution that meets the stringent requirements of real-time 6-DOF motion tracking of immersive visualization systems.
Move through Virtual Reality
Optical motion trackers typically use multiple two-dimensional imaging sensors (cameras) to detect „active“ infrared- emitting or „passive“ retro-reflective markers affixed to some interaction device. The iotracker system is comprised of up to eight small calibrated infrared cameras with integrated infrared (IR) strobe lights, one synchronization unit, a PC workstation running the iotracker software and several rigid-body targets attached to various interaction devices. Based on the information received from multiple cameras, the system is able to calculate the location of every marker through geometric triangulation. When more than two markers are grouped together to form a rigid-body target, it becomes possible to determine the target‘s orientation, yielding a total of six degrees of freedom (6-DOF). In a simple example, this allows a user to interact with a virtual environment, “moving” themselves or an object left/right, forward/backward, and up/down.
FIGURE 1. Optical motion tracking plays a key part in immersive visualization experiences.
Calibration in Three Steps
The cameras and targets used in the iotracker system must first go through three separate calibration steps to achieve accurate triangulation. The first is known as “intrinsic calibration”, which compensates for image distortion due to the camera optics and is performed by iotracker technicians prior to shipment. “Extrinsic calibration” is performed infield by the user, and is the process of finding the spatial transformation – where they are positioned relative to one another –between all pairs of cameras in a given setup. Extrinsic calibration must be repeated every time a camera is moved or reoriented, but takes only a few minutes to complete. The final step is to train the system to recognize the rigid-body targets. This process is called rigid-body target calibration. Once fully calibrated, the iotracker system can deliver sub-millimeter precision and below 5 mm accuracy (RMS) of point-measurements throughout the entire tracking volume.
Affordable and Accurate Cameras
For image capture, iotracker uses Firefly MV IEEE 1394a digital camera modules from Point Grey Research. The compact board-level Firefly MV cameras are integrated with a control board in a custom created camera enclosure that measures just 71 x 66 x 40 mm. The housing unites the camera and an IR LED array, creating a powerful image generator. The camera’s 3.6 mm focal length M12 microlens and wide angle IR emitters can achieve a 90-degree diagonal field of view, which allows the system to cover a maximum tracking volume of up to 40 m3. The Firefly MV is equipped with a wide-VGA 1/3” global shutter monochrome CMOS sensor from Micron (www.micron. com) that has near-IR capability in the 850 nm range. The near-IR performance of the sensor allows shorter shutter times, which minimizes motion blur, a common problem in fast motion capture systems.
“Point Grey is well known in the academic community for its affordable machine vision and stereoscopic imaging products, and we worked extensively with their products at the Vienna University,” says Thomas Pintaric, Core Developer at iotracker. “We ultimately selected the Firefly MV for a number of reasons. It’s the most affordable IIDC v1.31-compliant camera model on the market, and unlike similarly priced IIDC v1.04- compliant cameras, the Firefly MV supports external triggering, which we use to accurately synchronize shutters from multiple cameras.”</>
High Speed 3D Location Detection
The iotracker system uses passive rigid-body targets composed of retro-reflective spherical markers. The target designs are computationally optimized for maximal tracking performance and minimal self-occlusion, are constructed using lightweight carbon-fiber and polyamide materials, and have an industrial- grade (EN 471 Class 2) retro-reflective marker coating. The markers‘ special coating reflects most of the infrared light emitted by an iotracker camera back to the imaging sensor. iotracker cameras can be wall or tripodmounted, and are shuttersynchronized to a trigger pulse signal sent out by the iotracker synchronization unit over a BNC coaxial cable. Every camera streams digital video at 60 FPS to the tracking workstation via IEEE 1394a (400 Mbit/s FireWire). “The possibility to operate two Firefly MV cameras on the same IEEE-1394a bus at their maximum frame rate by using a custom image size of 608 x 480 pixels made life much easier for us,” adds Pintaric, “because it enabled us to connect a larger amount of cameras to the same workstation.”
FIGURE 4. Compact iotracker camera housing integrates board-level FireWire camera and IR strobe lights.
The cameras send a continuous stream of images to the tracking workstation, where the iotracker software runs advanced image processing algorithms in real-time to calculate the projected centers of every marker in every camera image. The 3D location of every marker is recovered via geometric triangulation. The software identifies precalibrated rigid-body targets and computes their position and orientation. It then transmits the resulting 6-DOF measurements to subscribed client machines over a TCP/IP Ethernet network. The high frame rate of the cameras and the speed of the iotracker software results in a very low latency of between 18 and 40 ms, depending on number of tracked rigid-body targets. The software also provides support for a wide range of third-party software packages, such as a built-in Virtual Reality Peripheral Network (VRPN) device server, which allows VRPN-aware client applications to directly stream tracking data from the server machine over TCP/IP.
“The modular design of iotracker allows users to customize and configure a tracking system for their specific needs,” says Dr. Zsolt Szalavari, Product Manager for iotracker’s distributor Imagination Computer Services. “The high precision, large tracking volume and easy and flexible setup of the system make it perfectly usable for many application areas like VR/AR research, architectural walkthroughs, engineering decision-making and real-time motion capture. It is also the first truly affordable optical motion tracking system that gives you the full benefit of high-precision motion tracking while keeping an eye on the budget.”
Reprinted with permission from the Inspect magazine article, "Real-Time Immersive Experiences: Optical Motion Tracker for AR and VR Applications", featured in the April 2009 issue. www.inspect-online.com