Choose your online store:  EU or  US  Canada  Australia
Case Studies
Click an item to read from the list below.
Case Studies
1/20/2011
Perfect Book
12/1/2010
Quality Kegs
10/1/2010
Sweet Success

Augmented Reality System Uses Firefly MV: Firefly MV application Story

Augmented reality blends virtual space with physical locations

Every Star Trek fan has wondered what it would be like to use the "holodeck", a room where real people would interact with simulated characters and environments. Augmented reality (AR), a field of computer vision research that involves combining real-world people and environments with computer-generated content, brings this idea to life. Unlike virtual reality (VR), where the environment that an individual interacts with is completely computer-generated, AR technology superimposes perspectively-correct 3-dimensional graphics onto real-time video feeds.

YouTube video

YouTube video demonstrating AR Second Life

Researchers at the Georgia Institute of Technology's (Atlanta, GA, USA; www.gatech.edu) GVU Center have developed AR Second Life, the first augmented reality interface to a massively multiplayer online (MMO) world. Based on the 3D virtual world Second Life, it blends together locations in physical space with corresponding places in the Second Life virtual space.

For their head-mounted display (HMD) client, AR Second Life uses a Point Grey wide-VGA Firefly® MV IEEE-1394 digital camera integrated into a Z800 HMD from eMagin (Bellevue, WA, USA; www.emagin.com). The camera is mounted on the front of the HMD, pointing down and reflected forward using a right-angled prism to allow the camera to be optically closer to the person's eyes than would otherwise be possible. An IS-1200 hybrid tracking device from Intersense (Bedford, MA, USA; www.intersense.com) is used to track the exact position and orientation of the HMD precisely over a large area.

AR Second Life HMD

Firefly MV integrated in head-mounted display (HMD)

The Firefly MV is connected to a MacBook Pro from Apple (Cupertino, CA, USA; www.apple.com), running Windows XP. Power and data are sent over a single IEEE-1394 cable via the MacBook's powered 6-pin FireWire port. Image acquisition and camera control is performed by VideoWrapper, an open source C/C++ library they created that provides a single abstract API for interfacing to a variety of video camera libraries on Windows and Mac OS X, including Point Grey's FlyCapture® SDK. Raw Bayer (color) images are captured at 60 FPS and processed in real-time by the AR Second Life client software, which is written in C++.

 

"We have used Firefly's, Flea's and Dragonfly's in our work, and had been using Flea's and the extended-head form factor of the Dragonfly for our previous head-worn displays," explains Blair MacIntyre, Associate Professor in Georgia Tech's School of Interactive Computing and GVU researcher. "We are now working with the current generation Firefly MV, which provides a nice balance between size, image quality and frame rate, at a much lower price point. It also supports automatic inter-camera image synchronization, which is important for creating stereo displays where the left and right eyes need to be synchronized."

 

"Many companies are exploring how to leverage multi-user online virtual worlds like Second Life for a wide range of applications," explains MacIntyre. "While our current work is focused on machinima and digital performance projects, our work on integrating AR techniques with these virtual worlds will allow us and others to explore solutions to many business challenges, such as collaboration of distributed teams, entertainment and education, and even manufacturing."

More Information

 Request Firefly MV pricing

 

Atlanta, GA, USA