FLIR White Papers

Accurate 360° Spherical Imaging With Multiple Pre-Calibrated Sensors

Quality and flexibility in the Ladybug cameras

Article Breakdown:


Today, the quality and flexibility of spherical video data make the medium ideal for applications requiring synchronization of video streams. The most famous and typical being GIS applications such as online mapping and streetviews colorizing LIDAR generated 3D point clouds. The Entertainment industry is another early adopter of this technology for providing immersive experiences.

The Point Grey Ladybug spherical imaging systems have become the de facto industry standard. The Ladybug systems do all the image acquisition, processing, stitching and correction necessary to integrate multiple camera images into full-resolution digital spherical and panoramic streaming video in real-time. This ability to stream in real-time is unique in the marketplace. 

Multiple Cameras Synchronized

Point Grey Ladybug 360 Spherical CamerasWhere some systems use mirrors or fisheye lenses to create the effect of a panoramic view, the Ladybug systems use six cameras with high quality Sony® CCD image sensors to truly deliver images gathered from six vantage points for 90% of the full sphere. Five CCDs are positioned in a horizontal ring and one is positioned vertically pointing upwards. The six cameras are pre- calibrated; this is the pivotal technology that allows the many other innovations within the system to be possible. Since lens settings, such as focus and iris, are fixed to ensure the camera stays calibrated, there is no need for in-field calibration.

The Ladybug cameras are controlled by the Ladybug API as part of the SDK. It allows for complete control of the camera, graphics rendering, and coordinate system overview. Graphics rendering support includes real-time rectification, stitching and blending. System coordination allows users to manage each of the six sensors independently. Lastly, the SDK allows users to integrate the system with their custom applications.

Geometric-Based Calibration and Accuracy

Rather than relying purely on mechanical calibration, the Ladybug systems use software to calibrate each camera on its own and in relation to each of the other five cameras. The system is able to know the vector associated with every pixel, in each camera, to one-hundredth of a degree accuracy. This in turn enables applications to know where the camera is in relation to the rest of the world. In order to be able to provide this relational data, Point Grey not only solved the problem of calibrating the lenses but also solved the greater challenge of calibrating the rotations and translations between all six lenses to high accuracy, a problem that is made harder by the small overlap between cameras’ fields-of-view.

Ladybug 360 degree spherical video with Google Maps
Ladybug spherical image linked to GPS data

Ladybug spherical image linked to GPS data Ladybug imaging mapped to LIDAR point cloud The geometric accuracy of the Ladybug calibration means that image data is spatially consistent across the whole sphere not just across the stitching seams. This allows Point Grey software to render any partial view of the video sphere without lens distortion effects being noticeable, even if that rendering spans multiple camera images.

The decision to use software correction rather than precise mechanical alignment means that Ladybug cameras can be assembled with an efficient and reasonable requirement for mechanical tolerances. Point Grey has also automated the factory calibration process that produces extremely consistent and reliable results. The mechanical design and automated calibration has made production of Ladybug cameras very scalable and can adapt to changing demand. Additionally, the factory calibration and robust case design removes any requirement for in-field calibration. The Ladybugs are calibrated once, in the factory, and then housed in a unique ruggedized casing rigid enough to resist changes in temperature, vibration and shock. Therefore, the calibration stays intact and in-field calibration is no longer necessary.

Lidar with Ladybug Imagery
Ladybug imaging mapped to LIDAR point cloud


Benefits of Calibration

The ability to enable applications to know where the camera is in relation to the world takes the Ladybug beyond being a camera simply panoramic images and into the realm of computer vision where it opens up a wide range of possible applications.

Post Processing Workflow for Maximum Dynamic Range

The Ladybug5 moves image-processing from the camera to the host PC where users control the outcome. The Ladybug5 captures, compresses, and transmits full bit depth (12-bits) images to the host PC. The LadybugCapPro’s post processing toolbar is used to apply white balance, gamma, smear correction, fall-off correction and other image processing functions. Users are able to make decisions and experiment with settings as they view the images and watch the effects in real-time.

Benefits of Post Processing

This capture and post workflow model allows users to maximize dynamic range and maintain flexibility by being able to return to the original content and re-apply post processing steps as desired.

Mouse over image to see corrected image after post processing. Color is corrected, detail in the shadows are brought out, and sun smear is removed.


To learn more and go to Our Spherical Cameras Landing Page