Tuesday, 17 February 2015

Shooting 360 degree video with the Ladybug3

I recently shot some panoramic content using Point Grey's Ladybug3 camera. The experience wasn't totally without issue so I wanted to share some of my experiences to help others who might have similar problems.

Panoramic view of London on a grey day. View as a sphere here

The Ladybug3 uses 6 cameras with wide angle lenses to capture panoramic content. It comes complete with stitching software that can output 360 degree images or video. The lack of a camera facing downwards means that the base of the sphere is not captured - this is seen as a black hole below the camera, for example if you look down when viewing the content in an Oculus Rift.

Camera setup

Ladybug3 connected to 17 inch MacBook Pro using FireWire 800 port

To capture content on the move I connected the Ladybug3 to a 17 inch MacBook Pro with a FireWire 800 port. The Ladybug software is Windows only, and VMWare Fusion cannot virtualise FireWire ports, so the laptop was booted into Windows. The MacBook itself was capable of supplying enough power to the camera even when not plugged in to the mains. It is also possible to use an ExpressCard to FireWire 800 adapter, powering the card via an external power supply. This method was shown to work by Paul Bourke in his setup and is the method recommended by Point Grey.

Software setup

I originally used the LadybugRecorder application as it's very simple. However, it provides very little control over the camera during filming. Instead, use the LadybugCapPro application to record data. This lets you control settings like exposure, shutter and gain.

Our camera was set to 15fps using JPEG 12-bit compression and full vertical resolution. As discussed here using a rate of 15fps rather than the full-resolution maximum of 16fps has the advantage of making it easier to upsample the video to a more standard 30fps later. Although the documentation is a little unclear on this for the Ladybug3, it is possible to stream at up to 32fps if half vertical resolution is used.


Following advice from my colleague Richard Taylor, stitching was performed using the Point Grey software into a sequence of PNG images. These images were then turned into a video file using separate software (discussed below). It was felt that the quality of the videos output by Point Grey's software was poorer than using this technique. In fact, at high resolutions the video output by the Point Grey software was extremely lossy for me.

At high resolution and using a good colour mode (such as High Quality Linear) the stitching process can be quite lengthy. This can be improved by using a machine with a good GPU and lots of memory, and setting in/out markers to avoid processing unnecessary frames. It's quite hard to find the in/out markers feature in LadyBugCapPro - you move the seeker to the desired frame and then click the icon of a blue arrow in a circle (found in the "Stream toolbar"). It was noted that using the "Parallel processing" option to speed up export had a tendency to mess up some frames, so it was not used.

Setting in/out markers in LadybugCapPro to reduce number of frames processed

We used the "Panoramic" type, which creates an equirectangular panorama. This has the advantage of being commonly supported by most playback software. However, it was pointed out by my colleague that this format doesn't make the best use of pixels and can distort content at the sphere poles.

After stitching to PNGs I noticed there were gaps in the image numberings produced by LadyBugCapPro. I wrote a small Ruby script to identify these gaps, and then manually duplicated neighbouring frames to fill the holes. I didn't have many gaps so this manual method was fine for me. However the script could easily be updated to automatically duplicate neighbouring frames.

Dropped frames and audio syncing

When syncing the audio with the output from the ladybug, it became apparent that there was a problem. The frame rate was inconsistent, varying across the duration of the video. As our software assumes a constant FPS, this resulted in erratic playback speed. Sometimes this wasn't a problem, but when syncing audio even a single dropped frame was noticeable.

As my Windows partition was small and the Ladybug generates large amounts of data, I attempted to save the data to an external hard drive using the MacBook's USB2 port during filming. However, the transfer rate of USB2 was too low, resulting in many dropped frames. It's important to choose settings that don't produce more data than your system can handle. In particular you can adjust the JPEG compression ratio, the frame rate, and if you're using full or half vertical resolution. Regardless of the settings, however, frames can still sometimes get dropped.

You can test for dropped frames in a recording using LadybugCapPro from inside the GPS menu, under "Generate GPS/frame information". This provides a report of how many frames were dropped and where. This information can be used to correct for the missing frames. I made a very basic script (available here) that parses out the missing frame details from this report, and uses that to copy neighbouring frames to fill the gaps. (NB: this code is mostly for reference - it moves and copies files so should only be used if you definitely know what it's doing, and probably on a copy of the data to avoid having to restitch if something goes wrong).

Processing the PNGs

The PNG images can be turned into a video using separate software. My colleague loaded the files into Adobe After Effects, before exporting this as an Adobe Premier project and using that software to turn it into video file. I didn't have this software available so instead I used ffmpeg which is available for free. It's very easy to install on a Mac using Homebrew. The command I used to create the video file was:

 > ffmpeg -r 15 -i ladybug_panoramic_%06d.png -c:v libx264 -r 30 -pix_fmt yuv420p out.mp4

The first -r parameter is the capture frame rate, the second -r parameter is the desired frame rate of the output. Refer to ffmpeg -help for more options.

Viewing in the Rift

To view the panoramic video in the Oculus Rift I used Kolor Eyes. This is free software that works on Windows and Mac.


I'd like to thank Richard Taylor for all his help - a lot of this post is based on his advice. Paul Bourke's Ladybug3 guide was also very helpful.

NB: Any code here was quickly thrown together, with few error checks and not much care. Standard caveats - use at your own peril.