8 – Field Procedures + Tips and Tricks
Although we’ve tried to be complete and thorough in our documentation, there are always tips and tricks that make Argus more effective, especially when used as an entire suite to support field recording of 3D movement. Here are some of our tips.
Epipolar lines and camera positioning
Argus-Wand and Clicker using triangulation based on the intersection of what are called epipolar lines, which are the projection of a ray from one camera into the field of view of another camera. Epipolar lines are quite useful for identifying objects of interest in several views at once and are more useful if, in the case of 3 or more cameras, the epipolars formed by camera 1 are not parallel in cameras 2 and 3. This will be the case if all 3 cameras lie along a single line, but will not occur if they lie in a plane or (for 4+ cameras) a more complicated geometry. Theriault et al (2014) describes this in more detail but we stress it hear because getting this part right can make later analysis much simpler.
Finding offsets with high-frequency noises
In order to find the offset between videos you’ve taken out in the field, Argus-Sync banks on the fact that you’ve recorded multiple, short-duration high-frequency sounds, or synchronization tones, at some point in the video. Sync can operate on videos without these sounds, but comparing sound tracks without these leads to poor results. Sounds from the environment are often noisy, not heard by all the cameras, and non-equidistant leading to phase shifts. Because of these facts, it is difficult to discern the offset between videos based purely on natural soundtracks. By adding in synchronization tones, you ensure that there is a part in the sound track of all the cameras that will be similar enough for lagged cross-correlations between them to reveal the offsets. In our work, we use walkie talkies hung at the bases of our tripods to give synchronization tones. All the walkie talkies are set to the same channel, and so simply beeping in on one multiple times at some point in the video does the trick. Using walkie talkies also enables us to give verbal notes about the particular recording to all the cameras involved. However, anything from claps to whistles can be used as synchronization tones for your videos. Just remember to make sure that the sound is loud and equidistant from each camera.
Avoiding strange shooting modes
As we talk about in the Developer’s corner section of the site, we ran into problems trying to calibrate certain GoPro shooting modes. These were the Superview modes which according to GoPro’s website stretches the pixels at the edges of the frame. This may be great for some artistic endeavors, but ruins any attempt at doing 3D camera calibration with Argus. Similar modes almost certainly exist on other digital cameras. We recommend that you avoid shooting modes that alter the image in strange ways after it reaches the sensor. These alterations can render the models that Argus uses to classify lens distortion useless.
High distortion, wide FOV cameras
Argus 1.0 (Python 3 and OpenCV 3.1) supports two types of Omnidirectional models, Scaramuzza’s and C. Mei’s [1, 2]. These models are highly useful when employing wide-angle, high lens-distortion cameras. The latest version of Argus in repository supports calibration with C. Mei’s model via OpenCV 3. In addition, Scaramuzza’s model can be calibrated in MATLAB (https://sites.google.com/site/scarabotix/ocamcalib-toolbox), and the coefficients are usable within Argus Clicker and DWarp (see documentation). When using these models to obtain a calibration with Argus-Wand, you will still need to input a Pinhole camera profile. This can be done by first undistorting the inputted pixel coordinates using your calibrated Omnidirectional coefficients (wand points and background points) with argus_gui, then setting the distortion coefficients in the inputted profile to zero. By undistorting the points this way, the focal length of the camera is almost certainly changed, so we recommend allowing the routines in Wand to optimize focal length and possibly principal point. This usually results in a far better calibration then you would otherwise obtain.
Proper wand procedures
In order to give scale to your 3D extrinsic calibration in Argus-Wand you must film an object of known length with an uniquely identifiable beginning and end, i.e. a wand of some sort. This lets you set the scale of the calibration and also provides a check for euclidean-ness, i.e. the wand length should not depend on its orientation. In order to ensure proper scaling for all the cameras, attempt to put the wand in the field of view of as many cameras as possible. Also, to help calibration, make the region that the wand passes through as big as possible and hopefully containing the objects of interest such as the animals that are actually being studied. Having a wand be seen in only a small region of space can lead to low quality calibrations that need additional information from fixed points
There are other ways to give scale to your calibration. Sometimes, especially when filming things that are high up or far away, it is impossible to put a wand in the region of interest. In this case, one can measure the distance between the cameras, and then scale their 3D points based on the translation vectors that Argus Wand outputs (in SBA-profile, see the documentation). In addition, wand points are not needed for an initial calibration. As such, in cases where the wand ends are indistinguishable or the wand is difficult to see, it can be helpful to simply mark ‘unpaired’ or ‘background’ points, get an initial calibration from Argus Wand, then use the direct linear translation coefficients from that calibration to help mark wand points, give scale, and refine your calibration.
Cameras overheating
One of the reasons we developed Argus was to make better use of robust, consumer grade cameras such as those targeted a extreme sports. However, the small size and high power draw of these cameras during recording can lead to thermal shutdowns if you’re using them in a hot environment and not pushing some airflow past them by crashing through the jungle on your bike. Plan to shade your cameras, record only when necessary, bring along an external battery capable of powering the camera and an external fan at one. You could even build a USB-powered peltier cooler!
Citations
Hi
Thank you for this amazing software.
We are busy doing field tests using GoPro Hero 5 Sessions and we find that after 40 mins of recording, the cameras are shutting down. The ambient temperature is about 23 but the sensors are in direct sunlight.
Can you advise how we can keep the sensors from shutting down? You mention an external battery, will this help with lowering the temperature?
Thanks in advance!
Amir
Hi Amir,
I have not used the Hero5 Session, but my group’s experience is that GoPros and many other cameras, consumer or lab-grade high-speed shut down or develop errors when running continuously in moderate to high air temperatures and/or direct sunlight. In your case a sun shade for the cameras might suffice. In more extreme cases a small USB-powered fan blowing air past the camera is necessary; these can run for a very long time from a mobile phone power bank.
Cheers,
Ty Hedrick