Automated Camera Calibration

Repost from smathermather.com

In a previous post on camera calibration, we discussed the possibility of setting up calibration flights that have the following characteristics: non-parallel (converging) flightlines of 20° with a 5° forward facing camera (assuming 0° is nadir). With OpenDroneMap’s --camera parameter, we can thus import camera models from a calibration flight and use that in a more efficiently flown “traditional” flight plan.

The underlying challenge becomes what is the cadence of such calibration flights? If I fly a calibration flight in the AM at one temperature, and the afternoon is substantially warmer or cooler, when do I need to repeat the calibration flight? Equally, what is the longer term cadence for changes such as drift in camera calibration parameters from long-term changes to the camera induced by aging, vibration and other factors.

So, I have been thinking through a thought experiment for the last few days: how with existing flight planners can we plan a flight that is both efficient and contains converging flight lines needed for automated calibration? Here’s what I propose:

Fly with much lower overlap than normal, but using two crossgrid flights (sometimes called crosshatch) separated by 20° with a 5° forward facing camera.

  • Crossgrid overlap percentages can be lower than parallel flights. To get good 3D results, you will require 68% overlap and sidelap for an equivalent 83% overlap and sidelap.
  • To get good 2D and 2.5D (orthophotos and digital elevation model) results, you will require 42% overlap and sidelap for an equivalent 70% overlap and sidelap.

animation showing experimental optimum

The one concern I have with this approach is that it is difficult with existing planners to ensure the even distribution of images across the pairs of cross flights. This is probably OK for larger values of overlap, but could be problematic when targeting 2 and 2.5D datasets.

I am hoping we can do some testing of this approach this week, but interested in seeing the idea out in the wild for others to try too.

I would definitely be interesting in seeing other folks experiments in this space.