Large Orthophoto Processing - ODM Experience and limits

After several weeks work I have finally succeeded in creating a reasonably large ortho, the issues encountered towards this are documented here Error in ODM meshing cell - PDAL pipeline. Many thanks for the assistance provided. Given that the changes that I needed to make may/may not be merged in the short term (there is other work the boffins are doing that is more important and likely to supersede my amateur hacking!), I thought I’d provide both a link to the modified code and some context (hopefully to help answer some of those “impossible” questions that get raised like “how many photos can ODM process?”, “what resolution?”, “how much memory?” etc). My mods are in the mikethefarmer/ODM branch Test-pdal-1.8-maint ( They are what they are - from an ex-engineer not professional programmer!

Context - I’m a small farmer/grazier needing to document my ground state to support maintaining my farming “license” (environment regs gone feral here). The main documentation used by the govt is aerial/sat photos, but while they clearly show old plow disturbance for 20yrs+, modern minimum-till (min disturbance) farming does not show easily after a year. So I decided I needed to document my farming myself several times a year. I am using a Phantom 4 Pro 2 (20Mpx) at 300ft, giving 2.5cm/px which is a good balance with number of photos needed and resolution. Over my 160ha (400ac) property, with 80% overlap, this equates to around 3900 8Mb photos (30Gb). These are done in five, three-battery runs (around 32-38ha/run) - practically 5 separate days (so slightly different composition for each run).

System - I am using a clean Ubuntu 16.04 VM (kvm) over a 16.04 host. The machine is a DL380 G7, 2x5660 Xeon (6 core/12 threads ea) with 144GB RAM (around 2011 vintage). The current VM has 128Mb RAM, 20 threads, 144Gb swap (SAS not SSD) and around 1Tb HDD.

GCPs - I have “surveyed” all my farm strainer/intermediate (fence) posts using a Trimble Catalyst RTX correction (giving me to the most part 30cm error with some at 50cm). There is a selection of these GCPs used (158) spanning the entire area. Used WebODM GCP to produce GCP sets (per drone run) from a master GCP list and merged the GCP files for the final use.

ODM - I used -fast-orthophto together with no resize. Ortho-res as 2.5cm/px and min features 10000. Everything else stock (I think). The fast ortho is important here, as I am not sure that even with my fixes a full 3D would run (I am trying separately with split-merge etc to see). Finished off with partial WebODM (for MBTiles).

Memory performance - The greatest usage was for opensfm, where I used all RAM and around 50G swap (ie around 170Gb peak). Opensfm handed it well. I estimate I could have probably avoided the swap by only using 12-16 threads or so (rather than the default 20). So this added a little to the processing time with swapping. All the other processes used less than the 128G RAM - PDAL around 72G, gap filling around 100G, etc) - these also predominantly use only a single thread (sat looking at ‘top’ far too long!). For my data config you might get away with 96G RAM (with swap) if you were using less threads (say 10). But I doubt anything less would realistically hack it.

Time performance - The OpenSfm portion took the most time at around 60 hrs processing. While it is hard to say exactly how much time the rest took (as I redid it a number of times while solving issues) it was probably little more than 6 hrs - so pretty much around 3 full days in total. A newer machine (got mine cheap ex-Enterprise disposal), more threads, SSDs, bigger RAM etc would reduce that (and your bank balance). So you need to be patient (and with AU power prices at around 35c/kW it is not cheap running either - but my study is warm!)

Quality performance - overall the result it is very good. While obviously not as sharp as the original photos, the 2.5cm/px seems to have been achieved. I can clearly see the pasture drill lines from last year, small weeds/rocks (<30cm) etc. There is the odd small hole on some trees/sheds (mainly around the peripheral), small distortions in building lines (to be expected with all images nadir) but overall good and what I needed (playing with some parameters may solve that). Accuracy is generally within 1m of the GCP (mostly less @ around 0-30cm) with a few at the periphery a bit larger (worst 5m - but I might need to check those were correctly entered). So generally better than you would expect to get from non-corrected GPS (~5m here). The final ortho was 85111x58339px (around 4.5Gb), making it painful to load in QGIS. MBTiles version much, much better :wink:

Hope this provides some context (and possibly assistance) to those trying to process a fairly large photo set. And at the risk of getting flak (knowing how some communities are) I would also plug Piero’s ebook - the latest draft is already well worth the price of gold admission!


Thank you for the info mikegf. I am just starting the process of surveying a 100ha site twice a year and your article has validated some of the work I have done so far.

I am using Google Compute Engines to do the work as the memory requirements quickly exceeded my humble server. 30ha currently worked out at £3 (gbp) for a run, which I think is ver reasonable