I am working on processing a 92,000 image dataset using split-merge, 72 cores, and 1.5 TB of RAM (across 6 hosts, although half the compute capacity is on one host alone). Needless to say, I am learning a lot.
As I am doing this, there are some peculiarities to the process: it’s predominantly not RTK data, it’s 4 different cameras (which is super helpful for lens correction), and we are working toward one seamless product, both elevation model and orthophoto.
I have been searching for equivalent projects. DroneDeploy proudly announced a while back the capacity to do 10k image datasets. Searching the Pix4D forums, I find some references to some 30k image datasets (which I have crossed that threshold on a previous project), but I am not seeing anything larger than 30k.
So, my question to all of you: have you seen anything larger than 92k images (roughly 1800 gigapixels of imagery) regardless of software (Pix4D, Agisoft, etc.) for producing a seamless product?
And if not: what’s the largest you have encountered?
BTW: and yes, I will be sharing what I learn in this project in the docs.