I saw the giant notice, but this is more of a general set of questions:
I have 589 photos, taken with Phantom 4 Pro V2, resized to 3072 because of memory expectations, still failed with the “node went offline or ran out of memory” error. Doing another run with resize at 2048.
I have 16 cores, 76GB RAM, and I would really like to not resize these images at all… Running in a hyper-V instance as an ubuntu server linux box.
I know that splitting the dataset works, but as far as I know it only works with a cluster.
I feel like sub 600 photos should be able to be processed without fail and no need to resize, but maybe I’m mistaken. Any insight?
I could grossly use photoshop and photomerge all of these into one, inaccurate mapping of the site with no memory issues. Full size images getting processed on my main computer with 32GB, in about 5hours.
Is there any secret to using webodm that maybe makes the process take longer, but can use 1k+ images without resizing?