Several questions regarding WebODM and RAM

Hey there,

I am new here and I used WebODM for the past 3 month pushing it’s limits
I am a geoinformatics student from germany

There is one big problem I have, and it’s a common one - RAM, or WebODM closing because there is not enough

My PC has a Threadripper 1950x and 64GB RAM
Docker uses 28 threads and 58GB RAM + 4 GB swap (wohoo)

But my task still fails

The pictures I use come from a Phantom 4 Pro, image resolution is 4864x3648 and CCD size is 13.2mm

I have been playing with settings over the past month and I am still not satisfied
PVMS tends to run better but it still crashes
What really helps is resize to 1024 but do I lose quality with this setting?

What are your settings for what kind of data?

Right now I try to process a set of 1400 pictures
It’s a 100 ha area and I took the pictures at 100m with an overlap of 75% side / 80% front

Hey @JBollow :hand: your analysis is quite correct, resizing images is the biggest contributor to reducing memory usage, and with 1400 pictures I would expect tasks to occasionally fail with 58 GB of RAM. Depending on the dataset you might be losing quality by resizing.

We have long term plans to include the ability to split large datasets into WebODM leveraging the work of detailed in, so that’s going to help a lot. But we’re probably 1 or 2 years away from that goal (given current resources).

Thank you very much, that is great feedback!

I am able to split my workloads manually and recombine 2 or more orthophotos later in QGIS or with GDAL

Would increasing swap be an option? And is this even possible?

I am considering buying up to 128GB RAM for larger datasets but I were able to complete my 1400 picture set with this settings: min-num-features: 10000, resize-to: 1024, fast-orthophoto: true, force-ccd: 13.2, use-pmvs: true
In 05:44:37

I am going to do further testing on this set and see how far I am able to increase the quality
But right now I am processing a “smaler” set of ~500 pictures with an 85% overlap at 15m from a md4-1000 with a Sony Alpha
The images aren’t geotagged but I created a GCP file for this, hopefully this will work, otherwise I’ll have to add geotags with exiftool?

Hey @JBollow,

I have just read this and am also wondering what the maximum number of images I could process is whilst maintining the highest quality

At the moment I am processing 903 images using a 32gb 64bit Windows 10 pro machine with I7-3770 3.4GHz processor and Nvidia GeForce 210.

~24GB allocated to Docker and 6PU as well as 1024mb swap.
Default settings applied. No modification to resizing parameters.

I will let you know how I got on, interested to know how far you push your ODM.

Hi, i am trying to use --fast-orthophoto arg but it seems it is not working. does it work in web odm ?

It worked for me

I successfully completed two high quality runs with 430 images ~4.3 GB
Images are taken with a sony alpha with 24MP

Sorry I am not allowed to share the dataset :frowning:

It’s a small area of just 1 ha, but a very high ground resolution was needed

I did a 200px/m and a 400px/m run and archived very good quality
The original pictures are somewhere in the 300px/m range

I recorded both processes with portainer
The settings I used are visible in the pictures

200px took 3 hours
400px took 3 hours 15 min

Both used the same amount of RAM

Docker used: 24 threads and 56GB RAM + 4 GB SWAP
I don’t know why portainer show stats like this, but 500% equals 1 thread full load so 12000% is 24 threads full load

Using docker stats returns wrong values!
So far portainer works great for monitoring

I am using docker for windows on windows 10

By the way, my high quality ortho at 400px is great and much better than pix4d!
Here is a comparison (not full resolution)

Other than that, my last project at 100m and an 100ha area did great, all I had to do is resize-to: 1024

From all my testing so far, everything up to a dataset from 5GB everything works with ~56GB RAM
Everything larger needs resizing
All other runtimeparameters do alter the quality but don’t really affect the RAM usage

For the moment I think I am good to go with only 64GB RAM


Thanks for sharing your results @JBollow!