Question Regarding RAM

Hello! I’m a new WEBODM user. I have a quick question.

So, I was thinking about upgrading my RAM to 64G as I have heard that the more you have for ortho maps the better. While processing my first map last night, I kept an eye on the task manager to see how much RAM was being used and it appeared no more than 14-15 GB was ever used by my computer. I’d assume WEBODM was probably 12-13 GB of it.

Any idea why it isn’t using more? I would imagine the maps would process faster had more RAM been used by WEBODM. If under 15 GB is being used during processing, it appears it would be a waste of money to do a RAM upgrade.

Any thoughts on this? BTW, I’m using the new Windows installer that doesn’t need Docker.

Thanks a lot,
Mark

Welcome!

RAM usage depends mostly on processing parameters and dataset size.

ODM will allocate what it needs dynamically as it works through the processing pipeline.

I’ve noticed the Windows version uses much less RAM than the version run in Docker, which shows as VMMEM in task manager

1 Like

Yep. Less overhead as you’re not running an entire other kernel and container, as well as WSL having a bit of a bug with releasing some pages from memory.

I’m quite pleased with Piero’s work on Windows-native ODM!

2 Likes

Hello,
I have a related issue.
I upgraded my RAM from 16 to 44 Gb, and the time for processing in WebODM improved just around 10%. No more than this.
I tried either on Linux mint and Windows 10, with datasets varying from 23 to 230 photos. Clearly the processing performance did not follow the large increase in memory… (the CPU is a Ryzen 7).

It’s sad because I spend a lot of money on DDR4. :frowning:

some clue about?
tks

RAM wouldn’t improve processing time very much unless you were severely RAM limited and swapping out to a slow disk a lot.

RAM mostly enables you to process larger datasets at all, as well as letting you drastically increase the quality of your products by using higher quality levels of a number of parameters.

Processing speed is mostly impacted by CPU base clock rate and CPU core/thread count.

3 Likes

Thank you Saijin for you quick feedback.
Bad for one side, but good for another. Good to be remembered about the importance of RAM on increasing quality levels of products.

Thanks a lot!

(I’m reading your comments about clock & core/thread and looking my ‘almost unused’ GPU… )
8-))

2 Likes

We’re getting there! Patience, my friend, haha.

I too would love to let my GPU rip on some data soon.

3 Likes

Hi all, also related

I have been running the new Windows native WebODM on 2 computers and have noticed that on both it uses 30% of available RAM. Processes that succeed on on one computer (20gb RAM) fail on another (8gb RAM) using the same parameters. When compared to the older version (running on a VM) the newer native one is much slower and fails with fewer images.

Any idea if there is a setting of sorts re. RAM allocation to the Windows native WebODM?

Welcome!

What do those RAM amounts signify? System memory free?

It should be using less RAM than before.

No settings per se, just tuning of parameters.

Those are the total amounts of RAM that each system has. ODM uses about a third of that during processing regardless of number of images.

Hmm, that’s odd. Not seeing that behavior locally. Could you please provide full system specs?

The one system is a notebook running an AMD Ryzen 7 3700U with 20GB of DDR4 RAM. The other is a desktop running an Intel Core i7-8700 with 8gb of DDR4 RAM. The desktop gets up to 53% utilisation re. RAM. The notebook stays below 40%.
On the desktop, when running 400 images RAM utilisation is 53% and when I open QGIS the process stops with an error.
Could you provide me a ballpark figure of number of images you believe each system should be capable of processing at any one time? (pics are from a SONY ILCE-5000 and are 24mp).

QGIS uses a good amount of RAM/commit, so it isn’t surprising that it is making OpenDroneMap run out of memory.

I’m not sure, especially since I don’t know exactly what processing settings you’re using, but 24MP is fairly hefty at 400 images and 20GB. Certainly more than I’d expect to be reliable for processing at anything other than low quality at 8GB RAM, especially if you’re going to be multi-tasking with other productivity software at the same time.

Hi Saijin, thanks for the speedy replies and willingness to help.
At the moment I am trying to find the optimum balance between processing settings and number of images. I have a stretch of river + banks that was surveyed and totals to about 7000 images which I want to process in batches and stitch together in QGIS later on.
Ideally I want to process so that I have 3cm/px ortho resolution and fast ortho is fine since I do not need DEMs at this stage.
I am going to try and run 300 images using fast ortho @ 3cm resolution and will let you know if that works.
The pictures do not have geo tags and I am using the geo-referencing tool in QGIS post processing to get them to fit.
The idea is to use the notebook to process and the desktop (with less RAM and less HDD space) to handle the outputs in QGIS afterwards.

1 Like

Fast ortho helps a lot with resource consumption… I would try to avoid any other processes on the computer that is stitching, and maybe see if you could use a fast flash drive for Windows ReadyBoost if you need more page/swap.

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.