Machine resources Pix4d vs WebODM

Hello community, how are you doing today?

I’ve been testing out WebODM after the trial period of the Pix4d was over, on my mac book pro 2015 (16 GB ram, 500 GB SSD, no GPU since it a Radeon video card and big sur os running)

The first thing I’ve noticed is that pix4d manages to process anything (for example a 250 set of pictures) and WebODM sucks every resource available and fails to process the same batch.

What is the difference in approaches?

It’s a bummer that I could process anything on pix and i that i need a new computer for webodm.

Any ideas?

Thanks

1 Like

Hi @peperoca,
saw that you are also in Uruguay: hola :cowboy_hat_face:

With 16Gb of RAM I was able to run 500 image datasets in around 2 hrs at standard settings.
Make sure to have enough pagefile (swap) memory available (2xRAM size).
That produced a very usable aerial map to take measurements from.
Pix4d is an amazing software, so is WebODM. My experience is, that you will be able to get to the same or better results with WebODM, but you will need to butter in some extra energy and readiness to try things out.

I do not know your situation, but depending on your type of application one software may serve you better than another. A new computer for 250 images will probably not be needed.

But you have the option of letting the community know what your exact problem is and we will probably find a way to make it work.
If you want, you could share some more info about the problem, maybe a screenshot?
If you upload the images, I would be willing to run them through and see how well they compute.

I am using WebODM to plan permaculture earthworks, say lakes, swales, food forrests etc. Plans are made from the contour lines and aerial maps made with WebODM. Then land surveyors, excavators and trucks come in to do the work. So I can confirm that the measurements in WebODM are good enough to make earth moves and plan lakes, swales and drainages from them. It also helps me a lot to estimate project costs.

Btw I am located in Treinta y Tres, some kilometers outside of town.
I would be curious to hear from you again :grin:

Have a good one!

Shiva

3 Likes

Hello @Shiva! How are you?

Thanks for the reply.

I actually lived in Treinta Tres from 2008 to 2010 while building the Galofer Thermoelectric plant. I was the mechanical engineer on site for the Brazilian engineering company that the client hired.

Regarding your experience with WebODM are you running it native on windows or through docker on Mac as myself?

Back pedalling a bit, I set on Docker 8 GB of available RAM from my 16 GB, but since it is virtually running the thing i wouldn’t want to leave the OS with low memory. I could try to push it some more, perhaps to 12 GB just to try.

When you refer to “readiness to try things out” what exact parameters you have in mind?

Thanks again!

Acabo de terminar de escribir y no se porqué puse todo en ingles!

Abrazo

1 Like

4GB left for the HostOS is usually okay provided you don’t run stuff while processing. Don’t neglect to give the Docker runtime the full 4GB SWAP volume size as well.

Never tried Pix4D so can’t compare but with WebODM/ODM you need to find the right settings yourself and that will need some testing.

I’ve had problems but I’ve found solutions after a wile.

Buenas dias!

Si, vivo en Uruguay pero todavia estoy aprendido espanol. Vivo en un pequeno eco pueblo cerca de la quebrada de los cuervos. Estamos muchos europeos (Hollanda, Alemania, Italia, Suiza) y algunos uruguayos. Pero hablamos ingles la mayor parte del tiempo.

Amazing! A veces vamos a esa planta a buscar cenizas o cáscaras de arroz secas para nuestro jardín y para hacer piedras de adobe.

But for these technical subjects, I still prefer English :sweat_smile:

I am using it in a Docker container on an Ubuntu 20.04 installation. The docker setup is so easy to use.

The more RAM you can give it the better. And as Saijin_Naib says, give the Docker instance as much swap / virtual memory as possible, as this will certainly help the process. A 250 images set can use 40 Gbyte of memory.

I do not know the dataset you are using, but often a good first step is to increase min_num_features: to something like 32.000 or even more. A second step is to increase feature_quality: to high or even ultra. But watch that setting, it increases processing time and the use of hardware.
Very rarely I had to even go back and capture some more images since there was too little overlap or just not enough features. Uruguayan meadows / fields can have very indistinguishable and repetitive features. Very steep slopes can also be challenging.

If the project still does not work for you, you could also share the dataset. I would be willing to run it on my machine and see how it computes.
That way I/the community could give you more concrete hints.

Hope you will get your images computed well!

Abrazo

1 Like