Cannot process dataset, fail after 235 hours

If you start a thread with your processing parameters and dataset, I’ll try my best to get you sorted.

1 Like

Thanks - I have a thread on the go already - see Processing stopped because of strange values in the reconstruction - WebODM - OpenDroneMap Community.

It looks like Gordon, jvhooser, and djinvite are having very similar problems.

3 Likes

:grimacing:

Sorry. I’m a bit scattered since I’m working mobile and traveling.

This week I’m back to the ol’ desk. I’ll follow up with everyone then and hopefully try processing.

3 Likes

No worries…

1 Like

Can we get any clues before the series comes out? I haven’t been successful yet in processing even a 6450 image dataset, with default quality settings, and a Ryzen 7 5800x with 128GB of memory.

1 Like

I use this as my starting point for RAM recommendations.

This assumes 1 or 2x the swap as the memory available. At 6000+ images and 128GB RAM, you’ll almost definitely run out of RAM. You could do 2500 images easily if you have 128 to 256GB swap.

2 Likes

Well, shoot. That certainly puts a crick in my plans to ultimately do tens of thousands of images at once! Thanks for the info though.

2 Likes

Yeah. It’s the right tool, but it does require some metal to get things done. Split merge can help though.

2 Likes

Ah, ok. So, changing the split default from 999999 to something closer to 2500 may increase the probability of success?

2 Likes

Yes!

1 Like

I’ve had a task with about 7800 images make it to completion, using 0.5X resizing, i7 10700K 3.8GHz processor, 96GB RAM, 1TB SSD. Every task I’ve tried with split-merge, 0.5X resizing, and up to ~9400 images has failed.

1 Like

Yes, and if this is anyone’s experience, we can help troubleshoot a lot easier if you share the data publicly or privately. Otherwise it’s very difficult for us to help. :smiley:

I’ve processed up to 90,000 images with split merge. In the process, we discovered a hard limit of ~80,000 due to a zip library limitation. That limitation is now gone. We don’t know the new image processing number ceiling, but it’s pretty high.

2 Likes

success 9400 images -

3 Likes

Interesting. What’s your hardware build and OS?

1 Like

Windows 10

1 Like

itenviro, what’s your hardware build and OS?

1 Like

Win 10 128gb 5950x 2tb ssd

1 Like

Man. My build is pretty similar to yours. Win 10, 128gb, 5800x, 1tb ssd main, with jobs processed to an 8tb hdd. I’d think I could process jobs at least somewhere in your realm, but I’m just having issue after issue.

how many images are you successful? what are your settings and image specs?

1 Like

sorry for post,
similar problem on mac

seems a recent update broke the installation of numpty

The answer to this problem has been revealed in another thread

solution is
to force reinstall of the correct version of numpy.

log into docker container then issue command

docker exec your container name (maybe webodm_node-odm_1) python3 -m pip install --upgrade --force-reinstall numpy

fixed my problem

2 Likes