First project - can't get it to succeed

Hello, all,

I just installed Docker and WebODM last night, and tried several times to get it to work. My processing always fails with error 1. For the moment, I’m just trying to do it with cell phone images, before I work up to actual drone images. I started with a 28 image cluster taken from multiple angles of a bottle of glue. That didn’t work, so I worked down, to just six images at this point. Here is the link to the images and the output. I don’t know what I’m doing wrong.!AmcHRyKotEWU6tNq6WcB8PwY5SqpxQ?e=9w4Kbg

You’ll actually have more luck with flight planned drone images. It’s actually moderately difficult to do photogrammetry manually like you’re trying.

Thanks for the reply. So you don’t think I did something wrong with my install then?

I’ll have to wait a bit on flying a plan. I just got my drone, and had to send it back as defective, so I’m waiting on the replacement now.

Try one of the bolded datasets here:

Well, I did the Boruszyn set, and it failed. Here’s the last part of the output:

Building BVH from 198749 faces...
done. (Took: 325 ms)
Calculating face qualities 100%... done. (Took 42.357s)
Postprocessing face infos 100%... done. (Took 1.087s)
Maximum quality of a face within an image: 5761.48
Clamping qualities to 40.9106 within normalization.
Writing data cost file...
Time[s]	Energy
0	189565
2	188100
2	186451
3	185676
3	185319
4	185125
4	185002
5	184921
5	184856
6	184822
6	184791
6	184769
7	184739
56451 faces have not been seen
Took: 51.646s
Generating texture patches:
done. (Took 25.108s)
5714 texture patches.
Running global seam leveling:
Create matrices for optimization...
Lhs dimensionality: 104249 x 104249
Calculating adjustments:
Color channel 2: CG took 89 iterations. Residual is 9.89471e-05
Color channel 0: CG took 89 iterations. Residual is 9.39968e-05
Color channel 1: CG took 89 iterations. Residual is 9.9316e-05
Took 0.511 seconds
Adjusting texture patches 100%... done. (Took 337.871s)
Running local seam leveling:
Traceback (most recent call last):
File "/code/", line 56, in <module>
File "/code/stages/", line 94, in execute
File "/code/opendm/", line 375, in run
File "/code/opendm/", line 375, in run
File "/code/opendm/", line 375, in run
File "/code/opendm/", line 375, in run
File "/code/opendm/", line 375, in run
File "/code/opendm/", line 375, in run
File "/code/opendm/", line 375, in run
File "/code/opendm/", line 356, in run
self.process(self.args, outputs)
File "/code/stages/", line 96, in process
'-n {nadirWeight}'.format(**kwargs))
File "/code/opendm/", line 76, in run
raise Exception("Child returned {}".format(retcode))
: Child returned 137

Your installation works, but you ran out of memory. How many cores do you have and how much memory?

The PC is a Surface Book 2:
8th Gen Intel® Core™ i7-8650U quad-core processor, 4.2GHz Max Turbo
There are 4 cores and 8 “logical processors”
NVidia GTX 1050

I had Docker Desktop set up to use 3 CPUs, 8192 MB RAM, a swap file of 2560 MB, and 128 GB Hard Drive size. That said, I just noticed that when I restart Docker, it goes back to default VM disk location, and default settings, so I’m trying again. I can’t be sure of what the settings were the last time I tried, given this tendency to revert to default that I hadn’t noticed before.

Ahh, for most modest datasets you’ll need more RAM than that for sure. But, you can try to bump up the RAM on the docker instance to something closer to 16GB, and use the --max-concurrency flag set to 1 to reduce the number of cores, which will help with RAM usage.

In general, unless it’s a beefy gaming laptop with lots of RAM, a laptop isn’t a great environment for doing photogrammetry, whether ODM or other photogrammetric tools.

1 Like

Ok. Thanks for the help. I guess my only option if this is what I want to do (for now, anyway), is one of the expensive online services.

Check out It should be free for most small datasets.


Thank you. At this point I’m just wanting to try it out to see if I can drum up demand in the area.

@smathermather is there any calculation formula to predict RAM requirement for a specific dataset size? I have a set of images around 900 and it ran just fine with default DSM+DTM option on a 26GB RAM, 4GB Swap, 4 cores 4 threads docker environment (32GB RAM on machine).

However, I needed more clarity on ortho map produced and reduced ortho-resolution parameter from 5 to 3. Then I always am getting such error above mentioned at local seam leveling step. I tried splitting dataset with 200 images and default split overlap setting. Still getting stuck at that step.

I would consider to upgrade my motherboard to something with four memory slots and buy two more 16GB or 32GB RAM sticks if my dataset with higher resolution can be worked out within 32-64GB RAM allocation range, since a higher resolution ortho map would be beneficial to us.

1 Like