Insufficient memory on Google Compute



I tried creating a VM on Google Compute to process a dataset of 542 images, installed with Ubuntu 18.04 and WebODM pulled from git 2 days ago. The VM has 16 vCPU with 60 GB memory. The settings used were fast-orthophoto, ignore-gsd, and orthophoto-resolution. For the last setting, I used a number of values from the default value, 2, 1.5, and 1, all of which ran fine except the last value where WebODM produce insufficient memory error. The default value run was a full run, while the other run was rerun from odm-orthophoto. I tried the last setting again, this time changing the VM spec to 24 vCPU and 90GB memory but WebODM still did complain about insufficient memory.

The last couple of lines from the console look like this:

[DEBUG] running /code/build/bin/odm_orthophoto -inputFile /var/www/data/960a7bd2-219f-4ba1-a56e-0d81d184e4e1/odm_texturing_25d/odm_textured_model_geo.obj -logFile /var/www/data/960a7bd2-219f-4ba1-a56e-0d81d184e4e1/odm_orthophoto/odm_orthophoto_log.txt -outputFile /var/www/data/960a7bd2-219f-4ba1-a56e-0d81d184e4e1/odm_orthophoto/odm_orthophoto.png -resolution 100.0 -outputCornerFile /var/www/data/960a7bd2-219f-4ba1-a56e-0d81d184e4e1/odm_orthophoto/odm_orthophoto_corners.txt OpenCV Error: Insufficient memory (Failed to allocate 43503955636 bytes) in OutOfMemoryError, file /code/SuperBuild/src/opencv/modules/core/src/alloc.cpp, line 52 Couldn’t allocate enough memory to render the orthophoto (113076x96183 cells = 43503955632 bytes). Try to reduce the -resolution parameter or add more RAM. Traceback (most recent call last): File “/code/”, line 47, in <module> plasm.execute(niter=1) File “/code/scripts/”, line 94, in process ‘-outputCornerFile {corners}’.format(**kwargs)) File “/code/opendm/”, line 36, in run raise Exception(“Child returned {}”.format(retcode)) Exception: Child returned 1

I thought 90 GB memory should be enough in this case. If I want to keep orthophoto-resolution=1, what should I do?


Welcome to the forum! There are a few things that you could try:
The first is to go up another size in VM (probably expensive).
The second is to perhaps create a swap file and use that to extend your RAM artificially (at the expense of both performance and disk space).
A third option, and probably cheapest unless you’re doing many surveys, would be to outsource the processing to a provider:

The first of the two is run by the main developer of WebODM.

1 Like

Thank you for your suggestions. I have upped the VM spec to 24 vCPU and 156 GB memory and the task completed. The peak memory usage was 94.1GB, so I clearly underestimated how much memory is required.

The result looks very good and comparable to the orthophoto produced by Photoscan on the same dataset. This is a very impressive piece of software! Thank you to all the devs and contributors of the project!

1 Like