MemoryError: Unable to allocate 78.1 GiB

I’ve read this topic: MemoryError: Unable to allocate 91.9 GiB for an array with shape but since I’m running webodm, and I can’t change memory settings, I would need to understand if som particulares “kind of picture” neds more memory and how to fix it !

1 Like

Here is the error message I got:

Your task [Task of 2021-09-05T************] could not complete. Here’s a copy of the error message:

0…10…20…30…40…50…60…70…80…90…100 - done.
[INFO] running gdalbuildvrt -resolution highest -r bilinear “/var/www/data/5f9f20df-6fea-4f78-8d1b-8c613b63a816/odm_meshing/tmp/merged.vrt” “/var/www/data/5f9f20df-6fea-4f78-8d1b-8c613b63a816/odm_meshing/tmp/tiles.small_filled.tif” “/var/www/data/5f9f20df-6fea-4f78-8d1b-8c613b63a816/odm_meshing/tmp/tiles.tmp.tif”
0…10…20…30…40…50…60…70…80…90…100 - done.
[INFO] running gdal_translate -co NUM_THREADS=8 -co TILED=YES -co BIGTIFF=IF_SAFER -co COMPRESS=DEFLATE --config GDAL_CACHEMAX 46.95% “/var/www/data/5f9f20df-6fea-4f78-8d1b-8c613b63a816/odm_meshing/tmp/merged.vrt” “/var/www/data/5f9f20df-6fea-4f78-8d1b-8c613b63a816/odm_meshing/tmp/tiles.tif”
Input file size is 106135, 197507
0…10…20…30…40…50…60…70…80…90…100 - done.
[INFO] Starting smoothing…
[INFO] Smoothing iteration 1
100 - done.
Traceback (most recent call last):
File “/code/”, line 54, in
retcode = app.execute()
File “/code/stages/”, line 130, in execute
raise e
File “/code/stages/”, line 94, in execute
File “/code/opendm/”, line 340, in run
File “/code/opendm/”, line 340, in run
File “/code/opendm/”, line 340, in run
[Previous line repeated 3 more times]
File “/code/opendm/”, line 321, in run
self.process(self.args, outputs)
File “/code/stages/”, line 66, in process
mesh.create_25dmesh(tree.filtered_point_cloud, tree.odm_25dmesh,
File “/code/opendm/”, line 26, in create_25dmesh
File “/code/opendm/dem/”, line 263, in create_dem
median_smoothing(geotiff_path, output_path)
File “/code/opendm/dem/”, line 323, in median_smoothing
arr = ndimage.median_filter(arr, size=5, output=dtype)
File “/usr/local/lib/python3.8/dist-packages/scipy/ndimage/”, line 1321, in median_filter
return _rank_filter(input, 0, size, footprint, output, mode, cval,
File “/usr/local/lib/python3.8/dist-packages/scipy/ndimage/”, line 1227, in _rank_filter
output = _ni_support._get_output(output, input)
File “/usr/local/lib/python3.8/dist-packages/scipy/ndimage/”, line 78, in _get_output
output = numpy.zeros(shape, dtype=output)
MemoryError: Unable to allocate 78.1 GiB for an array with shape (197507, 106135) and data type float32

Are you getting a lot of sky in your images from this dataset? If so, that can cause the reconstruction to get really big.

Another potential issue is one or more of the images having bad geolocation data, which can make the reconstruction extent so massive that it can’t be solved.

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.