Weird Out of Memory Error

[INFO]    running gdal_translate -co NUM_THREADS=48 -co TILED=YES -co BIGTIFF=IF_SAFER -co COMPRESS=DEFLATE --config GDAL_CACHEMAX 49.3% "/var/www/data/7afa8772-faae-4a17-aa8c-77b296aa278e/odm_dem/merged.vrt" "/var/www/data/7afa8772-faae-4a17-aa8c-77b296aa278e/odm_dem/tiles.tif"
Input file size is 269246, 168057
0...10...20...30...40...50...60...70...80...90...100 - done.
[INFO]    Starting smoothing...
100 - done.
Traceback (most recent call last):
File "/code/run.py", line 59, in <module>
retcode = app.execute()
File "/code/stages/odm_app.py", line 130, in execute
raise e
File "/code/stages/odm_app.py", line 94, in execute
self.first_stage.run()
File "/code/opendm/types.py", line 346, in run
self.next_stage.run(outputs)
File "/code/opendm/types.py", line 346, in run
self.next_stage.run(outputs)
File "/code/opendm/types.py", line 346, in run
self.next_stage.run(outputs)
[Previous line repeated 6 more times]
File "/code/opendm/types.py", line 327, in run
self.process(self.args, outputs)
File "/code/stages/odm_dem.py", line 97, in process
commands.create_dem(
File "/code/opendm/dem/commands.py", line 263, in create_dem
median_smoothing(geotiff_path, output_path)
File "/code/opendm/dem/commands.py", line 318, in median_smoothing
nodata_locs = numpy.where(arr == nodata)
File "<__array_function__ internals>", line 5, in where
numpy.core._exceptions.MemoryError: Unable to allocate 641. GiB for an array with shape (43040439273, 2) and data type int64

The number of photos is 2100. And I have 640 GB of RAM. It is trying to allocate 641 GiB. Any suggestions what steps needs to be taken?

1 Like

Perhaps similar to Running out of memory using WEBODM Lightning - #4 by gencomstewart ?

3 Likes

Shouldn’t we then force auto-boundary to True always? :thinking:

1 Like

It’s a relatively new feature, and we often don’t default to true on younger features, but it’s worth consideration/discussion.

2 Likes

There have been a non-zero number of Support cases where auto-boundary failed to calculate an appropriate boundary and caused a failure to process. So, it will need some bolstering to allow for graceful fail-over if it is to be the default.

That being said, I agree, it should be a default.

3 Likes

Ah, yeah. Gotta fix that before foisting it on everyone… :smiley:

4 Likes

Yeah it’s not a mature feature, more testing would be appropriate. Biggest reason for not setting it as default is that it in some cases it can also aggressively crop too much of the reconstruction area, depending on the flight. But certainly worth of discussion for inclusion as a default.

3 Likes