[INFO] running gdal_translate -co NUM_THREADS=48 -co TILED=YES -co BIGTIFF=IF_SAFER -co COMPRESS=DEFLATE --config GDAL_CACHEMAX 49.3% "/var/www/data/7afa8772-faae-4a17-aa8c-77b296aa278e/odm_dem/merged.vrt" "/var/www/data/7afa8772-faae-4a17-aa8c-77b296aa278e/odm_dem/tiles.tif"
Input file size is 269246, 168057
0...10...20...30...40...50...60...70...80...90...100 - done.
[INFO] Starting smoothing...
100 - done.
Traceback (most recent call last):
File "/code/run.py", line 59, in <module>
retcode = app.execute()
File "/code/stages/odm_app.py", line 130, in execute
raise e
File "/code/stages/odm_app.py", line 94, in execute
self.first_stage.run()
File "/code/opendm/types.py", line 346, in run
self.next_stage.run(outputs)
File "/code/opendm/types.py", line 346, in run
self.next_stage.run(outputs)
File "/code/opendm/types.py", line 346, in run
self.next_stage.run(outputs)
[Previous line repeated 6 more times]
File "/code/opendm/types.py", line 327, in run
self.process(self.args, outputs)
File "/code/stages/odm_dem.py", line 97, in process
commands.create_dem(
File "/code/opendm/dem/commands.py", line 263, in create_dem
median_smoothing(geotiff_path, output_path)
File "/code/opendm/dem/commands.py", line 318, in median_smoothing
nodata_locs = numpy.where(arr == nodata)
File "<__array_function__ internals>", line 5, in where
numpy.core._exceptions.MemoryError: Unable to allocate 641. GiB for an array with shape (43040439273, 2) and data type int64
The number of photos is 2100. And I have 640 GB of RAM. It is trying to allocate 641 GiB. Any suggestions what steps needs to be taken?