Running out of memory using WEBODM Lightning

Hello All,
I was wondering if someone can help me out with this one please. I had stumbled across WEBODBM Lightning and thought that this would be a great way into photogrammetry as I do not have a good enough PC for doing anything like this.
I had looked through the forum for similar issues, but the ones I’ve come across are basically to do with the processing being on the local PC, not the Cloud service as such.
My problem is I am trying to make a 3d model of a radio tower (about 519 or so photos : 20MB Pixel). All appears to go well for about 4 and a half hours then it crashes with a memory related error. The email I get back from the service (other than it tells me it’s failed) has this :-

Your task [Hallam Trig 3] could not complete. Here’s a copy of the error message:

.

90

.

.

.

Input file size is 81443, 67423

0…10…20…30…40…50…60…70…80…90…100 - done.

[INFO] running gdalbuildvrt -resolution highest -r bilinear “/var/www/data/d7dc7ba6-c3e9-4206-b898-cffec3382ed8/odm_meshing/tmp/merged.vrt” “/var/www/data/d7dc7ba6-c3e9-4206-b898-cffec3382ed8/odm_meshing/tmp/tiles.small_filled.tif” “/var/www/data/d7dc7ba6-c3e9-4206-b898-cffec3382ed8/odm_meshing/tmp/tiles.tmp.tif”

0…10…20…30…40…50…60…70…80…90…100 - done.

[INFO] running gdal_translate -co NUM_THREADS=16 -co TILED=YES -co BIGTIFF=IF_SAFER -co COMPRESS=DEFLATE --config GDAL_CACHEMAX 44.75% “/var/www/data/d7dc7ba6-c3e9-4206-b898-cffec3382ed8/odm_meshing/tmp/merged.vrt” “/var/www/data/d7dc7ba6-c3e9-4206-b898-cffec3382ed8/odm_meshing/tmp/tiles.tif”

Input file size is 81443, 67423

0…10…20…30…40…50…60…70…80…90…100 - done.

[INFO] Starting smoothing…

100 - done.

Traceback (most recent call last):

File “/code/run.py”, line 59, in retcode = app.execute() File “/code/stages/odm_app.py”, line 130, in execute raise e File “/code/stages/odm_app.py”, line 94, in execute

self.first_stage.run()

File “/code/opendm/types.py”, line 346, in run

self.next_stage.run(outputs)

File “/code/opendm/types.py”, line 346, in run

self.next_stage.run(outputs)

File “/code/opendm/types.py”, line 346, in run

self.next_stage.run(outputs)

[Previous line repeated 3 more times]

File “/code/opendm/types.py”, line 327, in run self.process(self.args, outputs) File “/code/stages/odm_meshing.py”, line 65, in process mesh.create_25dmesh(tree.filtered_point_cloud, tree.odm_25dmesh, File “/code/opendm/mesh.py”, line 28, in create_25dmesh commands.create_dem( File “/code/opendm/dem/commands.py”, line 263, in create_dem median_smoothing(geotiff_path, output_path) File “/code/opendm/dem/commands.py”, line 318, in median_smoothing nodata_locs = numpy.where(arr == nodata) File “<array_function internals>”, line 5, in where

numpy.core._exceptions.MemoryError: Unable to allocate 65.4 GiB for an array with shape (4388850682, 2) and data type int64

The photos can be downloaded from here :
https://drive.google.com/drive/folders/1jnL5INItoT8rUK4oeZVdP0zD-VPkCI8M?usp=sharing

I think I left all the options set to default with the exception of PC Quality - High & min-num-features (integer) = 20000

Has anyone else come across this?

I would have thought using a cloud service, the memory issue wouldn’t be an issue? (please bear in mind I’ve not played with this really at all, being a noob). I’ve given this about 4 goes, but all same.

Thoughts?

Any help is greatly appreciated.

Thank you !

1 Like

Try enabling the auto-boundary task option; it’s likely that part of the background around the tower, far off in the distance, is being included in the reconstruction and causing the program to attempt to generate a massive elevation model (incorrectly).

2 Likes

Thanks very much Pierotofy. I’ll give that a go!
Cheers!

3 Likes

Again, thanks very much Pierotofy, I’ve given that a go and have been able to complete the project.
I’ve just now got to explore how to view the 3d model (Meshlab?)

Will see how I go.
Thanks very much again.
Very much appreciated.
Cheers.

1 Like