I’m new here, but hopefully you will see more of me as I am determined to get to grips with WebODM. Apologies if my post is in the wrong forum section, or not formatted correctly. Feedback welcome.
I have been trying to get to grips with WebODM via the very reasonably priced Lightning service. I task has just failed and the console indicates that the Node ran out of RAM, is this possible? Am I configuring something wrong? Message provided was as follows;
It looks like your processing node ran out of memory. If you are using docker, make sure that your docker environment has enough RAM allocated. Alternatively, make sure you have enough physical RAM, reduce the number of images, make your images smaller, or reduce the max-concurrency parameter from the task’s [options](javascript:void(0);). You can also try to use a cloud processing node.
Last few lines of console output;
[INFO] DEM input file /var/www/data/10464cc7-6e55-4987-b4a7-fd4dd96a6b3c/odm_georeferencing/odm_georeferenced_model.laz found: True
[WARNING] DEM will not be generated
[INFO] Finished odm_dem stage
[INFO] Running odm_orthophoto stage
[WARNING] Maximum resolution set to GSD - 10.0% (12.8 cm / pixel, requested resolution was 2.0 cm / pixel)
[INFO] running /code/build/bin/odm_orthophoto -inputFiles /var/www/data/10464cc7-6e55-4987-b4a7-fd4dd96a6b3c/odm_texturing_25d/odm_textured_model_geo.obj -logFile /var/www/data/10464cc7-6e55-4987-b4a7-fd4dd96a6b3c/odm_orthophoto/odm_orthophoto_log.txt -outputFile /var/www/data/10464cc7-6e55-4987-b4a7-fd4dd96a6b3c/odm_orthophoto/odm_orthophoto_render.tif -resolution 7.80958091753981 -outputCornerFile /var/www/data/10464cc7-6e55-4987-b4a7-fd4dd96a6b3c/odm_orthophoto/odm_orthophoto_corners.txt
Error in OdmOrthoPhoto:
OpenCV(4.5.0) /code/SuperBuild/src/opencv/modules/core/src/alloc.cpp:73: error: (-4:Insufficient memory) Failed to allocate 36207239524 bytes in function ‘OutOfMemoryError’
Traceback (most recent call last):
File “/code/run.py”, line 68, in
app.execute()
File “/code/stages/odm_app.py”, line 81, in execute
self.first_stage.run()
File “/code/opendm/types.py”, line 338, in run
self.next_stage.run(outputs)
File “/code/opendm/types.py”, line 338, in run
self.next_stage.run(outputs)
File “/code/opendm/types.py”, line 338, in run
self.next_stage.run(outputs)
[Previous line repeated 6 more times]
File “/code/opendm/types.py”, line 319, in run
self.process(self.args, outputs)
File “/code/stages/odm_orthophoto.py”, line 73, in process
system.run(’{bin}/odm_orthophoto -inputFiles {models} ’
File “/code/opendm/system.py”, line 79, in run
raise Exception(“Child returned {}”.format(retcode))
Exception: Child returned 1
What’s the nature of the dataset?: Target resolution, number of bands (or RGB), number of images, and please share your settings. It’s possible the lightning network allocated a machine that wasn’t big enough for your processing settings. I’ve never had that happen, but it is possible.
No problems ref DroneDB. I thought I would jump on that bandwagon early. I am not a software Engineer (I am an Aircraft Engineer by trade and Hardware Hacker/Maker for hobbies) so all I can offer regarding Drone DB is to use it and feedback. I have downloaded the Windows Desktop Version (and bought license) and started to play with it. I shall start another thread ref DroneDB with some thoughts.
I will try to remove the “+” from the file names and try to re-upload. I also attempted to use a GCP file too (Made using GCP Editor Pro - also license bought) but when I tried to use the GCP file WebODM failed with an error that image referenced in GCP was not found. I just omitted the GCP file and tried again (that dataset is still running). I’m now wondering if the “+” also caused this issue?
Sweet! Powerful combo between Lightning, DroneDB, and GCP Editor Pro! I hope to get a chance to really try it out later this year after I rebuild my Solo.
I don’t know the code well enough to know for sure, but from years of working with GIS software… Stick with bare minimum ASCII characters (a-z,A-Z,0-9,_-) with no spaces and you’ll prevent 99.9% of all headaches
Just got this from running your data locally:
[INFO] running /code/SuperBuild/install/bin/texrecon /code/opensfm/undistorted/reconstruction.nvm /code/odm_meshing/odm_mesh.ply /code/odm_texturing/odm_textured_model_geo -d gmi -o gauss_clamping -t none --no_intermediate_results
/code/SuperBuild/install/bin/texrecon (built on Mar 3 2021, 16:58:17)
Load and prepare mesh:
PLY Loader: comment VTK generated PLY File
Reading PLY: 152913 verts... 305756 faces... done.
Generating texture views:
NVM: Loading file...
NVM: Number of views: 97
NVM: Number of features: 30804
Loading 100%... done. (Took 4.765s)
Building adjacency graph:
Adding edges 100%... done. (Took 0.401s)
458581 total edges.
View selection:
Building BVH from 305756 faces... done. (Took: 403 ms)
Calculating face qualities 100%... done. (Took 50.54s)
terminate called after throwing an instance of 'std::out_of_range'
what(): vector::_M_range_check: __n (which is 2190114) >= this->size() (which is 305756)
Aborted
Traceback (most recent call last):
File "/code/run.py", line 68, in <module>
app.execute()
File "/code/stages/odm_app.py", line 81, in execute
self.first_stage.run()
File "/code/opendm/types.py", line 338, in run
self.next_stage.run(outputs)
File "/code/opendm/types.py", line 338, in run
self.next_stage.run(outputs)
File "/code/opendm/types.py", line 338, in run
self.next_stage.run(outputs)
[Previous line repeated 4 more times]
File "/code/opendm/types.py", line 319, in run
self.process(self.args, outputs)
File "/code/stages/mvstex.py", line 104, in process
system.run('{bin} {nvm_file} {model} {out_dir} '
File "/code/opendm/system.py", line 79, in run
raise Exception("Child returned {}".format(retcode))
Exception: Child returned 134
I’ve re-run with forcing fisheye now to see if it also fails to reconstruct. Just wanted a baseline where it ran to completion.
EDIT:
Seems not able to finish the depth-map stage. It gets about 2-3 images in and then just hangs on one thread without progressing for minutes on end. Curious.
I have removed the “+0000” from the filenames (no idea why the drone appended it to the file names) and created a new DroneDB upload. 60m mission.
I have 2 further datasets of the same site (85m taken on the same day & 50m taken on another occasion). I sense some of you guys are intrigued by this headache.
I’m starting to think that the best solution is to just buy a DJI Phantom 4
I will have a go at that now. I made a crude GCP using known locations and features with high contrast (I think). Standby for a GCP based on this this Data Set. Question - Should I upload the GCP to DroneDB as well?
Yeah sure. I will do it shortly.
(I’m technically “in work” atm and trying to do this on another screen
Data set above (Bebop 2 with Pix4D capture) with GCP added 60m → Here (97 images)
Same site, same drone, same day, different mission flown at 85m (with GCP) → Here (56 images)
Same site, same drone, different day (different conditions) flown at 50m → Here (362 images!)
I have cleaned up other early attempts at DroneDB test Hub use as well, to save data datasets other than those above have been deleted. I have also figured out that you can re-name datasets within the web portal! Very handy.
Apologies, post above was typed before I created GCPs and I posted before adding hyperlinks. I will edit shortly when I have got eh GCPs done.
I am having an issue though. The images are opening 90deg rotated in both DroneDB & GCP Editor Pro. If I open them natively (Windows 10) the horizon is correct, but in both software packages above the horizon in 90deg out with sky all on the right. Any ideas?
It was brought to my attention that the lens type that the Bebop employ isn’t really supported upstream from us in OpenSFM as it isn’t really anything common (rectilinear, brown, spherical, fisheye), but somewhere between Spherical and Fisheye (hemispherical?), and as such, the existing lens models can’t really reconstruct the data from it properly.
I’ve filed an inquiry with them to see what they have to say.
For the time being, I know Pix4D can reconstruct that data (since the Bebop is Parrot Group’s own platform) if you must have something.