Hi everyone,
i’ve been using webodm for a long time and i absolutly love it. Lately i have to process very huge datasets (3000, 5000, 9000 pictures) and run into problems.
My NodeODM:
- 2 TB RAM
- 128 CPU cores
- 512 GB disk space
Usually i process with pc-quality: ultra but i couldn’t make it work with the above datasets. So i tuned everything down to the following settings:
auto-boundary: true, dem-gapfill-steps: 6, dem-resolution: 4, dsm: true, dtm: true, mesh-octree-depth: 11, optimize-disk-space: true, orthophoto-resolution: 1, pc-quality: high, use-hybrid-bundle-adjustment: true
With this settings i got the 3000 dataset to process, but the others are stuck since a few days at:
2022-07-01 07:40:15,800 INFO: -------------------------------------------------------
2022-07-01 07:40:15,821 INFO: P2103584.JPG resection inliers: 727 / 759
2022-07-01 07:40:15,842 INFO: Adding P2103584.JPG to the reconstruction
2022-07-01 07:40:16,045 DEBUG: Ceres Solver Report: Iterations: 9, Initial cost: 5.503326e+02, Final cost: 5.188912e+02, Termination: CONVERGENCE
2022-07-01 07:40:46,142 INFO: Removed outliers: 130
2022-07-01 07:40:51,293 INFO: -------------------------------------------------------
2022-07-01 07:40:51,316 INFO: P2103585.JPG resection inliers: 724 / 747
2022-07-01 07:40:51,338 INFO: Adding P2103585.JPG to the reconstruction
Can someone help me with the settings or give other suggestions? I don’t mind if the dataset runs for several days or even weeks and i wouldn’t want to reduce the quality to much, since it would reduce the products quality.
Thank you in advance for any help!
PS: i sadly can’t provide you with the datasets.