So some of us have hit a roadblock when trying to process large datasets (see previous threads mentioning “Processing stopped because of strange values in the reconstruction”).
Digging into the logs it’s apparent that processing fails during the OpenSFM stage, specifically during the “reconstruct” task in that stage.
The relevant error message in the WebODM logs mentions “num_nonzeros_ >= 0”.
E.g.:
2023-01-14 13:18:07,128 INFO: Shots and/or GCPs are well-conditioned. Using naive 3D-3D alignment.
block_sparse_matrix.cc:80 Check failed: num_nonzeros_ >= 0
/code/SuperBuild/install/bin/opensfm/bin/opensfm: line 12: 35232 Aborted (core dumped) “$PYTHON” “$DIR”/opensfm_main.py “$@”
From my Googling it seems that the root cause is the Ceres Solver algorithm/library - it has a size limit arising from using 32-bit signed integer for the num_nonzeros_ variable, and whoever maintains that code deemed it too big of a job to rewrite that Ceres code to use 64-bit variables.
Question:
What WebODM configuration parameters (aside from Split/Merge) may help workaround this the Ceres Solver limitation?