Workaround for large datasets - Ceres Solver size limitation

So some of us have hit a roadblock when trying to process large datasets (see previous threads mentioning “Processing stopped because of strange values in the reconstruction”).
Digging into the logs it’s apparent that processing fails during the OpenSFM stage, specifically during the “reconstruct” task in that stage.

The relevant error message in the WebODM logs mentions “num_nonzeros_ >= 0”.
2023-01-14 13:18:07,128 INFO: Shots and/or GCPs are well-conditioned. Using naive 3D-3D alignment. Check failed: num_nonzeros_ >= 0
/code/SuperBuild/install/bin/opensfm/bin/opensfm: line 12: 35232 Aborted (core dumped) “$PYTHON” “$DIR”/ “$@”

From my Googling it seems that the root cause is the Ceres Solver algorithm/library - it has a size limit arising from using 32-bit signed integer for the num_nonzeros_ variable, and whoever maintains that code deemed it too big of a job to rewrite that Ceres code to use 64-bit variables.

What WebODM configuration parameters (aside from Split/Merge) may help workaround this the Ceres Solver limitation?


Since this keeps biting us, I wonder if this should not be our next Community Funding Drive.


Just adding some references:


Just an FYI - I tried using matcher-neighbors=24 instead of the default triangulation and still got the same “num_nonzeros_ >= 0” error.

1 Like

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.