Invalid value encountered in divide

Hello. I’ve been trying to process a dataset I captured with a DJI Spark, using the latest code (as of this morning). Altitude was 50 meters and the 239 images had a high degree of overlap. WebODM went through without errors or warnings, then this warning and an exit with code 1:

/code/SuperBuild/src/opensfm/opensfm/ RuntimeWarning: invalid value encountered in divide A /= s /usr/local/lib/python2.7/dist-packages/numpy/linalg/ RuntimeWarning: invalid value encountered in det r = _umath_linalg.det(a, signature=signature) Traceback (most recent call last): File “/code/SuperBuild/src/opensfm/bin/opensfm”, line 34, in <module> File “/code/SuperBuild/src/opensfm/opensfm/commands/”, line 21, in run report = reconstruction.incremental_reconstruction(data) File “/code/SuperBuild/src/opensfm/opensfm/”, line 1177, in incremental_reconstruction data, graph, reconstruction, remaining_images, gcp) File “/code/SuperBuild/src/opensfm/opensfm/”, line 1113, in grow_reconstruction align_reconstruction(reconstruction, gcp, config) File “/code/SuperBuild/src/opensfm/opensfm/”, line 20, in align_reconstruction apply_similarity(reconstruction, s, A, b) File “/code/SuperBuild/src/opensfm/opensfm/”, line 42, in apply_similarity shot.pose.set_rotation_matrix(Rp) File “/code/SuperBuild/src/opensfm/opensfm/”, line 85, in set_rotation_matrix raise ValueError(“Determinant not 1”) ValueError: Determinant not 1 Traceback (most recent call last): File “/code/”, line 47, in <module> plasm.execute(niter=1) File “/code/scripts/”, line 133, in process (context.pyopencv_path, context.opensfm_path, tree.opensfm)) File “/code/opendm/”, line 34, in run raise Exception(“Child returned {}”.format(retcode)) Exception: Child returned 1

Can you tell me what it means?

Hi :hand:, difficult to say without looking at the input data. Could you share your images (or a subset of images that cause the program to fail with the error reported above) with us by uploading the images either to Dropbox or Google Drive (or some other storage provider)?

Sure - will do.

I’ve shared my images and the task output. Would you mind having a look? To me the images look crisp and there is quite a lot of overlap. I haven’t measured it but by eye it looks like more than 60% overlap in each axis.

I’m hoping to use the Spark to teach secondary school students how to survey riparian land for an environmental assessment. I’m wondering whether there is something about the Spark camera that makes it unsuitable for map stitching. DroneDeploy had trouble with the same data set but eventually managed it. I’ve had good results on WebODM and DD using my Mavic Pro, so that’s why I wonder about the Spark.

Any thoughts about the problem would be welcome.


For you or anyone else still encountering this issue, I got this exact error for a long time and could not figure out why. Turns out, the flight pattern for that specific collection was a little erratic. The drone had taken about 15 images at the start of the flight to test, flown to the AOI, and taken 80 more images. Once I realized this I took the first 15 images out of the dataset and tried again, and it worked. So I guess having two disjoint collection areas was causing the determinant of some matrix to not be 1. Try mapping the GPS coordinates of your images and see if there are any that are not overlapping with the rest, throw those out, and see if it fixes it.

1 Like

Hmm, perhaps we should be adding some heuristic for catching these cases. Thanks for this feedback!

I get exactly the same result in the log with my test project. I’ve tried lots of different image sets, all taken using my phone. I’m trying to do a simple test by getting a 3d model of a coffee mug. This latest test, I have the pictures very close together, which I assume is the same as overlap, when taking photos around an object.

The photos and console log are here:

Interestingly, I can get this to run past this point when using my old Canon camera, but it dies when it gets to the Georeferencing task due to another bug. (That one is supposed to be fixed, but maybe not).