Strange values in reconstruction...bug?

Hi,
Im currently trying to process some Images but it wont complete with that message:

/usr/local/lib/python3.9/dist-packages/numpy/core/fromnumeric.py:3440: RuntimeWarning: Mean of empty slice.
  return _methods._mean(a, axis=axis, dtype=dtype,
/usr/local/lib/python3.9/dist-packages/numpy/core/_methods.py:189: RuntimeWarning: invalid value encountered in double_scalars
  ret = ret.dtype.type(ret / rcount)
Traceback (most recent call last):
  File "/code/SuperBuild/install/bin/opensfm/bin/opensfm_main.py", line 25, in <module>
    commands.command_runner(
  File "/code/SuperBuild/install/bin/opensfm/opensfm/commands/command_runner.py", line 37, in command_runner
    command.run(data, args)
  File "/code/SuperBuild/install/bin/opensfm/opensfm/commands/command.py", line 12, in run
    self.run_impl(data, args)
  File "/code/SuperBuild/install/bin/opensfm/opensfm/commands/reconstruct.py", line 11, in run_impl
    reconstruct.run_dataset(dataset)
  File "/code/SuperBuild/install/bin/opensfm/opensfm/actions/reconstruct.py", line 13, in run_dataset
    report, reconstructions = reconstruction.incremental_reconstruction(
  File "/code/SuperBuild/install/bin/opensfm/opensfm/reconstruction.py", line 1530, in incremental_reconstruction
    reconstruction, rec_report["grow"] = grow_reconstruction(
  File "/code/SuperBuild/install/bin/opensfm/opensfm/reconstruction.py", line 1387, in grow_reconstruction
    remove_outliers(reconstruction, config)
  File "/code/SuperBuild/install/bin/opensfm/opensfm/reconstruction.py", line 1114, in remove_outliers
    threshold_sqr = get_actual_threshold(config, reconstruction.points) ** 2
  File "/code/SuperBuild/install/bin/opensfm/opensfm/reconstruction.py", line 1097, in get_actual_threshold
    mean, std = get_error_distribution(points)
  File "/code/SuperBuild/install/bin/opensfm/opensfm/reconstruction.py", line 1085, in get_error_distribution
    np.linalg.norm(np.array(all_errors) - robust_mean, axis=1)
  File "<__array_function__ internals>", line 5, in norm
  File "/usr/local/lib/python3.9/dist-packages/numpy/linalg/linalg.py", line 2561, in norm
    return sqrt(add.reduce(s, axis=axis, keepdims=keepdims))
numpy.AxisError: axis 1 is out of bounds for array of dimension 1

===== Dumping Info for Geeks (developers need this to fix bugs) =====
Child returned 1
Traceback (most recent call last):
  File "/code/stages/odm_app.py", line 94, in execute
    self.first_stage.run()
  File "/code/opendm/types.py", line 346, in run
    self.next_stage.run(outputs)
  File "/code/opendm/types.py", line 346, in run
    self.next_stage.run(outputs)
  File "/code/opendm/types.py", line 346, in run
    self.next_stage.run(outputs)
  File "/code/opendm/types.py", line 327, in run
    self.process(self.args, outputs)
  File "/code/stages/run_opensfm.py", line 37, in process
    octx.reconstruct(self.rerun())
  File "/code/opendm/osfm.py", line 53, in reconstruct
    self.run('reconstruct')
  File "/code/opendm/osfm.py", line 34, in run
    system.run('"%s" %s "%s"' %
  File "/code/opendm/system.py", line 106, in run
    raise SubprocessException("Child returned {}".format(retcode), retcode)
opendm.system.SubprocessException: Child returned 1

===== Done, human-readable information to follow... =====

[ERROR]   Uh oh! Processing stopped because of strange values in the reconstruction. This is often a sign that the input data has some issues or the software cannot deal with it. Have you followed best practices for data acquisition? See https://docs.opendronemap.org/flying/

the flags are: --auto-boundary --orthophoto-resolution 2 --pc-quality high --gps-accuracy 0.02 --force-gps
Im using about 180 GCP references

if needed I can run it again and post the full output.

1 Like

Why so many GCP references? Are you using them as image centroid geolocations? If so, the geo.txt is the proper tool for that task, instead.

Does this dataset process without the GCP file and your current parameters?

1 Like

I have to try it without gcps.
I have that many, because I used findGCP and aruco markers.
I have only 5GCPs, but 180 entries to my gcp file.

2 Likes

I’m planning to try out findGCP with Aruco markers - any tips for new players?

1 Like

make them big enough :smiley: mine are 90cmx90cm other then that, it worked pretty flawlessly. make sure to only use the smallest dict you can, so 3x3 or 4x4.

2 Likes

Good to know, I was thinking of 50cm so I’d better increase that!
What drone do you use & what altitude do you fly at?

1 Like

currently Im using the Phantom4RTK and fly at 70 meters, 90-100 meters is also possible though.

2 Likes

I have successfully processed it without the GCPs. I also checked all the GCP locations manually and they all seem to be spot on…
in case you want to try it yourself, here is all the data: https://cloud.netwakevision.com/index.php/s/Ftj5S8omKiqW45y/download

2 Likes

I’m wondering if that might be too many GCP points for some reason…

Would you be willing to “thin” the pool of tagged images down to 3-5 (best) images per GCP and try again?

1 Like

I will try that. but if it succedes, this would be a bug, wouldnt it? more points should give a better result I think?

1 Like

Yes and no. After a certain point, you’re over-fitting and just introducing error by adding more images per GCP. Ideally, you’d want to limit the images per GCP to the fewest that are most significant and have the least variability/error… But that’s hard to compute ahead of time.

Where the bug might lie would be if we have a limit on either how many GCP lines we can parse, total length of characters passed somewhere… Something like that.

1 Like

ok. as you said, limiting the amount of GCPs is not really simple when using aruco markers… I could randomly just select like 5 points per GCP, but I think thats not a good idea? the processing of the dataset with reduced points seems to run fine, it will be finished soon.
I think ODM should be able to handle many GCP points, as aruco markers cannot really be used otherwise…

2 Likes

ok, it finished processing but still failed. I reduced the 180 points to 25. should I try to reduce it any further? but that cannot be wanted behaviour (unless I made another mistake)
EDIT: I removed even more points and it still fails.

1 Like

maybe I found the issue, it looks like lat lon are switched:
47.7816526 9.1654868 491.31 5102 2949 100_0001_0010.JPG 4
this would be a findGCP bug, I will investigate.
Edit: It is my fault, I messed up lat and lon in my script. I will try it again now, sorry.

3 Likes

Ah yeah, if the coordinates are on the other side of the globe/equator things will get messy and fail because the reconstruction area will get way too big to work with.

Hopefully that settles it, and you can pass all 180 of your tagged images no worries!

Please let us know.

1 Like

It worked flawlessly this time :slight_smile: about 8cm accuracy with all the 180GCPs!

3 Likes

Wow, that is a lot of GCPs. Glad it worked!

2 Likes

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.