I can’t seem to be able to download the files (asks for a login)?
It is nadir capture.
Give that a try.
I’ve tried to process it in a few different ways, without success. Nice dataset btw!
Nothing strange stood out from the images / GCPs.
It would be interesting to see if this dataset processes correctly in another program (e.g. Pix4D or Metashape). The one thing I can’t verify is the actual measurements of the GCPs. I don’t exclude a software fault in ODM, so processing with a different program would be able to tell us if it’s a data collection or software issue.
This might be related: GCPs not aligning, up to 0.5m error - #12 by Saijin_Naib
I processed the project with current OpenSfM master with default OpenSfM options (different from ODM probably) and activated the bias compensation (
I end up with 0.101/0.062/0.046 error on X/Y/Z with compensated GPS bias and 0.101/0.062/0.952 without compensating it. So the compensation mostly correct for Z shift.
report_bias_compensated.pdf (4.4 MB)
report_not_compensated.pdf (4.5 MB)
By default the sigma of GCP measurements is 10 centimeters. By putting it to 1 cm (hard-coded here OpenSfM/ba_helpers.cc at master · mapillary/OpenSfM · GitHub), I have 0.035/0.022/0.017 with compensation. Setting it further to 1 mm, I have 0.035/0.022/0.016.
I’m considering exposing the GCP sigma in the gcp_list.txt so it can be set according the user’s GCP sensor nominal accuracy both horizontally and vertically (5 mm/10 mm H/V for the Emlid).
I’ve been back and forth with Emlid support on this one just to make sure. I’ve changed settings several different times and this type of error is consistent among all of my point collection efforts.
This is fascinating!
In case anyone is interested, I shot 3 different surveys simultaneously in Reachview on this one. I used the NAD83(2011) CRS on one, the NAD83(2011) / UTM 17N on one, and NAD83(2011) / Ohio South which you have in the project folder already. I’ll throw the other two survey logs into the project folder for comparison.
Would be a cool addition for sure! Could also be added in the config.yaml if GCPs are captured with the same instrument and one doesn’t need per-GCP sigmas.
So good news, following @YanNoun suggestion I’ve pushed an update to ODM which decreases the default GCP deviation, reprocessed this dataset and got some good results:
So try to update to the latest version and try again?
I’m happy to do that. I’m using the webodm lightning app for Windows. Do I need to update that?
Nop, you’re good to go! Lightning updates automatically to the latest.
Yann, Piero… This is HUGE.
Well, technically I guess it is tiny now, but the implications are huge!
This look like really interesting stuff. I wonder does OpenSFM log similar statistics (mean, sigma, RMS error) on each GCP?
This is how Pix4D shows it and its very useful to find GCPs which are skewing the overall result and offers the opportunity to rerun with more / or less GCPs.
This would allow the opportunity to add in check points to the GCP file rather than keeping them in reserve for manual checks.
Yeah, being able to cull bad data would be a huge benefit.
You can find per-GCP error measures by opening the ground_control_points.gpkg file in QGIS (from the Attributes Table window):
We still need to implement check points.
Super neat! I need to do some datasets of my own with a full end-to-end GCP workflow. No RTK/PPK, but I should get reasonable results with long baseline point averaging…
And MTPs and Scale Constraints
I’m getting an unexpected status code 504 when I process. Did you process with Ohio South coordinates or LLH?