Improving ODM output/errors

Hello OpenDroneMap Community!

I’ve been experimenting with OpenDroneMap and WebODM for a long time now, and while I’m getting decent results, I’m constantly experiencing issues and annoyances and have no idea how to rectify them.

I’m using the newly released Windows Native version of ODM and WebODM, but found these same issues in the Docker version. My drone is a DJI Mavic 2 Pro, and I program automated flights via the Pix4DCapture app.

I realize that because of the Mavic 2’s rolling shutter, I should expect some rolling shutter distortion especially if I set Pix4D’s flight speed to “fast”. But I’ve noticed that the difference between the “fastest” and “slowest” speeds in the app is barely 2-3m/s.

Here’s the problems I’ve been facing:

My first test: Double Grid flight over the Whittlesea Showgrounds:

Height: 120m
Est. GSD: 2.85
Speed: Slowest possible in the Pix4Dcapture app, est. 5m/s

Dataset/All photos: https://jupitermediavic-my.sharepoint.com/:u:/g/personal/ethan_jupitermediavic_onmicrosoft_com/Ea3CqvIVKipBs8lCSoV25dsB-uTbxT-7m-qvpKJcPOJgKg?e=6CX0CG

Buildings that look like Jelly in Ortho: Sign in to your account

Buildings that look like Jelly in 3D Model: Sign in to your account

Trees with a strange, blobby warping effect around the edges: Sign in to your account

Note also how some of the trees in the 3D Model look like wisps of green paper with no trunk underneath. I understand that this is to be expected since the drone can’t see the tree from underneath. In later tests I have improved my results by adding photos from a circular flight at a lower altitude.

Second Test: Whittlesea Park:

Height: 90m
Est. GSD: No idea, forgot to make note of the app’s estimate.
Speed: Fastest possible in the Pix4Dcapture app, est. 7.3m/s

Dataset one - Single grid: https://jupitermediavic-my.sharepoint.com/:u:/g/personal/ethan_jupitermediavic_onmicrosoft_com/EaLYC_UBRFFGphsWsTBVvmQBIofZsQrDT_cQ5u2yU6Y4MA?e=ehJCte

Dataset two - Second grid to be added to create 3D Model: Sign in to your account

fast-ortho completely ruins the output quality (made with only the 1st grid): Sign in to your account

Blobby trees persist (made with both grids, fast-ortho turned off for comparison): Sign in to your account

Mysterious holes in the water (I understand if there’s nothing that can be done here):
https://jupitermediavic-my.sharepoint.com/:i:/g/personal/ethan_jupitermediavic_onmicrosoft_com/EWjlY6oEp5ZAiFcWRnfvALABisZ_FG8OybiicI87FSUGJg?e=vHUHH2
Any tips on flight settings or ODM arguments?

I’d try not using the --fast-orthophoto choice as it skips the entire 3D Reconstruction pipeline, which is likely causing a lot of your visual artifacting.

Try processing just defaults first and comparing.

From how you described your flight plan, I don’t think there is much at all amiss with it.

1 Like

Good to know the flights are okay :grin:

All of these outputs apart from one do not use --fast-orthophoto. I can’t recall what settings I’ve specifically used for these but they’re basically tweaked version of the default presets, particularly the “3D Model” preset.

Here and there I have used --ignore-gsd, --pc-geometric:true and --camera-lens:brown, all of which I’m told improves results across the board, but none of these seem able to remove the “blobbiness” I’m experiencing. I’ve also tried --use-fixed-camera-params:true which I’m told can help negate rolling shutter distortion but doesn’t seem to do anything at all.

If the wind blows the trees around between exposures, that is normal.

Jelly-like buildings can be fixed with more images/overlap.

1 Like

I would almost never recommend using --ignore-gsd. It really only applies in a few very select edge cases and normally causes failures in really hard to track down ways.

I would keep --pc-geometric, skip --camera-lens:brown (our lens detection is defaulting to brown for rectilinear lenses anyway), skip --used-fixed-camera-params unless you have camera parameters from a meticulously collected/processed dataset that has no rolling shutter distortion and was optimized to eliminate overly aggressive self-calibration.

I would try adding these:
–use-3dmesh
–pc-quality ultra (or high)
–feature-quality ultra (or high)
–orthophoto-resolution 1
–dem-resolution 1
–min-num-features 16000
–mesh-octree-depth 14
–mesh-size 2000000

And see where that gets you for now.

Some of these may run heavy on your hardware, or may be capped if using Lightning, so scale back from ultra to high if needed.

1 Like

Another way to remove this, if it is a problem, is to stop the drone for each photo, which removes the distortion due to motion.

Something else to watch out for is drone speed vs exposure duration.
It’s best to keep your shutter speed short enough that image motion does not exceed GSD.
For example with my Mavic 2 Pro operating at 90m AGL to give GSD=2cm, travelling at 10m/s, theoretically I’d want an exposure of 1/500 sec or shorter (1/320 is ok in practice though)

10m/sec = 1000cm per second. 1/500 sec = 2cm motion of the drone during the exposure.

3 Likes

Yes, exactly this. Most of us don’t have the option to switch to a global shutter or mechanical shutter on demand, so the best we can do with slower readout CMOS sensors is stop for each photo to eliminate rolling shutter artifacting.

2 Likes

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.