Ortho and 3D full of holes

I’m not sure what is going on here, but I’ve had several attempts at this with various settings, using high and ultra quality, but all the orthos are full of holes, and the 3D does not capture the 50% shade cloth tunnels, which was the whole point of the exercise.

177 photos with M2P, nadir and non-nadir at various heights AGL

One of the photos

Ortho - full of holes

Missing tunnels

1 Like

What does the Point Cloud look like?

What settings did you use for this run?

1 Like

02:19:31
Completed
Task Output:

  • OnOff
    Created on: 28/12/2021, 18:44:16
    Processing Node: node-odm-1 (auto)
    Options: boundary: {“type”:“FeatureCollection”,“features”:[{“type”:“Feature”,“properties”:{},“geometry”:{“type”:“Polygon”,“coordinates”:[[[151.0513673722744,-31.331642402780624],[151.05130165815353,-31.33174492798513],[151.05159737169743,-31.331869790932522],[151.0516744852066,-31.33171285307283],[151.05138413608074,-31.33160460016308],[151.0513673722744,-31.331642402780624]]]}}]}, crop: 1, dem-resolution: 1, dsm: true, gps-accuracy: 5, mesh-octree-depth: 13, mesh-size: 300000, min-num-features: 12000, orthophoto-resolution: 1, pc-geometric: true, pc-quality: high, pc-rectify: true, texturing-keep-unseen-faces: true, use-3dmesh: true
    Average GSD: 0.45 cm
    Area: 875.49 m²
    Reconstructed Points: 36,025,370

1 Like

That one will create those grey areas, looks better without I think and I’ve seen it can make the ortophoto look worse.

Try without and see if you get a more compelling result.

For the orto you can try to load the cloud into CloudCompare and make an orto there.

2 Likes

I’m curious if using --pc-filter 0 might not help a bit.

How long does a re-run take for this dataset?

I only tried that on the third attempt, it made no real difference, the grey areas were there on the first 2 attempts, which had these settings:

99 images 01:18:29

Options: auto-boundary: true, boundary: {“type”:“FeatureCollection”,“features”:[{“type”:“Feature”,“properties”:{},“geometry”:{“type”:“Polygon”,“coordinates”:[[[151.0513673722744,-31…], etc}, dsm: true, dtm: true, feature-quality: ultra, gps-accuracy: 4, mesh-octree-depth: 12, mesh-size: 300000, min-num-features: 10000, orthophoto-resolution: 1, pc-geometric: true, pc-quality: high, pc-rectify: true, use-3dmesh: true
Average GSD: 0.69 cm
Area: 1,150.35 m²
Reconstructed Points: 9,611,489

and then I took more images, but that didn’t help

177 images 00:23:32

Created on: 27/12/2021, 18:08:19
Processing Node: node-odm-1 (auto)

Options: auto-boundary: true, boundary: {“type”:“FeatureCollection”,“features”:[{“type”:“Feature”,“properties”:{},“geometry”:{“type”:“Polygon”,“coordinates”:[[[151.0513673722744,-31. etc}, dem-resolution: 1, dsm: true, gps-accuracy: 5, mesh-octree-depth: 12, mesh-size: 300000, min-num-features: 10000, orthophoto-resolution: 1, pc-geometric: true, pc-quality: high, pc-rectify: true, use-3dmesh: true
Average GSD: 0.91 cm
Area: 871.31 m²
Reconstructed Points: 4,920,304


I’ll try that, it doesn’t take too long.

I tried Reality Capture, the 3D model had a similar problem with the shade cloth. although the result was slightly better, and the ortho didn’t have any holes in it, on the shade cloth, ground or house roof.

1 Like

The latest attempt was better, although still a few holes evident.

With these settings, which took quite a bit longer-

177images 05:39:17
Created on: 30/12/2021, 17:10:03
Processing Node: node-odm-1 (auto)

Options: boundary: {“type”:“FeatureCollection”,“features”:[{“type”:“Feature”,“properties”:{},“geometry”:{“type”:“Polygon”,“coordinates”:[[[151.0513673722744,-31.3etc}, dsm: true, gps-accuracy: 4, matcher-neighbors: 0, mesh-octree-depth: 11, mesh-size: 300000, min-num-features: 10000, orthophoto-resolution: 1, pc-filter: 0, pc-geometric: true, pc-quality: high, pc-rectify: true, use-3dmesh: true

1 Like

Can you pass with feature quality ultra and pc quality ultra?

Also, what about constraining pc-sample to say, half evaluated GSD to keep processing a bit more in check?

1 Like

Resized 0.5X to speed it up
177 images 00:40:45

Created on: 01/01/2022, 11:12:19
Processing Node: node-odm-1 (auto)

Options: boundary: {“type”:“FeatureCollection”,“features”:[{“type”:“Feature”,“properties”:{},“geometry”:{“type”:“Polygon”,“coordinates”:[[[151.051etc]}, dsm: true, feature-quality: ultra, mesh-octree-depth: 12, mesh-size: 300000, min-num-features: 10000, pc-geometric: true, pc-quality: ultra, pc-rectify: true, pc-sample: 2, use-3dmesh: true
Average GSD: 0.92 cm
Area: 912.58 m²
Reconstructed Points: 391

That certainly made for a very sparse point cloud, even with a 10million budget!

A very wacky 3D model -

Ortho, less holes, but very distorted -

EDIT, suspecting something was amiss, I checked, and PC sample units are metres, not cm as for other settings. Trying again with 0.02 instead of 2!

2 Likes

That is one point per 2m radius :stuck_out_tongue: Sparsest of sparse, haha

2 Likes

But at least it got rid of the holes! :rofl:
I’ve just run it again at 0.02, and the holes have returned, and the tunnels still don’t render correctly

1 Like

Have enough RAM to run without resizing and all the same (corrected) parameters as before?

1 Like

I should have enough, with only 177 images.
I’ll try it without resizing next, after I finish the set of over 800 images I took today.

1 Like

I’m not sure what’s happening with the holes and glad you two are working through that. As to the tunnels, they won’t render correctly. They’re translucent, and classical SfM algorithms don’t account for translucency. Some deep learning based ones do, but those are pretty experimental, highly constrained in their useable contexts, and thus still and haven’t made their way into tools like ODM for those reasons.

2 Likes

Yes, after the various attempts, including with Reality Capture, it’s clear the translucency is a problem. I’d probably need to throw another layer of shade cloth over to get a decent rendering.

I’ll run ultra quality with no resizing now to see how it goes.

2 Likes

Even more holes!

177 images 03:02:51

Created on: 02/01/2022, 13:03:27
Processing Node: node-odm-1 (auto)
Options: auto-boundary: true, boundary: {“type”:“FeatureCollection”,“features”:[{“type”:“Feature”,“properties”:{},“geometry”:{“type”:“Polygon”,“coordinates”:[[[151 etc }]}, dem-resolution: 2, dsm: true, feature-quality: ultra, gps-accuracy: 4, mesh-octree-depth: 12, mesh-size: 300000, orthophoto-resolution: 1.5, pc-geometric: true, pc-quality: ultra, pc-rectify: true, pc-sample: 0.02, use-3dmesh: true

Average GSD: 0.46 cm
Area: 903.36 m²
Reconstructed Points: 4,643,395

1 Like

I am getting best results with: texturing-data-term: area.
My normal pipeline:
dem-resolution: 1, dsm: true, dtm: true, mesh-octree-depth: 13, mesh-size: 800000, min-num-features: 15000, orthophoto-resolution: 1, pc-classify: true, pc-geometric: true, skip-report: true, texturing-data-term: area, use-3dmesh: true

3 Likes

I can try that tomorrow, but I have never had this problem with other sets of images, using essentially the same settings.

1 Like

You can try without 3D mesh too. For orthophotos, there’s rarely an advantage to using that.

2 Likes

Things have improved with the below settings, I switched to ORB to save some time, now I just have to figure out if the holes were due to 3D-mesh or the texturing-data-term… but still wondering why they were there in the first place, as I’ve never had this issue previously.

177 images 00:30:07

Created on: 03/01/2022, 12:59:15
Processing Node: node-odm-1 (auto)
Options: auto-boundary: true, dem-resolution: 2.0, dsm: true, feature-quality: ultra, feature-type: orb, matcher-neighbors: 12, mesh-size: 300000, orthophoto-resolution: 1, pc-quality: ultra, pc-rectify: true, texturing-data-term: area

Average GSD: 0.91 cm
Area: 890.49 m²
Reconstructed Points: 25,190,162

1 Like