Processing options for high quality 3D model

I have already created a couple of really nice 3D models with Webodm.
Currently I have problems with my dataset using images from Mavic 2 Pro.
With my Mavic Air I have experienced less problems than with my Mavic 2 Pro regarding quality of my 3D models (like holes in roofs…).
The result of this dataset is really bad (I have tried a lot of different settings, changing mesh size, feature quality, mesh octree depth…).
The dataset (112 images) was processed using the following parameters:

auto-boundary: true, dem-resolution: 2.0, dsm: true, matcher-neighbors: 12, mesh-octree-depth: 12, mesh-size: 300000, min-num-features: 10000, orthophoto-cutline: true, orthophoto-resolution: 2.0, pc-geometric: true, pc-quality: high, pc-rectify: true, rerun-from: dataset, texturing-data-term: area, use-3dmesh: true


I have tried the same dataset with DroneDeploy and got the following result:
DroneDeploy Image

Any idea on how to improve the quality with ODM? I really like the workflow and to be able to use a local pc for processing.
Thanks a lot :slight_smile:


Please try this one :

mesh-octree-depth = 13
mesh-size = 300000
pc-quality = ultra
feature-quality = ultra
min-num-features = 12000
Don't resize your images

Yes, it more time and memory consuming


Welcome, Hannes!

Sorry for the trouble.

Could you provide a few of the images in service like our or another Cloud Service provider? I’d like to get the Make/Model/Manufacturer out of the image’s EXIF.

It looks like ichsan has you headed in a great direction!

Please keep us apprised of what you get after integrating their changes!


Okay, thanks for the help.
I will try with the given options and will give feedback regarding the results.
In the meantime, here are the original images:

Original Images

1 Like

With the given settings, without resizing, I get the following result:

[INFO]    Finished odm_filterpoints stage
[INFO]    Running odm_meshing stage
[INFO]    Writing ODM Mesh file in: /var/www/data/75f541bc-2c93-46f8-9818-17dd0e9ef404/odm_meshing/odm_mesh.ply
[INFO]    running "/code/SuperBuild/install/bin/PoissonRecon" --in "/var/www/data/75f541bc-2c93-46f8-9818-17dd0e9ef404/odm_filterpoints/point_cloud.ply" --out "/var/www/data/75f541bc-2c93-46f8-9818-17dd0e9ef404/odm_meshing/odm_mesh.dirty.ply" --depth 13 --pointWeight 4.0 --samplesPerNode 1.0 --threads 7 --bType 2 --linearFit
[WARNING] Child returned 137
[WARNING] PoissonRecon failed with 3 threads, let's retry with 1...
[INFO]    running "/code/SuperBuild/install/bin/PoissonRecon" --in "/var/www/data/75f541bc-2c93-46f8-9818-17dd0e9ef404/odm_filterpoints/point_cloud.ply" --out "/var/www/data/75f541bc-2c93-46f8-9818-17dd0e9ef404/odm_meshing/odm_mesh.dirty.ply" --depth 13 --pointWeight 4.0 --samplesPerNode 1.0 --threads 3 --bType 2 --linearFit
[WARNING] Child returned 137
[WARNING] PoissonRecon failed with 1 threads, let's retry with 0...
[INFO]    running "/code/SuperBuild/install/bin/PoissonRecon" --in "/var/www/data/75f541bc-2c93-46f8-9818-17dd0e9ef404/odm_filterpoints/point_cloud.ply" --out "/var/www/data/75f541bc-2c93-46f8-9818-17dd0e9ef404/odm_meshing/odm_mesh.dirty.ply" --depth 13 --pointWeight 4.0 --samplesPerNode 1.0 --threads 1 --bType 2 --linearFit
[WARNING] Child returned 137
[INFO]    running "/code/SuperBuild/install/bin/OpenMVS/ReconstructMesh" -i "/var/www/data/75f541bc-2c93-46f8-9818-17dd0e9ef404/odm_meshing/odm_mesh.dirty.ply" -o "/var/www/data/75f541bc-2c93-46f8-9818-17dd0e9ef404/odm_meshing/odm_mesh.ply" --remove-spikes 0 --remove-spurious 20 --smooth 0 --target-face-num 600000 -v 0
14:25:42 [App     ] Build date: Jan 27 2022, 12:48:53
14:25:42 [App     ] CPU: Intel(R) Xeon(R) E-2174G CPU @ 3.80GHz (8 cores)
14:25:42 [App     ] RAM: 62.65GB Physical Memory 2.00GB Virtual Memory
14:25:42 [App     ] OS: Linux 5.13.0-28-generic (x86_64)
14:25:42 [App     ] SSE & AVX compatible CPU & OS detected
14:25:42 [App     ] Command line: -i /var/www/data/75f541bc-2c93-46f8-9818-17dd0e9ef404/odm_meshing/odm_mesh.dirty.ply -o /var/www/data/75f541bc-2c93-46f8-9818-17dd0e9ef404/odm_meshing/odm_mesh.ply --remove-spikes 0 --remove-spurious 20 --smooth 0 --target-face-num 600000 -v 0

===== Dumping Info for Geeks (developers need this to fix bugs) =====
Child returned 1
Traceback (most recent call last):
File "/code/stages/", line 94, in execute
File "/code/opendm/", line 346, in run
File "/code/opendm/", line 346, in run
File "/code/opendm/", line 346, in run
[Previous line repeated 3 more times]
File "/code/opendm/", line 327, in run
self.process(self.args, outputs)
File "/code/stages/", line 24, in process
File "/code/opendm/", line 205, in screened_poisson_reconstruction'"{reconstructmesh}" -i "{infile}" '
File "/code/opendm/", line 106, in run
raise SubprocessException("Child returned {}".format(retcode), retcode)
opendm.system.SubprocessException: Child returned 1

===== Done, human-readable information to follow... =====

[ERROR]   Uh oh! Processing stopped because of strange values in the reconstruction. This is often a sign that the input data has some issues or the software cannot deal with it. Have you followed best practices for data acquisition? See

I’m sorry my suggestion couldn’t fix your problem, somedays, I (or another people in this community) hopefully will try your dataset. I want to make sure it was a ODM bug or not

Alternative way that I sometimes tell others to do this way becayse it give me Less problem:

Please try ODM native command prompt in Windows. You can download the installer at download site
for example, put all of your images in D:\Qwerty\code\images

Then run this

run --project-path “D:\Qwerty” --orthophoto-resolution 2 --pc-geometric --pc-quality ultra --feature-quality ultra --mesh-size 300000 --mesh-octree-depth 13 min-num-features 12000

Good luck, hope we will find any solution


Hi, no worries. Restarted and got the following result:

Any suggestions?

1 Like

It looks like parts of the imagery of the roof are channel clipping, or near enough, so we’re not getting good reconstruction there. That’s leading to some of the holes and elevated portions of the roof in the reconstruction.

As for the sides, you likely would have needed another “layer” or tier of the survey as an orbit, lower down, with a more shallow gimbal angle.

Created on: 2/10/2022, 5:59:49 PM
Processing Node: UAV4Geo - Lightning (manual)
Options: auto-boundary: true, crop: 0, debug: true, dem-gapfill-steps: 4, dem-resolution: 1, dsm: true, dtm: true, matcher-neighbors: 0, mesh-size: 300000, min-num-features: 64000, orthophoto-resolution: 1, pc-classify: true, pc-filter: 0, pc-geometric: true, pc-quality: high, pc-sample: 0.01, resize-to: -1, use-3dmesh: true, verbose: true
Average GSD: 1.31 cm
Area: 8,027.5 m²
Reconstructed Points: 21,402,715

1 Like

If I use meshlab, using point cloud to mesh reconstruction with poisson, it was able to fill large holes/give mesh in the area with little point cloud. But, meshlab can’t texturing from image input as ODM did.

Otherwise, IMO, ODM is good for texturing especially if we use --feature-quality high/ultra. Sadly, the mesh from ODM’s reconstruction got any holes.

If I checked --texturing-keep-unseen-faces, it fills any holes, but the texture was pure white

Any Idea to fix it?
I will post the photos later tomorrow for clearer ilustration

1 Like

I think that’s a side-effect of trying to texture faces that don’t have associated points from which to draw coloring from… I need to read into this more.

1 Like

Hi, thanks for your effort and time for investigating my problems.

I tried a different project with more parameters:

Optionen: auto-boundary: true, dem-gapfill-steps: 10, feature-quality: ultra, mesh-octree-depth: 13, mesh-size: 300000, min-num-features: 12000, pc-geometric: true, pc-quality: ultra, rerun-from: dataset, texturing-data-term: area, use-3dmesh: true
Average GSD: 0,84 cm
Areal: 4.758,45 m²
Reconstructed Points: 98.304.333

The facade is for my projects not relevant, only the roof, that’s why I only take one 360 degree shot at the same height as the other images.
But still getting a lot of holes in the roof:

DroneDeploy is again outputting a really smooth roof with no holes at all. The points in the roof are found but mapped not in the same surface as the other roof, but somewhere in the middle of the building (vertically wrong). Is there anything that can be improved for surface detection or area mapping, like gapfilling or anything like that?
Thanks a lot,

1 Like

You are not alone. I also experimenting another preset for filling hole in the mesh,

Recently I tested --texturing-keep-unseen-faces is true. It helps filling the holes, but the texture is pure white.

Increasing --mesh-size (from 300k to 500k) and --min-num-features (from default 8000 to 16000) is helping a bit. Still not enough for my dataset, but you can try too.

Good luck



Could you try dropping --pc-geometric? Apparently, it serves mostly as a spatially-aware filter and might drop some points you may want. It seems like completion of the surface is more important than a “neat” representation for the time being, so I’d leave this for a finishing touch after you try the below.

Similarly, you could try with --pc-filter 0 and maybe --pc-sample 0.007 to see if you can really get as much density out of the point cloud as possible.

I’m still experiment on this.

My hypothesis
The hole happens is not because failed mesh reconstruction. The hole happen because mesh texturing removes faces which fail to texturing (not enough interpolation/can’t associated current faces with texture)

My idea (unproven) :

by default, output mesh doesn’t have any color, so we added --color flag so it can interpolate mesh color from point cloud

Next, we must activate --texturing-keep-unseen-faces so it don’t remove that faces which already colored by interpolated point cloud.

1 Like

We can see the holes in the dense pointcloud, however. That’s what I was visualizing above. If we have holes in the pointcloud, that will propogate to some degree into the mesh as a result.


yes, there are holes in dense point cloud, but, screened poisson reconstruction sometimes able to fill the mesh. Unfortunatelly, Some faces was removed by texturing stage since it doesn’t have association/can’t interpolate with texture from input images.

--texturing-keep-unseen-faces gives pure white faces color to fill that hole

So my idea was :
output screened poisson mesh from odm_filterpoints/ can be colored by interpolated the colored point cloud.

The rest is sharper the texture with existing/default texturing stage and don’t remove texture with activating --texturing-keep-unseen-faces flag

1 Like

I have tried all the given options, however, I am still getting a lot of holes in the roof.
Is image pre-processing with filters,…an option that could help ODM?

1 Like

It can, but it’s tricky. If the images lack sufficient dynamic range (and JPEGs pretty much perfectly describe this limitation), you’ll have a nearly impossible time recovering any texture/detail from the parts of the roof that clipped to white.

You can certainly try, however!

Using image-preprocessing with the following settings and furthermore using a polarized filter on the camera helped a lot in my case:

1 Like

Fixing it in-situ is always best. A polarizer is a great way to reduce spectral reflections! Great that you were able to use one!

1 Like