in the vein of @smathermather’s recent exploration of mapping all the things with ODM, I’ve been tinkering with towers and trees. These are traditionally hard topics for reconstruction (in my experience, maybe I can learn something!) - so here is a first cut attempt. The flight pattern is lots of circles at different heights, plus some with an upward looking camera (increasing the exposure offset so that detail is captured where there’s no direct light).
My ODM settings were:
--camera-lens brown --dem-resolution 2 --orthophoto-resolution 2 --mesh-octree-depth 12 --depthmap-resolution 1200
…I can see that it’s not quite there, the tower (a single pipe) is a bit wonky; and the solar panels are not flat. I’m running again now with --depthmap-resolution 2048 --opensfm-depthmap-method BRUTE_FORCE to obtain more points, and will post results.
Anyone had success with modelling similarly complex objects? what settings did you use?
Thanks @pierotofy - I think it goes a step deeper to the point cloud underlying the mesh. Here’s a set of filtered points after increasing depthmap resolution and using BRUTE_FORCE in depthmap matching:
Meshing will be hard from this however it rolls. So my next steps:
Agreed about the underlying point cloud. I’d be interested to see whether there if you fly closer + slower + higher overlap. (Basically as you said… more photos).
I’ve added a bunch more to a run today - I should also mention this was made with ~140 images shot from a range between 15 and 20 metres, so not bad really! By tuning ODM a little I got this:
I think the key improvements are higher resolution depthmap and stricter confidence / point outlier filtering.
60 more images are in todays run, mainly really close to the pole (about 5 metres away). All hand-shot so exposure compensation can be adjusted for upward or downward-looking images… I’ve run with the same parameters and I’ll edit this post with the result.
Per Piero: look at tsr as well. It employs some good filtering techniques specific to aggregate probability functions for multiple views, so it theoretically does more intelligent filtering than you’re likely to get on the point cloud alone.
Yay! I tried a couple of weeks ago, but was too distracted. It takes compiling a compiler so it’ll be a fun one. But, I didn’t hit any real barriers but time and attention.
@adamsteer Could you possibly make the image set available for testing? I have had similar difficulties, especially with construction sites, which is what I map most often, and would like to try some of my more or less successful task options.
Mesh and points below. In particular the solar panels are vastly better. I think a bit stricter outlier filtering would work better around the pole. Next time I try something like this I will also at least double the image collection, likely triple (to 400+ images in a similar flight plan.
I have been struggling with similar images as well. When I try Octree-depth at 14 my 'puter crashes at the end…and have lots of ram assigned. I drop it to 12 and it finishes.