Hello,
I am attempting to use ODM for a use case that I commonly execute with commercial software. I’m interested to take a set of historic aerial photographs and generate DSM/orthomosaics. In the example that I’m testing with I have 11 scenes from the USGS collected with about 60% image-to-image overlap. I have processed this exact dataset with Agisoft Photoscan and it processes successfully, so I know there is sufficient overlap and sidelap, with no gaps to allow processing. However, when I process with ODM I get an error that there is no reconstruction.json file, which I’m guessing means that the point matching between images did not generate sufficient points. The command I am using to run ODM is
docker run -it --rm \
-v $(pwd)/images:/code/images \
-v $(pwd)/odm_meshing:/code/odm_meshing \
-v $(pwd)/odm_orthophoto:/code/odm_orthophoto \
-v $(pwd)/odm_georeferencing:/code/odm_georeferencing \
-v $(pwd)/odm_texturing:/code/odm_texturing \
-v $(pwd)/opensfm:/code/opensfm \
-v $(pwd)/pmvs:/code/pmvs \
-v $(pwd)/odm_dem:/code/odm_dem \
opendronemap/opendronemap --opensfm-depthmap-min-consistent-views 2 --min-num-features 15000 --force-focal 152.929 \
--resize-to -1 --skip-3dmodel --matcher-neighbors 0 --mesh-octree-depth 12 --dsm --rerun-all --verbose --max-concurrency 2 --matcher-distance 0
Based on other posts in this forum I have set --opensfm-depthmap-min-consistent-views to 2 since each point is likely to only be in two images. Since I am only processing 11 images I do not resize them and based on results from Photoscan, 15000 points should be sufficient. I have tried running more points (40K) and it failed with the same error message.
The data were all preprocessed to remove black borders, set to common dimensions, and I added exif data with lat/lon/altitude of collection platform. I set --matcher-distance to 0 to make sure that I wasn’t excluding matches based on errors in gps locations (> 200m).
The end of my error trace is:
[DEBUG] running PYTHONPATH=/code/SuperBuild/install/lib/python2.7/dist-packages /code/SuperBuild/src/opensfm/bin/opensfm match_features /code/opensfm
2019-03-03 03:38:13,278 INFO: Matching 0 image pairs
2019-03-03 03:38:13,544 INFO: Matching 1VFNS00020012.jpg - 1 / 11
2019-03-03 03:38:13,735 INFO: Matching 1VFNS00010274.jpg - 2 / 11
2019-03-03 03:38:13,735 INFO: Matching 1VFNS00010269.jpg - 3 / 11
2019-03-03 03:38:13,741 INFO: Matching 1VFNS00020014.jpg - 4 / 11
2019-03-03 03:38:13,744 INFO: Matching 1VFNS00010270.jpg - 5 / 11
2019-03-03 03:38:13,746 INFO: Matching 1VFNS00010272.jpg - 6 / 11
2019-03-03 03:38:13,749 INFO: Matching 1VFNS00020013.jpg - 7 / 11
2019-03-03 03:38:13,752 INFO: Matching 1VFNS00010271.jpg - 8 / 11
2019-03-03 03:38:13,755 INFO: Matching 1VFNS00020011.jpg - 9 / 11
2019-03-03 03:38:13,756 INFO: Matching 1VFNS00010273.jpg - 10 / 11
2019-03-03 03:38:13,758 INFO: Matching 1VFNS00020015.jpg - 11 / 11
[DEBUG] running PYTHONPATH=/code/SuperBuild/install/lib/python2.7/dist-packages /code/SuperBuild/src/opensfm/bin/opensfm create_tracks /code/opensfm
2019-03-03 03:38:14,152 INFO: reading features
2019-03-03 03:38:16,169 DEBUG: Merging features onto tracks
2019-03-03 03:38:16,169 DEBUG: Good tracks: 0
[DEBUG] running PYTHONPATH=/code/SuperBuild/install/lib/python2.7/dist-packages /code/SuperBuild/src/opensfm/bin/opensfm reconstruct /code/opensfm
2019-03-03 03:38:16,500 INFO: Starting incremental reconstruction
2019-03-03 03:38:16,508 INFO: 0 partial reconstructions in total.
Traceback (most recent call last):
File "/code/run.py", line 47, in <module>
plasm.execute(niter=1)
File "/code/scripts/run_opensfm.py", line 141, in process
image_scale = gsd.image_scale_factor(args.orthophoto_resolution, tree.opensfm_reconstruction)
File "/code/opendm/gsd.py", line 31, in image_scale_factor
gsd = opensfm_reconstruction_average_gsd(reconstruction_json)
File "/usr/local/lib/python2.7/dist-packages/repoze/lru/__init__.py", line 348, in cached_wrapper
val = func(*args, **kwargs)
File "/code/opendm/gsd.py", line 75, in opensfm_reconstruction_average_gsd
raise IOError(reconstruction_json + " does not exist.")
Thank you for any help you can offer.