I have tried my best searching on the issues page on GitHub and this community forums but I haven’t found the exact issue. It is of two parts:
I have not tinkered with the default output path variable / environment. The code ran partially but crashed at different points for datasets of varying sizes. Initially, it returned an Exception: Child returned 137
error but I quickly found out that that was due to limited available memory. I finally tried running this code docker run -it --rm -v "($path)/datasets/project/images:/code/images" opendronemap/odm
against just 16 images and the code still didn’t run to completion and aborted with an Exception: Child returned 134
.
Here are my system specifications:
Intel Core M 5y10c with a base clock speed of 1.00GHz with a turbo boost upto 2.00GHz. Dual Core, 4 logical processors with virtualisation enabled.
L1 Cache = 128 KB
L2 Cache = 512 KB
L3 Cache = 4.0 MB
And most importantly, just 4GB of RAM
Secondly, can I open any temporary files that were generated? My main motive is to obtain the dense point clouds that were generated and analyse them. Everytime I run the code, I find a (significant) increase in data in the APPDATA folder for the current user, specifically C:\Users\user\AppData\Local\Docker\wsl\data\ext4.vhdx. I had come across a post carrying a capitalised red warning from Microsoft stating that I should not play with any file present here from the File Explorer, so I’m lost as to how I can access it and take a visual look at the temporary files (provided they are accessible)
- Merging feature onto tracks and finds number of ‘good’ tracks (here 9497)
- Attempts incremental reconstruction (using each image pair)
- “Two-view reconstruction inliers: 935 / 935”
- “Triangulated: 935”
- Operates Ceres-Solver
- Remove outliers from the ‘inliers’ calculated for each image pair
- Reconstructed image is available
- Undistorts images
- Skips exporting geocoords_transformation.txt (shown as a warning. Due to lack of metainformation)
- Finished OpenSfM
- Estimated Depth-Maps (in 18m)
- Filtered all Depth-Maps
- Fused all Depth-Maps
- Finished Open mvs
- Finished odm_filterpoints stage
- Created DSM for 2.5D mesh
- DSM resolution = 0.0036512934767399326 (0.09cm/pixel)
- Point cloud bounds are calculated (trivial)
- “[minx: -4.466757774, maxx: 4.508954525] [miny: -4.692795753, maxy: 4.963309288]”
- “DEM resolution is (2645, 2459), max tile size is 4096, will split DEM generation into 1 tiles”
- Completed smoothing to create mesh_dsm.tif
- Created mesh from DSM to complete odm_meshing stage
- Loaded and prepared the mesh
- Built an adjacency graph (591220 edges)
- “Building BVH from 394195 faces… done. (Took: 993 ms)
Calculating face qualities 100%… done. (Took 19.852s)
Postprocessing face infos 100%… done. (Took 1.368s)
Maximum quality of a face within an image: 2712.83
Clamping qualities to 62.9438 within normalization.” - “62838 faces have not been seen
Took: 47.901s
Generating texture patches:
Running… done. (Took 16.189s)
4814 texture patches.
Running global seam leveling:
Create matrices for optimization… done.
Lhs dimensionality: 201772 x 201772
Calculating adjustments:
Color channel 0: CG took 93 iterations. Residual is 9.96233e-05
Color channel 1: CG took 94 iterations. Residual is 9.69049e-05
Color channel 2: CG took 93 iterations. Residual is 9.97278e-05
Took 1.403 seconds
Adjusting texture patches 100%… done. (Took 7.792s)
Running local seam leveling:”
Aborted… Exception: Child returned 134
No parameters were fine tuned. I can’t even access most Docker Desktop settings like memory allocation that’s shown in many YouTube videos
I am currently using ODM (and not WebODM). I use Microsoft Edge running on Windows 10 Home. The software was installed through Docker Desktop (I tried both the WSL2 installation and native Ubuntu installation (on wsl2) before attempting it using Docker). I am not familiar with the whole procedure and mostly copy-pasted the code from the README.md file. I ran it against test cases with significant overlap (~80%) and the last subset of only 16 images that I had mentioned, was a strictly linear dataset.
Any help would be useful. Cheers!