Mystex stage fails

Had executed below command to generate DSM from tiff file(file size > 1GB) with following command:

docker exec -it python3 /code/run.py --project-path /datasets project --dsm --orthophoto-resolution 2 --geo /code/geo.txt

The previous issues wrt file size & undistorted are resolved. Now the script fails while executing script “code/stages/mystex.py” at line 108 with the following error:

[INFO] running “/code/SuperBuild/install/bin/texrecon” “/datasets/project/opensfm/undistorted/reconstruction.nvm” “/datasets/project/odm_meshing/odm_mesh.ply” “/datasets/project/odm_texturing/odm_textured_model_geo” -d gmi -o gauss_clamping -t none --no_intermediate_results
/code/SuperBuild/install/bin/texrecon (built on Sep 25 2021, 08:33:41)
Load and prepare mesh:
Reading PLY: 199919 verts… 399834 faces… done.
Generating texture views:
NVM: Loading file…
NVM: Number of views: 24
NVM: Number of features: 432144

    Loading 100%... done. (Took 14.641s)

Building adjacency graph:

    Adding edges 100%... done. (Took 0.633s)
    599751 total edges.

View selection:
Building BVH from 399834 faces… done. (Took: 231 ms)

    Calculating face qualities 100%... done. (Took 29.904s)


    Postprocessing face infos 100%... done. (Took 0.028s)
    Maximum quality of a face within an image: 36157.8
    Clamping qualities to 4881.8 within normalization.
    Optimizing:
            Time[s] Energy
            0       366535
            0       364936
            1       362097
            1       360443
            1       359700
            2       359272
            2       359002
            2       358894
            2       358732
            3       358616
            3       358526
            3       358444
            3       358367
    22039 faces have not been seen
    Took: 34.633s

Generating texture patches:
Running… done. (Took 19.088s)
558 texture patches.
Running global seam leveling:
Create matrices for optimization… done.
Lhs dimensionality: 199639 x 199639
Calculating adjustments:
Color channel 0: CG took 65 iterations. Residual is 9.72896e-05
Color channel 2: CG took 66 iterations. Residual is 9.43003e-05
Color channel 1: CG took 65 iterations. Residual is 9.93726e-05
Took 0.241 seconds

    Adjusting texture patches 100%... done. (Took 13.859s)

Running local seam leveling:

    Blending texture patches 100%... done. (Took 48.723s)

Generating texture atlases:
Sorting texture patches… done.
terminate called after throwing an instance of ‘std::bad_alloc’

Please suggest/advise, which parameter needs to be updated/adjusted here?

1 Like

If I’m not mistaken, I think you may have exceeded the max texture size that MVSTexturing/texrecon can handle.

Seems like this:
ODM 80MP image frames as inputs - ODM - OpenDroneMap Community

Thanks, since now I have image with higher pixel values Saijin_Naib, dalem will update define MAX_TEXTURE_SIZE (8 * 1024) in the script and share the results.

2 Likes

Tried updating above value but no luck. Any lead or alternative?

1 Like

Hmm, not as far as I know. I think at this point we’re well past what is well-tested in this pipeline as we’ve modified a few core libraries to handle texture dimensions far beyond what they are specced for. I don’t know where it could be failing now.

Any error logs or other output you can share?

It is same error that I had mentioned above

What value did you change MAX_TEXTURE_SIZE to?

It looks like ODM v2.64 should support 24K textures in MVS_Texturing:

Had ran ODM with few values and following were outcomes:

  1. When ran ODM with 5 files with default max_texture_size it failed with an error - ran out of memory. Here 99% of texturing step got completed and it failed at last stage

  2. Next when ran ODM with 24 files with default value it failed with std::bad_alloc error

  3. When ran ODM with max_texture_size as 26460*17004(tif file dimensions) it failed with an error bad input something…

  4. When ran ODM with max_texture_size as 32768 * 32768 or 65536 * 65536 it goes in infinite loop.

  5. When ran ODM with values above 16*1024 and below 32768 * 32768 values with failed with an error “Oh no! It looks like your CPU is not supported”

  6. In some cases it failed with an error std::bad_length

To summarize for set of tif files with following properties:

  1. Size: 1.3 GB
  2. Dimensions: 26460*17002
  3. Pixels: 550MP

Texturing fails with different error at sorting stage with above reasons

1 Like

Maybe try this:
MAX_TEXTURE_SIZE (439330 * 1024)

That should fit an entire image in one texture page… No idea how bad this will break, though.

It failed with an error std::bad_alloc

Alright, so it seems like we’re exceeding something in MVS_Texturing that just setting the MAX_TEXTURE_SIZE bigger doesn’t address.

https://en.cppreference.com/w/cpp/memory/new/bad_alloc

Okay, it looks like we’re using UINT (Unsigned Integer), which I think is just slightly too small to allocate such gigantic texture dimensions into. I think this means we’d need to re-write mvs-texturing entirely to use another allocation (UINT64?) instead of UINT32.

I’m pretty sure you cannot process images this large with ODM (what kind of images are these? Satellite?)

1 Like

Can you help with updated the script?

1 Like

Trust me, you do not want me near code.

After Max_texture_size 24*1024, the script starts failing with an error std::bad_alloc…

1 Like

Yep, so like I said above, I think we’d need to be re-writing the code to use another UNIT storage that is bigger than 32bit since we’re getting larger texture dimensions than it can hold. Just adjusting the script isn’t going to address that limitation.

For Max_Texture_Size in range of (8 * 1024) and (24 * 1024), the script exits when an error - ran out of memory

Yeah got it, but who will be rewriting code? Any idea, which set of script has to be updated?

Piero, possibly, and it looks like the entire program might have to be re-written to use UNIT64 instead of UNIT32. So, all of the cpp files in mvs-texturing source tree. And I have no idea if that’s possible or what that might break.