Exception: Child returned 100

Hi, this is my first time come across this error message.

I was using AWS ECS Fargate instance, “cpu”: 4096, “memory”: 30720, docker installed odm.

The images are here
From the log, the detailed error is as below (I also copied a few lines above)
To me it seems like the ortho has been produced successfully, yet failed in the building overview stage.
Could this be potentially a bug?

Thanks

e[39m[INFO]    running gdalwarp -cutline /odm/a656f769-38cf-4d34-bd8f-f8c87f2a0df6/code/odm_georeferencing/odm_georeferenced_model.bounds.gpkg -crop_to_cutline -co TILED=YES -co COMPRESS=DEFLATE -co PREDICTOR=2 -co BIGTIFF=IF_SAFER -co BLOCKXSIZE=512 -co BLOCKYSIZE=512 -co NUM_THREADS=4 -dstalpha /odm/a656f769-38cf-4d34-bd8f-f8c87f2a0df6/code/odm_orthophoto/odm_orthophoto.original.tif /odm/a656f769-38cf-4d34-bd8f-f8c87f2a0df6/code/odm_orthophoto/odm_orthophoto.tif --config GDAL_CACHEMAX 46.45%e[0m
Using band 4 of source image as alpha.
Creating output file that is 23179P x 12807L.
Processing /odm/a656f769-38cf-4d34-bd8f-f8c87f2a0df6/code/odm_orthophoto/odm_orthophoto.original.tif [1/1] : 0...10...20...30...40...50...60...70...80...90...100 - done.
e[39m[INFO]    Building Overviewse[0m
e[39m[INFO]    running gdaladdo -ro -r average --config BIGTIFF_OVERVIEW IF_SAFER --config COMPRESS_OVERVIEW JPEG /odm/a656f769-38cf-4d34-bd8f-f8c87f2a0df6/code/odm_orthophoto/odm_orthophoto.tif 2 4 8 16e[0m
0...10...20...30...40...50...60...70...80...90...100 - done.
Overview building failed.
Traceback (most recent call last):
  File "/code/run.py", line 68, in <module>
    app.execute()
  File "/code/stages/odm_app.py", line 82, in execute
    self.first_stage.run()
  File "/code/opendm/types.py", line 338, in run
    self.next_stage.run(outputs)
  File "/code/opendm/types.py", line 338, in run
    self.next_stage.run(outputs)
  File "/code/opendm/types.py", line 338, in run
    self.next_stage.run(outputs)
  [Previous line repeated 6 more times]
  File "/code/opendm/types.py", line 319, in run
    self.process(self.args, outputs)
  File "/code/stages/odm_orthophoto.py", line 137, in process
    orthophoto.post_orthophoto_steps(args, bounds_file_path, tree.odm_orthophoto_tif, tree.orthophoto_tiles)
  File "/code/opendm/orthophoto.py", line 63, in post_orthophoto_steps
    build_overviews(orthophoto_file)
  File "/code/opendm/orthophoto.py", line 34, in build_overviews
    system.run('gdaladdo -ro -r average '
  File "/code/opendm/system.py", line 79, in run
    raise Exception("Child returned 
{}
".format(retcode))
Exception: Child returned 100
1 Like

Pure hunch, but did you run out of storage?

1 Like

I wouldn’t think so. I’ve run 600 images with the same setting, and this one is only around 200.
Plus the error says

running gdaladdo -ro -r average --config BIGTIFF_OVERVIEW IF_SAFER --config COMPRESS_OVERVIEW JPEG /odm/a656f769-38cf-4d34-bd8f-f8c87f2a0df6/code/odm_orthophoto/odm_orthophoto.tif 2 4 8 16e[0m
0...10...20...30...40...50...60...70...80...90...100 - done.
Overview building failed.

But there is an update: when I remove the flag --build-overviews, it worked!

Now I’m really confused. I was using this flag before with other datasets and didn’t have problem with building overviews.

1 Like

Welcome! Sorry for the trouble with that.

Overviews can be really heavy to calculate, especially on datasets that cover a very large reconstruction area.

Did you happen to pass --auto-boundary or use a --boundary GeoJSON to constrain the reconstruction area more tightly? That may help in this instance.

1 Like

Not yet, but good to know. I’ll give it a go when I come across one next time.

But just to clarify, is the --auto-boundary trying to get rid of the potential blank space around the tiff generated? Also, is there a limit in general for the dataset that is too heavy to build overviews? Thanks

1 Like

The --auto-boundary is more or less just trying to make a minimum bounding polygon that encompasses all the “real” points that were reconstructed and throwing out gross outliers.
https://docs.opendronemap.org/arguments/auto-boundary/

You may well still have blanks around your data in the TIFF, but it should be more properly constrained.

Ooof… My least favorite answer… Depends, really. RAM seems to be the major limiting factor. I’ve only seen this a handful of times, and it was on datasets with very large reconstruction areas on my machine with limited RAM (32GB). That is to say, my understanding of when this strikes is still not very well formed, unfortunately.

1 Like

I was trying a 1900 images dataset before and 64GB RAM is not enough :joy: I read somewhere before saying people trying to make ODM utilize GPU. I know it’s hard with such a robust repo already. I sincerely wish my coding skill can be better since I have no clue of how to make that happen.

1 Like

ODM will use an appropriate CUDA-Compatible NVIDIA GPU if your platform and drivers are ready to go.

2 Likes

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.