Keep hitting SIGKILL(-9) after the initial matching stage

Hi all,
I am new to ODM and the community. With some quick tests I got some good first results, but feeding a few more datasets through I seem to quite often trigger the same error, so I am looking for any suggestions as to what I am doing wrong.
I have series of images (stills, or frames from video) from a hand held camera (GoPro Hero 4), walking through a forest scene. I installed Docker to use ODM, which went pleasing well.
I find I can run to completion on some datasets if I subset to say a quarter of the images, does this imply a memory issue, the error message seems to hint at that? By increasing (say doubling) the number of images I trigger the same error at the same processing stage with everything else unchanged.

Below is my command line, and below that an excerpt from the DOS window at failure.

Hopefully this is just a newbie error.
regards,
Dave

docker run -ti --rm ^
-v C:\1_work\Camera_0.5s\T1-2-ODM03:/datasets/code ^
opendronemap/odm –-end-with odm_filterpoints --min-num-features 16000 --project-path /datasets --camera-lens fisheye

2019-10-31 22:22:25,739 DEBUG: No segmentation for G0261975.JPG, no features masked.
2019-10-31 22:22:25,807 DEBUG: Matching G0261819.JPG and G0261919.JPG. Matcher: WORDS T-desc: 0.209 T-robust: 0.163 T-total: 0.373 Matches: 164 Robust: 23 Success: True
2019-10-31 22:22:25,850 DEBUG: No segmentation for G0261852.JPG, no features masked.
2019-10-31 22:22:26,033 DEBUG: Matching G0261980.JPG and G0261975.JPG. Matcher: WORDS T-desc: 0.171 T-robust: 0.126 T-total: 0.297 Matches: 224 Robust: 67 Success: True
2019-10-31 22:22:26,074 DEBUG: No segmentation for G0261925.JPG, no features masked.
2019-10-31 22:22:26,738 DEBUG: Matching G0261819.JPG and G0261852.JPG. Matcher: WORDS T-desc: 0.171 T-robust: 0.713 T-total: 0.885 Matches: 148 Robust: 8 Success: False
2019-10-31 22:22:26,787 DEBUG: Matching G0261980.JPG and G0261925.JPG. Matcher: WORDS T-desc: 0.261 T-robust: 0.452 T-total: 0.714 Matches: 155 Robust: 0 Success: False
2019-10-31 22:22:26,892 DEBUG: No segmentation for G0261859.JPG, no features masked.
2019-10-31 22:22:26,897 DEBUG: No segmentation for G0262003.JPG, no features masked.
Traceback (most recent call last):
File “/code/SuperBuild/src/opensfm/bin/opensfm”, line 34, in
command.run(args)
File “/code/SuperBuild/src/opensfm/opensfm/commands/match_features.py”, line 29, in run
pairs_matches, preport = matching.match_images(data, images, images, True)
File “/code/SuperBuild/src/opensfm/opensfm/matching.py”, line 65, in match_images
matches = context.parallel_map(match_unwrap_args, args, processes, jobs_per_process)
File “/code/SuperBuild/src/opensfm/opensfm/context.py”, line 41, in parallel_map
return Parallel(batch_size=batch_size)(delayed(func)(arg) for arg in args)
File “/usr/local/lib/python2.7/dist-packages/joblib/parallel.py”, line 934, in call
self.retrieve()
File “/usr/local/lib/python2.7/dist-packages/joblib/parallel.py”, line 833, in retrieve
self._output.extend(job.get(timeout=self.timeout))
File “/usr/local/lib/python2.7/dist-packages/joblib/_parallel_backends.py”, line 521, in wrap_future_result
return future.result(timeout=timeout)
File “/usr/local/lib/python2.7/dist-packages/joblib/externals/loky/_base.py”, line 433, in result
return self.__get_result()
File “/usr/local/lib/python2.7/dist-packages/joblib/externals/loky/_base.py”, line 381, in __get_result
raise self._exception
joblib.externals.loky.process_executor.TerminatedWorkerError: A worker process managed by the executor was unexpectedly terminated. This could be caused by a segmentation fault while calling the function or by an excessive memory usage causing the Operating System to kill the worker. The exit codes of the workers are {SIGKILL(-9)}
Traceback (most recent call last):
File “/code/run.py”, line 57, in
app.execute()
File “/code/stages/odm_app.py”, line 92, in execute
self.first_stage.run()
File “/code/opendm/types.py”, line 370, in run
self.next_stage.run(outputs)
File “/code/opendm/types.py”, line 370, in run
self.next_stage.run(outputs)
File “/code/opendm/types.py”, line 370, in run
self.next_stage.run(outputs)
File “/code/opendm/types.py”, line 351, in run
self.process(self.args, outputs)
File “/code/stages/run_opensfm.py”, line 27, in process
octx.feature_matching(self.rerun())
File “/code/opendm/osfm.py”, line 176, in feature_matching
self.run(‘match_features’)
File “/code/opendm/osfm.py”, line 21, in run
(context.opensfm_path, command, self.opensfm_project_path))
File “/code/opendm/system.py”, line 76, in run
raise Exception(“Child returned {}”.format(retcode))
Exception: Child returned 1

Most likely shortage of RAM.

“”
…A worker process managed by the executor was unexpectedly terminated. This could be caused by a segmentation fault while calling the function or by an excessive memory usage causing the Operating System to kill the worker. The exit codes of the workers are {SIGKILL(-9)}…

“””

Hi rumenchoo,
thanks for the confirmation.
What options do I have to get past this on my PC?

The machine is Windows 10 Enterprise, CPU i7-8700K 3.70GHz, RAM 32GB, Graphics NVIDIA GeForce GTX 1080 with memory 8GB.

Are there processing options to reduce memory footprint on a PC for the dense cloud generation phase? is WebODM a way to scale up?
Any other suggestions? Ultimately I would like to process many, high resolution images, so I realise I am pushing limits, but interested to hear of strategies to achieve this when (local) resources are insufficient with a default approach.

The results I get with ODM from smaller pieces of the scene are just so impressively good, it just makes me want more!

regards
Dave

You can resize photos
You can check ClusterODM .
OR
https://webodm.net/.