i have a dataset of 4000 images. my single system has 64gb ram. is there anyway i can split this dataset between multiple systems to process this large dataset
Sure, you might be able to scale horizontally using ClusterODM (that’s what it is for ), but you might run into issues with the final compositing if the main machine doesn’t have enough RAM.
What are the machine sizes you want to scale to?
They are all 64 gb Ram systems. So will I run out of memory in the final composting?
I don’t know, that’s highly dependent upon Task options as well as input image resolution in MP, and what your Virtual Memory/pagefile/swapfile situation is like on the machines (2-4x amount of RAM is usually a good start).
My last issue with odm was due to ram being insufficient and i had set swapfile to 128gb(2x of my ram).
If you can let it run for a while, bump the swapfile up more Just another way to help bolster your chances of success.
This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.