First, I’d like to thank those who have contributed to this project! I’m having a lot of fun learning how to use WebODM and get the most out of it.
Background: my company is starting a drone department, with a main goal of being able to map cell towers. I’m really only interested in generating a point cloud (no orthophotos, meshes, or DTMs/DSMs) from the drone images. We’ve demo’d Pix4D mapper & inspect, and while I really like the results and low processing time of Pix4Dinspect, I’m not sure it’s going to be cost-effective. I’ve been playing around with running WebODM in a Google Cloud instance, following the instructions at the end of the README file.
The issue: I was able to successfully process a few sets of images (200-300 images @ 20MP), however, upon opening the Diagnostic menu while the jobs were processing, I noticed that only a small portion of the RAM I provisioned was being used. I think it maxed out at 8 GB, out of 128 GB. I don’t have much experience with Linux, VMs, or Docker, so I’m kind of at a loss with how to troubleshoot this.
1). I set up a Google Cloud VM, using the “e2-highmem-16” machine type (16 vCPUs, 128 GB RAM), and attached a 100GB SSD volume.
2). I followed the instructions in the README file for installing WebODM on a Google Cloud VM instance
3). I uploaded my images and started processing
Things I checked/tried:
- set “max-concurrency” to 16 in WebODM web interface before starting processing
- ran “docker stats” to view memory usage/limits: memory usage hovered between 1-4 GB, peaking at around 8GB, and memory limits were 125.8 GB according to docker stats
Any ideas what could be limiting the amount of RAM being used?
Thanks in advance!