Memory experience


#1

Hello,

I’m testing WebODM on different operating systems and on different hardware.
My drone: Phanthom 4 Pro.

1st configuration:
My local test computer with Windows 10 Prof, Intel i7 and 8GB RAM and the WebODM Installer runs well with WebODM, but with the 6 GB memory allocated for Docker I can only calculate small projects.
If, for example, I want to calculate a flight with 80 images, the memory seems to be insufficient.

1st question: How much RAM do I need with a Windows environment to calculate up to 150 images?

2nd configuration:
Linux server with Ubuntu 16.04 as VPS with 16 GB Ram and 6 cores. WebODM installed by hand, everything works fine.
Here, too, the RAM seems to be too tight. WebODM aborts with 88 images.

2nd question: is there a set memory limit for Docker under a Linux installation?

docker info returns the following result:
Containers: 5
Running: 5
Paused: 0
Stopped: 0
Images: 4
Server Version: 18.06.1-ce
Storage Driver: overlay2
Backing Filesystem: extfs
Supports d_type: true
Native Overlay Diff: true
Logging Driver: json-file
Cgroup Driver: cgroupfs
Plugins:
Volume: local
Network: bridge host macvlan null overlay
Log: awslogs fluentd gcplogs gelf journald json-file logentries splunk syslog
Swarm: inactive
Runtimes: runc
Default Runtime: runc
Init Binary: docker-init
containerd version: 468a545b9edcd5932818eb9de8e72413e616e86e
runc version: 69663f0bd4b60df09991c08812a60108003fa340
init version: fec3683
Security Options:
apparmor
seccomp
Profile: default
Kernel Version: 4.4.0-135-generic
Operating System: Ubuntu 16.04.5 LTS
OSType: linux
Architecture: x86_64
CPUs: 4
Total Memory: 11.73GiB
Name: vmd28129.contaboserver.net
ID: QGQ4:5427:JW7P:BQCW:ATQF:K4VF:2GJ7:MQX4:XS57:7POX:PMXF:C5NK
Docker Root Dir: /var/lib/docker
Debug Mode (client): false
Debug Mode (server): false
Registry: https://index.docker.io/v1/
Labels:
Experimental: false
Insecure Registries:
127.0.0.0/8
Live Restore Enabled: false

WARNING: No swap limit support

Thanks Sven


#2

Hi

I dont know the answers for all your questions. But hopefully I can help you along with some of them.

I have been testing with a dataset of around 100 images with a size of 500MB in total and 20GB of ram was not always enough depending on the settings.
I have better results with running ODM standalone as a docker image with my dataset to avoid memory errors. Than running it through WebODM.

If you want to see the memory limit on docker you can run “docker containers stats”


#3

Hello,
Thank you for your answer. Meanwhile I have come further and am now testing under the following environment, which is promising:
Operating System: Ubuntu 16.04.5 LTS
Architecture: x86_64
CPUs: 6
Total Memory for docker: 23.55 GB

204 files with 1,6 GB are calculated in 3:25 hours (Processingnode Auto, options default , no resize)

I will now systematically carry out further tests and report.
Greetings Sven


#4

Hi

These are some of my memory experiences with ODM and a small dataset on a 16 way server with 50GB RAM:

Dataset Images Pix Size Time CPU count MaxMem(GB) settings
house 75 12 415M 18:00.67 30 8 default
house 75 12 415M 11:26.37 30 7 –fast-orthophoto
house 75 12 415M 1:01:00 30 19,5 –min-num-features 15000 --matcher-neighbors 12 --texturing-data-term area --mesh-octree-depth 12 --skip-3dmodel
house 75 12 415M 1:01:00 30 18 –min-num-features 20000 --matcher-neighbors 12 --texturing-data-term area --mesh-octree-depth 12 --skip-3dmodel
house 30 12 194M 4:19.00 30 5 –fastorthop
house 30 12 194M 3:58 16 3.5 –fastorthop
house 30 12 194M 7:06.00 16 3.7 default
house 75 12 415M 9:21 16 6.2 –fast-orthophoto
house 75 12 415M 16:50.00 16 6.2 default
house 75 12 415M 48:28.00 16 19 –min-num-features 15000 --matcher-neighbors 12 --texturing-data-term area --mesh-octree-depth 12 --skip-3dmodel

#5

This is incredibly useful. Thanks for sharing this.