Upload issue with large dataset

I’ve been struggling for 6 hours now to start a job to process 950 images. Some of these images are pretty big at nearly 30MB - all JPGs. I was originally triggering the job via WebODM API but when that didn’t happened I used the WebODM interface directly to see what was wrong. The upload seems to get to about 80% each time and dies. I’m on a local network, so speeds are as fast as the disks will go (not exactly blazing with Ceph, but 60MB/s+). I’ve changed the order that the files are uploaded by switching the list order.

Sometimes during upload, the upload resets itself to 0 and starts again.

It’s not clear if this is a timeout issue or something else. I’ve timed it and it’s dying at about 7min 20 seconds, which would be a really weird timeout value. Nothing in the nginx error log, or in syslog. Native server install.

Anybody else seen this issue before?

Error from WebODM:
950 files cannot be uploaded. As a reminder, only images (.jpg, .png) and GCP files (.txt) can be uploaded. Try again.

They’re all JPGs, though some are .jpg and some are .JPG - not entirely sure why, supplied to me that way. I’ve md5sum’d them all and even manually looked through them all - there doesn’t seem to be any corruption. Though if there was, I’d expect it to die at exactly the same point every time.

CPU load is low, RAM is fine on both the server and the client. This server doesn’t do any of the processing itself, it’s just the broker, so it’s hardly used:

[email protected]:~# free -m
              total        used        free      shared  buff/cache   available
Mem:          88509         804       80516          28        7188       87161
Swap:           975           0         975
[email protected]:~#

First time I see it, but it could be a timeout problem, or some other limit from browsers. Currently WebODM doesn’t have a “chunked upload” option as NodeODM does, so the request is sent all at once. 950 * 30MB = 28GB! Pretty sure that’s a bit much for the best browsers.

I’ve opened a feature request for that: [Feature] Add Chunked Uploads in WebODM (similar to NodeODM) · Issue #658 · OpenDroneMap/WebODM · GitHub

In the meanwhile, either compress the JPGs or resize them.

Only some are 30MB, the total is about 12GB. The API call is done with curl also.

Can I push jobs straight into NodeODM and have them show up in WebODM? The API docs are a little light on how the pieces glue together.

Yes you can upload the images directly to NodeODM (make sure to check the generate 2D and potree point cloud tiles option if you use the NodeODM UI), then you can import the results in WebODM using the import functionality (all.zip).

This can all be scripted too, although I don’t have references. Watch what the browser does with DevTools.

Might be worth to try the upload with other browsers too. They tend to have different limits.

Plus one to that. I had timeout issues with largeish sets (900 x 20 Mb) in Safari and Edge, but it was OK with Chrome.

I’ve tried Chrome, Firefox and curl but so far no dice. I ran curl manually and can at least see that the error is server side, however I don’t see anything useful in the logs:

<html>
<head><title>500 Internal Server Error</title></head>
<body bgcolor="white">
<center><h1>500 Internal Server Error</h1></center>
<hr><center>nginx/1.10.3 (Ubuntu)</center>
</body>
</html>

Detailed errors might be available in the /tmp/nginx.error.log.