Significant processing time difference

I’m curious why two similar data sets (the data sets come from the same location, same mapping run, just 1 week apart), only 24 images different and have such significant processing time differences on the same NodeODM host.

image

{“uuid”:“020a280d-d9e2-464e-9830-eda758563adc”,“name”:“08152020”,“dateCreated”:1597695157382,“processingTime”:78645263,“status”:{“code”:20},“options”:[{“name”:“split”,“value”:350},{“name”:“fast-orthophoto”,“value”:true}],“imagesCount”:1687,“progress”:36}

{“uuid”:“d8fb4fb8-9485-4528-8135-e1d69b335345”,“name”:“08022020”,“dateCreated”:1597634048660,“processingTime”:28758021,“status”:{“code”:40},“options”:[{“name”:“fast-orthophoto”,“value”:true}],“imagesCount”:1663,“progress”:100}

Is it because I split the job using split=350?

1 Like

Yes. When you split a task, some parts of the reconstruction (the edges) are reconstructed multiple times. There’s overlap between the splits so that we can align the submodels, but that means more redundancy, which means more processing time.

1 Like