Improve Processing Time

I have observed that when the dataset is large, let’s say more than 1.5k or 2k photos, processing time significantly increases.

Matching Pairs process takes a significant amount of time.

I came across this discussion OpenSfM Processing Time and came to know about use-hybrid-bundle-adjustment.

I have some queries related to this particular parameter:

  1. Can this parameter be set to True always if number of photos are more than 100?
  2. use-hybrid-bundle-adjustment — OpenDroneMap 2.8.8 documentation says that Speeds up reconstruction for very large datasets. Is it possible to quantify the very large datasets?

What are the other ways to tackle long processing time for large dataset (maybe 1.5k or more photos)?

2 Likes

Hybrid is enabled by default. When I’ve looked at processing time for the opensfm stage it can get exponentially slower for portions, but is linear overall. Are you seeing exponential increases in time for matching or reconstruction, or something else?

1 Like

I find matching time varies from <0.2sec to 15 or more seconds per match for different tasks, but I’m not sure why there is such a range. When it’s near 15 seconds per match, a large dataset can take an inordinate time to process!

1 Like

As per the documentation, it is not. :thinking:

Yes, when the dataset is relatively small, let’s say less than 1k or 1.5k, it works fine. But as the number of photos shoots up, I have seen that time for matching increases drastically.

I’m fairly certain the documentation is wrong. It became default about a year ago.

Cool. Do you have any profiling or documentation of this, or even logs from different size datasets that we can parse for time stamps?

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.