Weird test case of images runned with resized options and with no resize

Good evening,

Testing on a dataset with high resolution images and i stumbled upon a very weird output i did not expect.
I runned several times a dataset of 93 DJI-MAVIC2 camera images flown by dji pilot.Weirdly enough when i run the dataset with the images as is (without resize) the result is way worse than the dataset run with resize 1000.The images original dimensions are 5286x4228.
NodeODM version: 2.8.4
Sample image:

Test cases data:
Dataset ran with no resize:

  • command: run project --resize-to -1 --dsm --dtm --orthophoto-resolution 1 --dem-resolution 1 --pc-classify --smrf-window 0 --smrf-threshold 0

  • output data :

  • report :
    resize_-1.pdf (7.2 MB)

Dataset ran with resize 1000:

  • command: run project --resize-to 1000 --dsm --dtm --orthophoto-resolution 1 --dem-resolution 1 --pc-classify --smrf-window 0 --smrf-threshold 0

  • output data :

  • report :
    report.pdf (7.7 MB)

Dataset ran with feature-quality lowest:

  • command: run project --feature-quality lowest --dsm --dtm --orthophoto-resolution 1 --dem-resolution 1 --pc-classify --smrf-window 0 --smrf-threshold 0

  • output data :

  • report :
    report.pdf (8.7 MB)

Despite the fact the unresized images dataset has 62.000 detected features ,there are only 630 reconstructed points , while resize to 1000 dataset has only 11.150 features and has 1038 reconstructed features leading to a much better ortho and dsm tifs.

Any advice will be very helpfull.
Thanks in advance.

1 Like

Actually reading the code of osfm.py in line 179,resize to -1 is not respected by the code as the resize-to parameter is only checked if > 0(if (‘resize_to_is_set’ in args) and args.resize_to > 0:) and then goes to default feature_quality high.Still my data is erroneous since still the feature_process_size ends up to be 2673 and having worst results than 1000.
Is this a bug or an intended functionality?

1 Like

Sometimes, counter-intuitively, downsampling the data can improve matching.

Also, the behavior of --resize-to not being checked unless set is correct, as the current preferred option is to use --feature-quality.

I’m not seeing as much noise in the data as I’d expect for downsampling to be beneficial, but perhaps it is increasing perceived edge-sharpness and that is helping the matching.

1 Like

Thx saijin,resize-to is set to -1.Resize-to is not working only for value -1 which seems a little weird.

Im currently diving in opensfm and cv algorithms.Generally since the num_of_features are constant as 10000,indeed matches maybe worst at higher resolutions since 10000 points for a 5000x5000 image are way lower as a percent than 10000 points in a 1000x1000 image.I will keep searching and will inform you.
Best Regards.

1 Like

That is correct since it will be overriden by the setting for --feature-quality, unless you explicitly set a positive numerical value for --resize-to.

1 Like

If you are only getting 10000 features per image, then I suspect you are using ORB feature matching, which might be a factor here, it concentrates on features towards the centre of the images.

1 Like

My bad,i meant min_num_of_features,im running default SIFT with FLANN

1 Like

OK, I did wonder if you meant that :slight_smile:

Have you tried increasing the min_num_features to say 20000 or more, assuming you aren’t already getting well over 10000?

1 Like

yes again getting better results with resize-to 1000(with minimum features 20000)

1 Like