Incorrect results from official Micasense Altum Sample data set

Hello,

I’ve downloaded the sample data from micasense altum found at Altum Sample Data - MicaSense (the Complete Flight and the Orthomosaic for comparing)

In the latest version of webodm i added both folders from the sample data (000 & 001), set preset to multispectral , No to resize images, and run the project.

Project completed with success but am having the following issues.

  1. NDVI Values are incorrect :
    i) From WebOdms interface > Plant health the NDVI with the default filter RGN has MIN/MAX roughly -.5 to .7. with plethora of values been in the range of ~.045 to ~.507 with histogram peak at ~.311. kind of incorrect
    ii) Filters RGBN, BGRNRe, RGBNRe have better range MIN/MAX to ~-.7 to .913 , values ranging from -.3 to .88 and a peak at .8 (much better but yet not there)
    iii) downloading the data and passing it to QGIS making an NDVI there i get a MIN/MAX of -.994 to .999 (which is what i should be seeing), most of the values are between -.2 to .85. the peak is at .7.
    iv) Opening the Orthomosaic from micasense MIN/MAX is at -.999 to .999, ranges go from ~-.1 to almost 1 , the peak is at .9

Theoretically speaking iv has the correct values if you also add the fact that this comes from micasense it means we get incorrect values from WebODM.

Why is that? which part of the radiometric calibration is going wrong ?

  1. The orthomosaic stitch in the LWIR layer is not working correctly, is there some specific workflow that needs to take place?

  2. 3d model+ point cloud doesnt have any texture (the point cloud maybe does but its too dark?), is that expected?

  3. raw images are 16bit but results are 32bit is this the correct behaviour?

Any insight would be much appreciated,

Thank you for your time,
Best!

1 Like
  1. What did you choose for the radiometric calibration?
    radiometric-calibration — OpenDroneMap 2.7.0 documentation
  2. Usually LWIR/Thermal imagery is incredibly small, often something like 512x384 (or smaller :grimacing:) which can be difficult to reconstruct. What size are yours?
  3. It should be a colorized point cloud. Can you share screenshots?
  4. Yes, I believe it is expected that we recast the bitdepths for TIFFs.

Hello Saijin,

  1. I used the defaults which is camera only. Checking with exiftool images have no EXIF:HorizontalIrradiance (so no DSL data), nevertheless i tried the camera+sun which gave similar results (if not exactly the same).
  2. 160X120. Other software stitch that with success, (using data from the other layers but am sure you already know that), so there is a way for that to happen. what do we need to do to make webodm do something similar? all the new multispectral cameras are now with LWIR sensors which are no more than 320X160. having this feature working will be vital very soon!



4) good to know!

Thanks again for your time,
Best!

1 Like
  1. I wonder if we are missing some tag or some logic to do the calibration as expected.

  2. Yeah, that is small for our pipeline. I’m not sure how we can address that near-term… We likely need another funding drive to make custom tooling for such imagery.

  3. The model’s texture pages might be too big for PoTree. It likely will render fine in Blender, for instance.

Let me know!

Hello again,

  1. Does odm use the values from the calibration reflectance panel images ? like here MicaSense Image Processing Tutorial 1

  2. i bought the windows installer (even so am using linux) and the book (which is totally outdated), trying to help the tiny bit i can . not sure how this process works, do you create an issue in github ?

  3. i dont think its the size of the data. i tried a smaller data set (the folder 001 from micasense altum examples) , the point cloud gets black and white values like its getting the red layer/channel , then the geometry is still default grey . possibly because the data doesn’t have an actual rgb channel ?

thanks again for your time,
Best!

1 Like
  1. No, we don’t have automatic calibration from calibration panels. Some folks will pre-process their imagery with a vendor radiometric calibration tool and then process the rest of the way in OpenDroneMap (skipping any further calibration steps).
  2. If either of those are not to your satisfaction, please reach out to [email protected] and we can get your refunds taken care of. If they’re not serving you, you do not need to keep them :slight_smile:
    I think, given the size/complexity of the changes necessary, a larger community funding-drive to pay for a period of dedicated feature development might be necessary. We usually tackle big things like this with fund.webodm.org
  3. I’ve not seen that happen before, even with single-band or thermal imagery. Are you able to provide your all.zip output somewhere so I can test it here? What browser, X/Wayland, Desktop Environment, and display driver are you using?

Hello Saijin,

  1. most probably where the issue is !

  2. No i didnt mean it like that , getting those 2 products is the least i can do to help the project! i see so only if a big funding starts we can get a feature like that :frowning:

  3. So you dont get same results from processing Altum Sample Data - MicaSense ? ive tried linux/windows , opera/firefox , amd 6900/ nvidia 1500ti.Also via webodm to see the point cloud i need to go in out the 3d view a couple of times . if its indeed me i’ll make a smaller process and send you teh data.

Thanks again,
Best!

1 Like
  1. Sorry we don’t support this calibration workflow! But this type of integrated calibration would be a great thing to add in a funding drive for this platform.
  2. Oh, well thank you for your generosity! It helps, and your support means a lot!
  3. I will process this dataset this week and compare with your screenshots. Thank you for providing that platform information! Helps me a lot.
1 Like

Hello Saijin,

  1. with out that all multispectral data is incorrect , people will be reading the wrong values :smiley:
  2. interesting to hear the results .

best!

1 Like
  1. Yes. In these cases, we recommend that the customer pre-process the data with the Vendor’s calibration workflow and then perform the rest of the reconstruction afterwards.

  2. Soon!

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.