Comment/Description:
This dataset is quite large and requires a decent machine to process. It was taken using a DJI Matrice 100 with an X3 NIR camera gimbal. OpenDroneMap really hates this imagery presumably because of the lack of color variation outside of the red band. So far I have been unable to process this dataset even with a min-num-features of 500000. Give it a shot and see what happens.
This flight was done manually without the help of an autopilot. I cannot guarantee overlap values of any sort.
This is great @RyanF! We are actually actively looking for NIR datasets as multi spectral support is up in the priority list, so thank you for sharing it!
This is great. I am a student at Purdue working on finishing my degree in Unmanned Aerial Systems, so I have access to some of the fun cameras. I’ll discuss with my research team the possibility of sharing a bunch of imagery from the various platforms currently in our fleet.
Some of our systems:
C-Astral Bramor PPX
PPK enabled fixed wing aerial platform with a Sony A8000 RGB
Micasense Altum Multispectral sensor can be interchanged with A8000
DJI Matrice 600 Pro
DJI X5 RGB gimbal
Will likely be carrying the Altum in future flights
Yuneec Typhoon H
Standard RGB sensor
My personal DJI Matrice 100
Came from the DJI “Smart Farmers bundle”
Carries either an X3 RGB or X3 NIR gimbal sensor
Will also be carrying a Micasense Red Edge Multispectral camera at the same time as the X3 RGB camera
We have a pretty decent fleet and a large project coming up that will require us to gather a lot of data with all of our sensors and platforms. Our initial tests are showing that the Altum is kind of useless though because it can’t trigger fast enough and has an extremely narrow field of view. It only really works if you stick it right at that 400 ft limit and don’t deviate from it.
I just wanted to make a small update on the sharing of datasets we gather during our research. I have received permission to share any and all datasets we capture in the field using any of our sensors. Over the coming weeks and months you can expect more datasets to appear as we conduct more flights. They will end up in a Google Drive or other cloud account where anyone can download them for their own experimentation so long as credit is given to the Purdue University Unmanned Aerial Systems Program.
Yes, I really should be flying auto but Pix4D Capture wasn’t wanting to work that day. Given the choice between missing an opportunity for data collection or getting some weird flight paths, I chose the former. I know some of my manual flights I’ve done like this came out just fine in the past, so it should at least get us something. I think I have a dataset from that location that was flown in auto but I’ll have to see.
@RyanF - I’ve managed to process a subset of your images chosen by hand / eye using the master branch of ODM as of April 23rd. I took out images which were obviously ‘off nadir’ and badly oriented and otherwise used the default settings.
Here’s the hillshaded DSM, and the list of files used is below. It’s a good start, I think?
That is definitely looking better than my earlier processing attempts. I haven’t reattempted processing the dataset since I originally posted it, so at least we now know it can be used to create something.
This really wasn’t a good flight in my opinion as I was doing everything manually. In the future my datasets will be a lot cleaner and shouldn’t need cleaning of the images.