I tried to contribute to the datasets but the upload limit via github is pretty low, 25MB for files. I managed to push a contribution with command line but only with a small point cloud 30MB. With a “regular” one (300MB) even with command line it reject-it asking to use git LFS.
Do we plan to limit ourselves to small dataset ? Is there a guideline I missed ?
Yes, we’re going to limit to smaller datasets at least initially (or in other words, point clouds not as dense). PDAL’s sample filter filters.sample — pdal.io can be used to reduce the density of an existing point cloud (a value in the range of 0.02 - 0.05 meters for the filter should do it).
We’re still refining the instructions, I’ll add a note for this.
No problem, thank you for the answer.
For professionnal use we subsample the output of ODM to 0.1 meters usually. I will make an intermediate subsampling to 0.05 meters for classification.
(We have an intern begining next week, if he managed to finish quickly the first batch of tasks we have for him I will certainly ask him to classify some clouds and make pull-request in the dataset).