Maximum Point Cloud Size reached

Welp, new limit hit:

WARNING: on-the-fly merged LAS 1.2 files contain too many points (5036013017) for single LAS 1.2 file.
ERROR: cannot merge 5036013017 points into single LAS 1.2 file. maximum is 4294967295

Any recommendations on how to patch? Does LAS 1.4 give us a higher ceiling?

Edit: looks like 1.4 gives us UINT32, at least with breaking backwards compatibility.

1 Like

Pull request in:

1 Like

Note we don’t use PDAL to merge point clouds in split-merge, but lastools. https://groups.google.com/g/lastools/c/oTNRrK4VRy8?pli=1

New record!

2 Likes

Roger that. Updated the pull request. Feel free to revert or not the PDAL mod.

You can rely on me for that… :smiley:

3 Likes

Welp, lasmerge doesn’t seem to like that parameter:

/code/SuperBuild/install/bin/lasmerge -i /datasets/brighton2/odm_georeferencing/odm_georeferenced_model.laz -o /datasets/brighton2/out.laz -set_version 1.4

ERROR: cannot understand argument ‘-set_version’

I wonder if we need a newer version of lastools (maybe the one we have pinned is not up to date, see the SuperBuild/CMakeLists.txt); either way, make sure to test the command. :+1:

What!? Untested Sunday night commits don’t work? :slight_smile: It’s on my list to troubleshoot and test today.

2 Likes

Forgive the intrusion, but could PDAL not replace lastools for this part of the workflow?

PDAL is much slower for the merge. We actually had PDAL in there: we’ve had and removed lastools at least once before. :smiley:

1 Like

As far as I can tell, the one we have pinned has no such flag in any form. It seems to check what the inputs are and matches the format to the inputs. So, we have two options: make sure the upstream files in each of the submodels are 1.4, update lastools and add the flag, or both. The advantage to the upstream files being 1.4 is if, for whatever reason they too are too large for 1.2, we don’t just kick the can down the road, or up the pipeline for when that portion breaks when someone runs 20k image submodels (probably me someday, let’s be honest).

In the meantime, I’ve got weeks into processing this data, so I need to convert the upstream data, and finish processing with the tool as is before my node expires and deletes the data. Wheeee! Then I can patch ODM… . Or maybe simultaneously, but priority is on the data at the moment. More to come soon… .

1 Like

Ugh. Somehow NodeODM decided to delete the job, but not before I made a copy of all the point clouds.

The current version, nor the version we have pinned of lastools uses that flag. The promise that if I fed in a bunch of 1.4 las files into lasmerge it would output 1.4 seems to not bear fruit. So, I’ll sample using PDAL’s sample filter to get this down to ~4 billion points, and run it through pc2dem and call it done.

1 Like