Very large images (+100MB) problems

I found this thread regarding processing of large images

which is not my case.

I have issues processing images that are about 150MB each. I am getting a PIL warning (not crash) of
/usr/local/lib/python2.7/dist-packages/PIL/ DecompressionBombWarning: Image size (102060000 pixels) exceeds limit of 89478485 pixels, could be decompression bomb DOS attack. DecompressionBombWarning.

Then I got several crashes with
File "/code/SuperBuild/src/opensfm/opensfm/", line 98, in extract_exif_from_file exif_data = EXIF(fileobj)

File "/code/SuperBuild/src/opensfm/opensfm/", line 105, in unescape_string return decode(encode(s, 'latin-1', 'backslashreplace'), 'unicode-escape') UnicodeDecodeError: 'ascii' codec can't decode byte 0x8e in position 83: ordinal not in range(128)

which point to problems extracting the EXIF. But extraction was OK for dozens of images in the same data set, and then on what seemed like just another image, I got the crash. Exif of image right before and image at the crash appear the same except slight GPS coordinates, time differences. No field is missing.

Is there any fundamental limit to image file size?

Images here were not push/broom but rather the “usual” using a UAV

I saw a warning for one image as so
[WARNING] P0000383.jpg has malformed XMP XML (but we fixed it)

and then few lines down
2020-08-07 10:35:09,200 INFO: Extracting EXIF for P0000383.jpg Traceback (most recent call last): File "/code/SuperBuild/src/opensfm/bin/opensfm", line 34, in <module> File "/code/SuperBuild/src/opensfm/opensfm/commands/", line 35, in run d = self._extract_exif(image, data)