Can I create an orthophoto without location data?

I’m trying to create an orthophoto in WebODM from some historical imagery that was taken from the ‘RetroLens’ website (http://retrolens.nz/) so that I can undertake an analysis of changes in the character of a river. The photos I’m using at this stage are here: (https://drive.google.com/drive/folders/1iih1j9qzs8coVkcnszb8BuNlVkAd85a8?usp=sharing).

The images have no location data - just a csv file with each image that has an altitude of ‘12000’ and blank cells for lat. and long.

I don’t need the image to be georeferenced yet - I’m hoping I can stitch them together and then georeference/rubber-mat the final product to some GCPs (road corners, etc.). I was hoping to avoid having to try and add a GCP for every historical image as it’s quite difficult to place them.

I’ve attempted to import the images into WebODM and run a task at the default settings. I’ve also tried to set matcher-neighbor and matcher-distance to 0 to disable any attempt to use EXIF data but still get the message “An orthophoto could not be generated. To generate one, make sure GPS information is embedded in the EXIF tags of your images, or use a Ground Control Points (GCP) file”

I would paste the entire task output but it’s truncated at 500 lines (there must be a way around this?) and hopefully, my question can be answered without it (although I appreciate it could help).

I’m using WebODM through Docker/Google Chrome on a 2012 Macbook Pro. I’ve made numerous successful orthophotos on my machine using imagery from a DJI Inspire 2.

Surely there is a way to create a non-georeferenced orthophoto? Any help would be greatly appreciated!!

Hey @Tom, you don’t need to add GCPs for each image, usually 5 GCPs with 3 images per GCP (15 entries total) should suffice.

While it’s possible to render a non-georeferenced orthophoto with some code changes, or by invoking directly the odm_orthophoto program, ODM (and thus WebODM) currently doesn’t support it.

1 Like

Amazing - thanks @pierotofy. I’ll give it a go today!! I knew it must be relatively straightforward.

1 Like

Also recommend passing --fast-orthophoto if you are processing historical images. Possibly also related read: https://www.opendronemap.org/2019/05/stitching-historical-aerial-images-from-1940-using-opendronemap/

1 Like

I had a go with this yesterday and managed to create a comprehensive set of GCPs covering the area I’m trying to stitch. I then ran the process with:
Options: fast-orthophoto: true, skip-3dmodel: true, rerun-from: dataset

Unfortunately the process has stalled at:
[INFO] running pdal pipeline -i /tmp/tmpVAmXg3.json > /dev/null 2>&1

The clock has continued to run (is now at 13 hours) but there has been no progress. I think it actually reached this stage about 30 mins in. I’ve had this problem in the past and thought it was something to do with overloading my computer, so I restricted Dockers access to 7 (of my 8) CPU threads and 14GB (of my 16GB) of RAM. This seemed to help with the smaller things I was processing but maybe it wasn’t a solution (?).

All of the images (and the GCP and task output files) are here: https://drive.google.com/drive/folders/1iih1j9qzs8coVkcnszb8BuNlVkAd85a8?usp=sharing

A specific link to the task output is here: https://drive.google.com/open?id=1JjtN6M3YC3DmkfQLQe_GUiy2_rqol1bz

I’m using a 2012 MacBook Pro with 2.6 GHz Intel Core i7 processor, 16GB of RAM, Intel HD Graphics 4000 1536 MB, and an SSD.

Does anyone have any idea what the reason for the stall would be and how I could get around it?

I had another go with half of the dataset but ran into this problem - ‘Exception: Child returned 1’
“Process exited with code 1” means that part of the processing failed. Sometimes it’s a problem with the dataset, sometimes it can be solved by tweaking the [Task Options](javascript:void(0):wink: and sometimes it might be a bug!"
Full task output here: https://drive.google.com/open?id=1W6Z9RdSLOhxBbTkYepRg11zO8Ayy5tn1

Looks like you might be running out of disk space:

ERROR 3: Free disk space available is 6008270848 bytes, whereas 14704676160 are at least necessary. You can disable this check by defining the CHECK_DISK_FREE_SPACE configuration option to FALSE.

So I should be allocating more space from my SSD to Docker?

Definitely running out of space…(!!!) I had no idea a process could use so many GBs.