I am having a use case where I require angles to rotate world coordinate points(X,Y,Z) in order to align with image coordinate system…
I saw some values under rotation:[… ] in reconstruction.json for each camera, but not sure how to use these?
It is given that they represent in Axis/angle format!!
I got it from - shot.pose.get_rotation_matrix() in opensfm/opensfm/reconstruction.py
From rotation matrix, I could find out rotation angles and there by the angles required to rotate the image plane in order to align with point cloud view.
I do not know exactly whats the difference between rotation matrix from reconstruction.json and from bundle_r000.out , but looks like if I use the later (i.e from bundler) the cameras aligned more accurately.
However, it looks like some other transformation needs to be done - bcoz some of the images are not getting aligned as per the camera rotation.
but on applying same rotation angles(omega,phi,kappa) to images - I get a pattern where for those images having
Gimble Yaw Degree > 90 or < -90
: I had to add/subtract 180deg to kappa(rotation angle around Z-axis).
In the above image - for half of the circle, camera poses are coming good, rest are getting inverted.
Similarly, for grid path - one line gets proper camera poses, the immediate next line gets inverted.
So, in order to identify such images - I have to read EXIF: Gimble Yaw Degree from the input images.
However, exifread module is not reading that parameter.
Can someone help in finding out how to get that Gimble Yaw Degree from EXIF data.
P.s; I use exiftool to get exif data, but is there anyway to read such data from code?