A couple years ago, Google decided that instead of exporting the photos with EXIF data exactly as you've uploaded them, which was the original behavior and how platforms such as OneDrive do it, they are going to completely delete all EXIF from the image and instead also create a .json containing the original data, in a non-standard format. This script is an open and free version of a paid tool that goes through each image, finds the corresponding .json, and puts the EXIF data back on.
If you don't do that, when you reupload these photos into a new service, the date will be reverted to the day you've downloaded them and location data will be missing entirely.
Yes! I imported 23k media files into a new platform, and the takeout process was such a pain. My destination was built to handle the zipped or unzipped media, but occasionally issues cropped up,like when files spanned archives but the json was on the previous one. That resulted in orphaned files with upload dates instead of date taken.
Ultimately, I think I had the best experience extracting all 123GB and uploading the albums/folders that way.
Would have been SO much easier with an API that allowed cloud to cloud.
Google reminds me more and more of Microsoft of the 90s. That’s exactly the kind of compatibility breaking asinine move MS would do 30 years ago. Sigh…
Jumping in to second immich-go, just used it recently. I had no issues other than Google Photos not giving me the data for shared albums (most of mine), which seems to be a Google Takeout issue.
Used immich-go for this too. Had no problems, and the dev is really open to feedback. All the metadata was there (except for labeled persons), and albums were also created correctly. It even inported archived photos into Immich's archive!
If you import it as external library, Immich will not be able to modify or delete any of the files, but you will see the files just like you see ones uploaded using immich. It is up to you to decide what is better