News organizations, camera makers, and tech companies create a web tool called Verify for checking the authenticity of images for free. It's being adopted by Nikon, Sony, and Canon.
Camera Companies Fight AI-Generated Images With 'Verify' Watermark Tech::undefined
It's not DRM. It's like EXIF metadata. You can strip it anytime you want, and it will often get stripped in practice even if you don't want (e.g. screenshots, naive conversions, metadata-stripping upload services, etc.). It's entirely optional and does not require any new software to view the images, only to view and manipulate the metadata.
On its own, it doesn't tell you much about the image except that specific people/organizations edited and approved it, with specific tools. If you don't trust those people/orgs, then you shouldn't trust the photo. I mean, even if it says it came straight from a Nikon camera...so what? People can lie with cameras, too.
I wrote a bit more about this in another thread at https://lemmy.sdf.org/comment/5812616 about a month ago, if you're interested. I haven't really played with it yet, but there's open-source software out there you can try.
You could implement it like that but I'm not convinced that's the way this will go. The only way this will have mass adoption, I'm afraid, is if the tech giants can fleece us one way or another.
I guess this is better than nothing, but what happens if you take a photo of a generated photo? There are setups where the result will be impossible to tell that it's a photo of a photo, and then you can have the camera digitally sign the fake photo as real.
Consoles (Xbox, Nintendo, PlayStation) are all hacked eventually. All that will happen is someone will hack a camera to sign any image sent to it.
I think this tech (signed pictures) is just going to make the problem worse. Once a camera is hacked, it's "signed" but fake... Same spot we are now but now we have fake verified pictures
And consoles are a walled garden, here you would have to build a resilient trust network for all camera manufacturers, any private key gets leaked and the system is compromised.
It's not just a sig on the image, but on metadata as well. Harder to fake time + place if they implement it thoroughly. (I.e., they would have to make it only trust GPS and verify against an internal clock, I suppose, and not allow updating time and location manually.)
...including the date and time a photo was taken as well as its location and the photographer...
Not including gps and time makes this worse, but including it makes it useless because you can't ever verify a photo sent across social media, since the exit tags will be stripped.