Biden's AI advisor Ben Buchanan said a method of clearly verifying White House releases is "in the works."
The White House wants to 'cryptographically verify' videos of Joe Biden so viewers don't mistake them for AI deepfakes::Biden's AI advisor Ben Buchanan said a method of clearly verifying White House releases is "in the works."
Digital signature as a means of non repudiation is exactly the way this should be done. Any official docs or releases should be signed and easily verifiable by any public official.
This doesn’t solve anything. The White House will only authenticate videos which make the President look good. Curated and carefully edited PR. Maybe the occasional press conference. The vast majority of content will not be authenticated. If anything this makes the problem worse, as it will give the President remit to claim videos which make them look bad are not authenticated and should therefore be distrusted.
You mean to tell me that cryptography isn't the enemy and that instead of fighting it in the name of "terrorism and child protection" that we should be protecting children by having strong encryption instead??
Why not just official channels of information, e.g. White house Mastodon instance with politicians' accounts, government-hosted, auto-mirrored by third parties.
I think this is a great idea. Hopefully it becomes the standard soon, cryptographically signing clips or parts of clips so there's no doubt as to the original source.
When it comes to misinformation I always remember when I was a kid I'm the early 90s, another kid told me confidently that the USSR had landed on Mars, gathered rocks, filmed it and returned to earth(it now occurs to me that this homeschooled kid was confusing the real moon landing.) I remember knowing it was bullshit but not having a way to check the facts. The Internet solved that problem. Now, by God , the Internet has recreated the same problem.
I've always thought that bank statements should require cryptographic signatures for ledger balances. Same with individual financial transactions, especially customer payments.
Without this we're pretty much at the mercy of trust with banks and payment card providers.
I imagine there's a lot of integrity requirements for financial transactions on the back end, but the consumer has no positive proof except easily forged statements.
We need something akin to the simplicity and ubiquity of Google that does this, government funded and with transparent oversight. We're past the point of your aunt needing a way to quickly check if something is obvious bullshit.
Call it something like Exx-Ray, the two Xs mean double check - "That sounds very unlikely that they said that Aunt Pat... You need to Exx-Ray shit like that before you talk about it at Thanksgiving"
Or same thing, but with the word Check, CHEXX - "No that sounds like bullshit, I'm gonna CHEXX it... Yup that's bullshit, Randy."
I've been saying for a long time now that camera manufacturers should just put encryption circuits right inside the sensors. Of course that wouldn't protect against pointing the camera at a screen showing a deepfake or someone painstakingly dissolving top layers and tracing out the private key manually, but that'd be enough of the deterrent from forgery. And also media production companies should actually put out all their stuff digitally signed. Like, come on, it's 2024 and we still don't have a way to find out if something was filmed or rendered, cut or edited, original or freebooted.
I'm more interested in how exactly you'd implement something like this.
It's not like videos viewed on tiktok display a hash for the file you're viewing; and users wouldn't look at that data anyway, especially those that would be swayed by a deep fake...
The White House is increasingly aware that the American public needs a way to tell that statements from President Joe Biden and related information are real in the new age of easy-to-use generative AI.
Big Tech players such as Meta, Google, Microsoft, and a range of startups have raced to release consumer-friendly AI tools, leading to a new wave of deepfakes — last month, an AI-generated robocall attempted to undermine voting efforts related to the 2024 presidential election using Biden's voice.
Yet, there is no end in sight for more sophisticated new generative-AI tools that make it easy for people with little to no technical know-how to create fake images, videos, and calls that seem authentic.
Ben Buchanan, Biden's Special Advisor for Artificial Intelligence, told Business Insider that the White House is working on a way to verify all of its official communications due to the rise in fake generative-AI content.
While last year's executive order on AI created an AI Safety Institute at the Department of Commerce tasked with creating standards for watermarking content to show provenance, the effort to verify White House communications is separate.
Ultimately, the goal is to ensure that anyone who sees a video of Biden released by the White House can immediately tell it is authentic and unaltered by a third party.
The original article contains 367 words, the summary contains 218 words. Saved 41%. I'm a bot and I'm open source!
Tinfoil hat time. It's probably because they need to start creating AI videos to show he's 'competent and coherent' and they'll say their 'tests proves that it's a real video not a fake. And since the government said it's true, morons will believe it.