Skip Navigation

Court Bans Use of 'AI-Enhanced' Video Evidence Because That's Not How AI Works

gizmodo.com Court Bans Use of 'AI-Enhanced' Video Evidence Because That's Not How AI Works

This AI hype cycle has dramatically distorted society's views of what's possible with image upscalers.

Court Bans Use of 'AI-Enhanced' Video Evidence Because That's Not How AI Works

A judge in Washington state has blocked video evidence that’s been “AI-enhanced” from being submitted in a triple murder trial. And that’s a good thing, given the fact that too many people seem to think applying an AI filter can give them access to secret visual data.

230

You're viewing a single thread.

230 comments
  • Imagine a prosecution or law enforcement bureau that has trained an AI from scratch on specific stimuli to enhance and clarify grainy images. Even if they all were totally on the up-and-up (they aren't, ACAB), training a generative AI or similar on pictures of guns, drugs, masks, etc for years will lead to internal bias. And since AI makers pretend you can't decipher the logic (I've literally seen compositional/generative AI that shows its work), they'll never realize what it's actually doing.

    So then you get innocent CCTV footage this AI "clarifies" and pattern-matches every dark blurb into a gun. Black iPhone? Maybe a pistol. Black umbrella folded up at a weird angle? Clearly a rifle. And so on. I'm sure everyone else can think of far more frightening ideas like auto-completing a face based on previously searched ones or just plain-old institutional racism bias.

    • just plain-old institutional racism bias

      Every crime attributed to this one black guy in our training data.

You've viewed 230 comments.