Blade Runner director Ridley Scott calls AI a "technical hydrogen bomb" | "we are all completely f**ked"
Blade Runner director Ridley Scott calls AI a "technical hydrogen bomb" | "we are all completely f**ked"

Blade Runner director Ridley Scott calls AI a "technical hydrogen bomb"

Blade Runner director Ridley Scott calls AI a "technical hydrogen bomb" | "we are all completely f**ked"::undefined
I'm sure that a film director is an expert on the technical underpinnings of large language models, which primarily are used to generate blocks of text that have the appearance of being coherent.
Several departments where I work had massive layoffs in favour of implementing customized versions of GPT4 chatbots (both client facing services and internal stuff). That’s just the LLM end of AI.
That’s not even considering the generative image spectrum of AI. I fear for my companies graphics, web design, and UX/UI teams who will probably be gone this time next year.
I work freelance but occasionally needed to partner with artists and other stuff. But I now use various “ai” projects and no longer need to pay people to do the with as the computer can do it good enough.
I’m not some millionaire, I’m just a guy trying to save money to buy a house one day, so it’s not like a large economic impact, but I can’t be the only one.
Ux is not about drawing pictures. That work is already automated by ui kits anyway. Ux is about thinking through requirements and research.
We're a long way out from that fortunately.
Not saying that some jobs won't be cut/lost, but the companies doing that were likely looking for reasons to downsize.
AI models do not replace competent UI/UX. That's just not what they're designed to do. Very different functions.
I can tell you now that AI won't come for UX/UI teams, at least not in the near future. Clients rarely are able to really articulate what they need out of software and until AI is smart enough to suss that out, we're good. That being said, I'm sure there will be companies that try to go that route but I doubt it will work, again, in the near term.
deleted
I use Copilot in my work, and watching the ongoing freakout about LLMs has been simultaneously amusing and exhausting.
They're not even really AI. They're a particularly beefed-up autocomplete. Very useful, sure. I use it to generate blocks of code in my applications more quickly than I could by hand. I estimate that when you add up the pros and cons (there are several), Copilot improves my speed by about 25%, which is great. But it has no capacity to replace me. No MBA is going to be able to do what I do using Copilot.
As for prose, I've yet to read anything written by something like ChatGPT that isn't dull and flavorless. It's not creative. It's not going to replace story writers any time soon. No one's buying ebooks with ChatGPT listed as the author.
sigh. Can we please stop this shitty argument?
They are. In a very broad sense. They are just not AGI.
Saying this is like saying your a particularly beefed-up bacteria. In both cases they operate on the same basic objective, survive and reproduce for you and the bacteria, guess the next word for llm and auto-complete, but the former is vastly more complex in the way it achieves those goals.
An 85 year old film director*
Yes, I thought he was talking about the film industry ("we're fucked") and how AI is/would be used in movie. In which case he would be competent to talk about it.
But he's just confusing science-fiction and reality. Maybe all those ideas he's got will make good movies, but they're poor predictions.
You don't need to be an expert to see a demo and understand what you can do with the tech.
You kinda do, as anyone in tech that has ever had to communicate with customers can attest to.