BlueMonday1984 @ BlueMonday1984 @awful.systems Posts 42Comments 521Joined 1 yr. ago
If these nuclear plants manage to come to fruition, it'll be the sole miniscule silver lining of the bubble. Considering its AI, though, I expect they'll probably suffer some kind of horrific Chernobyl-grade accident which kills nuclear power for good, because we can't have nice things when there's AI involved.
Not a sneer, but an unsurprising development: Bluesky's seeing a surge in users:
In other news, Elon actually did it
the lasting legacy of GenAI will be a elevated background level of crud and untruth, an erosion of trust in media in general, and less free quality stuff being available.
I personally anticipate this will be the lasting legacy of AI as a whole - everything that you mentioned was caused in the alleged pursuit of AGI/Superintelligencetm, and gen-AI has been more-or-less the "face" of AI throughout this whole bubble.
I've also got an inkling (which I turned into a lengthy post) that the AI bubble will destroy artificial intelligence as a concept - a lasting legacy of "crud and untruth" as you put it could easily birth a widespread view of AI as inherently incapable of distinguishing truth from lies.
It was a pretty good comment, and pointed out one of the possible risks this AI bubble can unleash.
I've already touched on this topic, but it seems possible (if not likely) that copyright law will be tightened in response to the large-scale theft performed by OpenAI et al. to feed their LLMs, with both of us suspecting fair use will likely take a pounding. As you pointed out, the exploitation of fair use's research exception makes it especially vulnerable to its repeal.
On a different note, I suspect FOSS licenses (Creative Commons, GPL, etcetera) will suffer a major decline in popularity thanks to the large-scale code theft this AI bubble brought - after two-ish years of the AI industry (if not tech in general) treating anything publicly available as theirs to steal (whether implicitly or explicitly), I'd expect people are gonna be a lot stingier about providing source code or contributing to FOSS.
The top comment's also pretty good, especially the final paragraph:
I guess these companies decided that strip-mining the commons was an acceptable deal because they’d soon be generating their own facts via AGI, but that hasn’t come to pass yet. Instead they’ve pissed off many of the people they were relying on to continue feeding facts and creativity into the maws of their GPUs, as well as possibly fatally crippling the concept of fair use if future court cases go against them.
In other news, a lengthy report about Richard Stallman liking kids just dropped.
Hacker News has a thread on it. Its a dumpster fire, as expected.
Quick sidenote, you cocked up the formatting on the hyperlink - you're supposed to put [text in square brackets and](the link in circle brackets) like this
Zitron's given commentary on PC Gamer's publicly pilloried pro-autoplag piece:
He's also just dropped a thorough teardown of the tech press for their role in enabling Silicon Valley's worst excesses. I don't have a fitting Kendrick Lamar reference for this, but I do know a good companion piece: Devs and the Culture of Tech, which goes into the systemic flaws in tech culture which enable this shit.
Does anyone read these things before or after they’re sent?
It sounds like spam - by my guess, they usually aren't read at all.
New pair of Tweets from Zitron just dropped:
I also put out a lengthy post about AI's future on MoreWrite - go and read it, its pretty cool
Eigen "Turn the Homeless into Soylent" robot strikes again
I meant it for the stubsack, didn't realise
Google finally did it, they made Chrome completely unusable
neil turkewitz coming in with a wry comment about AI's legal issues:
And, because this is becoming so common, another sidenote from me:
With the large-scale art theft that gen-AI has become thoroughly known for, how the AI slop it generates has frequently directly competed with its original work (Exhibit A), the solid legal case for treating the AI industry's Biblical-scale theft as copyright infringement and the bevvy of lawsuits that can and will end in legal bloodbaths, I fully expect this bubble will end up strengthening copyright law a fair bit, as artists and megacorps alike endeavor to prevent something like this ever happening again.
Precisely how, I'm not sure, but to take a shot in the dark I suspect that fair use is probably gonna take a pounding.
PC Gamer put out a pro-AI piece recently - unsurprisingly, Twitter tore it apart pretty publicly:
I could only find one positive response in the replies, and that one is getting torn to shreds as well:
I did also find a quote-tweet calling the current AI bubble an "anti-art period of time", which has been doing pretty damn well:
Against my better judgment, I'm whipping out another sidenote:
With the general flood of AI slop on the Internet (a slop-nami as I've taken to calling it), and the quasi-realistic style most of it takes, I expect we're gonna see photorealistic art/visuals take a major decline in popularity/cultural cachet, with an attendant boom in abstract/surreal/stylised visuals
On the popularity front, any artist producing something photorealistic will struggle to avoid blending in with the slop-nami, whilst more overtly stylised pieces stand out all the more starkly.
On the "cultural cachet" front, I can see photorealistic visuals becoming seen as a form of "techno-kitsch" - a form of "anti-art" which suggests a lack of artistic vision/direction on its creators' part, if not a total lack of artistic merit.
Here's a better idea - treat anything from ChatGPT as a lie, even if it offers sources
(I really should get to this toxic productivity write-up I’ve been meaning to do for a year now,)
Go for it, Mii - I'd be happy to read it.
I was focusing more on the fact Justine failed to recognise Minimax had failed at its only job (giving her...whatever that anim is...instead of something actually 8-bit), but yeah all that sucks too