A lot of this content was already auto-generated in the sense that there are a lot of sites which operate on the business model of generic structure, data scrape, generate article about release date of popular movie or game. I imagine as a replacement of those sites this might actually be a mild improvement... well until it starts hallucinating release dates and also performs worse than human scrapers confusing new movies with ones they are remakes of or just have similar titles.
I wish Google would just start having a policy of immediately delisting those websites from search results, or at least deprioritizing or graying them out. If it falls under a certain category of website, it should meet certain quality standards. Like I assume they already do for medical information.
If they had a more general search version of Google Scholar where it was all stuff that Google reasonably thinks was actually made by other humans, that would improve things a lot.
Also ad "features that exist, are now shittier, and btw, you now have to pay and oh also btw yes you will have ads even with payment. Problem? Go fuck yourself."
I mean for research I'd prefer to use either a university or reference library. But even an underfunded library with banned books is a better resource than a largely unmonitored black box that confidently lies to you. At least when you look for a resource and it's not there you're certain in the lack of information.
i think this is a genuine possibility in schools now, at least in my country. maybe not removing access to the internet entirely, but at least severely restricting it, and going back to analog teaching methods wherever possible.
we were very early on digitalizing education btw, giving every elementary school kid their own tablet, implementing digital tools in almost every subject, and in recent years, replacing physical books with digital copies.
and surprise surprise, it's been a fucking disaster. i can't even imagine what it'll be like as AI gets more widespread
How do you find your research without any search engines and/or do you think the engine you're using will never try to implement an llm-based search? Corporate has its fist deep in academia these days...
On the bright side I used to be extremely online but shit has been getting steadily worse for so long that I don't even have to set limits for myself anymore because there's like 30 minutes per day of worthwhile internet.
Yeah, the main sites are somehow managing to squander their infinite content factories. It’s like the procedurally generated video games where everything is theoretically unique, but you learn to recognize the patterns and everything feels the same.
why? genuinely who does this help and how does it make google money? it seems like they're paying for the energy for ai content in exchange for absolutely nothing
God, it’s sad but you’re probably right. We had to implement something AI-related at work because the board all had massive hard ons for the buzzwords. They literally could not have given less of a shit what we used it for. We had full autonomy as long as ChatGPT ended up in our dependency tree somewhere.
The theory is that people don't want to click through blue links trying to find a source (or sources) they can trust, they rather want an instant summarised answer to any question. Google already does instant summarised answers for things like "when is the next public holiday" - generative AI content would expand these instant answers to any question, at the cost of accuracy. Google thinks ChatGPT is taking their market share (which it kinda is, and kinda was a year ago when they started developing this). The big idea of this new feature from Google is to retain market share, which is a prerequisite to making money.
I thought Google was already incorporating some machine learning stuff into the core search algorithm anyways, which would be a much better use than directly making up sentences.
Bluesky threads are already full of people laughing at this "pivot to video" moment. I'm pretty sure they didn't even bother to read the article. It's a typical social media site. Everybody is like-insane. Minutes - even seconds - count. Post first - read later if at all.
I think this is awful. Aggressive plagiarism by Google could (will?) make it a big success.
Google calls its AI answers “overviews” but they often just paraphrase directly from websites.
[...]
Jake Boly, a strength coach based in Austin, has spent three years building up his website of workout shoe reviews. But last year, his traffic from Google dropped 96 percent. Google still seems to find value in his work, citing his page on AI-generated answers about shoes. The problem is, people read Google’s summary and don’t visit his site anymore, Boly said.
---
Edit
To be clear - my main point is that I think Google is going to plagiarize as much as they want to try to get the shit to work. They won't be stopped by congress and they won't be stopped by the courts. Will plagiarizing work well enough to generate aiShittyText that Joe Schmo who shops at Walmart, isn't tech savvy will happily consume? It might be a ginormous flop. But my gut says Google's plan might work.
Rant: Holy mother of fuck. I haven't had pointed online convo outside of Hexbear in a very long time. I totally forgot how annoying the net can be. Reddit is bad enough but Bluesky can be like trying yell an argument through a keyhole due to the 300 character limit.
I agree, I think this could work. Google already has featured snippets, this just feels like an extension to that.
I'm pretty sure those snippets often screwed over the sites they were taken from too, because people read them but don't click through. But the AI summary ensures they get even less credit/ad revenue.
Any high-value search terms and Google hides the summary. So you either get ads or AI slop for every search.
It seems like it's going to force people to make their websites less accessible or something, to prevent Google from getting the full answer. Like they'll have some leading information indexable by Google and the rest of the answer will be in a video. Or maybe websites already do this in some way?
It seems like trying to monetize publicly available content on the internet is a crapshoot anyways. Hence why websites have paywalls, or additional stuff that you pay for to go along with the content, like merchandise.
It seems like it's going to force people to make their websites less accessible or something, to prevent Google from getting the full answer.
AI projects are laughable in many ways and google has a long track record of failure in most of its projects. But this to me feels different. Google is the monster in the room when it comes to search. I think google will give small web publishers a horrible choice.
Block us? We will fuck you over by never showing your sites in search results. Of course - we'll lie and say everything was aboveboard. The algos made the choices all by themselves!
Allow us to suck up your data is better and the intelligent choice. We'll give you some crumbs, peasants. Something is better than starving, right?
Sometimes when I Google the legality of certain things in my state, it brings up the laws in other states at the very top box lol. Can’t wait for AI to make search results good against completely making shit up instead of giving me inaccurate answers
You can sunbathe naked on Sunday afternoon only nude beach in the park when you dance and sing naked on Sunday in the park in the afternoon. Statute 69:420 Corollary Bro.
In one of the forums for the niche hobby I'm into, Google snippets have already been causing chaos for years.
Eg: "You're wrong because Google says that I need to do this". Well Google is wrong and doing that is the entire reason why you're having so many problems.
When investigating how the snippet was made it's either from a review or forum comment made by a newbie that somehow got traction or often someone saying DON'T do it that way, but their algorithm doesn't pick up the nuance and gets it twisted.
More recently it's been through taking snippets of entirely AI pages that write absolute gibberish which sound impressive to people with only a passing familiarity.
I wonder if this will happen: "To make a chocolate milkshake: 1. Thank you kind stranger!..." And then news will break that Google is claiming a malformed algorithm caused their AI to suck up the entirety of Reddit. And after that Google will be forced to admit "Oh, oops!" the wonky algorithm caused Google to suck up ginormous amounts of data from 10,000s of sites on the net. Then they'll say they're "untraining" which is another big lie. All they'll do as fast as they can is smooth out plagiarism so they can have deniability.
I wouldn't be surprised if a few years from now Google's legal team is at the supreme court claiming something ridiculous. Some of the best legal minds in the US are pushing the bullshit idea that AI cannot plagiarize because it doesn't know what plagiarism actually is. And the GOP majority seems to love the idea.
Three quarters of that is idiots not only believing what they see on the internet, but believing the first thing they see at the top of search results, without context.
We should be using computers as a means of interfacing with things generated by other humans, not expecting some simple algorithm to read our minds.
Given the current state of google search results this really just sounds like cutting out the middleman. Complaints from SEO powered garbage like the spruce fall on deaf ears.