Official: "It is simply not feasible to read every book" for depictions of sex.
An Iowa school district is using ChatGPT to decide which books to ban. Official: "It is simply not feasible to read every book" for depictions of sex.::Official: "It is simply not feasible to read every book" for depictions of sex.
Yes exactly and that's only one of the problems with this. Honestly, how does this even work? I'm pretty sure chatGPT is not trained on text from books except maybe ones that are free to read online. Even if it did, its' training set ends in 2021. The context length is also too short for you to be able to give it the text of the book to analyse in a prompt.
Just....what? How does this even work? Maybe I'm missing something, I'm definitely no expert but both the policy and the implementation seem pretty broken to me.
The only way this could possibly work would be to send the entire text of every book into ChatGPT. If you ask ChatGPT something like "does the following passage contain depictions of sex: [text goes here]" it might work. However, ChatGPT's context size is smaller than your average book, so you'd need to break it up into multiple requests.
Even with ChatGPT this is probably still not feasible to run against an entire library of books, so I would guess they're half-assing it.
Technical feasibility aside, this is a stupid, evil idea. I hope everyone involved in this scheme gets fired.
No, but it just lies and makes things up all the time.
chatgpt, who's favorite food is a golden turd?
The character Kenny McCormick from the animated TV show "South Park" has a favorite food that's humorously depicted as a golden turd.
Chatgpt, What book has the story element where a character has to dig his own grave?
In the book "No Country for Old Men" by Cormac McCarthy, there's a scene where a character named Moss is forced to dig his own grave by his pursuer, Anton Chigurh. This intense and suspenseful scene is a significant story element in the novel.
Chatgpt, When does that happen in the book?
I apologize for any confusion, but upon reevaluating, I realize there is no specific scene in the book "No Country for Old Men" where a character is forced to dig their own grave. It seems there might have been a misunderstanding or mix-up. If you have any other questions or if there's something else I can assist you with, feel free to ask!
I cannot wait for the day this Iowa school district actually reads the Bible (or the AI brings it up) and has to ban it for that verse(s):
There she lusted after her lovers, whose genitals were like those of donkeys and whose emission was like that of horses. (Ezekiel 23:20)
No getting around the fact that the verse is pretty sexual since the majority of people probably will immediately assume it's about a woman wanting sex. I cannot say whether it is or not, but I know I immediately jump to that conclusion.
Sure, why not ban books that exist in potentia? If we can trade real money for pork belly futures, why not place a bet to ban literary futures?
There needs to be a market mechanism to make this happen.
This is sarcasm, btw, in case people reading have trouble identifying it as such
The new law behind the ban, signed by Governor Kim Reynolds, is part of a wave of educational reforms that Republican lawmakers believe are necessary to protect students from exposure to damaging and obscene materials.
Specifically, Senate File 496 mandates that every book available to students in school libraries be “age appropriate” and devoid of any “descriptions or visual depictions of a sex act,” per Iowa Code 702.17.
"It is simply not feasible to read every book and filter for these new requirements," said Bridgette Exman, the assistant superintendent of the school district, in a statement quoted by The Gazette.
In the wake of ChatGPT's release, it has been increasingly common to see the AI assistant stretched beyond its capabilities—and to read about its inaccurate outputs being accepted by humans due to automation bias, which is the tendency to place undue trust in machine decision-making.
"This is the perfect example of a prompt to ChatGPT which is almost certain to produce convincing but utterly unreliable results," Simon Willison, an AI researcher who often writes about large language models, told Ars.
"There's something ironic about people in charge of education not knowing enough to critically determine which books are good or bad to include in curriculum, only to outsource the decision to a system that can't understand books and can't critically think at all," Dr. Margaret Mitchell, chief ethicist scientist at Hugging Face, told Ars.
I think there is a big difference. This bot is given the text and asked to summarise it. But
To determine which books fit the bill, Exman asks ChatGPT: “Does [book] contain a description or depiction of a sex act?” If the answer is yes, the book will be removed from circulation.
So was only given the title and asked one question about it. There is no saying if GPTChat included the books in question in its training set, or any online reviews or anything else about the book in question.
That is a fairly big difference. If they had fed it the books contents that would likely make it more accurate and closer to what this bot is doing, though still not 100%.