I tutor high school kids during the school year. They’re already ahead of this. They love asking ChatGPT for help on their homework. They’d rather get an AI to explain it to them than ask their teacher. They like that they can just keep asking as many questions about details as they want. They like that ChatGPT will not only give them general information but also tell them step by step how to do different kinds of projects.
Many of them have also lost a lot of respect for their teachers. They see the gaps in their teachers’ knowledge and they get extremely frustrated when a teacher penalizes their grades for not doing things exactly the way the teacher wants. This latter thing has been an annoyance for students since time immemorial but now they have more access than ever to outside opinions from resources like ChatGPT and other teachers’ lessons on YouTube.
As for hallucinations? These kids aren’t phased by them. They’re already used to their teachers telling them the wrong thing quite often. They probably trust the AI more than the teachers at this point!
I think the people that have the most to fear from ChatGPT are at Google’s search team. Young people are turning more and more to ChatGPT because Google search results have gotten so bad.
Teachers should be responding to all this by giving kids more time to read books in class. No tech policies at schools are also a big help. With no phones, laptops, or iPads in schools, kids will he forced to talk to and engage with their peers instead of scrolling social media.
The one thing I’m really afraid of is that kids will lose the ability to write without technological help. Perhaps teachers need to move to more in-class, pencil-and-paper writing assignments instead of essays at home.
Right? AI chatbots are so bad with hallucinations. As a programmer I can usually spot the inaccurate code chatbot is spewing, but these kids won't be able to spot the inaccuracy unless there's very strict quality control..which would require human teachers anyway.
And what would the damage be of they have errors in their code? They could learn to find those based on tests.
GPT is frigging amazing with coding. Slap down some points what to do and it shots out 400 lines of code that do exactly that. Seriously, it is not a god, but really far from as bad as people here make it seem.