For the love of God, if you're a junior programmer you're overestimating your understanding if you keep relying on chatGPT thinking 'of course I'll spot the errors'. You will until you won't and you end up dropping the company database or deleting everything in root.
All ChatGPT is doing is guessing the next word. And it's trained on a bunch of bullshit coding blogs that litter the internet, half of which are now chatGPT written (without any validation of course).
If you can't take 10 - 30 minutes to search for, read, and comprehend information on stack overflow or docs then programming (or problem solving) just isn't for you. The junior end of this feel is really getting clogged with people who want to get rich quick without doing any of the legwork behind learning how to be good at this job, and ChatGPT is really exarcebating the problem.
ChatGPT is banned by my employer, because they don't want trade secrets being leaked, which IMO is fair enough. We work on ML stuff anyway.
Anyway, we have a junior engineer that has been caught using ChatGPT several times, whether it's IT flagging its use, seeing a tab open in their browser during a demo, or simply just seeing code they obviously didn't write in code I'm reviewing.
I recently tried to help them out on a project that uses React, and it is clear as day that this engineer cannot write code without ChatGPT. The library use is all over the place, they'll just "invent" certain API's, or they'll use things that were deprecated/don't work if you've even attempted to think about the problem. IMO, reliance on ChatGPT is much worse than how juniors used to be reliant on Stack Overflow to find answers to copy paste.
I strongly advise not to do that. As others pointed out, it really is just predicting the next word. It is worth learning about how to problem solve and to recognize that the only way to become a better program is with practice. It's better to get programming advice from real people online and read the documentations for the functions and languages you are trying to use.
I've always, always been a documentation-only guy. Meaning I almost never use anything other than the documentation for the languages and libraries I use. I genuinely don't feel that I'm missing out on anything, I already write code faster than my peers and I don't feel the need to try to be some sort of 10x developer.
I literally cannot comprehend coding with ChatGPT- How can I expect something to work if I don't understand it, and how can I understand it if I don't code and debug it myself? How can you expect to troubleshoot any issues afterwards if you don't understand the code? I wouldn't trust GPT for anything more complex than Hello World.
If you're doing something extremely skillfully, chat gpt will make the dumbest suggestions ever...
Chatgpt is good for learning ideas and new things as an aggregate of what everyone thinks about it. But as a coding tool it cannot reason properly and has rubber stamp solutions for everything.
Today we have chatbots. Yesterday we had search engines and stack overflow. Before that we had books. And before that? Well what do you know... software programming is a relatively novel field. It's almost as if nobody has perfected how it should be learned.
The most valuable knowledge comes from experience. I copied plenty of code around during my learning days as well, and I still do it today. The most important part however is trying to understand the code you're working with. If you can understand it, know when it fails, test it in the right way, etc., then sure, you could probably learn to code from chatbots. They provide the information, and you're at liberty to do what you want with it. If you just copy it and forget, you'll be a bad programmer. But it's not like you couldn't do that before either with the other sources that were available - there were plenty of bad programmers before we had these tools available too.
That said, there is a risk that these chatbots do not provide any useful context around the code that they produce. When you learned from a book or stack overflow, you were reading from a reasonably authoritative source that could explain the code that was produced. But the authority behind the code from chatbots is probably much weaker than what we have from stack overflow, which in turn was probably also weaker than what we have from books. Does it have an effect or learning? I have no clue. But I still think you can learn from chatbots if you use the output that they provide in the right way. (Disclaimer: I have never used one of them and have no experience with them.)
As someone who is learning, I think it's imperative to understand that chatgpt has limitations that cannot be overlooked. It's pretty good if I make some silly syntax or formatting errors, but at the core I have to understand what I'm working with if I want to be a better programmer. I love the conversational nature because I often have a hard time wording questions, so it helps me in that regard as well. Idk if you want to be truly good at something you have to be more reliant on yourself than external tools.
ChatGPT was never made for programming and is horrible at generating code. It is nice for a peer-programming kinda setup tho, because it can quickly point you towards tools, libraries, APIs etc. to use