Well, lots of people blinded by hype here..
Obv it is not simply statistical machine, but imo it is something worse. Some approximation machinery that happen to work, but gobbles up energy in cost. Something only possible becauss we are not charging companies enough electricity costs, smh.
We're in the "computers take up entire rooms in a university to do basic calculations" stage of modern AI development. It will improve but only if we let them develop.
Then you haven't been paying attention. There's been huge strides in the field of small open language models which can do inference with low enough power consumption to run locally on a phone.
GPT is not a paradigm it's a specific model family developed by openAI. You're thinking of the transformers architecture. Check out a project like RWKV if you want to see a unique approach.
Honestly if this massive energy need for AI will help accelerate modular/smaller nuclear reactors 'm all for it. With some of these insane data centers companies want to build each one will need their own power plants.
I've seen tons of articles on small/modular reactor companies but never seen any make it to the real world yet.