AI’s voracious need for computing power is threatening to overwhelm energy sources, requiring the industry to change its approach to the technology, according to Arm Holdings Plc Chief Executive Officer Rene Haas.
AI’s voracious need for computing power is threatening to overwhelm energy sources, requiring the industry to change its approach to the technology, according to Arm Holdings Plc Chief Executive Officer Rene Haas.
The only really useful AI thing is the denoiser in Adobe Lightroom. I can shoot pictures in pitch black darkness with the highest ISO settings. Obviously it is a grainy mess. The denoiser manages to clean that up while retaining all of the details. It's really fucking great!
Sounds useful, but not at all worth the amount of energy being used to produce AI. You could just use that energy to feed/house people who could do the labor of denoising.
Come on that's not fair, it's very good* at drawing album covers and video game assets, which gives more time to artists to go work for Starbucks or Amazon instead of doing something they actually enjoy.
Correction, AI in the LLM/Diffusion sense is a decent tutor for cheap. Can cobble together rough temp art, and if used by an actually capable artist, make cool stuff.
Anything else and it's a garbage firehose, it's the undisputed king of mediocrity. Which, given the standards of SPAM and the modern web, is exactly what it's being used for.
I like talking to AI because it uses grammar properly and I don't have to structure my questions as though I'm dealing with an insecure retard on the internet.
I don't disagree that they should back up their claim, but it does intuitively make sense. AI - GPT LLMs in particular - are typically designed to push the limits of what modern hardware can provide - essentially eating whatever power you can throw at it.
Pair this with a huge AI boom and corporate hype cycle, and it wouldn't surprise me if it was consuming an incredible amount of power. It's reminiscent of Bitcoin, from a resource perspective.
No, it makes no sense. India has over a billion people. There's no way that amount of computing power could just magically have poofed into existence over the past few years, nor the power plants necessary to run all of that.
Because they're not actually pulling too much from the grid to cause damage to others or even the grid itself.
Any musings about curtailing AI due to power consumption is just bullshit for clicks. We'll improve efficiency and increase productivity, but we won't reduce usage.
We comprehensively investigate this question across 34 models and five standard pretraining datasets (CC-3M, CC-12M, YFCC-15M, LAION-400M, LAION-Aesthetics), generating over 300GB of data artifacts. We consistently find that, far from exhibiting "zero-shot" generalization, multimodal models require exponentially more data to achieve linear improvements in downstream "zero-shot" performance, following a sample inefficient log-linear scaling trend.
It's taking exponentially more data to get better results, and therefore, exponentially more energy. Even if something like analog training chips reduce energy usage ten fold, the exponential curve will just catch up again, and very quickly with results only marginally improved. Not only that, but you have to gather that much more data, and while the Internet is a vast datastore, the AI models have already absorbed much of it.
The implication is that the models are about as good as they will be without more fundamental breakthroughs. The thing about breakthroughs like that is that they could happen tomorrow, they could happen in 10 years, they could happen in 1000 years, or they could happen never.
Fermat's Last Theorem remained an open problem for 358 years. Squaring the Circle remained open for over 2000 years. The Riemann Hypothesis has remained unsolved after more than 150 years. These things sometimes sit there for a long, long time, and not for lack of smart people trying to solve them.
This focus on individual applications shifts blame onto consumers, when we should be demanding that energy prices include the external cost of production. It's like guilt tripping over the "carbon footprint" (invented by big oil) of your car.
ML is not an ENIAC situation. Computers got more efficient not by doing fewer operations, but by making what they were already doing much more efficient.
The basic operations underlying ML (e.g. matrix multiplication) are already some of the most heavily optimized things around. ML is inefficient because it needs to do a lot of that. The problem is very different.
There's an entire resurgence of research into alternative computing architectures right now, being led by some of the biggest names in computing, because of the limits we've hit with the von Neumann architecture as regards ML. I don't see any reason to assume all of that research is guaranteed to fail.
Yeah, uh huh, efficiency isn't really a measure of absolute power use, it's a measure of how much you get done with the power. Nobody calls you efficient if you do nothing and use no power to do that nothing. Google, Amazon, Microsoft, and Meta all together could not get anything done as companies if they all had to split an ENIAC (vastly less powerful than an older model iPhone) between them. This is a completely meaningless comparison.
Absolute power consumption does matter, but global power consumption is approximately 160,000 TWh, so the doubling means all the largest cloud providers all together are now using less than 0.05% of all the energy used across the world. And a chunk of that extra 36 TWh is going to their daily operations, not just their AI stuff.
The more context I add in to the picture, the less I'm worried about AI in particular. The overall growth model of our society is the problem, which is going to need to have political/economic solutions. Fixating on a new technology as the culprit is literally just Luddism all over again, and will have exactly as much impact in the long run.