Oh boy can't wait to have cups that burn a hole right through their coolers
I'd really love it if we'd just have a generation or two where they focused on making cpus more efficient and less hot rather than ramping power every generation , same with gpus
This only got bad with the most recent generation of CPUs. AMD 5xxx series is very efficient as demonstrated by Gamers Nexus. The Intel CPUs from 2500k to idk, 8xxx series? were efficient until they started slapping more cores and then cranking the power on them.
The 7 series are more efficient than the 5 series. They just are programmed to go as fast as thermals allow. So the reviewers that had really powerful coolers on the cpus saw really high power draw. If instead you set a power cap, you get higher performance per watt than the previous generations.
Having the clocks scale to a thermal limit is a nice feature to have, but I don't think it should have been the default mode.
Yes the second thing about cranking power and cores is what I'm talking about.
Also, as far as gpus, the 2000 series was ridiculously power hungry at the time, and it looks downright reasonable now. It's like the Overton window of power consumption lol.
I dunno, I ran a 2080 on the same PSU that I used on a 2013 build, a 650W seasonic. Got some graphs? Power consumption didn’t seem to jump that bad until the latest gen.
My current 3090 is a power hog though, that’s when I’d say it started for Nvidia (3000 series). For AMD, 7000 series CPUs, and I’m not really sure for Intel. 9900k was the last Intel CPU I ran, it seemed fine. I was running a 9900k/2080 on the same PSU as the 2500k/570 build.
As for as the 2080 goes, like I said, it was big FOR THE TIME, and power hungry FOR THE TIME. It's still reasonable especially for today's standards
As for as the last two gens, 3000 and 4000 series, they are known to draw more than their rated power requirements, which, for their min recommended psu wattage, 3080 was 50 watts more than the 2080 (750w), and 4080 was 100 w more than that (850w)
To add to that, both of these gens of cards, when doing graphics intensive things like gaming, can overdraw power and have been known to cause hard shutdowns in pcs with PSUs that are even slightly higher rated than their min rec. Before these last two gens you could get away with a slightly lower than rated wattage PSU and sacrifice a little performance but that is definitely no longer the case.
And sure, the performance to watts used is better in the 3080, but they also run 10+ degrees hotter and the 4000 series even moreso.
I just hope the 5000 series goes the way of power consumption refinement rather than smashing more chips onto a board or vram fuckery like with the 4060, like I'd be happy with similar performance on the 5000 series if it was less power hungry
Intel became less efficient because of how long they were stuck on 14nm. In order to compensate to beat amd in performance mindshare, they needed to push the clocks hard.
Overtime, cpus have been sitting closer to max clock, defeating the purpose of overclocking to many, where adding 1GHz was not out of the ordinary. Now getting 0.5GHz is an acheivement.
AMD uses 290/390 to compete with Nvidias 970, people buy Nvidia, shoulda bought a 390 meme is born after the 3.5 gb vram controversy happens. AMd mocked for high power consumption.
AMD releases 6000 series gous to compete with Nvidias Ampere line, uses a notibly significant lower power draw, people still buy Nvidia.
That's because Nvidia still has the leg up on rtx, but that doesn't mean Nvidia shouldn't be thinking about it. I'm not talking about what the market directs them to do, I'm talking about what I hate personally
I mean they did this generation technically. All of the rtx 4000 cards sans the 4090 are fairly efficient... only because nvidia moved the names of the gpu for each tier thats not the halo card.
Point is, you cant have everything and people generally prioritize performance first. Because efficiency has rarely gave either gpu company more profit gpu wise.
If you cared about efficiency, Nvidia answer to people would be buying their RTX 4000 SFF Ada(75w ~3060ti perf) or RTX 6000 Ada... if you can afford it.
I felt the same when the current-gen CPUs were announced, but when I looked closer at AMD's chips, I learned that they come with controls for greatly reducing the power use with very little performance loss. Some people even report a performance gain from using these controls, because their custom power limits avoid thermal throttling.
It seems like the extreme heat and power draw shown in the marketing materials are more like competitive grandstanding than a requirement, and those same chips can instead be tuned for pretty good efficiency.
Yeah I'm talking about Nvidia and Intel here, but tbh ryzen 4000 cpus run pretty hot, but they also optimized ryzen quite a bit before they changed to this new chip set, which makes sense to me. Seems like Nvidia and Intel are worried about what looks good power wise on paper rather than optimization sometimes.
I know someone who works at Nvidia, and he said the problem is that Moore's law is dead. Apparently the only way we can generate more performance right now is to input more energy and/or increase size.
Obviously that doesn't scale forever, and the 40 series are already fucking massive. So where does that leave us with the 50 series? We need some breakthrough.