you might not see a difference between 100fps and 300fps for example, but the feel is certainly there, because those frames are still being rendered and is more information. it can be the difference between landing a headshot and missing for pro players, even if the refresh rate is lower, so its not wasted at all.
Already set. I'm not that competitive player and my reflexes are worse than a sloth's. So I didn't even bother to buy a higher refresh rate monitor than 60.
My 7800xt pulls about 230 watts at full bore, giving me my monitor's refresh rate in FPS, 144. Limiting the framerate to 72 results in no tearing and drops the GPU watts to 170. Worth it.
Retro gaming... Low res... CRT filter with the warped edges and blooooooom effects... this is the ideal way to play your SNES games. Try it with Super Metroid, that shit is straight up unnerving and beautiful.
This is why I do love that my PC is powered by renewable energy. It blows my mind how expensive power is everywhere else, plus I don't wanna game if it means I gotta roll coal like huge parts of the world.
I have a Watt-Meter right on my PC plug next to my monitor so I can always see how much I consume. It's crazy how much the monitors alone take up, it's kind 40 KW/h each. I'm considering removing one of them.
Neither hopefully. The former at least is a unit of power, but 40 kW is enough to heat up a whole apartment building.
In reality a large and older monitor might use a couple hundred watts. A small modern 24" will probably use closer to 50 W (guesstimating), which is still a decent chunk of the power draw of a budget build.
KWh is a measure of total energy, not instantaneous power. Your watt meter was saying that since last reset of the value it measured 40 KWh of energy use. That's not an insignificant amount - a Chevy Bolt can go around 180 miles on 40KWh.
Watts, or kilowatts, are instantaneous power. That same Bolt can easily pull 100KW while accelerating and if it could somehow do that for an hour, it would have used 100KWh. It could never make it the whole hour as it has a 65KWh battery, so it would run out after 39 minutes.
What you're describing is kWh, not kW/h. You need to multiply power with time to get back to energy. An appliance using 1kW of power for 1h "uses" 1kWh of energy. The same appliance running for 2h requires 2kWh instead.
kW/h doesn't really make sense as a unit, although it could technically describe the rate at which energy consumption changes over time.
A typical wall outlet can only draw 1800w (1.8kw) no way is it drawing 40kwh (kw/h is a nonsense unit in this context). If it's drawing 40wh ghats actually quite low, a typical monitor is closer to 80-100w while powered on.
Where I live electricity is about 10c/kwh (cheap I know) so a 100w monitor is costing me about a cent an hour. More than worth it imo but you make your own decisions.
I game on lower resolution because a lot of modern games are too hyper detailed for me and I get lost in the crisp information density. That and I hate the sound of computer fans
That's the crisp part. A lot of modern games seem to be obsessed with making every single pixel pop out at you. Rummaging around outside is not like that it's softer. A real world comparison would be something like malls which are obsessed with making every inch of visible space distinctly pop. I also hate being in malls
I've been replaying FFXIV recently and I had the game running at the maximum refresh rate of my monitor and it was making the fans run harder. it took me way too long to realise that I should just go a setting down for refresh rate instead of altering fan speeds.
Undervolt it too. Depending on what GPU you have, you might drop an additional 20-40% without a performance hit. An older GTX1070 I used to have dropped power consumption by 40%. The energy savings weren't that big, but it was nice and quiet