My +10 year old GTX780 would pull 300W at full tilt, and it has only a ridiculous fraction of the compute power. The radeon 6990 would pull +400W… high end GPUs have been fairly power hungry for literally more than a decade.
Haha yeah I mistyped the years, it was supposed to be +10 and not +20…nevertheless these cards have been pulling at least 3-400W for the past 15 years.
The GTX 480/580 was 250 in 2010. But then we got the GTX 590 dual GPU which more or less doubled
The 680 was a drop, but then they added the TIs/Titans and that brought us back up to high TDP flagships.
These cards have always been high for the time, but quickly that became normalized. Remember when 95 watt CPUs were really high? Yeah that’s a joke compared to modern CPUs. My laptops CPU draws 95 watts.
As it so happens around a decade ago there was period when they tried to make Graphics Cards more energy efficient rather than just more powerful, so for example the GTX 1050 Ti which came out in 2017 had a TDP of 75W.
Of course, people had to actually “sacrifice” themselves by not having 200 fps @ 4K in order to use a lower TDP card.
(Curiously your 300W GTX780 had all of 21% more performance than my 75W GTX1050 Ti).
Recently I upgraded my graphics card and again chose based on, amongst other things TDP, and my new one (whose model I don’t remember right now) has a TDP of 120W (I looked really hard and you can’t find anything decent with a 75W TDP) and, of course, it will never give me top of the range performance when playing games (as it so happens it’s mostly Terraria at the moment, so “top of the range” graphics performance would be an incredible waste for it) as I could get from something 4x the price and power consumption.
When I was looking around for that upgrade there were lots of higher performance cards around the 250W TDP mark.
All this to say that people chosing 300W+ cards can only blame themselves for having deprioritized power consumption so much in their choice often to the point of running a space heater with jet-engine-level noise from their cooling fans in order to get an extra performance bump that they can’t actually notice on a blind test.
High end GPUs are always pushed just past their peak efficiency. If you slightly underclock and undervolt them you can see some incredible performance per watt.
I have a 4090 that’s underclocked as low as it will go (0.875v on the core, more or less stock speeds) and it only draws about 250 watts while still providing like 80%+ the performance of the card stock. I had an undervolt that went to about 0.9 or 0.925v on the core with a slight overclock and I got stock speeds at about 300 watts. Heavy RT will make the consumption spike to closer to the 450 watt TDP, but that just puts me back at the same performance as not underclocked because the card was already downclocking to those speeds. About 70 of that 250 watts is my vram so it could scale a bit better if I found the right sweet spot.
My GTX 1080 before that was under volted, but left at maybe 5% less than stock clocks and it went from 180w to 120 or less.
My +10 year old GTX780 would pull 300W at full tilt, and it has only a ridiculous fraction of the compute power. The radeon 6990 would pull +400W… high end GPUs have been fairly power hungry for literally more than a decade.
GTX 780 released in 2013?
RTX 3090 was 350W?
RTX 4090 was 450W?
So if by decades you mean this generation… then sure.
Haha yeah I mistyped the years, it was supposed to be +10 and not +20…nevertheless these cards have been pulling at least 3-400W for the past 15 years.
The 8800 Ultra was 170 watts in 06
The GTX 280 was 230 in 08.
The GTX 480/580 was 250 in 2010. But then we got the GTX 590 dual GPU which more or less doubled
The 680 was a drop, but then they added the TIs/Titans and that brought us back up to high TDP flagships.
These cards have always been high for the time, but quickly that became normalized. Remember when 95 watt CPUs were really high? Yeah that’s a joke compared to modern CPUs. My laptops CPU draws 95 watts.
As it so happens around a decade ago there was period when they tried to make Graphics Cards more energy efficient rather than just more powerful, so for example the GTX 1050 Ti which came out in 2017 had a TDP of 75W.
Of course, people had to actually “sacrifice” themselves by not having 200 fps @ 4K in order to use a lower TDP card.
(Curiously your 300W GTX780 had all of 21% more performance than my 75W GTX1050 Ti).
Recently I upgraded my graphics card and again chose based on, amongst other things TDP, and my new one (whose model I don’t remember right now) has a TDP of 120W (I looked really hard and you can’t find anything decent with a 75W TDP) and, of course, it will never give me top of the range performance when playing games (as it so happens it’s mostly Terraria at the moment, so “top of the range” graphics performance would be an incredible waste for it) as I could get from something 4x the price and power consumption.
When I was looking around for that upgrade there were lots of higher performance cards around the 250W TDP mark.
All this to say that people chosing 300W+ cards can only blame themselves for having deprioritized power consumption so much in their choice often to the point of running a space heater with jet-engine-level noise from their cooling fans in order to get an extra performance bump that they can’t actually notice on a blind test.
High end GPUs are always pushed just past their peak efficiency. If you slightly underclock and undervolt them you can see some incredible performance per watt.
I have a 4090 that’s underclocked as low as it will go (0.875v on the core, more or less stock speeds) and it only draws about 250 watts while still providing like 80%+ the performance of the card stock. I had an undervolt that went to about 0.9 or 0.925v on the core with a slight overclock and I got stock speeds at about 300 watts. Heavy RT will make the consumption spike to closer to the 450 watt TDP, but that just puts me back at the same performance as not underclocked because the card was already downclocking to those speeds. About 70 of that 250 watts is my vram so it could scale a bit better if I found the right sweet spot.
My GTX 1080 before that was under volted, but left at maybe 5% less than stock clocks and it went from 180w to 120 or less.
That’s a good point and I need to start considering that option in any future upgrades.