It’s a bit strange to look around the NVIDIA market stalls and AMD hand-me-downs and see prices at MSRP or lower. It was only six months ago that I was writing about how these companies may very well try to keep the outrageous x2 / x3 pricing on their GPUs (I really should just call the relevant hardware “graphics cards”) around so long that they become the new norm but, thankfully, you can pickup an RTX 3070 for under 500 USD. Half a grand for some extra frames in Elden Ring is a dream come true, I know, you don’t need to tell me.
It’s a bit surreal to see normal pricing on what has been a market starved of reasonable price to performance ratios since the Covid pandemic. For some scale, the computer I’m writing this article on was an emergency upgrade after my last PC began to show its age (I was running an FX 8350 on that puppy). For 1,300 USD, I picked up a Ryzen 3600 paired with a GTX 1660 Super and 16GB of RAM. At the time, this was the best I could do, as the year was not 2022. Through and through, this PC rang true, and why I decided to throw in some poorly thought-out rhymes, I have no clue.
Suffice it to say that the performance I got for the price it was listed at makes me cringe today. For that same 1300 USD, I could pick up a Ryzen 5800X3D and at least an RTX 3060. My video editing would be miles smoother and my gaming experiences, well, smoother. The PC would be smoother. Smoothie-like.
The thing is, I don’t regret the purchase at all: it was much needed. All the same, I do stare at the market today and swallow my chagrin as I imagine all of the folks picking up quality builds for far less in price than I had experienced just a couple years ago. All of that is a besides the ultimate point of this article: I believe that GPU improvements are becoming obsolete to the average PC user.
The Improvement of Performance Doesn’t outweigh the Increased Cost of Power

Even if you have zero concern for the limited amount of power humanity has access to, or the effect on the environment that this power usage has, the improvements being made to the GPU today are both enthralling on a technical level but also worrying for those who don’t want their water cooling methods to consist of a full outdoor pool in the dead of winter. That is to say that the amount of power that the RTX 4090 is pulling for standard use is so high that the dissipated heat calls for increasingly powerful cooling solutions, which raises the overall power consumption to boot. If the 1,500 USD price for that GPU alone wasn’t enough to deter you, the electricity bill and extra costs associated with an expensive water cooling solution probably will be.
This is to say nothing of the actual use cases of these new and improved graphics cards. I mean, sure, the 3000 series cards were objectively great to have upon release, and every PC in existence benefited from having one. That said, since the 4000 series have released, I haven’t been able to think of any game, AAA or otherwise, where someone playing could stop and think “Damn it! I only have a 3080 GPU!”
I know this is the old-timer’s trap of believing that PC’s will never need more memory or performance beyond what they already have, but honestly I don’t see the need to have the performance to power improvements be as sharp as they are in today’s world. Most people think that the improvements of today allow developers to accomplish more tomorrow, but the evidence I have supports the idea that the improvements of today allow the developers to act a tad lazier, forever. Why optimize your game properly when your playerbases’ hardware pick up the slack for you, right? Looking at you, Elden Bling.
I realize that none of this matters to the science community or the crypto miners who don’t realize they are throwing away cash at electricity bills, as they do still benefit from being able to have more complicated problems solve faster and faster. But for gamers? For average users? The GPU improvements that we’ll see over the next decade are likely to be way, way more than the vast majority of the market requires. While today you see budget users using graphics solutions for one or two generations past, I predict a future where that gap increases to at least a few generations.
This means that new releases will start to be stricter on number of product, with a slower release of product into the public sphere to maintain a constant cash flow. In some sense, the new releases of GPUs will be like the server-based CPU releases in that they are meant for a niche audience that needs them for something other than personal use. This is all speculation, obviously, but speculation rooted in the likelihood that the energy costs and headache paired with building custom cooling solutions are too much for PC users to bother with once every couple of years.
Cheers and fears, may the earth burn so that my stream can obtain 4k resolutions and, no, I won’t be editing this article as I’m too busy enjoying League’s preseason (Gwen is sleeper OP).
GLHF,
-E