Some Thoughts on the GPU Market

It’s a bit strange to look around the NVIDIA market stalls and AMD hand-me-downs and see prices at MSRP or lower. It was only six months ago that I was writing about how these companies may very well try to keep the outrageous x2 / x3 pricing on their GPUs (I really should just call the relevant hardware “graphics cards”) around so long that they become the new norm but, thankfully, you can pickup an RTX 3070 for under 500 USD. Half a grand for some extra frames in Elden Ring is a dream come true, I know, you don’t need to tell me.

It’s a bit surreal to see normal pricing on what has been a market starved of reasonable price to performance ratios since the Covid pandemic. For some scale, the computer I’m writing this article on was an emergency upgrade after my last PC began to show its age (I was running an FX 8350 on that puppy). For 1,300 USD, I picked up a Ryzen 3600 paired with a GTX 1660 Super and 16GB of RAM. At the time, this was the best I could do, as the year was not 2022. Through and through, this PC rang true, and why I decided to throw in some poorly thought-out rhymes, I have no clue.

Suffice it to say that the performance I got for the price it was listed at makes me cringe today. For that same 1300 USD, I could pick up a Ryzen 5800X3D and at least an RTX 3060. My video editing would be miles smoother and my gaming experiences, well, smoother. The PC would be smoother. Smoothie-like.

The thing is, I don’t regret the purchase at all: it was much needed. All the same, I do stare at the market today and swallow my chagrin as I imagine all of the folks picking up quality builds for far less in price than I had experienced just a couple years ago. All of that is a besides the ultimate point of this article: I believe that GPU improvements are becoming obsolete to the average PC user.

The Improvement of Performance Doesn’t outweigh the Increased Cost of Power

This thing has a 450W TDP. GGWP, Earth.

Even if you have zero concern for the limited amount of power humanity has access to, or the effect on the environment that this power usage has, the improvements being made to the GPU today are both enthralling on a technical level but also worrying for those who don’t want their water cooling methods to consist of a full outdoor pool in the dead of winter. That is to say that the amount of power that the RTX 4090 is pulling for standard use is so high that the dissipated heat calls for increasingly powerful cooling solutions, which raises the overall power consumption to boot. If the 1,500 USD price for that GPU alone wasn’t enough to deter you, the electricity bill and extra costs associated with an expensive water cooling solution probably will be.

This is to say nothing of the actual use cases of these new and improved graphics cards. I mean, sure, the 3000 series cards were objectively great to have upon release, and every PC in existence benefited from having one. That said, since the 4000 series have released, I haven’t been able to think of any game, AAA or otherwise, where someone playing could stop and think “Damn it! I only have a 3080 GPU!”

I know this is the old-timer’s trap of believing that PC’s will never need more memory or performance beyond what they already have, but honestly I don’t see the need to have the performance to power improvements be as sharp as they are in today’s world. Most people think that the improvements of today allow developers to accomplish more tomorrow, but the evidence I have supports the idea that the improvements of today allow the developers to act a tad lazier, forever. Why optimize your game properly when your playerbases’ hardware pick up the slack for you, right? Looking at you, Elden Bling.

I realize that none of this matters to the science community or the crypto miners who don’t realize they are throwing away cash at electricity bills, as they do still benefit from being able to have more complicated problems solve faster and faster. But for gamers? For average users? The GPU improvements that we’ll see over the next decade are likely to be way, way more than the vast majority of the market requires. While today you see budget users using graphics solutions for one or two generations past, I predict a future where that gap increases to at least a few generations.

This means that new releases will start to be stricter on number of product, with a slower release of product into the public sphere to maintain a constant cash flow. In some sense, the new releases of GPUs will be like the server-based CPU releases in that they are meant for a niche audience that needs them for something other than personal use. This is all speculation, obviously, but speculation rooted in the likelihood that the energy costs and headache paired with building custom cooling solutions are too much for PC users to bother with once every couple of years.

Cheers and fears, may the earth burn so that my stream can obtain 4k resolutions and, no, I won’t be editing this article as I’m too busy enjoying League’s preseason (Gwen is sleeper OP).


I Want Intel’s ARC GPUs to Succeed, But my Hopes Aren’t High

When I first got into PC gaming, AMD was basically the only company I really cared about. Before I even knew what a dedicated GPU was, I learned that, with AMD, all I needed was a single AM4 socket motherboard and one of their cheap APUs and I’d be set with an entry level gaming rig that would satisfy a newbie PC gamer like myself.

And it did. Hell, I don’t even think I was running AM4 at that time. Back then, I was using an A-10 series APU. And boy, did I use that fuckin’ thing till to its maximum. CS:GO? League? Sure. But what if I told you I would take that puppy and fire up ARMA 2’s DAYZ mod and happily play it at 25 FPS at 720p? Those were the days, I’ll tell you h-what.

These days, I can’t claim to be the most spoiled person in the word in regards to performance. Despite having a stronger knowledge of hardware, I haven’t actually cared to spend the money on any of the latest tech. I run a 1660 Super with a R5 3600 and, as far as I’m concerned, that’s all I need. 4k gaming? No thanks, not needed.

That said, I can stand to make note of the future of hardware’s direction even without direct experience with the stuffs, and I can stand taller yet in understanding that Intel’s ARC lineup of GPUs are, at best, going to act as a 2.5 billion dollar graceless buffer between the dedicated GPU market and Intel’s ironing out the seams of their new product.

Image via Intel

NVIDIA and AMD, who have both dug out respectable shares of the dedicated GPU market, have a new graphics card generation (supposedly) coming out in Q4 of this year. These newer cards are going to outdo the current circulation of cards by a considerable, unverified margin. NVIDIA’s 40 series GPUs are being produced on TSMC’s 4nm fabrication process and will retain a 300-400 dollar price point for the 4060 model. In today’s age, that performance is going to be more than enough for AAA gaming at 1080p (and likely 4k), and its going to do it at mid-range prices the likes of which we are only just starting to see again after the Covid-19 related silicon shortage. Meanwhile, we have yet to get a confirmed release date for the ARC graphics cards outside of a Korean laptop launch which played host to a plague of driver issues that have more or less dampened the hype surrounding the cards altogether.

When said cards do get a global release, they’re only just going to be able to compete with NVIDIA’s 30 generation, as demonstrated by a somewhat barebones test released by Intel. This means that NVIDIA will be a generation behind the competition with product that is only just strong enough to compete with a product that’s already been in circulation (and thus, discounted heavily) for more than a year.

Its a real shame, too, since having a third tried and true company to put the heat on AMD and NVIDIA in the graphics department would be a real win for consumers everywhere. Hell, it would probably be a huge win for the world as a whole when you consider the cementing of a stronger Intel fab process, cheaper market prices, alongside stronger consumer and server-based computing performance from all three companies to boot (assuming they can all keep pace).

The barrier between that distant, preferable reality and the one we may be in, which sees Intel eat their losses in full, is nothing short of a few tall cliffs that need climbing. In short, Intel’s ARC GPUs are something I desperately want to stick in the marketplace and pave the way for a consistent showing against NVIDIA and AMD. To do that, they’re going to need to be cheaper than the already discounted RTX and RX cards, strong enough to pass up on the new 40 and 7000 series cards, and also be void of any compatibility issues at global launch.

Basically, the odds are against Intel, but we’d all be better off if they could cause an upset.