Some Thoughts on the GPU Market

It’s a bit strange to look around the NVIDIA market stalls and AMD hand-me-downs and see prices at MSRP or lower. It was only six months ago that I was writing about how these companies may very well try to keep the outrageous x2 / x3 pricing on their GPUs (I really should just call the relevant hardware “graphics cards”) around so long that they become the new norm but, thankfully, you can pickup an RTX 3070 for under 500 USD. Half a grand for some extra frames in Elden Ring is a dream come true, I know, you don’t need to tell me.

It’s a bit surreal to see normal pricing on what has been a market starved of reasonable price to performance ratios since the Covid pandemic. For some scale, the computer I’m writing this article on was an emergency upgrade after my last PC began to show its age (I was running an FX 8350 on that puppy). For 1,300 USD, I picked up a Ryzen 3600 paired with a GTX 1660 Super and 16GB of RAM. At the time, this was the best I could do, as the year was not 2022. Through and through, this PC rang true, and why I decided to throw in some poorly thought-out rhymes, I have no clue.

Suffice it to say that the performance I got for the price it was listed at makes me cringe today. For that same 1300 USD, I could pick up a Ryzen 5800X3D and at least an RTX 3060. My video editing would be miles smoother and my gaming experiences, well, smoother. The PC would be smoother. Smoothie-like.

The thing is, I don’t regret the purchase at all: it was much needed. All the same, I do stare at the market today and swallow my chagrin as I imagine all of the folks picking up quality builds for far less in price than I had experienced just a couple years ago. All of that is a besides the ultimate point of this article: I believe that GPU improvements are becoming obsolete to the average PC user.

The Improvement of Performance Doesn’t outweigh the Increased Cost of Power

This thing has a 450W TDP. GGWP, Earth.

Even if you have zero concern for the limited amount of power humanity has access to, or the effect on the environment that this power usage has, the improvements being made to the GPU today are both enthralling on a technical level but also worrying for those who don’t want their water cooling methods to consist of a full outdoor pool in the dead of winter. That is to say that the amount of power that the RTX 4090 is pulling for standard use is so high that the dissipated heat calls for increasingly powerful cooling solutions, which raises the overall power consumption to boot. If the 1,500 USD price for that GPU alone wasn’t enough to deter you, the electricity bill and extra costs associated with an expensive water cooling solution probably will be.

This is to say nothing of the actual use cases of these new and improved graphics cards. I mean, sure, the 3000 series cards were objectively great to have upon release, and every PC in existence benefited from having one. That said, since the 4000 series have released, I haven’t been able to think of any game, AAA or otherwise, where someone playing could stop and think “Damn it! I only have a 3080 GPU!”

I know this is the old-timer’s trap of believing that PC’s will never need more memory or performance beyond what they already have, but honestly I don’t see the need to have the performance to power improvements be as sharp as they are in today’s world. Most people think that the improvements of today allow developers to accomplish more tomorrow, but the evidence I have supports the idea that the improvements of today allow the developers to act a tad lazier, forever. Why optimize your game properly when your playerbases’ hardware pick up the slack for you, right? Looking at you, Elden Bling.

I realize that none of this matters to the science community or the crypto miners who don’t realize they are throwing away cash at electricity bills, as they do still benefit from being able to have more complicated problems solve faster and faster. But for gamers? For average users? The GPU improvements that we’ll see over the next decade are likely to be way, way more than the vast majority of the market requires. While today you see budget users using graphics solutions for one or two generations past, I predict a future where that gap increases to at least a few generations.

This means that new releases will start to be stricter on number of product, with a slower release of product into the public sphere to maintain a constant cash flow. In some sense, the new releases of GPUs will be like the server-based CPU releases in that they are meant for a niche audience that needs them for something other than personal use. This is all speculation, obviously, but speculation rooted in the likelihood that the energy costs and headache paired with building custom cooling solutions are too much for PC users to bother with once every couple of years.

Cheers and fears, may the earth burn so that my stream can obtain 4k resolutions and, no, I won’t be editing this article as I’m too busy enjoying League’s preseason (Gwen is sleeper OP).

GLHF,
-E

I Want Intel’s ARC GPUs to Succeed, But my Hopes Aren’t High

When I first got into PC gaming, AMD was basically the only company I really cared about. Before I even knew what a dedicated GPU was, I learned that, with AMD, all I needed was a single AM4 socket motherboard and one of their cheap APUs and I’d be set with an entry level gaming rig that would satisfy a newbie PC gamer like myself.

And it did. Hell, I don’t even think I was running AM4 at that time. Back then, I was using an A-10 series APU. And boy, did I use that fuckin’ thing till to its maximum. CS:GO? League? Sure. But what if I told you I would take that puppy and fire up ARMA 2’s DAYZ mod and happily play it at 25 FPS at 720p? Those were the days, I’ll tell you h-what.

These days, I can’t claim to be the most spoiled person in the word in regards to performance. Despite having a stronger knowledge of hardware, I haven’t actually cared to spend the money on any of the latest tech. I run a 1660 Super with a R5 3600 and, as far as I’m concerned, that’s all I need. 4k gaming? No thanks, not needed.

That said, I can stand to make note of the future of hardware’s direction even without direct experience with the stuffs, and I can stand taller yet in understanding that Intel’s ARC lineup of GPUs are, at best, going to act as a 2.5 billion dollar graceless buffer between the dedicated GPU market and Intel’s ironing out the seams of their new product.

Image via Intel

NVIDIA and AMD, who have both dug out respectable shares of the dedicated GPU market, have a new graphics card generation (supposedly) coming out in Q4 of this year. These newer cards are going to outdo the current circulation of cards by a considerable, unverified margin. NVIDIA’s 40 series GPUs are being produced on TSMC’s 4nm fabrication process and will retain a 300-400 dollar price point for the 4060 model. In today’s age, that performance is going to be more than enough for AAA gaming at 1080p (and likely 4k), and its going to do it at mid-range prices the likes of which we are only just starting to see again after the Covid-19 related silicon shortage. Meanwhile, we have yet to get a confirmed release date for the ARC graphics cards outside of a Korean laptop launch which played host to a plague of driver issues that have more or less dampened the hype surrounding the cards altogether.

When said cards do get a global release, they’re only just going to be able to compete with NVIDIA’s 30 generation, as demonstrated by a somewhat barebones test released by Intel. This means that NVIDIA will be a generation behind the competition with product that is only just strong enough to compete with a product that’s already been in circulation (and thus, discounted heavily) for more than a year.

Its a real shame, too, since having a third tried and true company to put the heat on AMD and NVIDIA in the graphics department would be a real win for consumers everywhere. Hell, it would probably be a huge win for the world as a whole when you consider the cementing of a stronger Intel fab process, cheaper market prices, alongside stronger consumer and server-based computing performance from all three companies to boot (assuming they can all keep pace).

The barrier between that distant, preferable reality and the one we may be in, which sees Intel eat their losses in full, is nothing short of a few tall cliffs that need climbing. In short, Intel’s ARC GPUs are something I desperately want to stick in the marketplace and pave the way for a consistent showing against NVIDIA and AMD. To do that, they’re going to need to be cheaper than the already discounted RTX and RX cards, strong enough to pass up on the new 40 and 7000 series cards, and also be void of any compatibility issues at global launch.

Basically, the odds are against Intel, but we’d all be better off if they could cause an upset.

GLHF,
-E

What Role Does an APU Play During a Silicon Chip Shortage?

Covid gave the world a good smacking, the world responded by shutting themselves indoors and purchasing a ton of computing power. Thus began the silicon shortage, or so the internet would have you believe.

I’m not here to argue over the cause for the world’s current chip shortage. Scalpers, Covid, automobiles with computer chips becoming more popular, an increase in online education, et cetera, and et cetera scalpers, I’m sure, all had a hand in pushing the current prices of graphics processing units to where they are today. But that’s all completely irrelevant for the point of this article.

Today, let’s talk about the APU, and how it can be the saving grace for people who want or need to build working computers for reasonable prices.

What’s an APU?

An APU, or accelerated processing unit, is the term assigned to CPU’s (central processing units) with integrated graphics. In short, an APU is a CPU with the GPU built right into it. Simple, right? Good.

APU’s retain the same quality of their CPU counterparts, but never stray into the realm of ‘high-performance’ because the GPU power of integrated graphics is somewhat limited. That is to say, an APU isn’t going to go full-out in CPU performance because it would be bottle necked by the integrated graphics available to it.

Right now, the best integrated graphics on the market is Vega-11, which comes with AMD’s best Ryzen APU’s. If that sounds complicated or you have no clue what these words mean, worry not. Vega-11 is just the name for an integrated graphics solution. Ryzen is a just a name for a series of CPU’s and APU’s. You don’t need to know what they mean, you just need to know that Vega-11 is the latest and greatest in integrated graphics, and compared to a standalone GPU’s performance, Vega-11 isn’t anything to write home about.

This leads back full circle. If Vega-11, the best integrated graphics available, isn’t too amazing (not bad, but not great,) then AMD can’t make high-end APU’s because an amazing CPU wouldn’t pair well with a mediocre GPU. It’s like taking a nice salmon fillet and covering it in ketchup, the fuck you doin’?

This means that when you look for an APU, you’re looking for, at best, a mid-tiered piece of computer hardware. You’re not trying to stream at 4k resolutions, you’re not trying to use video editing software or 3D model a new house for a client, and you’re not trying to build a PC that’s high end. You’re mediocre, and your hardware is mediocre. And that’s usually fine.

Why “Usually”?

I say usually because in a usual market, things are usual. But things are not usual, as I pointed out above. The market is wicked and wily, and standalone GPU’s are priced at premium, usually being marked up 300%-400% of their normal value. Now, most PC parts are marked up in some fashion because of this, but the CPU / APU really has gotten the softest hit in that regard. So, then, where does that leave an our fellow APU’s? Not Mediocre, but grandiose, hotman.

Let’s take a look at the going price for the Ryzen 5 3400G (a Ryzen APU of decent quality), and compare its price to its equal standalone components that you’d pay for if you were either APU-phobic or looking to buy a pre-built PC.

Image via AMD

Vega-11 graphics, which the 3400G comes with, is roughly equivalent in performance to the GT 1030, which is a standalone graphics card released in 2017 that runs for about $130. The closest equivalent standalone CPU you can get in relation to the 3400G is the Ryzen 5 2600, which runs for about $180 at the time of writing this article. Collectively, we have a price of $310 for two components that give equal (give or take some minute differences here and there) performance to a product that costs $220. That’s just under $100 off a product that’s easier to install, order on its own, and provides nearly identical performance to its given counterparts.

There’s no one explanation for why this price difference exists, but I’d guess with no research whatsoever that the discrepancy between the APU and the CPU / GPU combo comes down to simple demand. Gamers often say that APU’s aren’t great for performance, and so that trickles down to a decreased population willing to buy APU’s for their PC, even in the face of a $100 tax.

The Role of the APU in Today’s Market?

It’s a cheap product that can give entry-level to mid-tiered performance. It’s priced at a discount simply because gamers have a negative stigma towards the term “APU”. It’s an APU, and it’s really that simple.

The role the APU plays today is the same one it played years ago: a cheap and effective CPU / GPU for the all purpose PC. The only difference today is that it comes with a 30% discount to its peers.

GLHF
-E