It’s Time to Upgrade to an SSD

Solid State Drives, or SSDs, are computer hardware designed as storage that functions quicker than Hard Disk Drives, or HDDs. The age-old experience of seeing your computer slowly “die” isn’t a very difficult problem to diagnose or fix, since this is a product of a HDD being present in a system instead of an SSD. When equipped with an SSD, a PC can boot up quicker, load programs quicker, and when an SSD ages, it doesn’t slow down (though it will still inevitably die).

Most people make the mistake of assuming that storage devices don’t affect gaming, but in today’s age, this couldn’t be further from the truth. Moment-to-moment fluidity in gaming is largely defined in modern AAA titles by CPU and GPU performance, but the ability to load all of the data required for play in the background is now being shoved, increasing demand, onto the storage devices on a PC. And these days, the HDDs are just not cutting it.

And this is no arbitrary fortune-telling. CDPR just recently updated their minimum spec list for Cyberpunk 2077 to include an SSD. This isn’t an industry-wide standard quite yet, but the fact that games are trying, with increasing fervor, to oust loading screens from the experiences they provide means that we’re soon to be just there: in a world where HDDs are simply used as redundant arrays of disks for mass storage and SSDs are used for everything else.

Another large AAA title, Starfield, also came out with a minimum spec list emploring users to play the game with an SSD rigged onto their system. It should come as no surprise to use, then, when every other developer trying to chase that stardust these two companies enjoy follows suit.

The HDD is soon to become a redundant relic of the past for all but the most storage hungry of professionals. Better get ahead of the game and upgrade to an SSD, any SSD, soon, or feel the burn of being left in the dust.

Sony Sees Reduction in PS5 Game Sales | A Boring Console

I’ve always been a bit pessimistic when it came to the viability of buying a PS5 for the average gamer. Considering its $499 price tag for the standard, non-digital version coupled with a $70 price for AAA titles, the expense always seemed to be hanging just out of ‘justifiable’ range. Compared to a gaming PC, the price might seem quite low, but when you factor in the price of controllers and the fact that you have to pay to have the right to play online, the costs associated between the two platforms quickly closes, and PC gaming, with its constant supply of discounts and free online access (provided you’re provided), actually becomes cheaper within a year or two.

Aside from costs to the user, the amount of games that are actually worth playing on the PS5 also struck me as a little thin to be worth the purchase. If I were to pick it up, I’d also grab Demon Souls, The Last of Us 2, and something related to Spiderman before never touching the console again, most likely, probably. In short, these titles offer something to me that are exclusive to the PS5 where my PC can’t compete by virtue of not having access to the games (and that’s not even true anymore in relation to Spiderman: MM). It seems like the overall trend of PS5 purchases supports my thoughts regarding longevity with the console.

Image via Sony

With data provided by SportsLens, one can see rather clearly that the fervor initially met with the PS5’s release is calming in quick fashion. In the fiscal year of 2020, Sony saw 338.9 million sales of game titles. Those numbers are split between both physical copies and digital downloads, the latter of which made up roughly 70% of all sales, to no surprise at all. In FY 2022, however, those total figures dropped to 264.2 million sales, which shows a consistent downtrend in interest towards the console and its respective titles.

I couldn’t prove to you, definitively, that the core issue with these figures lies in the longevity of the console’s offered games, but consider the Nintendo Switch for a moment: A console that’s been supported for over six years and has access to a long line of exclusives such that Breathe of the Wild, Smash Bros Ultimate, and the newly released Tears of the Kingdom (just to name a few heavy hitters), is also the console that can be bought as a handheld for only $200, and functions on the go instead of being a strictly immobile console. These are attributes that the PS5, in the most generous of terms, can’t readily compete with.

For gamers who have been paying half attention, the aforementioned Tears of the Kingdom, alone, would be reason enough to go for a Switch over a PS5. The console is cheaper, and while the game is stuck at a new normal of $70, the purchase would serve as a buffer into the world of Nintendo games that are pretty much all worth your time by any reasonable standard of game review. They are, by and large, games that were made to be played offline, and with your friends and family, which means that you aren’t heavily tempted to buy into their online subscription model for a whopping 20 bucks per year (vs. the Playstation’s $60 annually).

When looking at the difference in sales from exclusive titles vs. third party creations, one can see a trend of decision making akin to what I outline above: people are buying the PS5 because it holds access to certain exclusives, while ditching it when offered the choice to play other titles on different consoles or the PC. Between FY 2021 and FY 2021, the decrease in sales from Sony exclusives dropped only 2%, while third party titles dropped a whopping 15%, suggesting that most people are only purchasing the console for games they can’t play elsewhere.

Its also inconvenient to note that, despite revamping their subscription model, Sony lost 600,000 in PS Plus users, which suggests that players aren’t just holding off on buying games, their also holding off on using the console at all, or at the very least aren’t impressed with the online selection offered by the PS5.

As much as I love the Playstation legacy and its ability to craft one of a kind experiences, as is exemplified by their longstanding relationship with Naughty Dog, for example, I’m a bit embarrassed to admit that their delivery on the PS5 experience has seemed a tad underwhelming, especially with the likes of Nintendo breathing down their necks. Maybe the second half of this year can change that?

GLHF,
-E

Some Thoughts on the GPU Market

It’s a bit strange to look around the NVIDIA market stalls and AMD hand-me-downs and see prices at MSRP or lower. It was only six months ago that I was writing about how these companies may very well try to keep the outrageous x2 / x3 pricing on their GPUs (I really should just call the relevant hardware “graphics cards”) around so long that they become the new norm but, thankfully, you can pickup an RTX 3070 for under 500 USD. Half a grand for some extra frames in Elden Ring is a dream come true, I know, you don’t need to tell me.

It’s a bit surreal to see normal pricing on what has been a market starved of reasonable price to performance ratios since the Covid pandemic. For some scale, the computer I’m writing this article on was an emergency upgrade after my last PC began to show its age (I was running an FX 8350 on that puppy). For 1,300 USD, I picked up a Ryzen 3600 paired with a GTX 1660 Super and 16GB of RAM. At the time, this was the best I could do, as the year was not 2022. Through and through, this PC rang true, and why I decided to throw in some poorly thought-out rhymes, I have no clue.

Suffice it to say that the performance I got for the price it was listed at makes me cringe today. For that same 1300 USD, I could pick up a Ryzen 5800X3D and at least an RTX 3060. My video editing would be miles smoother and my gaming experiences, well, smoother. The PC would be smoother. Smoothie-like.

The thing is, I don’t regret the purchase at all: it was much needed. All the same, I do stare at the market today and swallow my chagrin as I imagine all of the folks picking up quality builds for far less in price than I had experienced just a couple years ago. All of that is a besides the ultimate point of this article: I believe that GPU improvements are becoming obsolete to the average PC user.

The Improvement of Performance Doesn’t outweigh the Increased Cost of Power

This thing has a 450W TDP. GGWP, Earth.

Even if you have zero concern for the limited amount of power humanity has access to, or the effect on the environment that this power usage has, the improvements being made to the GPU today are both enthralling on a technical level but also worrying for those who don’t want their water cooling methods to consist of a full outdoor pool in the dead of winter. That is to say that the amount of power that the RTX 4090 is pulling for standard use is so high that the dissipated heat calls for increasingly powerful cooling solutions, which raises the overall power consumption to boot. If the 1,500 USD price for that GPU alone wasn’t enough to deter you, the electricity bill and extra costs associated with an expensive water cooling solution probably will be.

This is to say nothing of the actual use cases of these new and improved graphics cards. I mean, sure, the 3000 series cards were objectively great to have upon release, and every PC in existence benefited from having one. That said, since the 4000 series have released, I haven’t been able to think of any game, AAA or otherwise, where someone playing could stop and think “Damn it! I only have a 3080 GPU!”

I know this is the old-timer’s trap of believing that PC’s will never need more memory or performance beyond what they already have, but honestly I don’t see the need to have the performance to power improvements be as sharp as they are in today’s world. Most people think that the improvements of today allow developers to accomplish more tomorrow, but the evidence I have supports the idea that the improvements of today allow the developers to act a tad lazier, forever. Why optimize your game properly when your playerbases’ hardware pick up the slack for you, right? Looking at you, Elden Bling.

I realize that none of this matters to the science community or the crypto miners who don’t realize they are throwing away cash at electricity bills, as they do still benefit from being able to have more complicated problems solve faster and faster. But for gamers? For average users? The GPU improvements that we’ll see over the next decade are likely to be way, way more than the vast majority of the market requires. While today you see budget users using graphics solutions for one or two generations past, I predict a future where that gap increases to at least a few generations.

This means that new releases will start to be stricter on number of product, with a slower release of product into the public sphere to maintain a constant cash flow. In some sense, the new releases of GPUs will be like the server-based CPU releases in that they are meant for a niche audience that needs them for something other than personal use. This is all speculation, obviously, but speculation rooted in the likelihood that the energy costs and headache paired with building custom cooling solutions are too much for PC users to bother with once every couple of years.

Cheers and fears, may the earth burn so that my stream can obtain 4k resolutions and, no, I won’t be editing this article as I’m too busy enjoying League’s preseason (Gwen is sleeper OP).

GLHF,
-E

I Want Intel’s ARC GPUs to Succeed, But my Hopes Aren’t High

When I first got into PC gaming, AMD was basically the only company I really cared about. Before I even knew what a dedicated GPU was, I learned that, with AMD, all I needed was a single AM4 socket motherboard and one of their cheap APUs and I’d be set with an entry level gaming rig that would satisfy a newbie PC gamer like myself.

And it did. Hell, I don’t even think I was running AM4 at that time. Back then, I was using an A-10 series APU. And boy, did I use that fuckin’ thing till to its maximum. CS:GO? League? Sure. But what if I told you I would take that puppy and fire up ARMA 2’s DAYZ mod and happily play it at 25 FPS at 720p? Those were the days, I’ll tell you h-what.

These days, I can’t claim to be the most spoiled person in the word in regards to performance. Despite having a stronger knowledge of hardware, I haven’t actually cared to spend the money on any of the latest tech. I run a 1660 Super with a R5 3600 and, as far as I’m concerned, that’s all I need. 4k gaming? No thanks, not needed.

That said, I can stand to make note of the future of hardware’s direction even without direct experience with the stuffs, and I can stand taller yet in understanding that Intel’s ARC lineup of GPUs are, at best, going to act as a 2.5 billion dollar graceless buffer between the dedicated GPU market and Intel’s ironing out the seams of their new product.

Image via Intel

NVIDIA and AMD, who have both dug out respectable shares of the dedicated GPU market, have a new graphics card generation (supposedly) coming out in Q4 of this year. These newer cards are going to outdo the current circulation of cards by a considerable, unverified margin. NVIDIA’s 40 series GPUs are being produced on TSMC’s 4nm fabrication process and will retain a 300-400 dollar price point for the 4060 model. In today’s age, that performance is going to be more than enough for AAA gaming at 1080p (and likely 4k), and its going to do it at mid-range prices the likes of which we are only just starting to see again after the Covid-19 related silicon shortage. Meanwhile, we have yet to get a confirmed release date for the ARC graphics cards outside of a Korean laptop launch which played host to a plague of driver issues that have more or less dampened the hype surrounding the cards altogether.

When said cards do get a global release, they’re only just going to be able to compete with NVIDIA’s 30 generation, as demonstrated by a somewhat barebones test released by Intel. This means that NVIDIA will be a generation behind the competition with product that is only just strong enough to compete with a product that’s already been in circulation (and thus, discounted heavily) for more than a year.

Its a real shame, too, since having a third tried and true company to put the heat on AMD and NVIDIA in the graphics department would be a real win for consumers everywhere. Hell, it would probably be a huge win for the world as a whole when you consider the cementing of a stronger Intel fab process, cheaper market prices, alongside stronger consumer and server-based computing performance from all three companies to boot (assuming they can all keep pace).

The barrier between that distant, preferable reality and the one we may be in, which sees Intel eat their losses in full, is nothing short of a few tall cliffs that need climbing. In short, Intel’s ARC GPUs are something I desperately want to stick in the marketplace and pave the way for a consistent showing against NVIDIA and AMD. To do that, they’re going to need to be cheaper than the already discounted RTX and RX cards, strong enough to pass up on the new 40 and 7000 series cards, and also be void of any compatibility issues at global launch.

Basically, the odds are against Intel, but we’d all be better off if they could cause an upset.

GLHF,
-E

What Role Does an APU Play During a Silicon Chip Shortage?

Covid gave the world a good smacking, the world responded by shutting themselves indoors and purchasing a ton of computing power. Thus began the silicon shortage, or so the internet would have you believe.

I’m not here to argue over the cause for the world’s current chip shortage. Scalpers, Covid, automobiles with computer chips becoming more popular, an increase in online education, et cetera, and et cetera scalpers, I’m sure, all had a hand in pushing the current prices of graphics processing units to where they are today. But that’s all completely irrelevant for the point of this article.

Today, let’s talk about the APU, and how it can be the saving grace for people who want or need to build working computers for reasonable prices.

What’s an APU?

An APU, or accelerated processing unit, is the term assigned to CPU’s (central processing units) with integrated graphics. In short, an APU is a CPU with the GPU built right into it. Simple, right? Good.

APU’s retain the same quality of their CPU counterparts, but never stray into the realm of ‘high-performance’ because the GPU power of integrated graphics is somewhat limited. That is to say, an APU isn’t going to go full-out in CPU performance because it would be bottle necked by the integrated graphics available to it.

Right now, the best integrated graphics on the market is Vega-11, which comes with AMD’s best Ryzen APU’s. If that sounds complicated or you have no clue what these words mean, worry not. Vega-11 is just the name for an integrated graphics solution. Ryzen is a just a name for a series of CPU’s and APU’s. You don’t need to know what they mean, you just need to know that Vega-11 is the latest and greatest in integrated graphics, and compared to a standalone GPU’s performance, Vega-11 isn’t anything to write home about.

This leads back full circle. If Vega-11, the best integrated graphics available, isn’t too amazing (not bad, but not great,) then AMD can’t make high-end APU’s because an amazing CPU wouldn’t pair well with a mediocre GPU. It’s like taking a nice salmon fillet and covering it in ketchup, the fuck you doin’?

This means that when you look for an APU, you’re looking for, at best, a mid-tiered piece of computer hardware. You’re not trying to stream at 4k resolutions, you’re not trying to use video editing software or 3D model a new house for a client, and you’re not trying to build a PC that’s high end. You’re mediocre, and your hardware is mediocre. And that’s usually fine.

Why “Usually”?

I say usually because in a usual market, things are usual. But things are not usual, as I pointed out above. The market is wicked and wily, and standalone GPU’s are priced at premium, usually being marked up 300%-400% of their normal value. Now, most PC parts are marked up in some fashion because of this, but the CPU / APU really has gotten the softest hit in that regard. So, then, where does that leave an our fellow APU’s? Not Mediocre, but grandiose, hotman.

Let’s take a look at the going price for the Ryzen 5 3400G (a Ryzen APU of decent quality), and compare its price to its equal standalone components that you’d pay for if you were either APU-phobic or looking to buy a pre-built PC.

Image via AMD

Vega-11 graphics, which the 3400G comes with, is roughly equivalent in performance to the GT 1030, which is a standalone graphics card released in 2017 that runs for about $130. The closest equivalent standalone CPU you can get in relation to the 3400G is the Ryzen 5 2600, which runs for about $180 at the time of writing this article. Collectively, we have a price of $310 for two components that give equal (give or take some minute differences here and there) performance to a product that costs $220. That’s just under $100 off a product that’s easier to install, order on its own, and provides nearly identical performance to its given counterparts.

There’s no one explanation for why this price difference exists, but I’d guess with no research whatsoever that the discrepancy between the APU and the CPU / GPU combo comes down to simple demand. Gamers often say that APU’s aren’t great for performance, and so that trickles down to a decreased population willing to buy APU’s for their PC, even in the face of a $100 tax.

The Role of the APU in Today’s Market?

It’s a cheap product that can give entry-level to mid-tiered performance. It’s priced at a discount simply because gamers have a negative stigma towards the term “APU”. It’s an APU, and it’s really that simple.

The role the APU plays today is the same one it played years ago: a cheap and effective CPU / GPU for the all purpose PC. The only difference today is that it comes with a 30% discount to its peers.

GLHF
-E