I spend most of my time, most of my days, in front of my computer. Its where I write, game, watch television, check the market, and on and on goes the list of things you can do with an internet connection and some hardware. Something I’ve noticed lately, though, is that I don’t actually use my time very effectively when I should be, and I’ve begun to figure out that that’s got a lot to do with how I perceive the value of the odd hour spent in front of my PC.
To explain a bit, what I mean is that, as apposed to the value of an hour as I see it when I’m, say, exercising, an hour that passes by when I’m writing feels the same as the hour I spend when I’m playing League. And the hour I spend watching Youtube guides on how to properly cook 50 cents worth of ramen blurs itself in with the hour I spend editing a video. Clearly, these things shouldn’t be weighted equally. An hour of good, meaningful work will feel, should feel, more rewarding than five hours of pinging for a gank.
And yet, those hours, as long as I do a little bit of all those things, tend to mix and dilute their importance with each other. If I write articles for thirty minutes, write an email, and then play Dead by Daylight for the next 10 hours, I know that I didn’t spend my time wisely. And yet, I feel like I still got a lot done, since I don’t remember playing games for 10 hours and doing actual work for 1. I remember doing all of them over the course of 11 hours, and that’s, more or less, good enough to lull my brain into being okay with how terrible of a habit that is.
And here’s where I’d like introduce my topic for today: consoles. Specifically, console gaming. These days, most people understand that PC gaming is pretty much the best way to play video games as long as you are prepared to do a little research on the hardware you want before hand. The accessibility to games is unmatched by any of the major consoles, playing online comes free with your ISP, and the performance-ceiling at which those games can run is much higher than any console is reasonably going to be able to compete with. But all of that, I think, pales in comparison, for the average person, to the benefits of having a console at the ready.
Ease of access (save for this PS5 debacle going on), exclusives, and the ability to play local multiplayer with friends and family without needing two 600 dollar devices are some of the more commonly cited reasons to own a console today. Ignoring those, I think there’s one big reason why anyone who likes video games should own at least one major console in their living space: Detachment.
Having an area, a time, and (crucially) a single, separate device to sit down and play a game on is imperative for someone who is at home often for ensuring their aware of exactly what and how they’re spending their time. The difference between sitting down and messing around on a device that’s meant for gaming and messing around on a device that you do your work on is perception. The differences between those two activities begin to blur, and ultimately hurt the user in the long run, even if theoretically he or she could have taken steps to try and prevent that from happening. Its unavoidable, like having a TV on in front of your bed every night and soon being unable to sleep without also watching some TV first.
Hell, even if you don’t do your work over a computer, simply having something that’s designated for the sole purpose of entertainment helps keep the mind concentrated on what its supposed to be doing at any given time when not using that device. Its a worthy point to consider if you’re on the fence about buying one of the new generation consoles, if you, indeed, are able to buy one within the next decade.