Friday, December 21, 2012

Is PC gaming truly dead?

An interesting discussion was brought up the other day on the NeoGAF forums. The initial question asked was "Will next-gen console hardware be more powerful than current PCs?"

My initial reaction was simple and straightforward.

Then, as I read through the thread, someone brought up a good point, one that I hadn't considered: How long since we've actually seen a PC game? I mean, a real PC game? Think about it; just about everything released these days is a port from console, or at least built with console in mind. The result has skewed the mindset of PC gamers into something different than it used to be.

The crux of the argument is that PC gamers have gotten used to running everything at absolute maximum settings, and getting stellar performance out of it. I can load up the latest big game, and play it at a higher resolution than the consoles, in a way that looks a lot better, and at a much higher framerate. The problem is that, because this generation has been filled with console games ported to PC, with its much more powerful hardware, this has become the norm. All games run at 1080p60 (1920x1080 Full-HD resolution, at 60 frames per second, the maximum that the average PC monitor can display).

The user in question reminded us of the fact that this wasn't always the case... maximum settings at good framerates used to be reserved for only the most expensive übercomputer, one that the average PC gamer (who spends a good amount of money on a system to begin with) could never hope to afford. That's the whole reason PC games have all those settings in the first place, is scalability. The option to turn something down that's slowing down performance, in order to get the game to run better. In a "true" PC game, it was impossible to run it at maximum settings, at a playable framerate, anyway. The last "true" game we've seen that does this is the original Crysis. To this day, the game won't run at 1080p60 even on my system (which is a bit of a beast). And that game's seven years old.

It's not just the console ports, though. I do have some games that are PC-exclusive... Hard Reset comes to mind. Yes, it has all those options to turn down to improve performance. But, to be honest, I never had to.. I could run the game out of the box at 1080p60 at maximum settings. For whatever reason, while it was designed for PC gamers from a gameplay and control standpoint, it was built more or less to console specs as far as graphics goes. The game is very pretty, but it's not really doing anything groundbreaking that can't be done on a console.

There are still many PC-exclusives, of course. RTS games like Starcraft II, multiplayer shooters like Planetside 2. Various complicated RPGs and MMOs that can only be run using a "controller" with 104 buttons. But none of them are designed to push PCs to the limit anymore. None of them are pushing photorealistic graphics that will bring even the mightest PC to its knees. Crytek claims that the upcoming Crysis 3 will do this, but it's based on the same engine as the previous game, which runs fine, and is also scalable to consoles. While I do believe they plan on making the PC version noticeably better than its console bretheren this time, I don't believe I'll have too much trouble running it (my PC meets the specs of their "high performance" recommendation). In terms of "Yay, a real PC game!!", I may wind up happy if I have to turn some of the settings down a little.

So think about it, fellow gamers with beastly machines... When was the last time you had to turn something down? And how much did it bug you to have to do it?

That said, I don't believe that the PC is dead as a gaming platform, and that's not the point of all this. But we're just another gaming platform now.. Playstation, Xbox, PC. And I'll continue playing that way, to be sure. But it certainly isn't as special as it used to be.