What's resolution got to do with it?
What’s your favourite game? Does it take you back to certain time or place? Remind you of a person or a shared moment? Make you wish you could experience it all again for the first time?
Do you know what resolution it ran at? How about the frame-rate? V-Sync? What type of Anti- aliasing did it use? What engine did it run on?
These questions and myriad more are currently the bain of any AAA PR Managers existence.
Gamers have traditionally been inquisitive and competitive animals and in a medium that moves as fast as ours it’s a never ending race to top. PC Gamers stand at the summit with their ability to tinker and upgrade to their hearts content, moving with the goal posts. Console gamers are by comparison inherently restricted, the hardware is a snapshot in time which in the past used to rival and occasionally exceed what the average Joe’s wallet could cobble together in PC parts. These days consoles are already lagging behind on day one with their ageing graphics cards and processors simply unable to keep up.
Gamers have traditionally been inquisitive and competitive animalsNew consoles present developers with the challenge of working with this hardware before it’s even been released and more importantly before the tools and techniques have been established and refined. All the while standing in the shadow of last generations victorious hardware; the fruits of which remain fresh in consumer minds.
But are these expectations unreasonable? By the end of last generation we had some truly spectacular work with games like the Last of Us showcasing just what the hardware was capable of. Although comparing different architectures and calculating processing power in teraflops is rarely of any real world use, it’s safe to say the new generation of consoles are significantly more powerful than their predecessors.
So shouldn't surpassing the benchmarks we have come accustomed to be a reasonable expectation? Surely moving forwards it stands to reason that the much vaunted standard of games displaying in 1080p while running at 60 frames per second must be within reach by now?
Before we look at whether it’s possible it might be worth stopping and considering why it matters so much. Aside from wanting the highest fidelity experience at all times; are these specifications intrinsically linked to our enjoyment of a game? Certainly a badly performing product will deliver a sub par experience, but when you need to resort to pixel counting and frame rate measuring to discover these differences are we perhaps taking this pursuit for perfection too far?
The beginning of a console cycle is typically consumed by this obsession. Consoles aren't cheap and no one wants to make the wrong decision. Researching which machine offers the best performance is simply the prudent thing to do. Numbers and statistics offer a tangible, relateable way to assess one products merits over another. The problem with video games is they aren't a solely a quantifiable medium; you can only gather so much information from tech sheets and bullet points, there is no substitute for having a controller in your hand and playing it yourself and that interaction is direct and different for each player.
One thing we should all be able to agree on though is that the good games are more than just a sum of their parts, and the best games transcend and their hardware entirely. At no point during Bioshock’s opening sequence did I think that the game could benefit from a resolution higher than 720p or more anti aliasing, I was more worried about splicers caving my head in as I descended into Rapture, and I wouldn't have it any other way. I truly pity the player who is perpetually in search of the perfect video game, constantly haunted by every minor flaw or inadequacy, it doesn't sound like a fun way to experience a game at all.
Admittedly I was a deserter from the PC graphics battle years ago and have come to accept my place as a console gamer who simply can’t demand the same experience from underpowered hardware. However that rational approach to the reality of console gaming is obscured by the warring corporations desire to present their hardware as the best and only choice to enjoy gaming this generation. A campaign which has had the unfortunate side effect of inflating expectations to such a degree that when developers fail to hit marquee features like 1080p or 60 frames per second it leaves many disappointed.
Neither of those bullet points are unobtainable, Ridge Racer 7 hit both of them back on the PS3 as do Knack and Forza Motorsport 5 on the PS4 and Xbox One respectively. So what exactly is Watch Dogs problem? 900p on the PS4 and 792p on the Xbox One? Why so low?
“Resolution is a number, just like framerate is a number. All those numbers are valid aspects of making games,"Creative Director Jonathan Morin (Watch Dogs)
“People tend to look at corridor shooters, for example, where there's a corridor and all the effects are on and it's unbelievable, and they forget that if you apply those same global effects to an open city with people around and potential car crashes and guys in multiplayer showing up without warning, the same effect is applied to a lot of dynamic elements that are happening in every frame."
In short, it’s complicated.
Although a high resolution and frame rate would be nice and in some cases it's mandatory to the experience (think racing, fighting and twitch based shooters) most of the time it’s not worth sacrificing aspects of game play which might make the game more enjoyable. If there is one criticism I would level at this generation so far; it would be that the games are yet to feel different, im not experiencing new or different game play which the power afforded in the new boxes promised to deliver.
Sure I like it when a game looks great, Infamous Second Son is slick and breath taking, and on the horizon games like The Division never fail to make my eyes water. But as we know graphics, or at least our perception of them, is fickle. Long after we are looking down our noses at some of these shiny future titles they will hopefully remain in our hearts due to the game play, characters and experience they delivered.
Characters that remember the players actions and react accordingly as promised in the Shadow of Modor, improved offline AI opponents created from friend play data ala Forza 5, deep intricate worlds full of unique AI and interconnected game play systems like watch Dogs. These are the things that really make me excited as a gamer and this is just the beginning. Since the graphical return on the advent of each generation peaked, we have started to see excellent advances in other areas that make games great, the indie revival seems to be a direct manifestation of game play over graphics yet still the the only lines in the sand that matter are a set of arbitrary numbers which, in reality, tell you next to nothing about the actual game underneath.
The cynic in me resigns himself to the fact that as Moore's Law follows it's never ending trajectory we are destined to be forever distracted by the quest for bigger and brighter and will never settle for simply better. On the other hand the gaming industry is relatively young and most likely has many renaissances ahead which will affect not only the way we perceive the medium but inform the standards with which we quantify that enjoyment.
In the mean time ill just shift my chair back the 6 inches to off set the difference in fidelity from 900p down to 792p, that'll do.
To comment on what you've read here, click the Discussion tab and post a reply.