There’s been a lot of debate lately about graphics and resolution, particularly as things like Assassin’s Creed: Unity and The Evil Within come out with locked-in 30 FPS with statements such as “It’s more cinematic” or the resolution only getting 900px due to console limitations.
For more or less the first time in memory, game creators are calling for people to stop worrying about the graphics as much. For years, publishers and developers have been selling games based on constantly increasing the resolution, graphical fidelity, and other graphic parts… and now it seems they have hit a bit of a wall and are trying to come up with various reasons to say stop worrying about it.
Graphics and FPS in Modern Gaming - The Root of the Issue
The issue here, especially with them dismissing it as a nonissue, is that for years the same companies pushed it so far and hard. There was no end of talk about next-generation graphics on each new console generation. Pushing the boundaries was what they sold to the public. They needed the highest and best graphics to get the utmost gaming experience.
While there are numerous examples of games that have sold heavily on graphics and such – perhaps better examples than this as it also had good gameplay – Crysis is one game that may always be linked to graphics. Crytek has prided itself on pushing the boundaries of graphics as far and as much as anyone over the years, and Crysis, in fact, had issues selling because of how far it stretched graphical cards.
“We tried to make everything more cinematic and more realistic," Cevat Yerli, Crytek president and CEO in 2007
That statement can be viewed as a way that many games were marketed during this era. Crysis was the bleeding edge in many ways; in fact, its development process going hand in hand with the new DirectX 10 and Vista meant that when it was released, you’d have to buy a whole new computer to get the best performance out of it. Each console generation was similarly designed, and often demos were about wowing the audience with the graphics.
Today it seems between the cost of consoles and PCs being more powerful than ever that, a snag has been hit. In fact, on discussions of whether or not 1080p and 60 fps should be a target, Ubisoft – who now develops Far Cry - has some different views about it going forward. Their lead creative director for Far Cry 4, Alex Hutchingson, said: “It’s certainly not something I care about in a game.” Far Cry is going to be set at 1080p and 30 FPS.
Hutchingson went on to say, "With the 4K TVs and things - somebody was telling me that with a 4K TV, to even see it, your living room has to be big enough to sit like 12 feet from the screen. I don't know the exact numbers, but it starts to get a little crazy. I'm just in it for the experience. I'll play a SNES game if it's cool."
That isn’t the only issue from Ubisoft lately, as Assassin’s Creed: Unity has been in the news for it partially due to locking at 900p and 30 fps. Nicolas Guérin, their world Level Designer said the following on it "At Ubisoft for a long time, and we wanted to push 60 fps. I don't think it was a good idea because you don't gain that much from 60 fps, and it doesn't look like the real thing. It's a bit like The Hobbit movie, and it looked really weird.”
The Creative Director for Assassin’s Creed: Unity, Alex Amancio, added, "30 was our goal. It feels more cinematic. 60 is really good for a shooter, but action adventure, not so much. It actually feels better for people when it's at that 30fps. It also lets us push the limits of everything to the maximum.”
Graphics and FPS in Modern Gaming - Pushing Boundries
For years they have pushed the boundaries and told everyone to upgrade the software so they can run it – and now that the consoles can’t consistently do it, they are coming up with reasons. In fact, here we can see the different Ubisoft studios disagree somewhat as Alex Amancio’s quote would seem to say Far Cry should run at 60 fps… but Alex Hutchingson is releasing it at 1080p and 30 FPS.
Nor is Ubisoft the only one to hide behind the ‘it’s more cinematic’ reasoning. Jason X – a senior producer at Bethesda – said that they were targeting 30 FPS as it made more sense for a survival horror game. While on PC, Bethesda has patched it up to 60 and is officially supporting that. It took a lot of outcries for that to happen.
In a lot of ways, there are two different issues here that tie into the same point about graphics. The first is resolution, and the second is Frames per Second. Resolution is, in my opinion, directly traceable to the years of pushing graphics and going on about how shiny the graphics were. To now be saying it doesn't matter after you sold that way for so long is inevitably going to bring some kickback. You created a situation where the consumer is constantly expecting better and better graphics and sold it to them that way, and now you no longer seem to be able to. In the long term, it may work out better as people focus on gameplay and art style, but it should surprise no one that people are upset about this.
As for 30 vs. 60 FPS, this is a much harder one to defend on the ‘cinematic’ reasoning. There’s a big difference here because movies are not interactive, and thus the updating of it doesn’t matter near as much; even if the human eye cannot catch every single degree, it makes a more responsive experience. If you have to go to 30 FPS for reasons of performance on consoles, say that, and I’m pretty sure there’ll be more acceptance than the weird cinema explanations. It will still get panned if you insist on doing it on PC as well because it says you aren’t putting effort into your port, but at least you’re being honest.
So, in short, the game developers made this bed by constantly selling on improving graphics, and now that hardware is flat-lining, some have to live with it. My suggestion is to be forthright about it, work on your console ones, and when you port it over to a PC, don’t try selling a used bill of goods.