I know game console launch hype get’s to people, but I’m still kinda shocked that the biggest forum meltdowns recently have been over, basically, a few hundred pixels in a couple games. I know a while back I said 1080p was kind of a big deal for consoles, but wow.
If you don’t know, there are rumors the Xbox One versions of Titanfall and Call of Duty Ghosts will run at 720p as opposed to native 1080p on the PS4 (which isn’t even 100 percent confirmed). I see people acting like this is gonna do serious damage to Xbox or something. Of course the meltdowns I’m seeing aren’t as big as those from Microsoft’s now defunct DRM policy for Xbox One, but I think they are pretty big given how little is at stake.
In January I did a post that was basically about how much it sucked looking at 720p console games on modern TVs. That was however the personal preference of myself and probably a lot of hardcore gamers. It’s probably why I’m gonna try to stick with PC for a wihle. I don’t actually think the differing resolutions between versions of a game are gonna make a big difference in sales. The more powerful console has never sold the most units.
Look, probably like 90 percent of the people who buy Call of Duty won’t even be able to tell the difference between the Xbox One and PS4 versions unless they’re sitting right next to each other. A lot of them probably already believe they’ve been playing all their current-gen games in 1080p. The only way you can indisputably determine a console game’s rendering resolution is with specific tools used by guys like DigitalFoundry. Everybody else just estimates based on aliasing.
Still, if any of this is true, along with the confirmation that Ryse will run at something like 900p, I do lament how console developers have essentially prioritized raw graphics over image quality. I would honestly say that IQ — that is the overall clarity of the image, has nearly the same impact on visual fidelity as the number of polygons and lighting effects underneath, if not more. On PC I’ve essentially had to pick and choose in some cases: turn the graphic settings up and deal with a slower framerate and a jagged image, or turn some settings down to get a clean image at a smooth framerate? I’ve made different choices with different games, but both are equally enjoyable in my opinion. I just wish console developers would realize this.
Maybe some of them are, which is why they’re boasting 1080p and 60fps on PS4. Guerilla games made the decision to have KillZone Shadow Fall’s multiplayer run at 60fps instead of the campaign’s 30, though that might just because of the multiplayer’s lack of enemy AI to burden the PS4’s CPU. Back at E3 I also noted how 343i made a point of announcing that the next Halo would run at 60fps on consoles.
Then again it could just be a convenient selling point while developers are still basically porting current-gen games to next-gen. What if they eventually just go back to pushing as many polygons and lights a possible at 720p and 30fps? Maybe if that doesn’t happen the average consumer on the street might become accustomed to native 1080p as a standard on consoles.
Whatever happens, when DigitalFoundry finally starts doing face-off articles between Xbox One and PS4 games, I will be reading the forum reactions with popcorn at the ready.
- Tips of horror game design from the Amnesia guys. http://t.co/neT9Lo7uXV
- That same site also just released a prototype build of Gone Home (which is half-off right now) running on the Amnesia engine.