Is PS4 for more pixels while Xbox One is meant for less with some enhancements? Both?
Edit: By "Both?" in the title, I mean that I wondered if both consoles were meant for the same graphics style or not. Wanted to write "meant for more pixels" in the title instead of just for most pixels but by doing so I went beyond the limits of the thread's title.
____________________________________________________________________________________________________
We all know the benefits of a superior resolution, but that's not the entire point of this post, so I will focus in cinema like effects like those used in Ryse.
This is a theory I developed after reading 4 articles -linked below- regarding to graphics. I said this days ago, more or less, but the post was removed because it was off-topic in that particular thread, so I decided to create a post on the matter.
I said then that given the fact that the Xbox One can't equal PS4 resolution in every game developers could try a different approach for both consoles, because both could shine in their own way.
I'm not gonna bother with complex terms. Anyway, my theory is that developers could choose to provide PS4 games with a resolution boost and in the case of the Xbox One they could work with a lesser resolution and adding extra effects.
High quality Depth of Field, dynamic range, motion blur, new AA technologies and high levels of AF.
As pointed out by forumers here before, the question is..., couldn't both consoles follow the same approach? In the same way than the Xbox One the PS4 could drop the resolution.
The possibilities would be going both with superior resolutions but less effects or dropping the resolution a little and enhance different areas of the image.
The thing about the second approach is..., for the PS4, wouldn't the 32 ROPs be underutilised if you choose an inferior resolution, so would do the CUs? Excuse me if I am wrong.
Maybe on the PS4 the best approach would be trying the best of both worlds, upping the res a little to utilise the ROPs and CUs and use as many effects as possible at the same time using the GDDR5 bandwidth.
When it comes to the Xbox One, I think the best approach would be trying to fit the framebuffer within the eSRAM in its entirety 100% of the time!
As long as you choose something along the lines of 720p, 800p, 900p, -or a very carefully tuned 1080p- and apply a lot of effects, this approach could provide incredible graphics.
Even if you dropped the native resolution of a PS4 game to the Xbox One levels the Xbox One would have an advantage in its whopping 270+GB/s of bandwidth -theoretical maximum if the framebuffer fits the eSRAM-.
Which will allow for some crazy effects. :smile2:
My point is that on the PS4 developers could choose a given higher resolution on a regular basis, and for the Xbox One a lesser resolution keeping within the ESRAM limits to take advantage of it at all times, but using some extra effects.
These are the aforementioned articles:
http://www.eurogamer.net/articles/2013-11-23-digital-foundry-vs-forza-motorsport-5
But as we've seen in the 900p presentation of a game like Ryse, there's more to producing an appealing end image than heightening the pixel count - though this inarguably goes a long way.
http://www.eurogamer.net/articles/digitalfoundry-next-gen-now-ryse-son-of-rome
Tech guru Timothy Lottes - then of Nvidia, now at Epic - presented an interesting theory about the difference in presentation between a Hollywood Blu-ray movie and a typical video game. His blog post - unfortunately - is now gone, but you can get the gist of the discussion in this Digital Foundry article, where Lottes concludes:
"The industry status quo is to push ultra-high display resolution, ultra-high texture resolution, and ultra sharpness. In my opinion, a more interesting next-generation metric is, can an engine on an ultra high-end PC rendering at 720p look as real as a DVD quality movie?"
The debate was interesting enough that even a Hollywood CG professional contributed:
"We do what is essentially MSAA. Then we do a lens distortion that makes the image incredibly soft (amongst other blooms/blurs/etc). Softness/noise/grain is part of film and something we often embrace. Jaggies we avoid like the plague and thus we anti-alias the crap out of our images," said Pixar's Chris Horne. "In the end it's still the same conclusion: games oversample vs. film. I've always thought that film res was more than enough res. I don't know how you will get gamers to embrace a film aesthetic, but it shouldn't be impossible."
Well, of all the games we've seen since then, Ryse is arguably the closest we get to a practical example of this theory - and it looks quite spectacular for much of its duration. Resolution doesn't drop all the way to 720p - Crytek chose 1600x900 - but the overall look is very cinematic, from film grain to motion blur to the immense levels of post-processing and pitch-perfect effects work. Ryse works at a sub-native resolution where others flounder partly because the anti-aliasing is quite sublime
Tech Focus: Game Graphics vs. Movies
http://www.gamesindustry.biz/articles/digitalfoundry-tech-focus-does-pixel-count-matter
An interesting discussion kicked off on the blog of NVIDIA's Timothy Lottes recently, where the creator of FXAA (an anti-aliasing technique that intends to give games a more filmic look) compared in-game rendering at 1080p with the style of visuals we see from Blu-ray movies.
"The industry status quo is to push ultra-high display resolution, ultra-high texture resolution, and ultra sharpness," Lottes concluded.
Do 1080p games super-sample compared to Blu-ray movies? Is the current focus on high contrast, high detail artwork the right approach for a more filmic next-gen?
"In my opinion, a more interesting next-generation metric is can an engine on an ultra high-end PC rendering at 720p look as real as a DVD quality movie? Note, high-end PC at 720p can have upwards of a few 1000s of texture fetches and upwards of 100,000 flops per pixel per frame at 720p at 30Hz."
Comparing screengrabs of a game (Skyrim running with a super-sampled anti-aliasing hack) with the Robert Downey Jr Iron Man movie, the NVIDIA man reckons that even at native 1080p with no MSAA, game rendering is still effectively super-sampling compared to the quality we see in theatrical presentations, and maybe game developers could pursue a more filmic look using fewer pixels in concert with other processing techniques.
Lottes noted that there is little or no single pixel-width detail in 1080p Blu-ray movies, as we can see in spades in ultra-precision PC presentation, suggesting that the same level of detail can be resolved in gaming without recourse to a 1080p framebuffer - or else utilising 1080p with a lot of filtering that gives the illusion of a lower resolution.
http://www.eurogamer.net/articles/digitalfoundry-the-complete-xbox-one-interview
We've chosen to let title developers make the trade-off of resolution vs. per-pixel quality in whatever way is most appropriate to their game content. A lower resolution generally means that there can be more quality per pixel. With a high-quality scaler and antialiasing and render resolutions such as 720p or '900p', some games look better with more GPU processing going to each pixel than to the number of pixels; others look better at 1080p with less GPU processing per pixel.
Game developers are naturally incented to make the highest-quality visuals possible and so will choose the most appropriate trade-off between quality of each pixel vs. number of pixels for their games