Image Quality and Framebuffer Speculations for WIP/alpha/beta/E3 games *Read the first post*

I don't want to sound like a broken record, we talked a lot about resolutions here, but Cervat Yerli said that Ryse is going to be a Full HD experience.



This is what I don't understand. A friend of mine also mentioned this and I am scratching my head trying to understand how you can pull that one off.

A native framebuffer means that the internal framebuffer of the console is going to be 1920x1080 no matter what! :smile2:

On the PS3 & Xbox 360 the native framebuffer of the games and the actual resolution were different things in some cases. We could never call a 720p Halo 3 a native framebuffer because you'd be wrong. But when it comes to the Xbox One things have drastically changed.

Presumably the explanation for this (my personal theory) is that the console does scaling on the level of a dedicated hardware scaler, along with the Display Planes, and the image is upscaled internally so you will get a fantastic image quality.

Additionally, I wonder... If your HDTV has an amazing picture the games will look spectacular and pixel counting won't work as it used to and the only way to actually know what resolution does a game run at is if developers make allowance and tell people what resolution they chose to run the game at.

Am I missing anything?
 
I don't want to sound like a broken record, we talked a lot about resolutions here, but Cervat Yerli said that Ryse is going to be a Full HD experience.

Am I missing anything?
It's a tweet. It's limited to a useless number of characters which encourages poor contractions and ambiguous remarks just to fit the character limit. Tweets should not be used as a source for detailed info or clarity beyond questions answered with a 'yes' or 'no'. The 'upscaler for AA' is curious, might mean some post FX AA applied while upscaling, but it's a tweet so that could just be a muddle of words. Ironically a tweet is short so that, in theory, they can be creted and published quickly, but the word limit means one should spend a decent amount of time crafting an accurate tweet to avoid accidental miscommunication. Few people will spend their time investing such care in a tweet.

The game, the major opaque geometry, is rendering to a 900p framebuffer. This is being upscaled to 1080p (into a 1080p framebuffer) and composited with the 1080p UI.
 
Presumably the explanation for this (my personal theory) is that the console does scaling on the level of a dedicated hardware scaler, along with the Display Planes, and the image is upscaled internally so you will get a fantastic image quality.

Additionally, I wonder... If your HDTV has an amazing picture the games will look spectacular and pixel counting won't work as it used to and the only way to actually know what resolution does a game run at is if developers make allowance and tell people what resolution they chose to run the game at.

Am I missing anything?

I can't see how a scaler is supposed to make <1080p = 1080p. Scalers of any type introduce artifacting and mess, even high-end AV Amps with Faroudja scalers cannot make a sub-HD image look as good as the native HD equivalent.

It reminds me of the early days of HDtv when people would tell me how much better their SDtv looked now. Or when I'd run into friends boasting about how good Sky HD looked when they'd only plugged into the receiver with SCART.

If scaling was as good as native content then TV stations wouldn't be scrambling to spend millions on high end storage to support a HD workflow they'd just stick with an SD workflow and a really expensive HD scaler (which most already have, boy how I LOL at Eden showing "HD" versions of old BBC nature docs).

In short I think what the Crytek guy is trying to say is 900p>720p so it's more 'HD', 'native framebuffer' nonsense is nonsense. Hell if I play PS2 on my TV I could claim my TV is playing PS2 games with a 'native 1080p framebuffer'
 
Hell if I play PS2 on my TV I could claim my TV is playing PS2 games with a 'native 1080p framebuffer'
No part of your output pipeline is 1080p, so you couldn't legitimately claim a 1080p framebuffer. For Ryse, the final frame image is a 1080p buffer in memory, sent out the HDMI in 1080p format, so it is a native 1080p framebuffer, technically. The UI is also 1080p.

Which sees me echoing my concerns again about the relevancy of this thread when future games are going to be made of lots of different sized framebuffers. What resolution is a deferred rendered game that renders luminance at 1080p and chrominance at 720p? What resolution the same game with the buffer resolutions reversed (not that that'd make any sense)? How do we count pixels on a dynamically scaled game? A mean average resolution over 1 hour's play??
 
No part of your output pipeline is 1080p, so you couldn't legitimately claim a 1080p framebuffer. For Ryse, the final frame image is a 1080p buffer in memory, sent out the HDMI in 1080p format, so it is a native 1080p framebuffer, technically. The UI is also 1080p.

Which sees me echoing my concerns again about the relevancy of this thread when future games are going to be made of lots of different sized framebuffers. What resolution is a deferred rendered game that renders luminance at 1080p and chrominance at 720p? What resolution the same game with the buffer resolutions reversed (not that that'd make any sense)? How do we count pixels on a dynamically scaled game? A mean average resolution over 1 hour's play??

... but 1080p.

You're right, but it was the same issue last gen. The usefulness of counting pixels is going to diminish, as games are going to look sharper than last gen anyways. I don't think many people think God of War 3 is a muddy mess, even though it's *only* 720p with some nice AA. It is an interesting question for people who are interested in what kind of compromises devs might be making for framerate, or to fit in other rendering effects, but as far as image quality goes, I think it won't be particularly useful. People want to use it as an objective measure of image quality, but as you've pointed out, the resolution of the final framebuffer is just one of many many aspects of image quality. Really, the understanding of whether a game has good image quality is going to be entirely subjective, and pixel counting will only be a tool for confirmation bias. I can understand why it's an interesting data point in a technical discussion.
 
Which sees me echoing my concerns again about the relevancy of this thread when future games are going to be made of lots of different sized framebuffers. What resolution is a deferred rendered game that renders luminance at 1080p and chrominance at 720p? What resolution the same game with the buffer resolutions reversed (not that that'd make any sense)? How do we count pixels on a dynamically scaled game? A mean average resolution over 1 hour's play??

Just simply analyze what the game is doing to those frame buffers to achieve final output even if its a range of resolutions. Its really just a analysis thread to understand how games are being designed to overcome hardware limitations.
 
It reminds me of the early days of HDtv when people would tell me how much better their SDtv looked now. Or when I'd run into friends boasting about how good Sky HD looked when they'd only plugged into the receiver with SCART.

Well, there's one difference. HD signals usually get quite a bit more bandwidth and a better codec than PAL (i.e. DVB-S in Europe). So in theory, the image, even when viewed in SD, should look better than the old PAL signal. But that's not quite the point here... people tend to believe what they are told, when there's a major information difference (i.e. the sales clerk tells them it's better).
 
No part of your output pipeline is 1080p, so you couldn't legitimately claim a 1080p framebuffer. For Ryse, the final frame image is a 1080p buffer in memory, sent out the HDMI in 1080p format, so it is a native 1080p framebuffer, technically. The UI is also 1080p.

Which sees me echoing my concerns again about the relevancy of this thread when future games are going to be made of lots of different sized framebuffers. What resolution is a deferred rendered game that renders luminance at 1080p and chrominance at 720p? What resolution the same game with the buffer resolutions reversed (not that that'd make any sense)? How do we count pixels on a dynamically scaled game? A mean average resolution over 1 hour's play??

Should have gone with ToungeInCheek.gif, I completely understand your point. The 'framebuffer in memory' things still feels a bit artificial and MarketySpeak, after all I could render my geometry at 480p then composite with my lighting at 600p and my UI at 108p. Sure I have a 1080p framebuffer but if you're not clued into how 3D works that can sound far more impressive than it is. IIRC there some PS3 titles that rendered a 1080p framebuffer while rendering the game itself at 720p if that was selected from the XMB (I don't mean GT5 or Wipeout I believe they were dynamically scaled) .

Dynamic buffers will likely result in this thread being moot over time but at least initially I suspect it will be interesting to see just how far below the 1080p bar some of these next-gen titles go. Ultimately of course it will all come down to whether these sub-1080p titles look good or not rather than a pixel counting exercise to determine 'teh winner'. Hell I never even noticed the dynamic scaling on GT5 or Wipeout so these techniques do work well and if they keep good IQ and high framerates I'm onboard.
 
Came across this:


RealtimeCevat said:
https://twitter.com/RealtimeCevat/status/384355681194614784

hsgU48L.png
RyseGame said:


Now wondering:

Would anyone here be able to measure the native rendering resolution (and possibly AA) used in that trailer from the 15Mbps 1080p video:

http://news.xbox.com/~/media/Images...VIDOC_1_FINAL_1080p2997_ST_H264_15000kbps.mp4

?
 
There are direct feed shots in the Ryse thread. They would be better fit for that.
 
Last edited by a moderator:
Came across this:






Now wondering:

Would anyone here be able to measure the native rendering resolution (and possibly AA) used in that trailer from the 15Mbps 1080p video:

http://news.xbox.com/~/media/Images...VIDOC_1_FINAL_1080p2997_ST_H264_15000kbps.mp4

?

Someone on Gaf did and said it was 1080.

Here is the List they have come up with.

We made a list of confirmed resolutions

Forza: 1080p - http://uk.ign.com/articles/2013/09/1...-ryse-does-not
CoD:DoG 1080p - http://www.eurogamer.net/articles/20...ox-one-and-ps4
Fifa 1080p - http://www.eurogamer.net/articles/20...-next-gen-fifa
LocoCycle 1080p - Pixel count by Liabe Brave - http://www.neogaf.com/forum/showpost...0&postcount=35
PowerstarGolf - 720p with 1080p UI - Pixel count by Liabe Brave - http://www.neogaf.com/forum/showpost...0&postcount=35
Ryse: 900p - http://uk.ign.com/articles/2013/09/1...-ryse-does-not
KI: 720p - http://gamingbolt.com/killer-instinc...xbox-one-cloud
DR3: Dynamic bullshi - actually unable to find a solid link on this, many available but no direct dev comments could be found easily. - Albert could clarify if he wanted to, even if he didn't know it would be 1 phonecall away.

Driveclub: 1080p - http://www.eurogamer.net/articles/di...-playstation-4
KZ:SF: 1080p - http://www.eurogamer.net/articles/di...-playstation-4
Knack: 1080p - http://www.eurogamer.net/articles/di...-playstation-4
Infamous: 1080p - http://www.eurogamer.net/articles/di...-playstation-4
the Order: 1920x800 (artistic- 2.40 cinema aspect ratio, retains 1:1 pixel mapping) - Seriously, the resolution of this one is everywhere, find it yourself :p
ACIV: 1080p - constantly referred to as 1080p, eurogamer comment here - http://www.eurogamer.net/articles/di...assins-creed-4
CoD:DoG 1080p - http://www.eurogamer.net/articles/20...ox-one-and-ps4
Fifa 1080p - http://www.eurogamer.net/articles/20...-next-gen-fifa
Resogun: 1080p - http://www.eurogamer.net/articles/di...dry-vs-resogun
Flower: 1080p - https://twitter.com/TDMoss/status/385907407475314688 - talks about 60fps in tweet but in comments confirms 1080p
DC:UO 1080p - http://massively.joystiq.com/2013/06...y-free-on-ps4/
FF14 1080p - http://www.dualshockers.com/2013/09/...t-pc-settings/
Blacklight Retribution 1080p - http://www.thesixthaxis.com/2013/09/...playstation-4/
Hohokum 1080p - http://blog.us.playstation.com/2013/...-vita-in-2014/
Strider 1080p - http://blog.us.playstation.com/2013/...inja-training/

They are just about the only confirmed resolutions out there.

EDIT: If anyone provides a valid link with a developer quote or a pixel analysis I'll add the game to the list.

EDIT2:
There is a thread here: www.neogaf.com/forum/showthread.php?p=86349940 which not only does some pixel counting it also explains how it is done if anyone is interested, it also confirms the figures above for Killzone, Knack, Infamous, Resogun, Driveclub and Infamous.

Also of note: Ryse was 1080p native at E3.
http://www.neogaf.com/forum/showpost.php?p=86116324&postcount=8069
 
Last edited by a moderator:
I think the 'fuss' is more that at one time they implied that their game was 1080p by releasing promotional materials that were natively at that res. If their promotional stuff was 900p upscaled there would have been no fuss

But we dont know if combat video was 1080p or 900p. It was first try for a guy who was counting it and Crytek post-AA solution is quite advanced, and video was heavily compressed.
If Quaz wrote that i would be certain, but this was just one guy, who tried to learn how to count resolution.
 
1080p->900p (introducing AA)->1080p?

So this theory is that the game renders at 1920*1080 (maybe with a PPAA solution) then it downscales to 1600*900 introducing some more AA through downsampling and then it upscales again at 1920*1080 through the XB1 scaler? will the whole downsampling be worth it though?
 
That'll just introduce blur. Downsampling and then upscaling degenerates image quality and can never improve it. If you want to blur the aliasing away, just use a blur filter, but that's a pretty rubbish solution.
 
Back
Top