Image Quality and Framebuffer Speculations for WIP/alpha/beta/E3 games *Read the first post*

well, everyone's point of views i understand and they seem valid. the unnecessary troubles that down sampling brings is the main concern from everyone. If they were able to hit 1080p then they should just leave at that, because it's more dominant.

Just to let everyone know we're on the same page, i'm not delusional or anything. I'm not going to defend my conclusion any further because if it turns out to be untrue then it's a waist of anyone's time. The DF analysis is what i'm more interested in viewing at this point. If yerli's assertions are from bad translation someone should notify him Asap.

---------------------------------------------------------------------------

Btw, Yerli made an other assertion about the usage of Esram. http://www.dualshockers.com/2013/10...ng-xbox-ones-esram-for-considerable-speed-up/
 
If it's to solve shimmering, which a poky little 20% super sample won't achieve, you can just blur the result to get the same results as your suggestion but for less effort.
Pretty much. Supersampling helps reduce shimmering by more accurately capturing high-frequency detail, by sampling at a higher spatial frequency. That advantage literally doesn't exist if you sample at your output resolution to begin with; scaling down and scaling back up becomes nothing more than a blur filter.

And blur filters don't deal with shimmering particularly well. If you low-pass filter a signal and then sample from it, you can prevent aliases from showing up in the first place (although this isn't a technique that can be applied particularly easily to video game graphics, obviously). But sampling from a signal and then low-pass filtering the result? All you're doing is blurring the aliases. Congrats, you now shimmery blurry stuff.

He's literally just saying that things were "sped up" by placing frequently-accessed render targets into high-bandwidth embedded memory. In other words, they're using the eSRAM pool in the way it was designed to be used, rather than letting the GPU bottleneck like crazy on high-throughput DDR3 access.
 
Last edited by a moderator:
Ryse's AA explained.

Cevat Yerli: MSAA is quickly getting bandwidth-bound and thus expensive. With a deferred shading based renderer the bandwidth consumption is getting prohibitively high. For Ryse we developed our custom SMAA 1TX, in essence it is a combo of morphological AA with smarter temporal anti-aliasing. It's a new robust and fairly efficient technique which we shared some details about at Siggraph this year. It's a solution which deals with any signal input changes in order to smooth out potential shimmering during movement, while masking out any potential ghosting, and together with shading aliasing solutions it provides a more filmic image quality overall.

http://www.eurogamer.net/articles/digitalfoundry-crytek-the-next-generation

So that's been pretty much cleared up.
 
Someone on Gaf did and said it was 1080.

Here is the List they have come up with.


http://www.neogaf.com/forum/showpost.php?p=86116324&postcount=8069
They should add Crimson Dragon to that Neogaf list, which runs at 1080p and 60 fps on the Xbox One.

Ruffian Games is also working on a 1080p 60 fps game for the Xbox One.

snOlQ60.png
 
Interesting part is that they've developed AA for shader aliasing.
Or am i reading this wrong and he meant by 'shading AA' a MLAA component in SMAA?
I believe so. That is, shader aliasing reduction. I don't know if the blur in the screenshots is their AA + upscale or a deliberate blur step, but it's worth pointing out that one of the issues with MLAA is reduction of texture/surface fidelity, but that's something Ryse is embracing (to good effect, I'll add).
 
That Jeremy Conrad guy's attitude is too vitriolic towards Microsoft ... must have some kind of history together ....
 
Ryse's AA explained.



http://www.eurogamer.net/articles/digitalfoundry-crytek-the-next-generation

So that's been pretty much cleared up.

Whatever they are doing to solve jaggies is clearly working. I played the game yesterday at bgs, and the game is ultra smooth. I was playing in less then 8 inches from the screen and even though I could tell there was some upscalling going on, there were almost no jaggies in sight...

Taking a few steps back, to a more normal gaming distance and the IQ was just fantastic. Better than quite a few native 1080p games even.
 
Whatever they are doing to solve jaggies is clearly working.
SMAA tends to do a decent job with blatant jaggies. It (more or less) looks for stairstep patterns and replaces them with something approximating what a properly-antialised version of the same stairstep should look like.

Meanwhile the temporal AA should give supersample-like results to help deal with shimmering thin objects and the like. It uses the pixels from a previous frame as subpixel samples for the current frame, giving supersampling "for free." Scare quotes with "for free" because
1-you have to reproject the pixels from said previous frame accurately onto the current frame, or else suffer various problems, and
2-it has some intrinsic limitations, like not being able to properly antialiase all of an object when it is coming out from behind occlusion.

The combination of high-quality TAA and SMAA should be capable of reasonably stable results, and depending on the SMAA implementation, somewhat sharp as well. The idea of mixing those two techniques was mapped out with the development of SMAA, and is a pretty neat idea.
 
Last edited by a moderator:
And the blur! Everyone's ignoring the blur, such that I'm starting to think I'm alone in seeing it. Maybe there are finger prints on my specs? :p
 
I believe so. That is, shader aliasing reduction. I don't know if the blur in the screenshots is their AA + upscale or a deliberate blur step, but it's worth pointing out that one of the issues with MLAA is reduction of texture/surface fidelity, but that's something Ryse is embracing (to good effect, I'll add).

SMAA generally do not touch textures quality, but maybe they have new pass for shader alising that blurs more. Or just combing SMAA 1TX, their shader aliasing and upscaling introduces so much blur? Or maybe they just dont do sharpening with upscaling, so its blurry just from image resize.
 
Last edited by a moderator:
SMAA generally do not touch textures quality, but maybe they have new pass for shader alising that blurs more. Or just combing SMAA 1TX, their shader aliasing and upscaling introduces so much blur? Or maybe they just dont do sharpening with upscaling, so its blurry just from image resize.

Ugh, I hope we don't start seeing sharpening passes all over the shop nothing introduces high frequency noise like sharpening. The only time I've seen it done well is with expensive photoshop filters on static images and I can't imagine those filters can be made realtime without a significant cost to quality
 
Ugh, I hope we don't start seeing sharpening passes all over the shop nothing introduces high frequency noise like sharpening. The only time I've seen it done well is with expensive photoshop filters on static images and I can't imagine those filters can be made realtime without a significant cost to quality

I think that when You have total control of rendering and tons of framebuffers, it could be done properly.
 
Hello

This is a demo of the new COD on ps4:
http://uk.ign.com/videos/2013/10/25/call-of-duty-ghosts-squads-squad-assault-gameplay

I grabbed a shot directly from the 1080p video. While the video compression is poor, is it possible to count pixels here?

http://i.picpar.com/Gvy.png

IGN's videos are not a good capture to use because they've either used 1080i (or some chroma-subsampling) that's messing up the vertical resolution (there's maybe one or two frames I've seen that indicate 1920 horizontal res), hence the bizarre (and obvious) edges you can see there.

As I mentioned earlier, just wait for the uncompressed captures later.
 
Back
Top