Digital Foundry Article Technical Discussion Archive [2013]

Status
Not open for further replies.
Under the right conditions, MLAA produces some great results as well. But yeah, Ryse seems to do really well in the visual department. So at least it's a hopeful next-gen start of CryEngine.

Yeah, in the Greek and Roman era, they don't have thin wires, so I presume subpixel aliasing is not a consideration. I reckon MLAA will still work well nextgen in these cases. :devilish:
 
I am extremely intrigued by this game now after hearing what they have to say about the anti-aliasing and hence the virtual lack of aliasing in the game. Something that hasn't been accomplished by any game on any console prior to this.

I'll have to see for myself if it actually manages to do this. I'll be quite pleasantly surprised if it does. Although a bit of a shame that the gameplay can't match the fidelity of the graphics.

Regards,
SB

Some last gen games also accomplished this, and going with the DF screenshots I won't call it perfect, just very good. KZ shadow fall also looks amazingly clean in a lot of areas, only some shaders aliasing that is still a problems to most games.
 
Some last gen games also accomplished this, and going with the DF screenshots I won't call it perfect, just very good. KZ shadow fall also looks amazingly clean in a lot of areas, only some shaders aliasing that is still a problems to most games.

Considering not a single last gen game provided decent AA (It's been a huge pet peeve of mine for over a decade) and KZ: SF certainly isn't up to my standards (though better than last gen)...um, yeah. All, of course, IMO. I'm much more stringent on AA than most other graphical artifacts.

That said, at typical living room distances with typical living room TVs, it's probably good enough for most. As you can't tell the difference between 720p and 1080p, it's unlikely you'll notice some forms of aliasing artifacts if the AA is at least somewhat decent. Distance and pixel density won't be able to address all aliasing artifacts, however.

Regards,
SB
 
wasn't Forza Horizon one of the cleanest game ever on last gen with MSAA and FXAA. Personally I don't consider Ryse as a new benchmark, cuz I think some last gen games achieve super clean IQ even for 720p resolution.
 
So, that temporal edge-detect they've come up with (SMAA 1TX?) really is good. Anything to boost image quality on the cheap is a great thing.
 
pc version of Crysis 3 have all 3 settings of SMAA to play with, I am sure there are comparison shots out there and performance chart too.
 
What constitutes decent AA? 4xMSAA?

4xMSAA with equivalent transparency AA is acceptable. 8x is better, but I can live with 4x without it bothering me. Specular aliasing continues to be a problem, however. And not all transparency AA solutions are good either. The better ones, based off of supersampling tend to be quite demanding, especially if there are a lot of transparencies in the scene.

4x or higher SSAA, preferably not ordered grid, is better. But as I said 4xMSAA with equivalent quality Transparency AA is enough that most aliasing won't bother me when playing a game.

It's all about gradually reducing or removing things that annoy me on screen when I play games. I don't expect all rendering artifacts to be eliminated immediately or within one generation of graphics hardware. But I do like to see progress.

On PC many games have had a regression with worse AA now than what we had 4 years ago due to the proliferation of compute based AA, most of which is absolutely horrible, IMO. I recognize that what we'll be getting on PS4/XO will be better than what we had on PS3/X360, but it's hard to disengage the PC side of my brain. So I still see a lot of regression versus overall progression. It's more annoying when I think that for many PC ports, the AA will be guided by what was possible on console. And hence, still stuck with worse AA than what we had years ago.

And yes, I know that MSAA isn't easy to do with deferred rendering/lighting even with some of the things put into Dx10/11 to facilitate its use in those situations. But it doesn't make aliasing any less annoying because of that.

Regards,
SB
 
4xMSAA with equivalent transparency AA is acceptable. 8x is better, but I can live with 4x without it bothering me. Specular aliasing continues to be a problem, however. And not all transparency AA solutions are good either. The better ones, based off of supersampling tend to be quite demanding, especially if there are a lot of transparencies in the scene.

4x or higher SSAA, preferably not ordered grid, is better. But as I said 4xMSAA with equivalent quality Transparency AA is enough that most aliasing won't bother me when playing a game.

It's all about gradually reducing or removing things that annoy me on screen when I play games. I don't expect all rendering artifacts to be eliminated immediately or within one generation of graphics hardware. But I do like to see progress.

On PC many games have had a regression with worse AA now than what we had 4 years ago due to the proliferation of compute based AA, most of which is absolutely horrible, IMO. I recognize that what we'll be getting on PS4/XO will be better than what we had on PS3/X360, but it's hard to disengage the PC side of my brain. So I still see a lot of regression versus overall progression. It's more annoying when I think that for many PC ports, the AA will be guided by what was possible on console. And hence, still stuck with worse AA than what we had years ago.

And yes, I know that MSAA isn't easy to do with deferred rendering/lighting even with some of the things put into Dx10/11 to facilitate its use in those situations. But it doesn't make aliasing any less annoying because of that.

Regards,
SB

God of War 3 apparently had the best AA in history.
I didn't always like it though: it often appeared as if I was playing a CGI sequence, which was super distracting believe it or not.

Also: I believe you are the one who did some blind test concluding that nobody can see the difference between 720P and 1080P. Did you also test if they could see difference between AA methods at 1080P?

As for the DF article, they should have measured the power output :( My guess is that there is a big difference between pre and post patch
 
I'm not quite sure where you heard that, but it's very wrong. Have you ever tried 8x Supersampling for example?

I think it was compared to every other console game, or maybe even against what 2005-2006 era pc's can produce at that resolution and framerate. I'll try to look for sources
 
4xMSAA with equivalent transparency AA is acceptable.
Well then, that was achieved this gen. WarHawk at least at 4xMSAA. No transparency AA, but it didn't have transparent surfaces. ;)

http://www.preshweb.co.uk//wp-content/uploads/2008/12/warhawk51.jpg

Our pixel counters have found quite a few games with 4xMSAA at 720p on both consoles too. It was a real shame XB360 never really got the 'free AA' IQ we were hoping for, but I guess that was because graphics switched from forward rendering. All that said, last gen wasn't really strong enough to manage high AA with the pixel pushing desired. I don't think any platform ever will be. The only way for more AA is to use hardware beyond the spec targeted by the developers, where their fancy pixel shaders aren't exhausting the hardware and there's room to multisample everything. And Ryse is an optimistic look at how hopefully advance AA techniques will produce better IQ with less overhead than MSAA, although Ryse does have a general softness that might be a side-effect of such methods (until improved?).
 
Here's the DF article on AA.

Here is the analysis of GoW3, specifically page 3 where the AA talk begins.


thanks, great reading it again.

It was the "best AA ever" because it made people think it was bullshot, pre-rendered. there is not a greater compliment to realtime graphics and or AA methods.
 
I just get the feeling that IW and DICE have basically dumped a straight port of the PC version onto the PS4 and dialled back features where necessary and have very little optimisation beyond a few modifications for compatibility where required.

Definitely seems like it, the hardware should be capable of much more with proper optimization.

After all, a HD6870 could run BF3 on Ultra settings at 1920x1200 (11% higher than 1080p) and still average 44 fps. So there's little reason to doubt that closed box hardware (with 30% faster memory) wouldn't be able to achieve 60 fps.
 
Status
Not open for further replies.
Back
Top