Digital Foundry Article Technical Discussion Archive [2014]

Status
Not open for further replies.
I also wonder why they insist on MSAA vs FXAA as that is another performance hit.

Please, 2xMSAA >>> FXAA.

FXAA does blur the sub-details hence it's not good when you have to aim precisely. With 2xMSAA you almost gain twice the sub-details (vs strong FXAA) which can be really important at 60fps on a twitch shooter (also good with a driving game).

FXAA is the past. When used strongly like in COD or BF4 It's a very bad blur algorithm which doesn't prevent temporal aliasing, wrecks sub-details/sub-contrasts and even creates nasty artifacts.

In 5 years this algorithm (well at least the strong blurring variant) will be dead. Except if you want to use it in a cartoon-ish or artistic fuzzy game where you don't care about geometry subdetails, realism or resolution of textures.
 
^I don't think he's arguing that FXAA looks better, but that it will give a performance advantage over MSAA. There's lots of tearing and dips to even the 30s when two or more titans are fighting on screen.

Digital Foundry has tested both versions of Titanfall, and situation is not perfect on Xbone:

http://www.eurogamer.net/articles/digitalfoundry-2014-titanfall-beta-tech-analysis


The only advantage on PC version are higher rendering resolution, MSAA and FOV slider. Everything else is directly copied from Xbone code - textures are identical, as well all effects and their LOD states. Engine is simplistic, low poly, no dyunamic shadows. Just basic source engine without any nextgen touch.
Framerate is similar to Tomb Raider PS4. That's not good on a twitch shooter like Titanfall.
 
Please, 2xMSAA >>> FXAA.

FXAA does blur the sub-details hence it's not good when you have to aim precisely. With 2xMSAA you almost gain twice the sub-details (vs strong FXAA) which can be really important at 60fps on a twitch shooter (also good with a driving game).

FXAA is the past. When used strongly like in COD or BF4 It's a very bad blur algorithm which doesn't prevent temporal aliasing, wrecks sub-details/sub-contrasts and even creates nasty artifacts.

In 5 years this algorithm (well at least the strong blurring variant) will be dead. Except if you want to use it in a cartoon-ish or artistic fuzzy game where you don't care about geometry subdetails, realism or resolution of textures.

No. Just no. MSAA is the past and proper implementation of FXAA/MLAA/SMAA does not have to blur everything, plus it has the benefit of alpha coverage. EDIT: Post process AA solutions are here to stay and with the current shift towards deferred rendering are becoming the AA of choice fore developers. And when you combine that with the performance benefit of using a PPAA solution, the distinction becomes more clear. You even admit you are concerned with "the strong blurring variant" but that is just a tweak. If you prefer SMAA, then pretend I wrote that.

You are dead wrong on this. And CoD has relied on MSAA for the most part on consoles and PC.
FXAA doesn't work well with distant detail, which I'm guessing is important in this game as a shooter? Or is it all run-and-gun with no need for spotting enemies afar?
I'd agree to that, though I often see MSAA break down in that case. From what I've seen, most everything is up close and personal in Titanfall currently.
 
Last edited by a moderator:
I'm not really sure why the insistence on 1408x792 vice 720p as resolution. 1.1 mpixels vs 921k isn't a massive bump and with performance sitting where it is, you'd think that would be an easy choice to scrap back some temporal performance. Maybe it's a marketing thing?
I think it's entirely a marketing thing, just so they can say that they're running higher than 720p because of all the flak that 720p has been taking on next-gen.
 
I think it's entirely a marketing thing, just so they can say that they're running higher than 720p because of all the flak that 720p has been taking on next-gen.


So this. That being said there is time for at least the framerate to stabilise for release but I really doubt that we'll be seeing them do that and up the res to 900p on launch. That being said DF did pinpoint Alpha effects as being the worst culprit for framerate dips, what are the odds we'll see a drop in alpha res to try and even out the framerate somewhat? I do hope not as crappy alpha effects made the PS3 look substantially rougher than X360 last gen and to see that nonsense carry on to the current gen would be very disappointing
 
No. Just no. MSAA is the past and proper implementation of FXAA/MLAA/SMAA does not have to blur everything, plus it has the benefit of alpha coverage.

You are dead wrong on this. And CoD has relied on MSAA for the most part on consoles and PC.

I'd agree to that, though I often see MSAA break down in that case. From what I've seen, most everything is up close and personal in Titanfall currently.

Yes past CODs mainly used MSAA. But not COD and BF4 on next gen which both use a very strong and probably very cheap FXAA.

The problem is that the dirt cheap variant of FXAA does blur everything and is used in many launch next gen games (the most on PS4 games unfortunately), like Quincunx was abused in PS3 launch games.

Tomb Raider PC and Tomb raider on next gen (PS4 & XB1) use a sharp and quite efficient (and good) version of FXAA. But Tomb Raider X360 use a strong blurring variant I suspect that is not even in the same family of algorithms.

In my opinion, the future is a combination of AAs like MSAA + SMAA + temporal aliasing AA. But it's like many developers don't even care about the image quality in their games. Even if the console can handle SMAA (which is quite cheap now) or MSAA (by removing a few polygons or details/shaders which would have being blurred by the FXAA anyway) they will continue to wreck their games with vaseline filters because of laziness or ignorance.
 
...
The problem is that the dirt cheap variant of FXAA does blur everything and is used in many launch next gen games (the most on PS4 games unfortunately), like Quincunx was abused in PS3 launch games.

Was there ever a good use of quincux that didn't look like a blurry mess? I had honestly mentally assigned that as 'cheap broken ass AA' and hadn't considered the possibility that Sony or Nvidia had just implemented it badly.
 
I'd agree to that, though I often see MSAA break down in that case. From what I've seen, most everything is up close and personal in Titanfall currently.
Properly-implemented MSAA doesn't "break down" in its coverage of geometry, ever. One of the great advantages of non-temporal sample-based AA techniques is that they have pristine consistency and stability.

Thin objects in the distance will still shimmer, because they still might not be sampled all that sufficiently. But it'll be astronomically better than FXAA, which might as well be no-AA for badly undersampled objects.

Was there ever a good use of quincux that didn't look like a blurry mess? I had honestly mentally assigned that as 'cheap broken ass AA' and hadn't considered the possibility that Sony or Nvidia had just implemented it badly.
Quincunx is more or less intrinsically blurry, since it uses an extremely wide sample pattern.

IMO it has shockingly good antialiasing performance for a pattern with such sparse samples, it's amazing how well it can sometimes deal with moire artifacts and such. But obviously there's a cost.
 
WRT Quincunx
I had honestly mentally assigned that as 'cheap broken ass AA'
Ive said it before here
quincunx is more expensive than 2xMSAA
eg you have something like

1280x720 2xMSAA = 38fps
1280x720 quincunx = 36fps

the reason developers choose it, was not cause it was cheap or easier its because in their eyes they thought that it looked better

Looking at the titanfall screenshots often running at sub 60fps I've gotta say they choose the wrong engine. It looks last gen, surely they had enuf cash to use dice/unreal/crytek etc
 
...they will continue to wreck their games with vaseline filters because of laziness or ignorance.
Or ridiculous deadlines or management decisions earlier in the development process that backed them into a corner or a limited engine (similar affected by choices early in the development process), etc. Calling devs lazy or ignorant doesn't go down that well here where the wide variety of influences are more keenly appreciated. Unless you have a track-record as a swoop-in development guru who's implemented multiple AA fixes across games and engines and platforms, and can categorically prove it's all down to laziness or ignorance with some convincing references, I suggest you issue an apologetic revision to that assertion.
 
Or ridiculous deadlines or management decisions earlier in the development process that backed them into a corner or a limited engine (similar affected by choices early in the development process), etc. Calling devs lazy or ignorant doesn't go down that well here where the wide variety of influences are more keenly appreciated. Unless you have a track-record as a swoop-in development guru who's implemented multiple AA fixes across games and engines and platforms, and can categorically prove it's all down to laziness or ignorance with some convincing references, I suggest you issue an apologetic revision to that assertion.

Ok I will try to do that with one specific game, Knack on PS4:

- Runs at 35-40fps and I think almost never go lower than 30fps or very rarely.
- Is blurred by a strong FXAA
- Constantly stutters because of the unlocked 35fps engine.

When they could have easily:

- Locked the game at 30fps to have a constant smooth game
- Used a quite cheap SMAA instead

The game would have been super sharp (more than Ryse!), with less aliasing than currently (because you know a cheap FXAA just blurs the aliasing, it doesn't remove it like a real morphological AA like SMAA) and would also runs smoothly versus the juddering current game.

in my humble opinion, I say this is a good example of ignorance because of "laziness of finding better image quality trade-offs with AA solutions" in gaming development. Also, even a cheap blurring FXAA does use some GPU time. When you decide to use SMAA or a small 2xMSAA you have to take into consideration the bit of GPU time you gained by just removing the vaseline filter.

Anyway I do apologise for having used this very strong word ("laziness") and I admit I did use it only to leave a strong (maybe too strong) impression in my post. I don't really believe that any game developer can be lazy. I know game programming is time consuming and really hard.
 
In my opinion, the future is a combination of AAs like MSAA + SMAA + temporal aliasing AA.

You mean like TXAA ;-) The problem though is that its very expensive (comparable to MSAA) but the image quality is unparalleled. Then again it does produce a softer image which for some games may not be preferable and so in those cases a combination of pure MSAA and transparency AA would be best - assuming you have sufficient performance to spare.
 
Ok I will try to do that with one specific game, Knack on PS4:

- Runs at 35-40fps and I think almost never go lower than 30fps or very rarely.
- Is blurred by a strong FXAA
- Constantly stutters because of the unlocked 35fps engine.

When they could have easily:

- Locked the game at 30fps to have a constant smooth game
- Used a quite cheap SMAA instead

The game would have been super sharp (more than Ryse!), with less aliasing than currently (because you know a cheap FXAA just blurs the aliasing, it doesn't remove it like a real morphological AA like SMAA) and would also runs smoothly versus the juddering current game.

in my humble opinion, I say this is a good example of ignorance because of "laziness of finding better image quality trade-offs with AA solutions" in gaming development. Also, even a cheap blurring FXAA does use some GPU time. When you decide to use SMAA or a small 2xMSAA you have to take into consideration the bit of GPU time you gained by just removing the vaseline filter.

How can you make such a claim.
Do you know for sure that ignorance is the "limiting factor" in Knack case?
 
You mean like TXAA ;-) The problem though is that its very expensive (comparable to MSAA) but the image quality is unparalleled. Then again it does produce a softer image which for some games may not be preferable and so in those cases a combination of pure MSAA and transparency AA would be best - assuming you have sufficient performance to spare.
Combination of SMAA 4x and 2x TXAA would be quite incredible, better edge definition from SMAA and flicker reduction from TXAA.
 
How can you make such a claim.
Do you know for sure that ignorance is the "limiting factor" in Knack case?

No, but it's certainly not outside the realm of possibility. I don't think we have any reason to suspect that outside of Cerny, who had plenty of other things to do, the actual game was created by a team who aren't necessarily up-to-speed with the latest development techniques. They've even been questioned on being up to speed with the concept of shaders ...
 
(because you know a cheap FXAA just blurs the aliasing, it doesn't remove it like a real morphological AA like SMAA)

FXAA and SMAA/MLAA are fundamentally very similar: they attempt to reconstruct the original signals (triangles) that were rasterized by analyzing the actual pixel colors that were writting to the render target, and then apply filtering based on the angle of the reconstructed edge. There's some implementation details that cause differences in the performance as well as the result they produce, but they ultimately have same basic advantages and disadvantages: they're cheap and very effective on still images, but have a really hard time eliminating temporal artifacts since they're working with limited information. So if you're going to somehow simplify FXAA down to "it just blurs the aliasing" then you're going to have to give SMAA/MLAA the same treatment.

On a side note, it makes me sad how much the word "blur" is misused here and on other Internet forums. :cry:
 
No, but it's certainly not outside the realm of possibility. I don't think we have any reason to suspect that outside of Cerny, who had plenty of other things to do, the actual game was created by a team who aren't necessarily up-to-speed with the latest development techniques. They've even been questioned on being up to speed with the concept of shaders ...

It is also possible that they felt that was the right solution.
Or maybe that was the best result that they could get with the time they were given.
 
Combination of SMAA 4x and 2x TXAA would be quite incredible, better edge definition from SMAA and flicker reduction from TXAA.

I'm not sure you'd really need SMAA in that instance. TXAA 2x should already give comparable or better edge definition. I don't have a direct comparison of TXAA 2x to SMAA at present but certainly TXAA 4x gives much better edge definition on it's own than SMAA.

SMAA:
http://smg.photobucket.com/user/pjb...ng.html?&_suid=139264064088509900940294316707

TXAA 4x
http://smg.photobucket.com/user/pjb...ng.html?&_suid=139264069711204298699286567428
 
FXAA and SMAA/MLAA are fundamentally very similar: they attempt to reconstruct the original signals (triangles) that were rasterized by analyzing the actual pixel colors that were writting to the render target, and then apply filtering based on the angle of the reconstructed edge. There's some implementation details that cause differences in the performance as well as the result they produce, but they ultimately have same basic advantages and disadvantages: they're cheap and very effective on still images, but have a really hard time eliminating temporal artifacts since they're working with limited information. So if you're going to somehow simplify FXAA down to "it just blurs the aliasing" then you're going to have to give SMAA/MLAA the same treatment.

On a side note, it makes me sad how much the word "blur" is misused here and on other Internet forums. :cry:

If you talking only about the good (and probably not so cheap) variant of FXAA, ok (but even then FXAA for instance in TRDE will just deal with sub-pixel aliasing when SMAA will reconstruct the whole long edges).

But If we are dealing with the cheap FXAA algorithm then for me it's a blur algorithm. Here Trine 2 PS4 (1080p + cheap blurring FXAA) vs PC (MSAA):

Trine2_PS4_blurred_by_FXAA_Versus_sharp_PC.png

Here AC4 PS4/PC comparison. Notice the textures are blurred on PC version compared to PS4. Well cheap FXAA is on by default on any PC aa version. And the PC version is 4xMSAA also by the way.

a19kp23.png

Now here two cropped images from Forza 5. NoAA vs SMAA applied by myself:

Forza5_bmp_cropped_No_AA.jpg

Forza5_bmp_cropped_SMAA.jpg

SMAA does not blur at all the image: Almost zero blur and it tries to reconstruct the edges to match an ideal 8xSSAA. When (cheap) FXAA does blur extensively everything, the edges, the textures and all the assets.

The latest version of SMAA is really a beast at cost/performance ratio, far above FXAA or MLAA. And it is already combined with MSAA and temporal AA quite effectively, the code (and impressive comparative demo) are available for everyone.
 
*side-discussion

Again, FM5 is just a broken MSAA. If you zoom in far enough you can just make out the extremely faint shade on thin geometry or the top left of the windshield. The left side-view mirror area actually shows the 2xAA.

The lighting generally borks the AA as I've tried to explain before.

/side-discussion
 
Status
Not open for further replies.
Back
Top