Digital Foundry Article Technical Discussion Archive [2015]

Status
Not open for further replies.
Why would you say that? The difference isnt that huge to my understanding. Both games run at 1080p while maintaining almost the exact same visual quality. Both games reach 60fps with some dips in the XB1 version.
That's why people would say that. The PS4 is capped for the sake of visual and control consistency; although the XB1 version occasionally keeps up, it (if we go by the DF numbers) occasionally spits out frames up to 50% slower than that target. And although the average numeric difference might be smaller, in the real world it's the difference between "stable" and "juddery." And for the PS4 to put out that performance, it's likely producing frames much faster than 17ms most of the time.

In instances where the 360 version used 2xMSAA, QAA on PS3 was an artistic choice more than a perf disadvantage. If 360 had supported a quincunx sample+resolve pattern, it's likely a lot of those same devs would have used it on 360 as well.
 
That's why people would say that. The PS4 is capped for the sake of visual and control consistency; although the XB1 version occasionally keeps up, it (if we go by the DF numbers) occasionally spits out frames up to 50% slower than that target. And although the average numeric difference might be smaller, in the real world it's the difference between "stable" and "juddery." And for the PS4 to put out that performance, it's likely producing frames much faster than 17ms most of the time.
I didnt say there is a perfect parity. The 50% framerate scenarios are the worst case scenarios. It is not a consistent difference
In instances where the 360 version used 2xMSAA, QAA on PS3 was an artistic choice more than a perf disadvantage. If 360 had supported a quincunx sample+resolve pattern, it's likely a lot of those same devs would have used it on 360 as well.
2XMSAA was producing clearly better results on the 360 on multiplatform games. There was no artistic gain in almost none of the scenarios. But still, the 360 in most cases did outperform the PS3 in a combination of the areas I mentioned earlier
 
I didnt say there is a perfect parity. The 50% framerate scenarios are the worst case scenarios. It is not a consistent difference
Right, which is why I explained why a lengthy framerate time-average doesn't tell the whole story. Namely, the PS4 needs to be producing frames far faster than the XB1 version in general to maintain that sort of consistency, and that consistency is more impactful than the framerate percentage gap would imply, since its the difference between smooth and juddery. In other words, the PS4's strengths are being leverage toward a significantly smoother-feeling game in a way that requires significantly more processing power.

2XMSAA was producing clearly better results on the 360 on multiplatform games. There was no artistic gain in almost none of the scenarios.
There's no reason that QAA would be cheaper than 2xMSAA, and the PS3 supports both. So regardless of what any of us think about the two methods, it's hard to imagine why quincunx would be chosen except because someone thought it gave better results.
 
There's no reason that QAA would be cheaper than 2xMSAA, and the PS3 supports both. So regardless of what any of us think about the two methods, it's hard to imagine why quincunx would be chosen except because someone thought it gave better results.
Or because the PS3 was having a harder time using 2xMSAA
 
Why would you say that? The difference isnt that huge to my understanding. Both games run at 1080p while maintaining almost the exact same visual quality. Both games reach 60fps with some dips in the XB1 version. To me thats closer than what we used to get a year ago. Its probably even closer than the performance differences we used to get the previous gen, where the PS3 versions usually had either some of the following issues or a combination of them: often less stable performance, QAA or missing AA, missing foliage, lower res transparencies or missing transparent effects, lower resolution, blurrier textures etc
There is not a 40% observed difference regarding this game

No. That's not what the article really says (but it's certainly what it might imply). it's really more like, "with some 60fps moments when nothing much is happening on the screen".
 
Or because the PS3 was having a harder time using 2xMSAA
2xMSAA works by using two geometry samples at each pixel, and to resolve the final image it blends those two samples.
PS3 QAA works by using two geometry samples at each pixel, and to resolve the final image it blends those two samples, in addition to three extra samples from neighboring pixels.

On a basic level, the latter is actually more expensive. Depending on the hardware design and how things are cached it's possible that QAA is identical in cost, but it would imply rather bizarre hardware design for QAA to be cheaper than 2xMSAA.

Furthermore, we know that developers have occasionally gone for wide soft reconstruction filters even with custom solutions where it's trivially obvious that a narrow filter would have been cheaper or the same cost, such as Halo Reach's quincunx TAA or HRAA's use of flipquad.
 
I didnt say there is a perfect parity. The 50% framerate scenarios are the worst case scenarios. It is not a consistent difference

2XMSAA was producing clearly better results on the 360 on multiplatform games. There was no artistic gain in almost none of the scenarios. But still, the 360 in most cases did outperform the PS3 in a combination of the areas I mentioned earlier

Maybe if the PS4 version framerate was uncapped the true framerate would be 80+ fps when Xbox One run at 60+ fps and we have all time the difference of 40/50% of fps...
 
No. That's not what the article really says (but it's certainly what it might imply). it's really more like, "with some 60fps moments when nothing much is happening on the screen".

Pretty much...

In relaxed scenes in less detailed locations, Xbox One hits the desired 60fps target, but as soon as the action starts to ramp up we see the game frequently operating between 50-60fps during action scenes, where alpha transparency effects put the engine under stress. While there is a mild reduction in how crisp the controls feel, gameplay isn't adversely affected here, and it's still possible to easily string together combos and counters without any problems. This doesn't hold true for the entire game, and some situations see the action hit the mid-40s, causing increased judder and a momentary jump in controller latency, though only for a brief moment before frame-rates go back up to their usual 50-60fps rate.
 
I didnt say there is a perfect parity. The 50% framerate scenarios are the worst case scenarios. It is not a consistent difference

2XMSAA was producing clearly better results on the 360 on multiplatform games. There was no artistic gain in almost none of the scenarios. But still, the 360 in most cases did outperform the PS3 in a combination of the areas I mentioned earlier

I think that you're making a confusion between technical/visual differences... technical differences are bigger this gen.

You may think visual differences are smaller now, but it's more an opinion than an indisputable fact.
 
So I finally read through this and looked at the comparison images.

Are we sure that it's an AA difference between PS4 and XB1, and not a texture filtering difference? In those shots it looks like the XB1's unique aliasing is across flat surfaces, on thin details that presumably wouldn't get fixed on PS4 either, if it's using a non-temporal PPAA solution. As far as geometry aliasing and stuff goes, it looks basically identical on both platforms, and doesn't seem to be getting dealt with much at all on either.
If I had to guess based on those images, I'd think that:
1-The XB1 features a significant degree of negative LOD bias (possibly as an AF alternative?), and
2-PS4 and XB1 use the same AA solution, which is either no AA or an extraordinarily conservative PPAA.
 
The making of Uncharted: the Nathan Drake Collection
Bluepoint Games discusses PS4's finest remaster with Digital Foundry.

An excellent read! :yes:

BP: The three games don't run on the same engine, you can't just switch U1 over to use Havok and expect the gameplay to remain the same. Each game has its own version of the engine ported to PS4 and only the parts that are truly the same (or parts we changed to operate the same with thorough testing) are actually shared across the three games.

..

BP: The TLOU code base itself was actually an evolved version of U2 (not U3), meaning it didn't have some of the functionality required to make U3 work. Still, having access to the TLOU engine for reference helped a lot. Once the PS4 versions were initially running (using all the knowledge the TLOU engine could provide) we were hovering at ~30 fps.

 
So this is why Uncharted 2 was the first to get ported, interesting! Will read the whole thing.
 
Regarding QAA which is supported on PS3 and not Xbox 360.

QAA uses as much memory as 2xMSAA.
It gives AA similar to 4xMSAA.
The negative part is that it causes blurring of the screen.

And believe it not but in reality it goes like this (More demanding) 4xMSAA>QAA>2xMSAA (Less demanding) as we can see in the graph from nVidia http://www.nvidia.com/object/feature_hraa.html

So HTupolev probably is right when he says that it is an artistic choice.

/ Ken
 
About the lighting changes in the collection from the DF article
Digital Foundry: On a game-by-game basis, what was the approach taken in terms of revamping the lighting? Even in the later games, some of the changes can seem quite radical, while often the overall look is rather than subtle than the originals.

Marco Thrush: A lot of the lighting differences are due to the fact that the earlier games used a 'fake' ambient specular term. We essentially replaced that with an IBL [image-based lighting] specular approach that is more common these days, and revamped the specular lighting model on the materials to be more physically correct. Then we went in and created new textures to control the specular lighting for every texture. The result of that is a more realistic looking specular which reflects the environment. This means that in some cases you see more specular lighting than before, and in other cases it means you get less specular because it shouldn't have been there in the first place. On a regular basis we would capture PS3/PS4 side-by-side images to compare the differences in lighting and rein in when things were starting to look too different.

Fitting all three games in one disc
Digital Foundry: One of the biggest questions we had pre-launch concerned the pre-rendered cut-scenes and how you would fit all three games onto one Blu-ray. Obviously you could lose the 3D encodes from Uncharted 3, but beyond that, just how did everything fit onto one disc?

Marco Thrush: Better compression for both audio and video. Removing video content: S3D movies for U3, bonus content for all games, credits movies (we rendered the credits at runtime to save disk space). Removing multiplayer assets helped as well. Lastly, a lot of streaming games improve load time by reducing seek-time overhead and by duplicating assets to place data physically close on the Blu-ray. With all data being installed onto the hard drive (with much faster seek times), we're able to get away with just storing a single copy of each texture and still have everything load in time (or even faster than the PS3). One question we've seen come up is: Why don't you just render the cinematics in real time? The reality is that all the geometry and texture data required to render the cut-scenes takes up way more room than the movies.

And about the AA
Digital Foundry: Could you share some details on the anti-aliasing used? The quality seems higher than the typical post-process solution.

Marco Thrush: We use a fairly simple FXAA solution. The best way to avoid aliasing is to make sure the content doesn't create it in the first place!

Bluepoint did a fantastic job but they were also given the proper amount of time (and most probably budget) to complete the task (about 15 months according to the article).
 
Last edited:
Do any games use a half-exposure blur rather than a full exposure blur? That'd be more cinematic but less smooth.
This has a full point impact in perceived quality than going to 4k or higher spatial resolution as tested by EBU.
34a489388955d5c9a881c02c7ca3f158.jpg
 
Regarding QAA which is supported on PS3 and not Xbox 360.

QAA uses as much memory as 2xMSAA.
It gives AA similar to 4xMSAA.
The negative part is that it causes blurring of the screen.

And believe it not but in reality it goes like this (More demanding) 4xMSAA>QAA>2xMSAA (Less demanding) as we can see in the graph from nVidia http://www.nvidia.com/object/feature_hraa.html

So HTupolev probably is right when he says that it is an artistic choice.

/ Ken

Yes. QAA = 2xMSAA + Vaseline smeared all over your screen. :p

The quality isn't even remotely similar to 4xMSAA, unfortunately, but somewhere in between. But as many mention, it's generally regarded with derision and abandoned by NVidia as soon as they were able to produce competitive MSAA (speed and IQ) to ATI.

Regards,
SB
 
Status
Not open for further replies.
Back
Top