And yet they perform extremely close in this game's case
That's why people would say that. The PS4 is capped for the sake of visual and control consistency; although the XB1 version occasionally keeps up, it (if we go by the DF numbers) occasionally spits out frames up to 50% slower than that target. And although the average numeric difference might be smaller, in the real world it's the difference between "stable" and "juddery." And for the PS4 to put out that performance, it's likely producing frames much faster than 17ms most of the time.Why would you say that? The difference isnt that huge to my understanding. Both games run at 1080p while maintaining almost the exact same visual quality. Both games reach 60fps with some dips in the XB1 version.
In instances where the 360 version used 2xMSAA, QAA on PS3 was an artistic choice more than a perf disadvantage. If 360 had supported a quincunx sample+resolve pattern, it's likely a lot of those same devs would have used it on 360 as well.
I didnt say there is a perfect parity. The 50% framerate scenarios are the worst case scenarios. It is not a consistent differenceThat's why people would say that. The PS4 is capped for the sake of visual and control consistency; although the XB1 version occasionally keeps up, it (if we go by the DF numbers) occasionally spits out frames up to 50% slower than that target. And although the average numeric difference might be smaller, in the real world it's the difference between "stable" and "juddery." And for the PS4 to put out that performance, it's likely producing frames much faster than 17ms most of the time.
2XMSAA was producing clearly better results on the 360 on multiplatform games. There was no artistic gain in almost none of the scenarios. But still, the 360 in most cases did outperform the PS3 in a combination of the areas I mentioned earlierIn instances where the 360 version used 2xMSAA, QAA on PS3 was an artistic choice more than a perf disadvantage. If 360 had supported a quincunx sample+resolve pattern, it's likely a lot of those same devs would have used it on 360 as well.
Right, which is why I explained why a lengthy framerate time-average doesn't tell the whole story. Namely, the PS4 needs to be producing frames far faster than the XB1 version in general to maintain that sort of consistency, and that consistency is more impactful than the framerate percentage gap would imply, since its the difference between smooth and juddery. In other words, the PS4's strengths are being leverage toward a significantly smoother-feeling game in a way that requires significantly more processing power.I didnt say there is a perfect parity. The 50% framerate scenarios are the worst case scenarios. It is not a consistent difference
There's no reason that QAA would be cheaper than 2xMSAA, and the PS3 supports both. So regardless of what any of us think about the two methods, it's hard to imagine why quincunx would be chosen except because someone thought it gave better results.2XMSAA was producing clearly better results on the 360 on multiplatform games. There was no artistic gain in almost none of the scenarios.
Or because the PS3 was having a harder time using 2xMSAAThere's no reason that QAA would be cheaper than 2xMSAA, and the PS3 supports both. So regardless of what any of us think about the two methods, it's hard to imagine why quincunx would be chosen except because someone thought it gave better results.
Why would you say that? The difference isnt that huge to my understanding. Both games run at 1080p while maintaining almost the exact same visual quality. Both games reach 60fps with some dips in the XB1 version. To me thats closer than what we used to get a year ago. Its probably even closer than the performance differences we used to get the previous gen, where the PS3 versions usually had either some of the following issues or a combination of them: often less stable performance, QAA or missing AA, missing foliage, lower res transparencies or missing transparent effects, lower resolution, blurrier textures etc
There is not a 40% observed difference regarding this game
2xMSAA works by using two geometry samples at each pixel, and to resolve the final image it blends those two samples.Or because the PS3 was having a harder time using 2xMSAA
I didnt say there is a perfect parity. The 50% framerate scenarios are the worst case scenarios. It is not a consistent difference
2XMSAA was producing clearly better results on the 360 on multiplatform games. There was no artistic gain in almost none of the scenarios. But still, the 360 in most cases did outperform the PS3 in a combination of the areas I mentioned earlier
Maybe if the PS4 version framerate was uncapped the true framerate would be 80+ fps when Xbox One run at 60+ fps and we have all time the difference of 40/50% of fps...
No. That's not what the article really says (but it's certainly what it might imply). it's really more like, "with some 60fps moments when nothing much is happening on the screen".
In relaxed scenes in less detailed locations, Xbox One hits the desired 60fps target, but as soon as the action starts to ramp up we see the game frequently operating between 50-60fps during action scenes, where alpha transparency effects put the engine under stress. While there is a mild reduction in how crisp the controls feel, gameplay isn't adversely affected here, and it's still possible to easily string together combos and counters without any problems. This doesn't hold true for the entire game, and some situations see the action hit the mid-40s, causing increased judder and a momentary jump in controller latency, though only for a brief moment before frame-rates go back up to their usual 50-60fps rate.
I didnt say there is a perfect parity. The 50% framerate scenarios are the worst case scenarios. It is not a consistent difference
2XMSAA was producing clearly better results on the 360 on multiplatform games. There was no artistic gain in almost none of the scenarios. But still, the 360 in most cases did outperform the PS3 in a combination of the areas I mentioned earlier
So I finally read through this and looked at the comparison images.
Digital Foundry: On a game-by-game basis, what was the approach taken in terms of revamping the lighting? Even in the later games, some of the changes can seem quite radical, while often the overall look is rather than subtle than the originals.
Marco Thrush: A lot of the lighting differences are due to the fact that the earlier games used a 'fake' ambient specular term. We essentially replaced that with an IBL [image-based lighting] specular approach that is more common these days, and revamped the specular lighting model on the materials to be more physically correct. Then we went in and created new textures to control the specular lighting for every texture. The result of that is a more realistic looking specular which reflects the environment. This means that in some cases you see more specular lighting than before, and in other cases it means you get less specular because it shouldn't have been there in the first place. On a regular basis we would capture PS3/PS4 side-by-side images to compare the differences in lighting and rein in when things were starting to look too different.
Digital Foundry: One of the biggest questions we had pre-launch concerned the pre-rendered cut-scenes and how you would fit all three games onto one Blu-ray. Obviously you could lose the 3D encodes from Uncharted 3, but beyond that, just how did everything fit onto one disc?
Marco Thrush: Better compression for both audio and video. Removing video content: S3D movies for U3, bonus content for all games, credits movies (we rendered the credits at runtime to save disk space). Removing multiplayer assets helped as well. Lastly, a lot of streaming games improve load time by reducing seek-time overhead and by duplicating assets to place data physically close on the Blu-ray. With all data being installed onto the hard drive (with much faster seek times), we're able to get away with just storing a single copy of each texture and still have everything load in time (or even faster than the PS3). One question we've seen come up is: Why don't you just render the cinematics in real time? The reality is that all the geometry and texture data required to render the cut-scenes takes up way more room than the movies.
Digital Foundry: Could you share some details on the anti-aliasing used? The quality seems higher than the typical post-process solution.
Marco Thrush: We use a fairly simple FXAA solution. The best way to avoid aliasing is to make sure the content doesn't create it in the first place!
This has a full point impact in perceived quality than going to 4k or higher spatial resolution as tested by EBU.Do any games use a half-exposure blur rather than a full exposure blur? That'd be more cinematic but less smooth.
Regarding QAA which is supported on PS3 and not Xbox 360.
QAA uses as much memory as 2xMSAA.
It gives AA similar to 4xMSAA.
The negative part is that it causes blurring of the screen.
And believe it not but in reality it goes like this (More demanding) 4xMSAA>QAA>2xMSAA (Less demanding) as we can see in the graph from nVidia http://www.nvidia.com/object/feature_hraa.html
So HTupolev probably is right when he says that it is an artistic choice.
/ Ken