Digital Foundry Article Technical Discussion [2017]

Status
Not open for further replies.
5fps higher fps is much smoother now ? Remember that the XBX is not locked, also has tearing so the hardware is also pushed to its limits in some scenes.

We have recent and old XB1/PS4 comparisons with higher fps difference during stress tests. Tomb Raider DE was running consistently ~20fps higher on PS4 during framerate drops and I think the 2 recent CODs run often ~10fps higher on PS4 compared to XB1 (with higher res and/or settings on PS4).


Well, Wolf runs worse on XBX in scenes featuring the flamethrower, as much as 10fps lower. So there's that.
4Pro and X1X aren’t running identical settings, so that would have the greater impact here. RPM should not be dismissed, but we have little knowledge of what it can improve for each title, as RPM performance will really be title specific. Given that RPM is not consistent i would not call it a crutch feature in bringing parity between X1X and 4Pro. I would use F1 as a method of seeing what happens if X1X has to run a 4Pro Build of a game, and the results are good for a 4Pro. But the X1X will still edge it out.

Doing the reverse 4Pro has a much harder time at being able to run a game optimized and set for X1X.

This is effectively what we are comparing here. But we knew that to be the case so I’m not really adding anything to the conversation. End of the day, these comparisons show us how the software utilized the hardware. That’s all.

It’s never been a statement of what the hardware is capable of. We know those values since they were released before launch. Those values never change either.

It’s only about how the software works with the hardware to make their vision of the game. I really grow tired of this cherry picking scenes at different settings from different games to try to measure the systems. That’s the worse way to measure, and more importantly doesn’t help anyone.
 
Wow, look at that massive tearing on the 4Pro. They should have lowered the settings on the 4Pro since it just can't handle it well.

perform3ins2y.gif
 
Wow, look at that massive tearing on the 4Pro. They should have lowered the settings on the 4Pro since it just can't handle it well.

perform3ins2y.gif

If you're tearing like that - constantly, and across the full height of the screen - then something is very wrong with the choices you've made.

Limiting tearing to the top and bottom, say, quarters of the screen makes it much less intrusive but causes sizeable drops in average frame rate.
 
Well we have that game with F1. In framerate stress tests the game drops at ~55fps on Pro with 59 or 60fps but tearing on XBX (meaning the XBX is pushed to it's limit). That's a 10% advantage for the XBX version in those scenes (not oddly those scenes that show the lowest gap have rainy conditions...)

In others scenes think the lowest I have seen is 51fps on Pro vs 59fps on XBX -> 16% advantage for XBX.

In this game, XBX has between 10% to 16% framerate advantage with same res and roughly same settings. Pretty telling, don't you think ? compared to allegedly 100% advantage in others games ?
there is one small misstake here. both platforms are limited by the CPU in most areas (which is perfectly fitting for the fps differences). But the xb1x is also limited by the 60fpx cap. So we can't say how "limited" the gpu really is. It would be nice to "hear" how loud the xb1x gets, because this is the only way I know you can currently identify if the GPU is really pushed to it's limits.
the mirror- and LOD-problem seems to be a bug (at least I would say so), just because there is no reason why the xb1x shouldn't render those things if it performes better than the 4pro. Shouldn't change much.

But I really don't get it why the forza team is the only one that gets 60fps stable in a racing game. It is just so essential to this genre to have at least stable framerates.
 
But the xb1x is also limited by the 60fpx cap. So we can't say how "limited" the gpu really is. It would be nice to "hear" how loud the xb1x gets, because this is the only way I know you can currently identify if the GPU is really pushed to it's limits.

From the games I have played on my One X, it is as silent of a console as my launch Xbox One. The shelf area behind the One X feels warmer than behind the One while the top of the One X console are much cooler to the touch than the One. The sides register almost no heat at all. The One X does a better job at exhausting and moving the heat away from the system and out the back. The cooling system works amazingly well. I really cant hear the One X when its gaming.

I can and have heard my 4Pro (not the latest hardware revision of it), its louder than the One X.

Yes, very unscientific data there, but I dont think hearing the One X will give you anything because of how well the new cooling system works.
 
there is one small misstake here. both platforms are limited by the CPU in most areas (which is perfectly fitting for the fps differences). But the xb1x is also limited by the 60fpx cap. So we can't say how "limited" the gpu really is. It would be nice to "hear" how loud the xb1x gets, because this is the only way I know you can currently identify if the GPU is really pushed to it's limits.
the mirror- and LOD-problem seems to be a bug (at least I would say so), just because there is no reason why the xb1x shouldn't render those things if it performes better than the 4pro. Shouldn't change much.

But I really don't get it why the forza team is the only one that gets 60fps stable in a racing game. It is just so essential to this genre to have at least stable framerates.
optimization right?
XBO exclusives will leverage more of DX12 feature set and likely develop a pipeline for it than a multi platform title would. If they haven't yet, they're clearly moving into that direction.
And once again, I have to ask the obvious question (that no one can answer) of whether xbox exclusives are leverage executeIndirect in their games, as the added customization to their command processor opens up possibilities and techniques for heavy GPU side dispatches. I'm so desperate to learn more whether it's a successful or not in it's current iteration and how rendering will change as it moves to that direction, but if not, what MS needs to do with DX12 to make it so.
 
optimization right?
XBO exclusives will leverage more of DX12 feature set and likely develop a pipeline for it than a multi platform title would. If they haven't yet, they're clearly moving into that direction.
And once again, I have to ask the obvious question (that no one can answer) of whether xbox exclusives are leverage executeIndirect in their games, as the added customization to their command processor opens up possibilities and techniques for heavy GPU side dispatches. I'm so desperate to learn more whether it's a successful or not in it's current iteration and how rendering will change as it moves to that direction, but if not, what MS needs to do with DX12 to make it so.
Optimizing is not that hard if you just lower details ... well whatever the bottleneck is (e.g. lower physic calculations if the cpu is the bottle-neck etc).
But they just want to push their visuals and we don't get fluent picture. Yes, forza has been optimized a lot for the hardware, but as stated in the videos, they never really use the full potential of the GPU (well and the cpu on the x) just because they want a fluent 60fps and that leaves a bit headroom on gpu-side. Developers should not always push the res or add new details if that means that the frame-cap cannot even be reached. At least that is my opinion. stable framerate >> res/details.
 
Optimizing is not that hard if you just lower details ... well whatever the bottleneck is (e.g. lower physic calculations if the cpu is the bottle-neck etc).
But they just want to push their visuals and we don't get fluent picture. Yes, forza has been optimized a lot for the hardware, but as stated in the videos, they never really use the full potential of the GPU (well and the cpu on the x) just because they want a fluent 60fps and that leaves a bit headroom on gpu-side. Developers should not always push the res or add new details if that means that the frame-cap cannot even be reached. At least that is my opinion. stable framerate >> res/details.
Forza 7 looks great on X1X, I'm not sure where you are going with that? F7 manages to accomplish both graphics and physics and cars on track with weather and dynamic puddles no?
 
Forza 7 looks great on X1X, I'm not sure where you are going with that? F7 manages to accomplish both graphics and physics and cars on track with weather and dynamic puddles no?

I think Allandor is talking about leaving enough headroom so that even unusually demanding edge cases don't result in lost frames.

Optimisation can only get you so far in minimising the effect of worst case scenarios. If you don't want to drop frames (or tear) even in the worst of circumstances you'll spend 99.9999% of your time using less than 100% of your resources. And probably 99% of your time using a lot less than 90%.

That's why dynamic resolution scaling is so attractive. Though in the case of most games, that means having a lot of CPU left in reserve most of the time too,

With Forza, even the stock X1 does a remarkable job of holding 60 hz. That means that the 30% faster (probably somewhat more in reality with the enhancements) CPU is probably going somewhat underutilised all the time. But that's a price worth paying, IMO.
 
Witcher 3 X1X vs 4Pro
imo, these results between the two versions is what I would expect when both platforms optimized well.

What's interesting is that we're seeing more dynamic scaling vs CBR decisions by developers. Sort of interesting that it went this way; we have often discussed as a group that the dynamic/native vs CBR was not a good trade off (for X1X, but excellent for 4Pro), yet we see some more games selecting this route.

This is probably not the thread to discuss it, so perhaps a spin off if people are interested in debating it. I think as a developer this is the best way to keep the two versions close together, while maximizing both platforms. X1X owners may feel differently as, or if, reconstruction is nearing native visual quality, then the power could be used elsewhere, like higher quality settings. An interesting debate to be had, but the results are good none the less, OR, this speaks volumes to the ID Buffer in being able to assist with reconstruction techniques. But we'd need some numbers to make that a thing. We've seen HZD skip ID Buffer for their own techniques for instance. If you're skipping ID Buffer, than the same type of algorithms can work on X1X with similar success.

http://www.eurogamer.net/articles/digitalfoundry-2017-witcher-3-xbox-one-x-analysis

 
Last edited:
Not bad considering the game mostly is 1600x900 on standard Xbox One. The One X seems to range from 1800p to 2160p while offering antialiasing to the pipeline for a slightly cleaner look and also offers higher quality shadows. The 4Pro uses CheckerBoarding 4K.

~~~
Another curious quirk of PS4 Pro's approach is an apparent lack of post-process anti-aliasing added to the pipeline, meaning coverage on tree outlines is often left raw. Compared to Xbox One X, which adds an extra pass to these elements for a cleaner (if softer) frame, it puts Microsoft's console in pole position in terms of image quality, but the truth is that owners of both mid-gen refresh consoles get great results for their UHD displays, with super-sampling in place for 1080p users.

On its blog, developer CD Projekt Red has outlined additional visual upgrades, reserved for the 4K mode. This includes enhanced ambient occlusion, shadows and textures, all bringing it up to par with PS4 Pro's quality presets. That is, with one exception: Xbox One X uses an even higher quality setting for shadows, giving a visibly clearer outlines to distant shade. Checkerboarding aside, the visual feature set is much the same between both machines, but this one area alone sees Xbox One X pull aside.

In terms of gameplay, it's fair to say that Xbox One X does push on to a higher level with an extra perk in performance. With the 4K mode selected, key stress points like Crookback Bog and Heatherton village run at a flawless 30fps on Xbox One X. It's difficult to interrupt the game with even an occasional stutter in detail-rich environments like Novigrad too, showing there's still clearly lots of headroom for the console. Bearing in mind the sustained 25fps refresh on PS4 Pro during the Crookback area especially, it's a surprise to see its rival winning out in performance as well. For our money, the 4K mode on Xbox One X gives the smoothest, best-looking take on The Witcher 3 outside of the PC space.
 
Their CBR 4K on Pro holds up quite well compared to the dynamic 4K. I think the CBR is causing foliage to look a bit soft by comparison, but other than that, it looks great.

The extra AA pass on XB1X helps clean up edges, but it does soften them a bit by comparison, more noticeable in distant buildings.

Performance in cutscenes and the select few areas of the game probably could have used some dynamic scaling on Pro to help with performance. Hopefully the upcoming HDR patch will add some additional performance improvements on Pro.

Other than that, great job by CDPR. Game looks great on both.

I've said this before, but I REALLY hope CBR 4K is the standard next gen, instead of brute forcing native 4K. With improved CPUs, I hope CBR 4K / 60 is more common.
 
Last edited:
Pro hits 20 fps at points where X1X holds solid 30 with higher IQ. CB rendering on pro has entirely missing geometry on distant/fine details while X1X has all pixels filled + higher quality AA. This is bizarrely presented as some kind of stylistic preference, just like shitting yourself after alcohol poisoning on a night out is some kind of "party trick".

In the CPU limited city Pro drops under 30 fps while X1X just about condescends to touch on the lowest low point of 35-ish when it's at it's most desperate - which is pretty desperate for a $499 system running in "performance mode".

Speaking of which, in its exclusive 60 hz mode X1X shits the bed and runs away from 60 hz whenever things get busy. Or whenever there's weather.

In short: Pro does really well for a $399 2016 machine, X1X owns the Kingdom but it swaggers in overweight 12 months after the contender has left to eat budget lentils and fit into smaller pants.
 
It only drops to 20fps during a cutscene. Cutscenes in general could maybe use a drop in res on Pro to stabilize performance.

The only time framerate is not stable on Pro during gameplay is in the Bog area, where the framerate is between 25-28fps. Otherwise it's a pretty stable 30fps.

Foliage doesn't look very good on either console tbh, noticeably worse than PC. It does look a bit less detailed compared to XB1X though. Foliage seems to be what is affected most by CBR vs native. Still looks far better than 1080p.

Again, hopefully some of these performance issues can be sorted out in the upcoming PS4 patch that adds HDR.
 
It only drops to 20fps during a cutscene. Cutscenes in general could maybe use a drop in res on Pro to stabilize performance.

The only time framerate is not stable on Pro during gameplay is in the Bog area, where the framerate is between 25-28fps. Otherwise it's a pretty stable 30fps.

Foliage doesn't look very good on either console tbh, noticeably worse than PC. It does look a bit less detailed compared to XB1X though. Foliage seems to be what is affected most by CBR vs native. Still looks far better than 1080p.

Again, hopefully some of these performance issues can be sorted out in the upcoming PS4 patch that adds HDR.
Well, Pro hasn't got any dynamic res like XBX that can pushes less pixels than Pro in those scenes.

Pro hits 20 fps at points where X1X holds solid 30 with higher IQ. ...

Lower IQ actually. Image quality of 4K CBR is much better than 1480p on XBX in those cases where the GPU is pushed (most probably in the bog area). Source: VG tech:

Xbox One X in 4K Mode uses a dynamic resolution with the lowest pixel count found being approximately 2620x1480 and the highest count found being 3840x2160.

Thanks god we have others sources than DF to give us a better view of the whole picture.
 
That vid is just a frame rate test. Where does it say that the Xbox sustains the lower bound of it dynamic rez just to maintain 30 fps in the bog area?
 
That vid is just a frame rate test. Where does it say that the Xbox sustains the lower bound of it dynamic rez just to maintain 30 fps in the bog area?
In DF's written article, DF claims that it drops to 1800p during the Bog area. VG Tech claims that the lowest resolution he saw was 1480p though. VG Tech often times finds more difficult stress points than DF, as he usually finds the same or lower res/frame drops.

DF says In the video and written article that it's 4K when you look at the sky, wander around interiors, or are in less GPU intensive areas. But the resolution drops in more intensive areas.

As far as IQ goes, I guess it's a matter of preference. XB1X is cleaner overall because of the extra AA, but it does drop below native 4K in intensive areas. Both the AA and resolution drops can cause the XB1X version to look a bit softer sometimes. The lack of AA on Pro does cause some edges to look rougher by comparison. And the CBR seems to soften foliage a bit.

Overall I'd say XB1X looks a bit better. But honestly, they both look very close to my eyes. I think CDPR did a great job on both versions.

Pro's biggest issue is the framerate during the bog area, and cutscenes. But that's only a small portion of the game.
 
Last edited:
Well, Pro hasn't got any dynamic res like XBX that can pushes less pixels than Pro in those scenes.



Lower IQ actually. Image quality of 4K CBR is much better than 1480p on XBX in those cases where the GPU is pushed (most probably in the bog area). Source: VG tech:



Thanks god we have others sources than DF to give us a better view of the whole picture.
If you know nothing about the context of the numbers I’d hardly call that the whole picture.

Not saying it’s not a relevant data point. But over attributing without knowing how many frames will dip that low or for how long or if that number can be reproduced reliably, is not really knowing the whole story.

Above 1440p for All of 30 seconds to 2 minutes for a game that plays for 200hrs is not relevant data point. I don’t even know if it amount to 0.000025% or game time. That puts it probably near 6 deviations out from average resolution.
 
Status
Not open for further replies.
Back
Top