Digital Foundry Article Technical Discussion [2018]

Status
Not open for further replies.
So after all, PS4 Pro never was 4k, expect when you stand still to take a screenshot. But highest res in motion 1800p with regular res-drops. Full 4k on xb1x with regular res-drops.
Never understood why DF never made a video of the "real" PS4 Pro 4k patch when it was released (there was a 4k patch before that only updated the UI to 4k). After all it was still a remarkable upgrade to the base hardware.

What I really don't understand is, why did ubisoft take so long do develop that 4k patch? I thought x-patches were much easier to develop and they annouced the patch for nov/dec 2017.
The Xbox is right now the best console overall. Especially the Xbox One X. Not only you are starting to have "infinite" BC but if you want to play the best multiplatform games, that's what you have to buy -unless you have a PC with a 1050Ti or above, then the Xbox One X isn't the best platform.

I think Xbox One X is about the same power of a Intel i7-7700HQ and a 1050Ti --more or less like a powerful laptop.​
 
Last edited by a moderator:
That's a bummer.
One would've thought Microsoft had given the guidelines to support VRR on at least all XboneX enchanced titles.
Yea they need to get on that for sure.
They do indicate which titles have VRR and which don't, but I agree that at this point in time they need to find ways to encourage developers to support VRR.
 
The Xbox is right now the best console overall. Especially the Xbox One X. Not only you are starting to have "infinite" BC but if you want to play the best multiplatform games, that's what you have to buy -unless you have a PC with a 1050Ti or above, then the Xbox One X isn't the best platform.

I think Xbox One X is about the same power of a Intel i7-7700HQ and a 1050Ti --more or less like a powerful laptop.

X is more comparable to the 1060 6gb. Farcry 5 actually ran better on X in 4k too
 
And here's the DF written article on Xbox FreeSync: https://www.eurogamer.net/articles/digitalfoundry-2018-freesync-support-tested-on-xbox-one-x

It's potentially a revolution in the console space - just as it has been for PC gamers. Variable refresh technology is a big win for improving the game experience, lessening judder and removing screen-tearing. It's a pretty simple concept really, levelling out performance by putting the GPU in charge of when the display should present a new frame. It's a game-changer. No longer are unlocked frame-rates a problem - in fact, 40-50fps gameplay can look almost as smooth as 60fps. It's a remarkable trick, but crucially, it works. Nvidia's G-Sync led the way, but it's AMD's alternative - FreeSync - that has been built into Xbox One, and we've finally had the chance to test the technology. Clearly it's still early days, but at its best, the results are quite remarkable.

Let's begin by laying out the basics. Microsoft's variable refresh implementation only works with FreeSync screens - G-Sync monitors are incompatible - but the Xbox version of the tech is bespoke, with some pretty big differences compared to AMD's rendition. There's strong compatibility though: you'll need a display that supports FreeSync over HDMI (as opposed to the more common DisplayPort) but on the Microsoft end at least, there's support for 720p, 1080p, 1440p and 4K outputs. We've had confirmation that both Xbox One S and Xbox One X are invited to the party, but the firm also told us that older launch model hardware also gets the upgrade.

...

But looking back at Wolfenstein 2 and The Vanishing of Ethan Carter, minor hiccups aside, it's easy to see just how cool variable refresh technology is and what a difference it can make to the experience. Another thing worth considering is that we went in looking for the best results from Xbox One X, but it may well be the case that the standard model gets a better turnout overall - there's more variability in performance, plus 1080p FreeSync monitors tend to have wider variable refresh windows.

In the here and now, the issues we came across make it difficult for us to recommend that you go out and buy a FreeSync monitor, as opposed to a larger flat panel TV, but Microsoft deserves kudos for embracing the future of display technology and laying the groundwork for support on what is likely to evolve into a very important feature. Whether it's via the mooted HDMI 2.1 variable refresh feature or even with direct FreeSync support, it's only a matter of time before TV manufacturers add this technology or something very much like it to their screens. And with that in mind, it's great that Microsoft is getting ahead of the curve - and sharing this early work with its users.
 
"Some" is putting it lightly. Seems like in most games they tried it either worked some of the time or not at all. Maybe Nintendo can implement G-sync on the Tegra for comparisons lol.

And the number of TVs coming out with Gsync? Samsung will be coming out with Adaptive Sync (FreeSync) TVs and LG are likely to follow closely behind.

It'd be much better if NV pulled their head out of their behinds and finally thought of their customers and started to support Adaptive Sync in addition to Gsync. If Gsync is so much better, people will continue to buy those monitors.

But either way, out of all 3 consoles, I'd imagine that the Switch would be the least likely to end up hooked up to a gaming monitor. As such it's far more important for it to support things that Televisions might actually support.

Regards,
SB
 
wew lad, it was a joke. G-sync on Tegra is as likely as Nintendo using hardware that isn't a generation behind in its next console.
 
https://www.eurogamer.net/articles/digitalfoundry-2018-hellblade-xbox-one-x-xbox-one-s-analysis

Does Hellblade on Xbox One X deliver the definitive console experience?
And how well does the standard console hold up?


More interesting is how Xbox One X version offers additional functionality over PlayStation 4 Pro. The Sony platform features two modes, which essentially prioritise 60fps or target a 1440p resolution. Xbox One X ups the ante here with improvements on both presets. The performance mode does a good job of sustaining 60fps, and does so with a resolution delta that generally sits between 1080p and 1296p (though in very rare cases, it drops to 720p). That's up against Pro with a 720p to 1080p variance, with more of a bias to the lower end, while for its part, X generally settles closer to full HD resolution.

Next, there's the resolution and enriched modes on Xbox One X, each of which add a straight 30fps cap. Ninja Theory promises resolutions typically lower than 4K for the enriched mode, albeit with the best visual settings for shadows and foliage. The max result on each does come in at 3840x2160 at the absolute peak - though it's obviously less common on the enriched variant. The opening coastline area, for example, runs at 3072x1728 on the enriched mode, giving us 80 per cent on each axis of full ultra HD resolution. It's a faintly visible cut in image quality from the high resolution mode, which hits 3328x1872 in that very same scene - or 86 per cent for reference.

This may well go lower in later chapters, but as a comparison point it's a telling gap, and shows the difference in priority. Either way, both modes here best PS4 Pro's only 30fps playback option, which runs between 900p-1440p. For Xbox One X, the enriched mode ultimately gives us the closest comparison point to the settings used on Sony's enhanced machine - in terms of shadows, foliage and level of detail. Our advice would honestly be to skip the high resolution mode altogether; it doesn't quite hit a native 4K and it loses out on a lot of neat extras.
 
So this is more about the PC technology, but still may have some overlap with other benchmarking technology.

https://www.eurogamer.net/articles/digitalfoundry-2018-the-trouble-with-pc-benchmark-modes

Just how useful are PC benchmark modes really?
Optimising performance needs to be easier - and that means we need better tools.

Have you ever loaded up a new PC title, run the in-game benchmark, tweaked settings for optimal performance then discovered that actual gameplay throws up much lower frame-rates, intrusive stutter or worse? It's a particular frustration for us here at Digital Foundry, and it leads to a couple of very obvious questions: firstly, if benchmark modes are not indicative of real-life performance, what use are they? And secondly, if their use is limited, how representative of real-life gaming are the graphics card reviews that use them, including ours?

Before we go on, it's fair to point out that not every benchmark mode out there is useless beyond redemption. In fact, there are a range of great examples that do set you up reasonably well for tweaking for optimal performance. And then there are others which actually drain system resources more than the actual game, which we'd argue is of more use than those that inflate their performance figures.

However, there are some particularly striking examples we have to highlight simply because the delta between benchmark mode and real world performance is absolutely massive. Perhaps the most notorious example we can muster is Tomb Raider 2013. It tests just one scene - the initial shipwreck scene from the beginning of the game - and it shows the camera panning around the Lara Croft character model. It's a scene that's easy to replicate in-game where we find that the same hardware running the same scene at the same settings produces anything up to a 21 per cent performance deficit.

 
So this is more about the PC technology, but still may have some overlap with other benchmarking technology.

https://www.eurogamer.net/articles/digitalfoundry-2018-the-trouble-with-pc-benchmark-modes

Just how useful are PC benchmark modes really?
Optimising performance needs to be easier - and that means we need better tools.

Have you ever loaded up a new PC title, run the in-game benchmark, tweaked settings for optimal performance then discovered that actual gameplay throws up much lower frame-rates, intrusive stutter or worse? It's a particular frustration for us here at Digital Foundry, and it leads to a couple of very obvious questions: firstly, if benchmark modes are not indicative of real-life performance, what use are they? And secondly, if their use is limited, how representative of real-life gaming are the graphics card reviews that use them, including ours?

Before we go on, it's fair to point out that not every benchmark mode out there is useless beyond redemption. In fact, there are a range of great examples that do set you up reasonably well for tweaking for optimal performance. And then there are others which actually drain system resources more than the actual game, which we'd argue is of more use than those that inflate their performance figures.

However, there are some particularly striking examples we have to highlight simply because the delta between benchmark mode and real world performance is absolutely massive. Perhaps the most notorious example we can muster is Tomb Raider 2013. It tests just one scene - the initial shipwreck scene from the beginning of the game - and it shows the camera panning around the Lara Croft character model. It's a scene that's easy to replicate in-game where we find that the same hardware running the same scene at the same settings produces anything up to a 21 per cent performance deficit.


When I'm benchmarking, I usually select games with real-time battles scenes or action scenarios within their respective benchmarking suite. My upcoming benchmarks will center around memory speeds and PCI-E lane (i.e., x4, x8, x16) configurations.
 
https://www.eurogamer.net/amp/digitalfoundry-2018-switch-hacked-exploit-analysis

Nintendo Switch has been hacked, with two similar exploits released in the last 24 hours following a complete dump of the console's boot ROM. The hacks are hardware-based in nature and cannot be patched by Nintendo. The only way forward for the platform holder in fully securing the console will be to revise the Nvidia Tegra X1 processor itself, patching out the boot ROM bug. In the short term, homebrew code execution is possible and a full, touch-enabled version of Linux with 3D acceleration support is now available.

"Choosing whether to release an exploit or not is a difficult choice," fail0verflow wrote in a blog post accompanying the release of its exploit. "Given our experiences with past consoles, we've been wary of releasing vulnerability details or exploits for fear of them being used primarily for piracy rather than homebrew.
more in link
 
Last edited:
So you can now take your Switch and turn it into an Nvidia Shield?

It's quite good news really:
1) Linux, depending on how we it utilises the hardware (namely, docking,) will make the Switch the best emulator on the market.
2) Nintendo will have to release a hardware revision, probably manufactured on a more advanced node, granting better battery life.
 
https://www.eurogamer.net/articles/digitalfoundry-2018-the-witcher-patch-161-ps4-pro-issues

What's up with The Witcher 3 patch 1.61 on PS4 Pro?
HDR support added and performance upgraded, but 4K visuals take a hit.

Let's kick off with the positives. The high dynamic range support is a real positive for PlayStation 4 consoles, and combines beautifully with the 4K checkerboarding on PS4 Pro. It gives it parity with Xbox One X in display support, and there's no question the new Toussaint area from the Blood and Wine expansion shines in particular. But the price to pay on PS4 Pro for this upgrade is significant: curiously, draw distances for foliage and shadows are visibly dialed back on 1.61 - notably on the console's 4K output mode. This leads to more pop-in of grass - almost as if it's sprouting from the ground a few metres ahead, while more shadows visibly fade in ahead of Geralt during traversal.

In the video embedded on this page, you'll see that we grabbed some fresh Witcher 3 capture from patch 1.61, and stacked it up against our library 1.50 captures - the degraded LODs are fairly easy to pick up on, and it's certainly been noticed by the game's dedicated community. And there's more - shadow draw distance in 4K mode has also been cut back compared to the original 1.50 patch the debuted Pro support.

 
sucks to have to choose, I'd be pretty upset about this situation.
I think for a lot of people who don't have 4K sets, it's hard to understand, watching 4K movies off your 1080p screen will not get you any closer to seeing what it is either.
4K _needs_ HDR to really be something special. I've seen games that have been upgraded to 4K without HDR and the improvement is nominal; and the reasoning is obvious, you have all this extra clarity, but not enough brightness on the screen to show it, the pixels are really small on 4K. But once you got 4K with HDR, all of a sudden this minute and tiny details in textures and particle effects can pop, and it's visible and wonderful. Coupled with something like OLED, and you're there, you've got these properly lit tiny details and you have properly darkened minute details, and you just feel like you're looking at something you've not seen before. Things like dust, and looking at star fields, things that usually are muddied by either low resolution or a lack of brightness are now displaying as you would think they should. Things can be better sure, but that's happening at the engine level. I think with 4K HDR we're really good to go here for a long time.

So definitely it sucks, I'd hate to have to choose, hopefully they release another patch. But I suppose if I was forced to choose between 4K and better draw distance and 4K HDR, i think I would still choose the latter. The latter looks next gen. The former just looks like maxed out settings.
 
I think Xbox One X is about the same power of a Intel i7-7700HQ and a 1050Ti --more or less like a powerful laptop.

Think a i7 7700hq would be much faster then the jaguar in the xbox one x. 1050TI i think xbox x is faster, a 1060 could be, though i think its abit faster. I think a RX580 is more comparable.
 
Voluemtric clouds, voluemtric lighting, self shadowing POM, lit particles and SSAO are indeed impressive for 2007. Unfortunately, the shading was not up to the quality of the 2006 demo and Crysis Warhead was technically more advanced in lighting as far as I remember. The Motion blur looked very good especially with the blurred outflying shells of the handguns and the nanosuit allegedly consisted of about 70 000 polygons which would be multiple times ahead of all other games at that time.
I dont understand why Yerli said the 2011 console version looks better.

Hopefully we will soon get more destruction and interaction possibilities in games again.
 
Last edited:
Status
Not open for further replies.
Back
Top