Digital Foundry Article Technical Discussion [2020]

Status
Not open for further replies.
Ampere will be very interesting, and I'm personally eyeing it up for my next GPU. Given the way DLSS is coming on it's going to be hard to justify RDNA2 unless it's a lot cheaper.

Indeed, seeing how Turing perfoms in rasterisation, ray tracing, and dlss/tensor, Ampere can only improve from the 2018 turing products. AMD has come long ways, in special in CPU, but i think NV is still a generation ahead atleast, it seems RNDA2 catches up with Turing.
NV basically had all the RNDA2 features and tech with Turing, and with ampere their going a step further.
Consoles have impressive hardware for being 500 dollar boxes, but the pc is more intresting then ever, with Ampere and full RDNA2 dGPUs around the corner with around 18TFs or more, Zen3 ryzen cpu's that improve even more over what zen 2 did, PCIE4/5 and DDR5, and ofcourse 7gb/s SSD's slated to launch with the adaptation of DX12U/velocity arch, we are in for a threath, in special if we see Sony ports later in the generation. I think todays gddr6 vram on turing is fast, but that sure will be more too, perhaps approaching closer to 1TB/s or even HBM for higher end stuff.
Going to be intresting to see Alex's DF video analysis with games running on such hardware. 120fps 4k with DLSS3 together with ray tracing could become possible.
 
The Touryst is Stunning on Xbox One And PC - A Superb Port of a Brilliant Switch Game

One of our favourite games from last year was Shin'en Multimedia's The Touryst - a game that looked quite unlike anything else and played beautifully. It's a simply wonderful experience and now it's available on Xbox One, Xbox One X and PC - and a technical masterpiece is just as good on more powerful hardware... with one or two caveats.

 
Lovely on the One X. Not too bad on One S, but definitely a jaggy fest. Performance still seems high on both.

Tommy McClain
 
Was playing this yesterday. It's really beautiful on the One X. The only thing I would is that the distance blur is just a little too agressive for my taste.
 
I'm finding it hard to get direct comparisons of RT performance, as all we've got at the moment is the MS Minecaft demo, vs lots of stuff online with DLSS involved! I get the feeling that pure RT performance from RDNA2 at the XSX level is probably above a 2060 (maybe?), but I guess we'll find out whether Nvidia's RT cores have less impact on the rest of the GPU for hybrid rendering in due course.

Ampere will be very interesting, and I'm personally eyeing it up for my next GPU. Given the way DLSS is coming on it's going to be hard to justify RDNA2 unless it's a lot cheaper.



It's really hard to know isn't it. DF have an interesting DLSS graph that shows even though the 2080Ti has about double the INT8 performance of the 2060S, it's only about 50% faster at 1080p DLSS output from 540p (I think). As the base resolution and and output resolution increase, then more of the 2080Ti's Tensor performance seems to come into play. Maybe there's some fixed function element getting in the way at lower resolutions.

A 2060S' Tensor cores should be about 7 times faster at INT8 than a hypothetical Lockhart at 4TF FP32, but that's pitting the entire MS GPU against just the Tensor cores in the Nvidia chip. But then again, at a base resolution (before MLSS) of 540p you'd probably have low utilisation of the 3D pipeline and be able to make good use of async compute to regain some of the utilisation lost to 540p rendering (huge pixels compared to polygons so inefficient for rasterisation and all that).

While we're on the speculation train, lets stay on for one more stop!

2060S in Death Stranding is giving DF figures of a 0.736 ms cost for the DLSS. Taking this at face value (assuming it's not a separate stage with the full cost hidden), if half your Lockhart GPU time was taken up with ML upscaling, and the figures between the two are directly comparable (probably not), that'd be about 14 x 0.736 ms = ~10.5 ms. Or less than one third of a 30 fps frame. Would this be better than native 1080p or 900p with sharpening for a 30 fps game?

Errr ... maybe? (And it might let you get away with shockingly low res textures and less time lost to RT too...)

I just hope AMD releases some generic free-to-use version of DLSS thats performant across varying hardware.
 
Last edited:
Was playing this yesterday. It's really beautiful on the One X. The only thing I would is that the distance blur is just a little too agressive for my taste.

Agreed. Open that baby up!

BTW, the guys at DF mentioned something about implementing raytracing on this for the Series X version. How would that look?

Tommy McClain
 
hm
https://www.eurogamer.net/articles/digitalfoundry-2020-horizon-zero-dawn-pc-tech-review

Average frame-rates from the benchmark don't tell the full story. Apparently, I can run Horizon Zero Dawn at native 4K at an average of 78 frames per second on an RTX 2080 Ti. What it doesn't tell you is that the actual game experience will be a fair bit lower than that, with frequent stutters. Stutters in excess of 40ms, 70ms or over 100ms can happen as a cutscenes starts or ends, when a camera changes position in a cutscene, when a UI element updates for a quest, or when you are just walking around in the world not doing anything special in particular. This happens reproducibly across multiple graphics card and CPUs and chosen resolutions, impacting the fluidity of the game, producing an experience less consistent overall that the PlayStation 4 version, which has no such stutter.

I thought that dropping to console-level 30fps might solve the issue but the problem is that the 30fps cap within the game actually runs at 29fps, producing even more stutter. Also, if you are experiencing profound performance problems, make sure you have your mainboard properly configured for 16x PCIe bandwidth for the GPU. This one's on me but I didn't - my slot was set to 8x bandwidth and it hobbled performance, while switching up to 16x solve that particular problem. Going back to Death Stranding, PCIe bandwidth made no difference at all.

Cutscenes run at arbitrary frame-rates but facial animation is locked to 30 frames per second

...
Another problem in cutscenes is how they were not authored around the idea of interpolated frame-rates above 30fps, so in some cutscenes you can see characters warp around during scene cuts. Mismatches in animation refresh are evident elsewhere: Horizon's tall 'stealth grass' runs at the correct frame-rate at all times, but the new dynamic plants and foliage added to the PC version are locked at 30 frames per second refresh instead. An unlocked frame-rate needs to mean just that - picking and choosing what can meet the limits of PC hardware and what remains locked to 30Hz shouldn't be an option. What's so baffling about this is that Guerrilla Games are perfectionists - I can't help but feel that intrusive stutter and mismatched animation would never make their way into one of their PlayStation products so it's disappointing to see that happen here.

Our feedback was submitted to the developer and we understand that addressing the stutter and fixing the broken texture filtering is a priority, while essential features like full frame-rate animation are being looked into.

Some hope things are addressed.
 
Last edited:
Did From Software take care of the port? Gosh, they should have delayed this. Hopefully performance will get better.

edit:50€ for this game in this state is a joke. Sorry for the developers, maybe they should have been given more time.
 
Practice run for the PS5 Enchanted version, right? ;)

But yeah, I don't see why they didn't just delay it. Could have just relaunched the game simultaneously with PS5? *shrug*
 
hm
https://www.eurogamer.net/articles/digitalfoundry-2020-horizon-zero-dawn-pc-tech-review

Average frame-rates from the benchmark don't tell the full story. Apparently, I can run Horizon Zero Dawn at native 4K at an average of 78 frames per second on an RTX 2080 Ti. What it doesn't tell you is that the actual game experience will be a fair bit lower than that, with frequent stutters. Stutters in excess of 40ms, 70ms or over 100ms can happen as a cutscenes starts or ends, when a camera changes position in a cutscene, when a UI element updates for a quest, or when you are just walking around in the world not doing anything special in particular. This happens reproducibly across multiple graphics card and CPUs and chosen resolutions, impacting the fluidity of the game, producing an experience less consistent overall that the PlayStation 4 version, which has no such stutter.

I thought that dropping to console-level 30fps might solve the issue but the problem is that the 30fps cap within the game actually runs at 29fps, producing even more stutter. Also, if you are experiencing profound performance problems, make sure you have your mainboard properly configured for 16x PCIe bandwidth for the GPU. This one's on me but I didn't - my slot was set to 8x bandwidth and it hobbled performance, while switching up to 16x solve that particular problem. Going back to Death Stranding, PCIe bandwidth made no difference at all.
Cutscenes run at arbitrary frame-rates but facial animation is locked to 30 frames per second

...
Another problem in cutscenes is how they were not authored around the idea of interpolated frame-rates above 30fps, so in some cutscenes you can see characters warp around during scene cuts. Mismatches in animation refresh are evident elsewhere: Horizon's tall 'stealth grass' runs at the correct frame-rate at all times, but the new dynamic plants and foliage added to the PC version are locked at 30 frames per second refresh instead. An unlocked frame-rate needs to mean just that - picking and choosing what can meet the limits of PC hardware and what remains locked to 30Hz shouldn't be an option. What's so baffling about this is that Guerrilla Games are perfectionists - I can't help but feel that intrusive stutter and mismatched animation would never make their way into one of their PlayStation products so it's disappointing to see that happen here.
Our feedback was submitted to the developer and we understand that addressing the stutter and fixing the broken texture filtering is a priority, while essential features like full frame-rate animation are being looked into.

Some hope things are addressed.


... At least someone is making 343i's Halo Infinite PC beta look good ...


 

Check out the comparison between RX580 and GTX1060 at 19:12.

Someone pointed out an interesting aspect where AMD cards' hair physics are different than nVidia's, which is somehow correct.

With the AMD ones, it moves like a single unit where with nVidia cards, it's much more smooth, life-like and has more movement/flow which might be causing serious FPS issues if the same thing applies to every fur/hair etc.
 
Hi @Dictator

https://www.eurogamer.net/articles/digitalfoundry-2020-horizon-zero-dawn-pc-tech-review



Horizon Zero Dawn is a port by developer Virtuous. You should maybe add that to your article before people think GG actually did the port.
We have Heard this as well and I imagine it to be true give what I see here - yet all of our communication before release about the technical issues we were experiencing Was with Guerilla and not the Port house - very different than other games with Port house where indirectly communicate with them (Benox, Nixxes, etc.) .
 
Looking at Horizon Zero Dawn for PC made me wondering what the end result would've been like if the developers were able to use Mantle instead of D3D12 ...
 
The PCIe bandwidth scaling is interesting.

I wonder how next gen GPUs with PCIe 4 bus speeds will scale with this game?

Is there any benches using a 5700X comparing Gen3 vs Gen4?
 
Just watched the video analysis. Ouch, that's pretty devastating! How could they release a game in this state?

Great analysis though. This is definitely one to hang fire on until it's either fixed or gets much cheaper.
 
The PCIe bandwidth scaling is interesting.

I wonder how next gen GPUs with PCIe 4 bus speeds will scale with this game?

Is there any benches using a 5700X comparing Gen3 vs Gen4?

I wonder why this game is so dependent on pcie bandwidth, even when the vram is not full... At the same time, this game was so developed and tweaked for ps4, I'm not that surprised that the pc port is a mess.
 

Good to see that SSD does indeed improve texture pop in as I long suspected but repeatedly got told was not the case 'SSD on consoles only improves loading times'.

Interesting the varying degrees of improvment on loading - what's that all about - from ~20% to ~3x improvements game dependent!?
 
Status
Not open for further replies.
Back
Top