Digital Foundry Article Technical Discussion [2020]

Status
Not open for further replies.
This is what I expect I guess too, but the one thing that is throwing a wrench in it is the idea of them handling incoherent rays with a different performance profile. Or, the in32fp32 split in turing giving it a different performance characteristic in some RT titles than RDNA 2.0 has.

Something along those lines.

I can't remember. Has DF ever done any testing of ray tracing with metro, control, battlefield with the frame rate capped at 30fps? Really curious to see if the de-noising struggles at pathetic frame rates.
 
NV's first hardware tessellation implementation was more performant than AMD/ATI's despite AMD/ATI having had multiple iterations of hardware tessellation over the years.
NVs history about tessellation dates more back i think:

NV1, their first GPU was about rational bezier patches, which is the building block of NURBS. No triangle support at all. (Probably targeting professional CAD market not games, although everbody seems to see this as a failed attempt of game GPU.)
GeForce 3 had bezier patches too, which i have used back then. The integer/farctional tessellation options shown were the same later becoming standard with actual TS. (I don't know if NV1 already had this too, likely yes.)

On ATIs side i only remember PN triangles before TS came up.
 
I can't remember. Has DF ever done any testing of ray tracing with metro, control, battlefield with the frame rate capped at 30fps? Really curious to see if the de-noising struggles at pathetic frame rates.
From personal tests using Quake 2 RTX on my vanilla 2060, I find the denoiser to look fine at 4k at about 7-10 FPS. 1080p looks pretty fine @ 45FPS, too, and noise starts to show at 720p a little at 95FPS. Here's the thing with Quake 2 RTX for me, though. At any resolution other than 4k, it's noisy and a bit brighter until I toggle vsync on or off. Not as noisy as disabling the denoiser, but more like the brightness is up a bit higher and an overly aggressive sharpening filter is being applied. But after that, it's pretty clean looking, regardless of frame rate. Resolution has a bigger impact that framerate.

I can't test Control, Metro or BFV because I don't own those. The only other RTX enabled game I own is Shadow of the Tomb Raider, and I don't think I could pick out denoiser artifacts in that game if I tried.
 
I can't remember. Has DF ever done any testing of ray tracing with metro, control, battlefield with the frame rate capped at 30fps? Really curious to see if the de-noising struggles at pathetic frame rates.
I have looked at it before.
I think the denoiser on the diffuse GI and reflections in Control does not do a good job really at 30 fps, as you can see the smear and trails much more easily. A controller "helps" for lateral movement since controllers are inherently linear in movement and slow and imprecise, but any moving object at 30 fps with their denoisers for those two facets has visible problems IMO.

Battlefield has a high feedback denoising scheme so it actually looks "OK" at lower framerates on the RT reflections.

Metro once again has, in the base game on launch, a more obvious and slow temporally based denoiser. 30 fps, you could see trailing behind fast moving objects. This changed though when the last Colonel released as them adding in emmissives meant it needed to speed up and not leave trails if it was to work, otherwise you would fire the flamethrower and see ghost and trails everywhere. Interviewing them at Gamescom 2019, they mentioned how they worked a lot to cut down ghosting and trails for their denoiser post-the last colonel.
Though that was at 60 fps, I have not tried it at 30.

IMO, Quake 2 RTX denoiser in v 1.3 is pretty amazing. Not a lot of trails, very stable. 30 fps... it becomes more visibile though.

I think in the end, the denoising at 30 fps in console titles might be visibly "problematic" for people used to 60 fps games on PC. But if you look at a game like UC4 and how its 30 fps presentation is with its motion blur and TAA (which imo, are not good), yet that game is praised very much so, I think most people would end up just collectively ignoring denoising ghosting and slowness like they do all the time in 30 fps titles.
 
NVIDIA had an interesting presentation planned for GDC

Fast Denoising with self stabilizing recurrent blurs

This talk seeks to guide artists and designers in creating and fine tuning their content for Ray Tracing in Unreal Engine 4.

In this topic NVIDIA is going to discuss latest advancements in non-DL based denoising. Basing on previous work from Metro Exodus, a new method has been introduced which is based on recurrent blur too, but it has got a lot of improvements, like: better overall performance, cleaner results, specular denoising support, fast data reconstruction, better bilateral weighting and some others.

https://developer.nvidia.com/gdc20-show-guide-nvidia

Not sure if this is something that is being used in UE4/Q2RTX already but seeing how denoising can add a whopping 40% to frametime in Q2RTX, some more perfomant denoisers would definitely be welcome.
 
IMO, Quake 2 RTX denoiser in v 1.3 is pretty amazing. Not a lot of trails, very stable. 30 fps... it becomes more visibile though.
Strangely enough when I was doing my non-scientific denoiser comparisons with Quake 2 RTX I didn't walk around a lot. At lower resolutions, you can see a bunch of noise in the image without moving. But yeah, after capping the framerate to 24 and walking around, I see trails there. It's different than the low res noise, though. That's more swimming pixels.
 
But if you look at a game like UC4 and how its 30 fps presentation is with its motion blur and TAA (which imo, are not good), yet that game is praised very much so, I think most people would end up just collectively ignoring denoising ghosting and slowness like they do all the time in 30 fps titles.

True. If not 60fps without the need for motion blur becomes a thing on consoles, but i doubt it if RT makes it entry and 4k (or close to it) are targets, along with next generation graphics. Interesting to see if/might more sony (ps5) exclusives make it to pc.
 
Pretty much inline with what I was expecting.
Compared to 1X, looks to be a match. Probably hold framerate better due to CPU.
My take on xsx, would be as expected between 1080p - 1440p.
One of the reasons I would like slightly more than 4TF is to get it closer to 1440p and give it bit more legs to stay at/above 1080p in the future. But not if it meant it would impact cost.
 
Nice comparison as always, but I think Richard missed an opportunity to talk about Lockhart's role as a possible xCloud server blade to replace XB1S units.

Tommy McClain
 
Might be interesting to see the same but done with the newer rumored 7TF Lockhart to see how much improvement over a 6TF GCN OneX it could bring.
 
Also worth noting that it was done with just lowering resolution, not settings which can be done at lower resolutions if need to claw back bit more performance etc.

I think a 7TF Lockhart may be too close to xsx in both visual output and price.
Personally 4.5 nice - 5TF would probably be enough imho
 
Last edited:
Also worth noting that it was done with just lowering resolution, not settings which can be done at lower resolutions if need to claw back bit more performance etc.

I think a 7TF Lockhart may be too close to xsx in both visual output and price.
Personally 4.5 nice - 5TF would probably be enough imho

Yeah I think they have to have enough of a gap to where the price differentiates enough. A 4-5TF chip would be small enough of a chip to shave off some cost. Not to mention dropping the disc drive which I suspect they might as well.

You could "cheap out" on cooling as well.
 
It’ll be pretty tight between the GDDR6, the APU, and the SSD. Not really much room for the remaining components with a sub-$300 release (or even $249 if the imgur leak last year is believable).
 
Yeah I think they have to have enough of a gap to where the price differentiates enough. A 4-5TF chip would be small enough of a chip to shave off some cost. Not to mention dropping the disc drive which I suspect they might as well.

You could "cheap out" on cooling as well.
  • Smaller APU
  • Cheaper cooling
  • Cheaper power delivery
  • Less and slower GDDR6 memory
  • No disc drive
    • No second hand market
    • Higher margins compared to physical sales
    • People moving to digital anyway but this is forced
  • Higher sales(after initial launch months)
I don't know how much that all adds up to, but could be enough to not loose too much on keeping price low.

Price point would not be bellow $300 if xsx is $500.
Bellow that price point is the X1SAD, and possibly keep X1S around for bit longer. 1X gets EOL.

Lockhart isn't a price replacement for X1 or the X1SAD, it's entry level for next gen.
 
  • Smaller APU
  • Cheaper cooling
  • Cheaper power delivery
  • Less and slower GDDR6 memory
  • No disc drive
    • No second hand market
    • Higher margins compared to physical sales
    • People moving to digital anyway but this is forced
  • Higher sales(after initial launch months)
I don't know how much that all adds up to, but could be enough to not loose too much on keeping price low.

Price point would not be bellow $300 if xsx is $500.
Bellow that price point is the X1SAD, and possibly keep X1S around for bit longer. 1X gets EOL.

Lockhart isn't a price replacement for X1 or the X1SAD, it's entry level for next gen.

I agree that it's unlikely that it would be below $300. I think if Lockhart does get released Xbox One X is pretty much discontinued right away. Xbox One S might stick around for a bit but I think even then it's days will be numbered. The whole point of Lockhart presumably is to capture those that would buy a cheaper console. If the MSRP of Lockhart is $299 or even $349 they could have discounts/sales that would put it into pretty affordable territory.
 
  • Like
Reactions: Jay
DF emulates a potential Lockhart machine:

it's impressive how a 4TF GPU with 22CUs can beat and cruise the performance of previous 6+ teraflops GPUs from AMD. That's the best part of the video for me. From a company coming from an obscure time with crappy GPUs -PS4, Xbox One- and CPUs, to having the best CPUs in the market by a long shot, the most efficient too, and GPUs that are getting closer to nVidia's super standard of efficiency. Not quite there but...close.

Also, what's with the naming of AMD's modern graphics cards? 5500XT, 5700XT..... Back in the mid-2000s they had the 5900XT already! (edit: my bad, it was nVidia's)

edit2: but hey, look at this, nVidia gotta do something https://www.techpowerup.com/gpu-specs/radeon-rx-5900-xt.c3481

https://www.techpowerup.com/gpu-specs/radeon-rx-5950-xt.c3487
 
Last edited:
crappy GPUs -PS4, Xbox One

CPU's where crap. GPU's not that crappy, mid end offerings for the time perhaps, somewhat below. XSX will be mid-end if we get big navi and ampere this year. Many are comparing RTX2000 series from 2018, by the time will be over 2 years old, a 3070 will be matching a 2080/S, at the least. Perhaps a 3060 will match the consoles.

the best CPUs in the market by a long shot

Zen2 8 core, most likely comparable to 3700x, a long shot from the best amd has to offer right now. No idea what the clocks will be but probably they wont boost to 4ghz or higher. Maybe a 3.2 to 3.5ghz baseclock.
 
Status
Not open for further replies.
Back
Top