Digital Foundry Article Technical Discussion [2020]

Status
Not open for further replies.
Enthusiast forum: "why does XSS even exist, no one could possible even want this"

Rich from DF: "I know someone who cancelled their Cyberpunk Pre-order for PS4 and bought an XSS just to play it"

If they are into gamepass, it's great, but without gamepass it essentially becomes a 100 dollar/euro discount for a PS5 where you lose the advanced controller, half the storage and more than half of the performance, which makes it not such a good deal after all. Especially if the person knows Richard from Digital Foundry, who should warn against 'deals' like this. Imo. If a friend was buying a GPU. 300 dollar or 400 dollar. But the 400 dollar one has more than double the performance, as well as other great features (and games), you would try at least to inform the friend.
If the situation was the other way around, with Xbox having better games, controller, performance, more storage for 100 dollar/euro more, I would advice to get the 399 xbox so it's not about the brand or anything.

As for the video, PS5 appears to have less motion blur but it could be the encoding.

I believe the game will be a clear win for Series X once the patches are out, the marketing deal, the fact that the One X already had advancements compared to PS4 as well as the history of CDPR of, lets call it 'having a better understanding of the xbox hardware/sdk' makes me predict Series X will have slightly better image quality, a more stable frame rate and also slightly less bugs. I would say it's the definitive version. It already is if you have a modern LG OLED (with the VRR) or display
 
Being an Xbox owner the launch 720 days still leave a taste of pessimism, we have seen a pretty consistent trend so far with next gen builds. I wonder if that will continue when the patches come out. This is even more so with @chris1515 slipping in mentions of a new reconstruction library/technique in Sony's Dev tools.

One thing I feel slightly better about is CDPR are not afraid of technology and I would expect deeper feature use for both next gen ports. DX12U being in both PC and Xbox makes me hopeful we will see something closer to a true next gen game and not an up port of Xbox One.
 
Being an Xbox owner the launch 720 days still leave a taste of pessimism, we have seen a pretty consistent trend so far with next gen builds. I wonder if that will continue when the patches come out. This is even more so with @chris1515 slipping in mentions of a new reconstruction library/technique in Sony's Dev tools.

One thing I feel slightly better about is CDPR are not afraid of technology and I would expect deeper feature use for both next gen ports. DX12U being in both PC and Xbox makes me hopeful we will see something closer to a true next gen game and not an up port of Xbox One.

Microsoft have something in the oven too. They showed a demo of a ML reconstruction technology. And AMD prepare a reconstruction technology for PC and consoles. All is ok.
 
Microsoft have something in the oven too. They showed a demo of a ML reconstruction technology. And AMD prepare a reconstruction technology for PC and consoles. All is ok.

Are you talking about this? As if so, this was just DLSS running through DirectML.

I can't find the link right now but AMD have recently said that their DLSS alternative won't be ML based.

So it seems that for now Nvidia are still the only game in town for machine learning based resolution upscaling. It may be that whatever AMD comes up with is just as good though, we'll have to wait and see.
 
Are you talking about this? As if so, this was just DLSS running through DirectML.

I can't find the link right now but AMD have recently said that their DLSS alternative won't be ML based.

So it seems that for now Nvidia are still the only game in town for machine learning based resolution upscaling. It may be that whatever AMD comes up with is just as good though, we'll have to wait and see.

Did I say Sony reconstruction doesn't have ML part? I will not lie and told I know how it work but I heard about the acronym and it reminds me a lot of DLSS just one letter of difference. And it helps with anti aliasing too. This is the only details I know and I guess what means the acronym.

I am sure MS prepare some AI based reconstruction.
 
Last edited:
XSS is proving itself to be a surprisingly punchy little machine, at least for cross gen games.

XSX vs PS5 is interesting because as they're running last versions, this entire game should be fitting into GPU optimal ram on the XSX. And yet the PS5 retains its usual small advantage in performance at 60 hz, just like with next gen / GDK builds of XSX games. To me, this would indicate again that the fast / slow ram areas aren't particularly a factor in this. Although this could be confirmation bias on my part because I never expected they would be.

The comparison between XSX and XSS is odd. In quality mode the XSX is running at less than twice the resolution of the XSS, with basically the same settings. Yet the XSX has more than twice the fill rate, peak geometry rate, memory bandwidth, cache bandwidth, and more than three times the raw compute.

In performance mode the XSX is again pushing out less than twice the pixels of XSS with some noticeable dips to boot (so it's definitely struggling), and that's despite XSX having GI turned off!

Yeah I know, it's a last gen game that's only generation aware, but in this case the comparison between XSX and XSS (lets leave PS5 out of this) raises some questions IMO because it's the same tools, same code, same architecture.
 
XSS is proving itself to be a surprisingly punchy little machine, at least for cross gen games.

XSX vs PS5 is interesting because as they're running last versions, this entire game should be fitting into GPU optimal ram on the XSX. And yet the PS5 retains its usual small advantage in performance at 60 hz, just like with next gen / GDK builds of XSX games. To me, this would indicate again that the fast / slow ram areas aren't particularly a factor in this. Although this could be confirmation bias on my part because I never expected they would be.

The comparison between XSX and XSS is odd. In quality mode the XSX is running at less than twice the resolution of the XSS, with basically the same settings. Yet the XSX has more than twice the fill rate, peak geometry rate, memory bandwidth, cache bandwidth, and more than three times the raw compute.

In performance mode the XSX is again pushing out less than twice the pixels of XSS with some noticeable dips to boot (so it's definitely struggling), and that's despite XSX having GI turned off!

Yeah I know, it's a last gen game that's only generation aware, but in this case the comparison between XSX and XSS (lets leave PS5 out of this) raises some questions IMO because it's the same tools, same code, same architecture.

I would agree with you about many of the SX comparisons that have already been shown between PS5 and is definitely interesting. But with the SX in performance mode having higher effects, higher resolution, and greater crowd and car density than PS5 as per NX Gamers comparison it would seem like this would at least be a factor in why the PS5 is able to retain an FPS advantage.
 
Are you talking about this? As if so, this was just DLSS running through DirectML.

I can't find the link right now but AMD have recently said that their DLSS alternative won't be ML based.

So it seems that for now Nvidia are still the only game in town for machine learning based resolution upscaling. It may be that whatever AMD comes up with is just as good though, we'll have to wait and see.
MS are researching image construction through ML on the XSX. DF mentioned it in the same video where they mentioned that AMD's alternative won't be ML-based.
In fact, MS has mentioned ML Super Resolution in multiple materials they have released. The XSX lead architect Andrew Goossen mentioned it in the XSS DF interview last month, and I am quoting here "It's an area of very active research for us, but I don't really have anything more to say at this point,". In the Games Award "Next-Gen Gaming" talk with Phil Spencer, Lisa Su, and an EA lady, Phil specifically called out using ML to get better IQ using less bandwidth/teraflops. He said this in the context of doing more with less as something they are working on. All this plus pretty much all their materials published with regards to their inclusion of INT4/8 for ML has always mentioned Super Resolution so I think we will see an ML-Based solution from MS.

It remains to be seen how effective their solution will be compared to Nvidia's but there is nothing saying it won't be more effective. ML-based image reconstruction is still an emerging area of graphics and there is nothing to say that DLSS2.0 is the best it is ever going to be or that you must have a certain amount of INT4/8 capability to get an effective solution. Of course, such a solution, so long as it is portable and can scale, should be faster on Nvidia's hardware with its dedicated tensor cores.
 
Last edited:
I would agree with you about many of the SX comparisons that have already been shown between PS5 and is definitely interesting. But with the SX in performance mode having higher effects, higher resolution, and greater crowd and car density than PS5 as per NX Gamers comparison it would seem like this would at least be a factor in why the PS5 is able to retain an FPS advantage.

My understanding was that in performance mode, the only real difference between PS5 and XSX was car and crowd density, and in the DF video at least the XSX seems to hold up very well there so that's cool. The drops I was referring to were in some of the shooting sections, where random crowds didn't seem to be a factor (perhaps you can confirm if you've played those bits?).

Some dips on XSX directly co-incided with gun or explosion flashes taking over a large part of the screen. As the game should all be in GPU optimal ram, I think that rules out memory BW, so the bottleneck must be somewhere else. The XSS with much less BW and less than half the fillrate seems relatively more resilient then the XSX. At least in this game, at this time.
 
I think it is going to be interesting when we finally learn if PS5 has indeed unified cache which could explain this seemingly overall advantage PS5 has on most high fps modes. From the benchmarks on PC, the 2700x is barely there keeping 60fps on ultra settings so it would match with both XBSX and PS5 having issues locking that framerate.
 
I think it is going to be interesting when we finally learn if PS5 has indeed unified cache which could explain this seemingly overall advantage PS5 has on most high fps modes. From the benchmarks on PC, the 2700x is barely there keeping 60fps on ultra settings so it would match with both XBSX and PS5 having issues locking that framerate.

https://videocardz.com/newz/cyberpunk-2077-gets-fps-boost-with-a-patch-for-amd-ryzen-cpus

Maybe CDProjekt need to improve SMT code on zen2 CPU of PS5 and Xbox Series.
 
https://videocardz.com/newz/cyberpunk-2077-gets-fps-boost-with-a-patch-for-amd-ryzen-cpus

Maybe CDProjekt need to improve SMT code on zen2 CPU of PS5 and Xbox Series.

Yeah maybe it will improve. But keep in mind also that those CPUs on PC are running at much higher frequencies.

Edit: sorry I got confused my AMD naming... again. 2700x is Zen+ not Zen2. A 3700X is getting above 60fps average, but also at higher frequencies as well. So let's see what the future holds when more patches are released.
 
Yeah maybe it will improve. But keep in mind also that those CPUs on PC are running at much higher frequencies.

Edit: sorry I got confused my AMD naming... again. 2700x is Zen+ not Zen2. A 3700X is getting above 60fps average, but also at higher frequencies as well. So let's see what the future holds when more patches are released.

Not only this you have a lower level API on consoles and CPU on consoles does less thing too no I/O on PS5 and not many on XSX/XSS and some audio are done somewhere else on consoles too.
 
should be faster on Nvidia's hardware with its dedicated tensor cores.

Its a hardware thing in the end still too. But as evident in other sections of the forum, dlss like tech is generally disliked by console gamers.

Not only this you have a lower level API on consoles and CPU on consoles does less thing too no I/O on PS5 and not many on XSX/XSS and some audio are done somewhere else on consoles too.

PS5 cpu wont be faster as opposed to a 3700x, it is a too large of a frequency defferentional. GPUs are going to lift some IO work later on pc too. Ayway, i think the streaming portion of CP2077 is mighty impressive, perhaps the most impressive so far.
 
Its a hardware thing in the end still too. But as evident in other sections of the forum, dlss like tech is generally disliked by console gamers.



PS5 cpu wont be faster as opposed to a 3700x, it is a too large of a frequency defferentional. GPUs are going to lift some IO work later on pc too. Ayway, i think the streaming portion of CP2077 is mighty impressive, perhaps the most impressive so far.

I never said it will be faster using boost clock a 3700X is 25% faster than the PS5 CPU in theory. I said the gap is not so high and first things if SMT is done well on Ryzen CPU and consoles CPU it will help. There is other things than decompression done on the PS5 I/O complex memory mapping and check in and 3D audio is done on Tempest Engine side. And consoles API are a bit thinner.

My main concern is not the CPU on Cyberpunk 2077 but the GPU and the triangle based raytracing.

The best number I heard for streaming is Demon's souls currently 3 to 4 GB/s of compressed data when needed but it is a 2009 linear games and it does not push streaming and CPU very far.
 
Status
Not open for further replies.
Back
Top