Current Generation Games Analysis Technical Discussion [2022] [XBSX|S, PS5, PC]

While playing Scorn, everything about it makes me think its perfect for UE5 and nanite. It looks gorgeous, but I can only imagine all that detail shine with nanite.
Runs perfect on my aging 4790k and 2070 super at 1440p, 80-90fps. If I enable fidelity FX (I assume its FSR 2.0 I can do 4k/60. Sadly there is some minor stuttering with every new area.
 
Yeah says FSR2 on XSX
Unreal temporal upscaler on XSS

Would this also not only be first time FSR2 but different upscaling between consoles?
Is unreal cheaper?
XSS looks to have locked 60 from what I saw, XSX didn't.
XSS has DRS
We've seen strong evidence that FSR 2.0 is reasonably compute-demanding. So it'd make sense to only use it for XSX.
 
We've seen strong evidence that FSR 2.0 is reasonably compute-demanding. So it'd make sense to only use it for XSX.
Believe they got it down to 2ms don't know if that's both XSS and XSX though.

I'm all for using best thing for not only each platform but for different consoles in the same platform like XS.
Just that I believe this is the first time seen something like this.
 
Plague tale Requiem is a very heavy game 1440p 30/40 fps on XSX and PS5 and 1080p 30/40 fps on XSS and the performance on PC show it too. The games run better on Xbox consoles. The game is GPU bound and the game don't use raytracing. A raytracing patch will arrive on PC. But nothing a 4090 can't solve. I think this will be the only GPU to be 4k 60+ fps with the raytracing patch.



Long time I can't wait a Digitalfoundry video for performance analysis but I was curious seeing the performance. Now I wait DF coverage.

EDIT: And the funny part is the 4090 GPU bound or CPU bound the GPU is so powerful.
 
Last edited:
Why is IGN running this on a zen 2700 cpu? 2070 as the gpu. Il wait for DF video regarding performance.
Does IGN ever read the yt comments so they notice these things and can improve their findings?
 
He found his by now world renown 2070 to output an average of 35 frames at 4k/high. Naturally the ps5 is 35% faster at "same settings". Then a vanilla 6800 he claims it's twice the power of a 2070 when in reality it's around 50% at 4k and with that card he gets near 60 frames on average, at ultra this time, at 4k. How does a card that's 50% faster than a 2070 output 70% more performance at higher quality levels ?

I so wish Alex would handle DF's video for this game :D
 
He found his by now world renown 2070 to output an average of 35 frames at 4k/high. Naturally the ps5 is 35% faster at "same settings". Then a vanilla 6800 he claims it's twice the power of a 2070 when in reality it's around 50% at 4k and with that card he gets near 60 frames on average, at ultra this time, at 4k. How does a card that's 50% faster than a 2070 output 70% more performance at higher quality levels ?

I so wish Alex would handle DF's video for this game :D
The 6800 is about 70% faster than the 2070 at 4K.
 
The 6800 is about 70% faster than the 2070 at 4K.
I looked at the launch review from ComputerBase since they have nice graphics.


They don't show the vanilla 2070, but the 6800 is

51% vs 1080TI (which i think is in the ballpark of a 2070 at the time of that review)
45% vs 2070 Super, which is probably what we should use since his oc'd 2070 is practically a super already, no ? :)

Did the performance percentage increase in the meantime between a 2070 and a 6800 ? Even so, how is it 70% faster than his 2070, at higher details ? Is it another case of his 2070 setup severely underperforming compared to other similar setups ? Im sure folks in another forum will start comparing his scenes with their setup tomorrow when the game launches
 
He found his by now world renown 2070 to output an average of 35 frames at 4k/high. Naturally the ps5 is 35% faster at "same settings". Then a vanilla 6800 he claims it's twice the power of a 2070 when in reality it's around 50% at 4k and with that card he gets near 60 frames on average, at ultra this time, at 4k. How does a card that's 50% faster than a 2070 output 70% more performance at higher quality levels ?

I so wish Alex would handle DF's video for this game :D

From the Techpowerup 6800 Reviewrelative-performance_2560-1440.png
 
I looked at the launch review from ComputerBase since they have nice graphics.


They don't show the vanilla 2070, but the 6800 is

51% vs 1080TI (which i think is in the ballpark of a 2070 at the time of that review)
The 1080 Ti is around the performance of a 2070S now. It used to be around the performance of a 2080 back in 2018.
45% vs 2070 Super, which is probably what we should use since his oc'd 2070 is practically a super already, no ? :)

Did the performance percentage increase in the meantime between a 2070 and a 6800 ? Even so, how is it 70% faster than his 2070, at higher details ? Is it another case of his 2070 setup severely underperforming compared to other similar setups ? Im sure folks in another forum will start comparing his scenes with their setup tomorrow when the game launches
relative-performance_3840-2160.png
/
 
Same drama with exact same theme.... Jesus.
His math is definitely off. The 6800 is 70% faster than the 2070, not 100% as he claims. It still ends up 25% faster than the PS5, so something is off there. We still need a few patches though. PS releases often get much better after a while. No HDR is a shame though.
 
His math is definitely off. The 6800 is 70% faster than the 2070, not 100% as he claims. It still ends up 25% faster than the PS5, so something is off there. We still need a few patches though. PS releases often get much better after a while. No HDR is a shame though.
Could be an another case of VRAM limitation. In the video itself, he shows how the game breaches past 8 GB with ultra settings/textures. Could be a case of another breach even with high textures. Clearly, 8 GB budget is hugely problematic at native 4K situations, especially in these recent titles.
 
I've had a flick through of his video, just seems like a shit port to me.

LOD issues, draw distance on Ultra lower than PS5, texture bugs, excessive VRAM use.

And is it fair to compare an RTX 2070, a GPU meant for 1080/1440 gaming at native 4k? A resolution it was never intended for.

It's 4 years old and the PC space has moved on massively with performance at its price range.
 
It is not impossible that this game just doesn't run well, comparatively, on Nvidia GPU's.

A 6800 should be nowhere near 2x the performance of an overclocked 2070, as has been pointed out.
 
It is not impossible that this game just doesn't run well, comparatively, on Nvidia GPU's.

A 6800 should be nowhere near 2x the performance of an overclocked 2070, as has been pointed out.
He wasn't talking about this game specifically. He was saying the 6800 is in general twice the performance of the 2070 which is false. You need a 6800 XT/3080 for that.
 
Could be an another case of VRAM limitation. In the video itself, he shows how the game breaches past 8 GB with ultra settings/textures. Could be a case of another breach even with high textures. Clearly, 8 GB budget is hugely problematic at native 4K situations, especially in these recent titles
via Digital Trends

"Our 4K benchmarks show the video memory limitations of Uncharted Legacy of Thieves. Above the Low preset, the 8GB available in the RTX 3060 Ti became maxed out, so the results for the Medium, High, and Ultra presets are much tighter than they are at 1440p and 1080p"

 
Back
Top