Current Generation Hardware Speculation with a Technical Spin [post GDC 2020] [XBSX, PS5]

Status
Not open for further replies.
AMD release a patent for reduce bandwith contention on APU field in 2016 or 2017 and publish in 2019. The memory controller was supposed to give priority to CPU memory call because it is more latency sensitive than the GPU. It is somewhere on era but too difficult to find.

Don't forget now there is an SSD accesing that pool of memory as well. Much higher bandwidth requirements from the storage side compared to the PS4 era. My guess is all the cache optimizations Cerny talked about are oriented to allow as much room as possible for the GPU. It will be interesting if Cerny gives a detailed "post-mortem" analysis in a couple of years, after there isn't really a need to keep most of this secret.
 
Last edited by a moderator:
Next generation consoles are probably on ultra too I suppose.

.
From my now 3 years testing PC settings vs. console - consoles are ultra across the board maybe in one out 20 cases, maybe? For those games with very few settings in general, maybe? Consoles are usually medium + high, and some things lower than low. You can see this if you go across all of my videos quite easily.
I think this will still be the case this generation of course too as consoles developers seem resolution greedy - and pushing resolution higher than is honestly reasonable at times at the sacrafice of GPU visual settings. We already have examples of very sub-ultra settings (mainly medium + some lower than low) in Watch Dogs Legion. That game has what I would call "real" ultra settings though... not like other games where the performance cost of Ultra is basically another game's "medium".
 
From my now 3 years testing PC settings vs. console - consoles are ultra across the board maybe in one out 20 cases, maybe? For those games with very few settings in general, maybe? Consoles are usually medium + high, and some things lower than low. You can see this if you go across all of my videos quite easily.
I think this will still be the case this generation of course too as consoles developers seem resolution greedy - and pushing resolution higher than is honestly reasonable at times at the sacrafice of GPU visual settings. We already have examples of very sub-ultra settings (mainly medium + some lower than low) in Watch Dogs Legion. That game has what I would call "real" ultra settings though... not like other games where the performance cost of Ultra is basically another game's "medium".
Did you happen to note what the console settings in watch dogs were outside of RT?
 
From my now 3 years testing PC settings vs. console - consoles are ultra across the board maybe in one out 20 cases, maybe? For those games with very few settings in general, maybe? Consoles are usually medium + high, and some things lower than low. You can see this if you go across all of my videos quite easily.
I think this will still be the case this generation of course too as consoles developers seem resolution greedy - and pushing resolution higher than is honestly reasonable at times at the sacrafice of GPU visual settings. We already have examples of very sub-ultra settings (mainly medium + some lower than low) in Watch Dogs Legion. That game has what I would call "real" ultra settings though... not like other games where the performance cost of Ultra is basically another game's "medium".

I only speak about the normal mode of DMC5 not the RT one* but maybe there is some settings behind ultra PC like ambient occlusion. I think AMD trails behind NVIDIA on RT but not in rasterization if we believe what some reviewer are hinting.

*When DMC5 SE release on PC, I expect Nvidia card to be far above consoles and above AMD PC GPU card in RT.

EDIT: For RT on console I think it will be used sparsely in future titles shadows or reflections maybe shadows + reflections is some titles but I doubt it. GI will be done with other methods like in UE5 with Lumen or Demon's souls.
 
Last edited:
Amd tech demo X DX12 ultimate this thursday
Coming November 19, the "Hangar 21" Technology Demo Video will let you see the breakthrough AMD RDNA™ 2 gaming architecture in action, the foundation of the AMD Radeon™ RX 6000 Series graphics cards that power the next generation of gaming with mind-blowing visuals featuring realistic lighting, shadows, and reflections enabled by AMD FidelityFX and Microsoft® DirectX® 12 Ultimate.
https://www.amd.com/en/technologies/radeon-software-fidelityfx
 
Don't forget now there is an SSD accesing that pool of memory as well. Much higher bandwidth requirements from the storage side as well. My guess is all the cache optimizations Cerny talked about are oriented to allow as much room as possible for the GPU. It will be interesting if Cerny gives a detailed "post-mortem" analysis in a couple of years, after there isn't really a need to keep most of this secret.
But the SSD would only really take 5-10GB/s out of that 450GB/s, no?
 
And the PS5 has many bandwidth saving features that the 2080 mostly doesn’t have access to so that should I think probably easily offset having to share with CPU?

What features are you referring to? I'm not aware of anything custom for bandwidth saving in relation to a dGPU. If you're thinking of the cache scrubbers and coherency engines, my understanding is that they're more to reduce the load on CPU and GPU in maintaining cache coherency and avoiding regular flushes.
 
I think this will still be the case this generation of course too as consoles developers seem resolution greedy - and pushing resolution higher than is honestly reasonable at times at the sacrafice of GPU visual settings. We already have examples of very sub-ultra settings (mainly medium + some lower than low) in Watch Dogs Legion. That game has what I would call "real" ultra settings though... not like other games where the performance cost of Ultra is basically another game's "medium".

Thats due to marketing pressure unfortunately and I highly doubt its the developers decision in most cases when it comes to resolution targets. Slapping "4K" on a box or advert is far easier for the general public to understand than explaining increased visual fidelity. I am always for giving the consumer more options like unlocking more PC options in console options menus so the user can pick and chose what matters to them.... so you don't get a mess like DMC5 regurgitated.
 
What features are you referring to? I'm not aware of anything custom for bandwidth saving in relation to a dGPU. If you're thinking of the cache scrubbers and coherency engines, my understanding is that they're more to reduce the load on CPU and GPU in maintaining cache coherency and avoiding regular flushes.

Flushes means reload data from memory and it means consume bandwidth and CPU or GPU loads.
 
i would love 1080p 30fps with as much graphical features as possible in console games.

i'm more photo mode man than gamer now.

Well, use the photo modes then. Don't ask for 30fps. If they keep seeing people asking for 30fps they might go back to that insanity.

Knowing developers past behaviors, I'm surprised to be witnessing 60fps dynamic modes at launch. Who am I kidding, we are only having 60 now because most titles are cross-gen.

The PS360 era was soooo bad in framerates, new gamers "learned" to ignore the stuttering tearing galore. I am seeing those same gamers on GAF asking how to disable the "soap opera effect" after enabling the 120fps modes.

The PS4/Xbone was a huge opportunity to break the cycle, but alas the same unstable 30fps experiences prevailed. The gamers from the PS360 era probably graduated to become developers in todays titles. So we might never get rid of the 30fps. The taste for >60fps is a PC gamer / vintage console gamer thing.
 
Status
Not open for further replies.
Back
Top