Current Generation Hardware Speculation with a Technical Spin [post GDC 2020] [XBSX, PS5]

Status
Not open for further replies.
https://www.prometheanai.com/

Game Industry Veteran, former Technical Art Director for Sony Interactive, Andrew pushed cutting edge pipelines that powered some of the most complex productions in the world. Develop and Forbes magazine's 30 under 30, artist, programmer, consultant, entrepreneur and speaker at Computer Graphics events all across the globe. For years he has been fighting for democratizing the creative process, supporting artists and empowering creativity within every single person.

Now as Promethean AI team we are embracing that challenge for good.
 
It's kind of interesting to think about difference between ps5 and xbox. If cpu happened to be the bottle neck and there was 10% less frames then perhaps the slightly slower gpu can actuallly render those frames with xbox quality. i.e. same resolution, less frames. On the other than if gpu is bottle neck then perhaps both consoles get same amount of frames but playstation will get less resolution. Will be interesting to see how things pan out in the end. Probably the tradeoff will be quite game specific.

That said, many games today have very little difference between high and ultra settings. Who knows if small differences will be visible. Playstation could have same resolution but slightly worse shadows/reflections and as we have seen people don't notice or don't care(how much did the better ray tracing shadows/reflections get shit from people. Difference between consoles will be much less than ray tracing on/off on pc games)
 
So what is Alex supposed to be wrong about? I just read through that thread by Andrew Maximov and I don't think I've seen the DF guys argue against any of that. I'm assuming this is just a thread posting battle now without context?
He's basically saying Alex is wrong that the XSX is superior.
Another developer disagrees and his friends say that it is.

meh
 
Consoles fans right now on every single forum

LongScientificAbyssiniancat-size_restricted.gif
 
It's kind of interesting to think about difference between ps5 and xbox. If cpu happened to be the bottle neck and there was 10% less frames then perhaps the slightly slower gpu can actuallly render those frames with xbox quality. i.e. same resolution, less frames. On the other than if gpu is bottle neck then perhaps both consoles get same amount of frames but playstation will get less resolution. Will be interesting to see how things pan out in the end. Probably the tradeoff will be quite game specific.

That said, many games today have very little difference between high and ultra settings. Who knows if small differences will be visible. Playstation could have same resolution but slightly worse shadows/reflections and as we have seen people don't notice or don't care(how much did the better ray tracing shadows/reflections get shit from people. Difference between consoles will be much less than ray tracing on/off on pc games)

Console games tend to give themselves a margin (at least the good ones) so frame drops from target are rare. All they're going to do is put a little bit more of a margin on ps5 cpu in case the clock drops and that will be the base target for next gen. Basically operate under the assumption that the clock will drop a little bit even if it doesn't.
 
So I looked at the post from Alex and it's an unrelated topic. He's posting about GPU performance for RT and shading etc. The Naughty Dog guy is posting about asset streaming. They have nothing to do with each other. There's no conflict ...
I don't think the two threads are related. He does talk about asset streaming. And then he's cc'd on a fanboy rant about how it doesn't matter, and he then says he disagrees with Alex's assesment.
 
From 2070/2080 level of perf to 2080s/2080Ti is large enough for some to choose the latter

I don't care about personal preferences. I'm curious about the technical differences and how visible they would be in real life and also in frame to frame comparisons.
 
Console games tend to give themselves a margin (at least the good ones) so frame drops from target are rare. All they're going to do is put a little bit more of a margin on ps5 cpu in case the clock drops and that will be the base target for next gen. Basically operate under the assumption that the clock will drop a little bit even if it doesn't.

This is first generation where we can start to rely on variable refresh rate. Not having to lock to 30/60/... is gamechanger. My life changed when I got a monitor with freesync.
 
From 2070/2080 level of perf to 2080s/2080Ti is large enough for some to choose the latter
People tend to focus too much on the TF number, forgetting other parameters, chief among them is the available memory bandwidth for the GPU.

Xbox Series X TeraFlops puts it above 2080 Super indeed, but the available memory bandwidth to it puts it under that mark, and in a significant way, Series X has at max 560GB for both the CPU and GPU, which means the GPU will access less than that amount, maybe even way less if we factor in CPU/GPU memory contention, on the other hand the 2080 has full 448GB all for itself. That's why don't be surprised if the Series X fell in between RTX 2070 Super and RTX 2080 in practice.

It gets worse for the PS5, it has 448GB for both the CPU and GPU, the 2070 and 2070Super have that same 448GB all for themselves, that's why I expect performance to fall in between those two.

Then there is the fact that NVIDIA reports TF differently, NVIDIA calculates TF based on the least stable boost clocks, but their GPUs often boost higher, a 2080 has 10TF @1700MHz, but it always boosts to 1900MHz, making it effectively a 11TF GPU. Same goes for other GPUs, whether 2080Ti, 2070, 5700XT .. etc.
 
This is first generation where we can start to rely on variable refresh rate. Not having to lock to 30/60/... is gamechanger. My life changed when I got a monitor with freesync.

Yes this is usually forgotten by "lol 12 > 10 lol" people.

While it works only on new hdmi 2.1 TVs, after those are common = if both games run on 4k but one runs round 55-65fps and another 45-55fps but otherwise they are identical = VRS will make them almost equal I think.

And anyway, biggest differences on visuals comes from how good devs are and how much time they spend probably, many people dont even see difference between 720p vs 1080p and many people play games on switch that run 360p so resolution clearly isnt the most important thing on earth.

I would even assume that 3rd party games are 95% identical to naked eye, unless devs mess up one of the versions.

And consoles are sold by exclusives, those that buy them for 3rd party dont probably care. They just want their fifa/cod and thats it
 
This is first generation where we can start to rely on variable refresh rate. Not having to lock to 30/60/... is gamechanger. My life changed when I got a monitor with freesync.

This is true. I have a vrr monitor and I love it. What % of users do you think will have vrr tvs? Probably very very few. If anything they'll have uncapped modes that allow you to run above 30fps, or above 60fps. They'll still design with headroom for stable 30, 60. That 2% frequency boost on the gpu, if that's really the variable range won't even gain you 1 fps.
 
This is true. I have a vrr monitor and I love it. What % of users do you think will have vrr tvs? Probably very very few. If anything they'll have uncapped modes that allow you to run above 30fps, or above 60fps. They'll still design with headroom for stable 30, 60. That 2% frequency boost on the gpu, if that's really the variable range won't even gain you 1 fps.

If you don’t have a vrr tv by next gen, I mean are you even a real gamer?
 
Status
Not open for further replies.
Back
Top