Current Generation Hardware Speculation with a Technical Spin [post GDC 2020] [XBSX, PS5]

Status
Not open for further replies.
What motivates people to lie and make up stuff like that?
Not taking the exact same spots and time can have big impacts.

I also saw some kind of different shadows in the DF video. The branch-shadows of some trees were different (~in the scene where Brag kills the other Viking with his shield), but I guess these are more or less randomly generated or a small difference in sun position.
 
On paper Series X has 3 times the TFLOPs, so 900p would be too much, everything else being the same.
MS does not want 720P-ish launch titles for their new S console to avoid ridicule so that is why they took away 60fps

1440p has ~2.56 times the pixels as 900p
if PS5 can make do with 10TFlops, S should be able to hit it with 4 Tflops with little problems, maybe with some more aggressive scaling.
We probably won't demand a butter smooth 60 fps, but as both big brothers can pretty much lock to most scenarios with little issues it suggests that usually it should hit around 70~80 fps.

Not doing that suggests that there's a currently a deficiency with either the architecture or software stack.
 
Ah, so 32 double pumped ROPs would have the raw throughput of 64, but you would lose efficency on small tris?

So Sony - who may not have VRS - could be using an Rdna 1 like arrangement of 64 'single pumped' ROPs. And they would probably need that for 4Pro compatibility. This might lend itself better to higher effective filtrate, particularly with small tris.

Do you think the double pumped RBs can still output up to 8 pixels per clock when not using VRS pattern, or does it likely drop to only 4 (4 *8 for 32 pixels /cycle)?
It’s not exactly clear lol. I don’t know how double pumping works with respect to triangles. Typically FF units have issues with small triangles. The IIRC they are setup for 16 pixels per triangle is the smallest optimal triangle it can write out to before losing performance (RDNA1). I will need to check RDNA2 but I don’t think it’s a small triangle problem otherwise we would see cutscenes always grind to a halt.
 
Interesting. Seem's Matt and Fafalada are in agreement about when GDK access became available for Dev's. It wasn't June of this year as most believed.

Per Matt (Dev)


Per Fafalada (Dev)


Matt


Matt


Like I stated before, both of these systems will be quite close in performance. Something all gamers should be happy about.
I don’t think in the history of (recent shared architecture) console launches we’ve seen frame dips of 50% in random areas however. XBO was the worst case of tools and we didn’t see that type of behaviour, we saw a bunch of 720p however. I think even if there were occurrences it didn’t manifest like that the entire generation.

so I’m not really sure what is happening but I have doubts it’s architecture related (seems more random than consistent at this moment).

While he is right that the two will trade blows at least we see that now, he provides no good explanation for the 30-50% frame drop which is probably the area of debate around whether the GDK is responsible or not. So I would be hesitant to assess this as an architectural problem.

imagine looking at frame graphs as a dev doing the exact same thing for 8.88ms, and suddenly one part of your pipeline goes for another 8.88ms alone. Seems... off or extremely suboptimal.
 
Last edited:
There was an odd case of a madden game that was 30fps on ps3 and 60fps on x360. But the next game a year after was 60fps on both.
PS3 games in first year or so used to run with 20% less pixels on screen, grass and smoke cut out, no MSSA, frame rate dips...generally very very bad. It was shocking at the time, so much so that Sony completely overhauled their (at that time absolutely horrid) tools and sent ICE team from studio to studio to help out.

Devs knew where most issues came from on PS3 side, along with tools, it had less memory and it was severely limited by fixed vertex/pixel pipeline vs unified on 360. Not till devs start to cull triangles by SPEs before sending it to RSX did we see PS3 coming back to life...well, sort of.
 
I know it would be even more work, but somehow I miss the PS4 Pro and xbox one X version in this video. Especially to compare those with the series s. But I guess DF ist still making such a video but currently have just to much to do.
 
It's these sudden framerate drops on XSX that are perplexing.. Especially in 120 fps mode in DMC 5 and COD I think? But also 30% (it's not 15%, that is an error in calculation as the PS5 has a 14fps advantage or so out of 60fps) in AC Valhalla.

Between it struggling in 120fps modes and the drops in Valhalla seeming to come once there are multiple NPC's on screen....it has to be CPU related no?...
 
Between it struggling in 120fps modes and the drops in Valhalla seeming to come once there are multiple NPC's on screen....it has to be CPU related no?...
I don't think that's such an easy guess this gen. Last gen the cpus were very weak, so any situation where a cpu load was necessary would be CPU limited, but these consoles have solid cpus. I think it's pretty unlikely that any game running successfully on last gen at 30fps is going to be cpu limited at 60fps on this new consoles (short of a bad bug).
 
I don't get the impression that matt guy is an engineer (certainly not a graphics dev) so I'm not sure what significance him being a real insider has. Unlikely he's going to get to the bottom of horrible api level bugs choking framerate or the exact headaches caused by early tools via water cooler talk with his coworkers.
 
Right but in some of these examples it's the 120fps modes where the XSX is dipping. And in the case of Assassins Creed I don't if that was ever successfully solid 30.

It wouldn't makes sense given that XSX has the faster CPU but I wonder if there is something going on at a optimization level.

Either way I'm just speculating here given the nature of the way it is dipping on XSX...
 
Xsx SOC started verification in H1 2019 and finished in H2 2019. It means the specs
are already locked for more than 1 year. How could it be possible there is no new tool??

Besides how do we compare the "maturity" between PS5 tools and xsx tools?
 
Right but in some of these examples it's the 120fps modes where the XSX is dipping. And in the case of Assassins Creed I don't if that was ever successfully solid 30.

It wouldn't makes sense given that XSX has the faster CPU but I wonder if there is something going on at a optimization level.

Either way I'm just speculating here given the nature of the way it is dipping on XSX...


Yeah, definitely looks like a cpu issue. Someone mentioned the SMT toggle that XSX has for developers, could it be the case that these developers just decided to not turn it on or they thought the higher clocks of no SMT was better? Has there been a Dirt 5 comparison? I know that developer specifically mentioned using the SMT on the cpu. Would be interesting to see the 120hz mode from that game compared.
 
Xsx SOC started verification in H1 2019 and finished in H2 2019. It means the specs
are already locked for more than 1 year. How could it be possible there is no new tool??

Besides how do we compare the "maturity" between PS5 tools and xsx tools?
Your talking hardware dev kits.

My guess is that ps5 tools are a continuation of the ps4 ones.
That's not the case with the new xbox ones.

Think of it like the dashboards but in reverse.
PS5 is new
XSX|S is continuation
Less things to go wrong and do when it's a continuation. If it's new you have to rewrite and implement whole sections of the code base.

The tools and environment is a lot more mature on PS5.

I think people hear that and assume people are downplaying PS5 or something.
When it's about why is xbox under performing, especially with huge dips in fps etc.
 
Yeah, definitely looks like a cpu issue. Someone mentioned the SMT toggle that XSX has for developers, could it be the case that these developers just decided to not turn it on or they thought the higher clocks of no SMT was better? Has there been a Dirt 5 comparison? I know that developer specifically mentioned using the SMT on the cpu. Would be interesting to see the 120hz mode from that game compared.

This is what I speculated in the other thread. Curious to see a Dirt 5 comparison.

Technically even with SMT disabled XSX would have a 300Mhz advantage over PS5 CPU but I'd think a console game optimized for more CPU threads would have an advantage.

Of course we have no developer proof SMT was not used for the other games.
 
https://www.game-debate.com/news/29...c-performance-report-graphics-card-benchmarks

2080 GPU at 1440p, high setting. The "1% low fps" of RTX 2080 is 41.4 fps. Average fps is 61.5.

ACV is 1728p~1440p on PS5 and xsx. Assuming consoles uses medium~high setting.


It does't seem xsx "underperforms" compared with 2080.

It looks like PS5 is exceptional and "outperforms" RTX 2080, which is very surprising.
It also looks like RTX2080 is underperforming here. But AC was never that great on PC. E.g. Odyssey was also not running that well on PC.
xbox one x needed optimizations because the CPU was just to "small". The new Directx12 console has a much stronger CPU so it doesn't need that much optimization to get the game running, when ported from the PC version.
PS5 with a whole different API needed some more work and therefore optimization. No simple PC port was possible.

And like always, Directx12 seems to be really hard for developers to get it properly running. There are still so many games that don't use DX12 at all. Also there are a few games that have a Directx12 options, but perform better with Directx 11 (if the CPU is fast enough). It all indicates that after all those years, many developers did still not get warm with DX12.
Btw, wasn't Watch Dogs Legion, one of those games were DX11 mode runs better than DX12 mode?
 
Status
Not open for further replies.
Back
Top