Surely if a game tailors to PS5's strengths it wouldn't still perform better on XSX?
If a game dev decides to go all out and optimize the heck out of the PS5, it probably would outperform the XSX by a decent margin, playing to the higher clocks, cache arch, maybe even the SSD speeds.
And to answer more accurate to your question, the PS5 most likely is an easier target for now to extract performance out of. The narrow and fast GPU would allow that, aswell as not having to think of the 10GB vram limit which some games, in special if sloppy, could go over (like on pc).
That 10TF is easier to maintain then the XSX's 12TF i can imagine.
It just be that in typical AMD fashion CU scaling is poor.
Historically ATI/AMD GPU's have always scaled poor with CU count (HD5850 vs 5870, HD6950 vs 6970....etc.....etc...)
If you clock Vega 56 and Vega 64 at the same clocks they're within 1% of each other.
Yeah maybe. RDNA2 seems to love high clocks, the XSX is about of an outlier in the clock department for a 10+TF performance class GPU. All RDNA2 dGPU's clock atleast as high as the PS5, mostly even higher. And seeing the PS5 is close to the 6600XT (10.2TF) in gaming, seems that PS5 is doing what its supposed to be doing. What we suspected before, no-one is punching above some weights, its the XSX that should perform better.
That is a very big possibility as it's clearly an architecture that's built for speed.
Its what i can find, as a pc gamer i monitor and follow benchmarks and what im seeing is RDNA2 gpus being tested, their always high clocked. I tested a 6600XT myself, its quite close to the PS5, which tells me the PS5 is extracting its performance as it should.
Beforehand, like iroboto wrote, i was in the thought that the XSX would pull ahead according to its specs, but right now it doesnt. It might though.
One could underclock a RDNA2 dGPU, say RX6800 and try to see what happens relative to performance. Thats still 8 CU's more, and it has its own bandwith/16gb gddr, quite hard to fairly make conclusions even then, but to get an idea of the RNDA2 clocks.
Now is the xbox series x really that much more powerful ? Who knows how it will shake out. I think we wont really know until either a refresh systems come out in 2024ish or we transition to the gen after this.
It technically is. Somethings bottlenecking things, might that be gpu saturation vs clock speeds, memory contention (not likely) or dev focus. It could be a combination of different factors aswell. Sony has historically have had the better/closer to the metal tools since the PS4 era right?
Maybe the XSX gpu has its advantages in RT (CU saturation). The XSX minecraft rt demo was quite impressive performance ways. Problem with that is that demo never made it out the door so theres no single benchmark to go after with that one aside from MS's own.
I have said before , I think this will be a faster generation than last. Ray tracing is just really pitiful on both these consoles and a jump to RDNA 3 which should be ready in 2022 or RDNA 4 in 2023/24 may be much smarter than just clocking this systems faster and adding more cus. You also get the benefit of a much newer ryzen processor and pci-e 5 to draw from. But that is a whole other conversation
Its maybe 'pitiful' but not useless either. When optimized its quite nice what your getting. Look at rift apart, the RT reflections are quite convincing in special considering the rest of the game is quite 'next gen' as in fidelity and performance. But i'd agree that theres much of room to improve due to ray tracing, while native resolution increases have taken a backseat (and im happy for that).
4k as a standard will be around for a long, long time to come.
Edit: and to say again, the PS5 does come out as the 'winner' so far, it has less specs but competes very well with the higher specced box. No matter how, thats a feat and something i and others didnt expect to happen in the speculation threads here.
As for the true winner, there are none. Both compete and thats maybe the best for all of us, right?