I think a bit of a misspoke there on my Part! In my brain I must have been saying one Thing thinking another.
I think a bit of a misspoke there on my Part! In my brain I must have been saying one Thing thinking another.
Spilit memory and lower clock GPU must be messing up more than expected. Weird Microsoft
You can continue to use XDK (old DevKit iterated on from 2012) to produce Xbox One family titles and have ability to know if you're running on Xbox Series X or S hardware. However, you can not take advantage of new APIs or Hardware features. I believe Sea of Thieves is still operating on the XDK. This would allow for higher resolutions or framerate options, but the title is still operating in "Backwards Compatibility" mode. This can be seen in the Game Info section for each title. I think this is called something like "XDKAware", but I could easily be off on the name.
You can switch over to use GDK to produce Xbox One, Xbox Series, PC, and xCloud titles and have access to the new APIs and Hardware features.
There are obviously demising returns between "clearly not performing properly" and "perfect". Unless there's some huge unaccounted for factor, the xbox team can expect a lot more upside on tool optimization, because they're well below an adequate quality level right now.
At least for high framerate modes, it might be a fixed function issue as the lower resolutions depend more on that than the CUs. There's still a limit at the main memory bandwidth, but perhaps for the ROPs the bandwidth compression is seeing better throughput.Yeah, across developers, engines, genres we're seeing the same thing. It's the dips in performance that really stand out to me. Everyone seems to be having this problem.
So if a game is developed using XDK on Xbox Series consoles it's still technically in BC mode?
It's fascinating that around 11:30 in DF note that Sony have deployed some counter-screen-tear-detection technology.
Here's the thing. Reliable sources over on resetera, i.e. multiplatform developers, have been saying all year that there's not much difference between the consoles - definitely not 20% visuals delta as the theoretical maximum number of teraflops numbers would suggest. And this has been echoed by reliable sources like Jason Schreier who talks to a lot of devs. Some folks have said PS5 has advantages. PS5 is running that GPU, the whole GPU and cache, much faster. In cases where games aren't utilising all the CUs and/or are not close to needing maximum performance, why wouldn't PS5 turn in better performance? It would be kind of weird if it didn't. Expect both consoles it improve, whether Series X has more headroom to improve is something that may not be known for 2-3 years. ¯\_(ツ)_/¯
At least for high framerate modes, it might be a fixed function issue as the lower resolutions depend more on that than the CUs. There's still a limit at the main memory bandwidth, but perhaps for the ROPs the bandwidth compression is seeing better throughput.
If we assume they are both 64 ROPs for colour writes, then XSX will only need 467GB/s (1.825GHz * 64 writes * 32bpp per write), so they actually have a lot more bandwidth than they need for that. PS5 would need 570GB/s at 2.23GHz. Bandwidth compression may provide about up to an extra 30% on average (448GB/s + 30% = 582GB/s) which puts raw fill rates at a big advantage on there.
Things may look a little different on blending rates, but there's evidence that transparency heavy scenes are also at a disadvantage on XSX, so there's something funny going on there, and there shouldn't be an API issue blocking such simpler operations. Normally, you'd see the blend rates saturate the available bandwidth and XSX should have automagically won, but that's seemingly not being observed.
Things may look different again where 4xFP16 render targets (not pixel shader precision) are used for certain render passes, but the addition of different 32bpp precision formats may skew things back again as developers optimize for performance vs pixel quality.
I'll leave it at there. :V
Since people keep using the phrase "reliable sources", who were the ones behind the 13TF PS5 Specifications? Weren't some trying to put that at the feet of Jason Schreier?
Since people keep using the phrase "reliable sources", who were the ones behind the 13TF PS5 Specifications? Weren't some trying to put that at the feet of Jason Schreier?
Since people keep using the phrase "reliable sources", who were the ones behind the 13TF PS5 Specifications? Weren't some trying to put that at the feet of Jason Schreier?
It's fascinating that around 11:30 in DF note that Sony have deployed some counter-screen-tear-detection technology.
No, those were rapid fanboys. Jason Schreier is a self-confessed 'non-technical person', I have only ever see him repeat sentiments of what he's heard from developers. He's never delved onto the details of technicalities that he himself does not understand. This forms part of his credibility.
As a journalist, I've never seen him claim to know more than he understands or ride coattails on information shared in confidence. And this is probably why developers feel comfortable talking to him and sharing information with him. This speaks volumes.
No, those were rapid fanboys. Jason Schreier is a self-confessed 'non-technical person', I have only ever see him repeat sentiments of what he's heard from developers. He's never delved onto the details of technicalities that he himself does not understand. This forms part of his credibility.
As a journalist, I've never seen him claim to know more than he understands or ride coattails on information shared in confidence. And this is probably why developers feel comfortable talking to him and sharing information with him. This speaks volumes.
edit: and just to provide clarity. I think Microsoft absolutely know what they're doing with Series X. A bit like PS4 with it's extra compute capacity for the future, I think Microsoft have planned for this, but also endowed Series X with a ton of overhead for high-bandwidth graphics functions. We may not see the fruits of this for a few years. I can see wide vs fast trading blows (technical advantages) over the generation until devs develop techniques that work well on both architectures. The only way this doesn't happen is if Sony do have some weird secret sauce, but I feel like we would have heard about this by now. Different approaches to the same problems but Microsoft have more GPU headway to stretch their legs.