Current Generation Games Analysis Technical Discussion [2023] [XBSX|S, PS5, PC]

Status
Not open for further replies.
Then I can only assume the discussion delved into or became social media persona discussions that doesn't fit anywhere on this site, so the other mods opted for a "nuke from orbit" approach for cleansing.
Yes. The discussion was NXGamer, not game tech analysis. As London Geezer's post was innocuous enough and I heavily play favourites, I shall restore it and it alone, standing out of context of a best-forgotten vacuum of deleted arguing with no reference to the game he was asking about...

Happy now? :p
 
The idea of a 4060 being as powerful as PS5(sans memory) is...very interesting. I wish it was even more powerful though. PS5 is already on par with 2070s. I had hoped Nvidia would be scaling PS5 to a 4050 by now. PS4 was already getting trounced by ultra budget PC GPUs 3 years in. But seems things are slowing down


This also is probably sobering for the people who keep obsessing over switch 2 being secretly close to the power of the other consoles because of DLSS magic dust and ARM super CPUs that can defy the laws of physics. Even with the most optimistic process node shrink, Nintendo have a budget and limitations they have to abide by to the point actually chasing power becomes meaningless. Most importantly is battery life.
 
Last edited:
Nintendo have a budget and limitations they have to abide by to the point actually chasing power becomes meaningless. Most importantly is battery life.
In theory a very wide but slow chip could produce decent results for mobile. But the cost is extravagant and it would be hard to find the right point of speed to be slow enough to preserve battery life and not be so slow that it’s struggling with small workloads.

Not going to happen. But it’s fun to make Diablo builds
 
The old ver. switch can only run BOTW for around 1.5hrs, which is pretty low for a portable console, and i guess this is also the bare minimum.
Main issue is memory bandwidth, which grows a lot slower than the raw compute power of the chip.
A quick computation: Apple's latest mobile top dog chip A16 has 64-bit-width lpddr5x memory, and that gives out an bandwidth of 51.2GB/s. This is roughly twice as much as switch's lpddr3 bandwidth when docked (25.6GB/s). And also remember most mobile devices cannot have their memory running at the maximum TDP for a long time before lowering down the frequency due to over-heating issue (which is also why deferred lighting is rarely seen in mobile games and those who used it has a high hardware requirement such as Genshin Impact).
As the comparision, PS4 has a bandwidth of 180-200GB/s (forgot the exact number), and steam deck has 88GB/s given its giantatic size and high TDP (15-25w) for a handheld device. In general, I don't think the future for a hybrid console is very bright, well only performance-wise, especially for those who are expecting similar power as PS4 (might achieve it in terms of FP computation but not every aspect).
I could only think of two solutions: 1. Stacking up L3 cache for the soc, kinda like the infinite cache in rdna3 gpus or the latest trend in RTX 4000 series. But cache is expensive (apparently the two rdna3 home consoles don't have the infinte cache as far as I know), I highly doubt whether Nintendo gonna take this method.
Or 2. Heavy use of DLSS2. This way the internal resolution could be reduced to a minimal of 1/4 while still maintaining a plausible result (i'll exclude 1/16 mode here), which could largely reduce the bandwidth requirement (in theory). By doing so, the console might be able to run games in handheld while having a visual on par with PS4 titles (and might even be better if targetting 720~900p, which - hey - not too bad on a small handheld screen!
 
Launch Switch run between 2,5h and 5h. With BotW the run time was around 3h. Nintendo nearly doubled it with the Switch update in 2019.

I dont see the doom and gloom. Switch 2 will be as portable as the first one. And playtime between 3h and 6h is enough for a mobile console.
 
I don't know is this right thread. For sure here was comparison of PS5 and Xbox Series X. How big is difference? Those 20% will be a big difference? How big will be diference in ray tracing? Did PS5 have some advantages comparing to Xbox Series X?
 
I don't know is this right thread. For sure here was comparison of PS5 and Xbox Series X. How big is difference? Those 20% will be a big difference? How big will be diference in ray tracing? Did PS5 have some advantages comparing to Xbox Series X?
It depends on what you consider a big difference. In reality the GPU disparity is the smallest difference in console history between Xbox and playstation, but depending on the game it's either noticable or not. Generally speaking they can be considered pretty much around the same level
 
But how about texture units 208 vs 144, render output units 80 vs 64 and RT cores 52 vs 36? Can anyone comment this? What possible difference can be in thise areas?
 
But how about texture units 208 vs 144, render output units 80 vs 64 and RT cores 52 vs 36? Can anyone comment this? What possible difference can be in thise areas?
The real issue is that resolution is no longer fixed for most games. DRS makes it nearly impossible to detect any real difference between the two.

At this point in time, people are just declaring a winner for the sake of it. DRS makes it really a wash and the only thing worth monitoring is frame rate and latency from controls. And the intangibles like audio etc.

I have major doubts most people will be able yo see how fast resolution changes until you drop to 1080p from 4K. But once again upscaling helps softens that blow.

I’d say, the only thing worth discussing is how gracefully both consoles deal with falling out of their ideal resolution bands.

But when next Gen is announced, all bets are off :p. Time to go hard into winners and losers: fight to the death! Haha
 
But how about texture units 208 vs 144, render output units 80 vs 64 and RT cores 52 vs 36? Can anyone comment this? What possible difference can be in thise areas?

Isn't PS5 64 ROPs (64 active + 16 disabled)?

Anyway, there are differences but the two systems are far more similar than they different. Given similar points in the AMD technology roadmap, similar silicon area and node, and similar power considerations, there's not room for a ton of differences.

XSX is potentially 20 ~ 25% faster when compute / RT / BW bound, PS5 is potentially ~20~25% faster when pixel fill bound (or potentially more on a depth pass). ThePS5's clocks make up for a lot of deficits in terms of number of units, and potentially convey advantages in other areas too.

The thing is that games are often bound by different parts of the hardware at different points in the game and also just different points in the frame. And how and when this happens is in large part down to the software. So it's tricky to really say without looking at profiling what is down to what. A relative advantage in one part of the frame might be countered at some other point by a relative disadvantage.

I think that as games go heavier on RT and compute driven pipelines, SX will probably be in a slightly better position, but probably as much will come down to the software (and market share) as the hardware. And being faster at RT isn't really going to result in large differences if a game goes so light on RT (as many console games do) that the real performance differentiators remain elsewhere.

One potential area of difference that doesn't get discussed is the CPU side. The big difference here is that the Series consoles have twice the SSE / AVX throughput. This could potentially allow you to run more complex simulation models or update your RT BVH tree faster or with more detail / objects (but the GPU would still have to be able to test against it fast enough for it to be useful).

All in all, interesting differences but not ones that are likely to matter when you sit down to play a (properly made) game on your system, whatever that system is. From a hardware perspective there are no losers this gen. All the systems are good - and yes that includes the Series S at its price point!
 
The real issue is that resolution is no longer fixed for most games. DRS makes it nearly impossible to detect any real difference between the two.

At this point in time, people are just declaring a winner for the sake of it. DRS makes it really a wash and the only thing worth monitoring is frame rate and latency from controls. And the intangibles like audio etc.

I have major doubts most people will be able yo see how fast resolution changes until you drop to 1080p from 4K. But once again upscaling helps softens that blow.

I’d say, the only thing worth discussing is how gracefully both consoles deal with falling out of their ideal resolution bands.

Yeah, for most of these games I'd have a hard time spotting differences between the two even running side by side. Poor texture filtering and large frame rate fluctuations and stutter are pretty easy to spot (and get annoyed by) even on their own, but a 12% difference in resolution? From normal seating difference? Not a chance. I need Digital Foundry and 3x zoom to tell me which version is clearly better and what I should be angry about on message boards. :yep2:

But when next Gen is announced, all bets are off :p. Time to go hard into winners and losers: fight to the death! Haha

Can't wait! :ROFLMAO:
 
Series-X also doesn't have 80 ROPS.
Ok, found info that there is 64. Then there is wrong info in wikipedia.
One potential area of difference that doesn't get discussed is the CPU side. The big difference here is that the Series consoles have twice the SSE / AVX throughput. This could potentially allow you to run more complex simulation models or update your RT BVH tree faster or with more detail / objects (but the GPU would still have to be able to test against it fast enough for it to be useful).
Very interesting. Thanks a lot for that info.

Also thanks everyone for answers. Ok if we compare same game on both consoles there is minor difference and I've seen a lot of comparisons where was minor difference or almost no difference. But how do you guys think will be with exclusive games? In my opinion almost all PS3 exclusives looked better thank xbox 360 exclusives if ye compare exclusives released same year. Ok there wasn't like completely like this a little bit different.
2006 Gears of War and 2007 Uncharted
Both third person shooter. Uncharted was a lot better in terms of graphics.
2007 Halo 3 and 2008 Resistance 2.
Both first person shooter. Resistance 2 was a lot better.
2008 Gears of War 3 and 2009 Uncharted 2. Uncharted 2 was a lot better.
And so on.
Same was if we compare PS4 and Xbox One exclusives. And I think same will be also this generation.
But what is your thoughts? How big will be this difference? PS3 was on par with Xbox 360 but have Cell. PS4 was 50% more powerful than Xbox One. This gen situation is different. And AS we know XSX is 20% more powerful than ps5, can Sony keep up this gen?
 
PS4 was 50% more powerful than Xbox One.
Oh, really? I didn't notice. Seriously, the PS4 advantage over One wasn't all that great in terms of geometry or texture assets superiority, however, PS4 did occasionally enjoy higher framerates or at the vey least, stuck closer to 1080p.

This gen situation is different. And AS we know XSX is 20% more powerful than ps5, can Sony keep up this gen?
Function answered this perfectly. These advantages depends on the areas where each console strengths lay, or not bottlenecked the most. And this really boils down to developers strengths and team sizes as well. Even the most powerful hardware and it's games can still look like and run like sh** compared to the weaker hardware. Just look at the current state of PC gaming, and how so many moderate console titles are forcing and/or requiring beefier PC hardware just to brute force matching/moderate console IQ settings just to match it's console counterpart. Seriously, without proper optimization(s) and the willingness not to treat each platform as its own unique beast (versus shoehorning code), having the most powerful hardware don't mean sh** at the moment.

Anyhow, PS5 Pro and XBSX Pro will be up next, so...
 
Ok if we compare same game on both consoles there is minor difference and I've seen a lot of comparisons where was minor difference or almost no difference. But how do you guys think will be with exclusive games?

We're at a point in time now where hardware no longer dictates how good a game looks, the limiting factor now is time, budget and talent and this is where Sony appear to be way a head of Microsoft.

That's not to say all of Microsoft's studio's are terrible, they actually have some extremely talented guys (Coalition and Turn 10 to name a few) but Sony seem to have more studio's of that quality.

As we know XSX is 20% more powerful than ps5, can Sony keep up this gen?

20% (it's actually 18%) is nothing these days, it's also the difference on paper and nothing ever scales 100% linear. So that 18% performance advantage on paper might only result in a 0-18% increase in reality. There's so many things within the GPU itself and in the software that can drive that 18% advantage way down, even the difference in API efficiencies can completely consume that 18% advantage.

It's also only in compute work loads, so if the game limited by another part of the GPU that extra compute won't amount to anything useful.
 
Last edited:
Every Gears of War looked better than the Uncharteds that came out a year later. So did Rise compared to last of us. How are you seeing Uncharted trumping Epic themselves using their at the time brand new engine with all their knowledge of it ? Uncharted 1 on original hardware looks extremely bad today while Gears 1 looks perfectly fine to play today
 
Status
Not open for further replies.
Back
Top