Current Generation Games Analysis Technical Discussion [2023] [XBSX|S, PS5, PC]

Status
Not open for further replies.
Windows is not intended primarily for gamers, most windows machines are like... atms. I agree the various pieces of the pc platform (api designers, hardware manufacturers, and OS designers) should take architecture for gaming way more seriously (and have to for the complaints on forums like these to ever be addressed) but there's no way that's going to come in the form of a min spec bump for a windows release.
I'm not talking about purely gaming.

You ever tried Windows 11 on a dual core? It's horrid.
 
So now when people look at a PC version that is complete trash and then look over and see a $500 console version running mostly without issue and no hassle...

Yep, it’s pretty wild that a 4090 and 13900K in many cases doesn’t provide a materially better experience than a console.

These seem like pretty dramatic statements to me. I get that some people get very annoyed by pretty much any level of stuttering, and I agree, it's not good and would be nice to see them removed entirely. But when you're talking about visual quality that exceeds the combined benefits of the console RT and quality modes, simultaneous with performance that exceeds the performance modes, then the occasional shader comp stutter in a handful of games might be a perfectly reasonable compromise for that to many people. Don't get me wrong, the horrendous stuttering of say Sackboy or Calisto at launch is definitely unacceptable to a game breaking degree. But those issues were quickly resolved as most serious issues of this nature are.

Not only do games run poorly at console settings on much more powerful hardware but they’re not much more visually impressive when everything is cranked up.

I think this needs quantifying. What does running poorly mean? Low frame rate or stuttering? Shader comp stutter I can agree with here depending on how bad it is. But in frame rate terms I'm not sure we've seen any evidence of this? Assuming that is you define poorly as no better or worse than the consoles performance.
 
Yep, it’s pretty wild that a 4090 and 13900K in many cases doesn’t provide a materially better experience than a console.

Not sure if you're serious?

That combo will get you 60fps at higher native resolutions and quality settings than the consoles or if you're on a 1440p monitor 120fps in pretty much every game with still higher settings.

So 2-4x the frame rate increase (with likely higher settings too) could be considered a much better experience than what the consoles offer.
 
Last edited:
Not sure if you're serious?

That combo will get you 60fps at higher native resolutions and quality settings than the consoles or if you're on a 144p monitor 120fps in pretty much every game with still higher settings.

So 2-4x the frame rate increase (with likely higher settings too) could be considered a much better experience than what the consoles offer.

Quite serious. Remji is right, with current gen consoles delivering upscaled 4K@60fps they are already providing a very good experience. The point is that spending 2-3x more on vastly more powerful PC hardware doesn’t significantly improve that experience. Native 4K and higher frame rates are great but they’re not game changers in most cases. Games simply do not scale to take advantage of more powerful hardware.

Just take Alex’s recent Dead Space and Forspoken videos for example. Almost none of the graphics settings materially impact IQ in a way you would expect given the vast difference in CPU and GPU horsepower. This isn’t a new problem on PCs but it becomes more apparent the closer consoles get to “good enough”.
 
Quite serious. Remji is right, with current gen consoles delivering upscaled 4K@60fps

Are they doing that in every game? And in the games they do offer 60fps in how many are a locked 60fps?

higher frame rates are great but they’re not game changers in most cases.

Tell that to the console gamers who now cry if a game doesn't run at 60fps on their console of choice, now they've had a taste of 60fps they can't go back to 30fps, so frame rate absolutely is a game changer.
 
Last edited:
Framerate is probably the biggest gamechanger of all, lol. How is it not ? High framerate impacts every moment you interact with a game. Everything you do. Everytime you press a button and something happens on screen, you're feeling that low latency in your mouse and you're observing the smoothness of the images on screen. There's nothing more impactfull i'd say. Like i said previously, wait until you sample actual high framerate for any length of time. You'll want to facepalm with 50 hands at once whenever you see an online comment about "silky smooth 60"
 
Quite serious. Remji is right, with current gen consoles delivering upscaled 4K@60fps they are already providing a very good experience. The point is that spending 2-3x more on vastly more powerful PC hardware doesn’t significantly improve that experience. Native 4K and higher frame rates are great but they’re not game changers in most cases. Games simply do not scale to take advantage of more powerful hardware.

Just take Alex’s recent Dead Space and Forspoken videos for example. Almost none of the graphics settings materially impact IQ in a way you would expect given the vast difference in CPU and GPU horsepower. This isn’t a new problem on PCs but it becomes more apparent the closer consoles get to “good enough”.

By the same token there's little point to the PS5/XSX given the existence of the series s. Similarly there would be little point to console quality, RT, and 120fps modes.
 
By the same token there's little point to the PS5/XSX given the existence of the series s. Similarly there would be little point to console quality, RT, and 120fps modes.
To be fair the series S isn't doing hot at all. And in addition it's cheaper than it's default contemporaries. Maybe without series S they would be doing much worse, but it's hard to ignore the idea that it's a pretty unnecessary addition to the roster in the same way pro and X were. All it does it make more work for developers to optimize
 
To be fair the series S isn't doing hot at all. And in addition it's cheaper than it's default contemporaries. Maybe without series S they would be doing much worse, but it's hard to ignore the idea that it's a pretty unnecessary addition to the roster in the same way pro and X were. All it does it make more work for developers to optimize

That reinforces my point. The gap between a top end PC and Series X is bigger than the gap between Series X and Series S. If Series S isn't doing well (I've no idea) then it shows that people do care about the improvements that extra power brings even if it is just better image quality, high framerates and a few relatively minor graphical bells and whistles.
 
This is currently a non-technical subjective discussion on the issue of Diminishing Returns and what counts as a meaningful upgrade to a game. Unless there's some real technical discussion to be had on how games are scaling or something (and I'm not sure there is), this line of discussion should be drawn to a close, thanks.
 
To be fair the series S isn't doing hot at all. And in addition it's cheaper than it's default contemporaries. Maybe without series S they would be doing much worse, but it's hard to ignore the idea that it's a pretty unnecessary addition to the roster in the same way pro and X were. All it does it make more work for developers to optimize
The Series S suffers from a similar identity crisis as the WiiU, but thankfully not as much: The issue is: Who exactly is it for? The WiiU was offering the same games as the PS360 at around the same performance (or slightly lower) in late2012. The controller was supposed to differentiate it for the casuals/users who didnt own a PS360 yet. But games were experienced almost exactly the same as its competitors unlike Wii. So if it offered the same experience as PS360, why would anyone who owned these consoles or anyone who is interested in a different experience be interested in WiiU.

The Series S similarly stands in an odd place, where it offers the same games as Series X but at lower quality so why buy that instead of a PS5 or Series X? But it is priced affordably for the lower end/casual user. Why buy that though and not a Switch which offers a different gaming experience?

The XBOX Series product line has a future because of Series X. And without Series X, there is no future for Series S. There is some interest but not enough.
The point of Series X is that it is supposed to be a next gen experience to replace One, One X and One S. High fidelity visuals for your 4k TV. And here we have Series S, a product that doesnt doesnt play your old XBOX disks, doesnt play your Series X disks, doesnt really offer the upgrade of the next gen console, and thus priced at a lower price. It partly defeats the purpose of a next gen upgrade and compensates for a lower price for it. To speak in Sony's terms it is a low performance console offering the games of the high performance next gen console. It is not a clear cut next gen upgrade for the core and not a clear cut offer for the casual who will prefer the Switch.
 
Series S is for developing countries (I live in one). People cannot even afford 1440p screens, let alone high quality high end 4K TVs that most people tend to combine SX and PS5. Naturally, they see likes of SX and PS5 a waste of money if they're on 1080p TVs/monitors and go for Series S. And that 200 bucks difference can be huge for people like them.

Sure, money/performance ratio on Series X is better. Money performance ratio on the 4090 is best too, I think? But more people will go for 4070ti instead of 4090, simply because it is cheaper.
 
it currently has to do with a lack of priority than knowledge. PC often doesn't get priority ..
While I agree PC is often a lower priority than consoles, in several cases we've seen developers being unaware of several basic things about PCs, sometimes they don't know about compilation stutters or how to implement a system for PSO gathering/collection, or how windows VRAM management works, or how to even implement basic mouse controls, among several other things that were perfectly outlined in Alex's excellent video about PC ports guidelines, and those are just the tip of the iceberg.

So I would say, it's a combination of both low priority and not enough knowledge about the PC platform, which is a situation that especially affects the new generation of developers who grew up playing console games, as opposed to the older generation of developers (the experts and the old geezers) who grew up fiddling with all sorts of old finicky and rapidly evolving PC hardware.
 
Last edited:
That reinforces my point. The gap between a top end PC and Series X is bigger than the gap between Series X and Series S. If Series S isn't doing well (I've no idea) then it shows that people do care about the improvements that extra power brings even if it is just better image quality, high framerates and a few relatively minor graphical bells and whistles.

That probably has a lot more to do with XSX being the main/normal version of the console. The iPhone mini also flopped.
 

David Wang of AMD talk about primitive shader and Mesh shader.

Mr. Wang
 Primitive Shader as hardware exists in everything from Radeon RX Vega to the latest RDNA 3-based GPU. When viewed from DirectX 12, Radeon GPU's Primitive Shader is designed to work as a Mesh Shader.

Mr. Wang.
 Since the PS5 GPU is an AMD RDNA-based GPU, it has a Primitive Shader, which can be used natively (*from the PS5 SDK). This allows some PS5-exclusive game titles to effectively utilize the Primitive Shader.
 My sense is that the number of first-party titles for PS5 that use Primitive Shaders to a certain extent exceeds the number of cases utilizing Mesh Shaders. In comparison, although Mesh Shaders have become an industry standard, I think that there are few cases of active use of Mesh Shaders in recent game productions.
 
Last edited:
Status
Not open for further replies.
Back
Top