Current Generation Games Analysis Technical Discussion [2023] [XBSX|S, PS5, PC]

Status
Not open for further replies.
It's worth noting that because resolution can be scaled by the user in the PC space, graphical features that go beyond the consoles do not have to be limited to only the minority of GPU's that are significantly more powerful than the consoles. Someone with a 2070s for example might decide to run Alan Wake 2 with full PT at 1080p DLSS Ultra performance. Sure image quality would be poor, but that's their trade off to make.

Whether that option translates into higher sales to owners of older GPU's is a different matter. In my case it would (if I had an older GPU) but I probably don't represent the average PC gamer.
 
Consoles are a mix of high and medium :D

Indeed that is the purpose - people seem to not get that which I find pretty annoying at times. We at DF are about "the cutting edge in the future, past and present." Looking at the game at its "max" is trying to ascertain how it scales and where the cutting edge lies. It has nothing to do with petty console wars. We always look at max and then at "optimised".
If the point was scalability testing, then why not also use quality settings on consoles? Because few PC gamers will ever run those settings at 60fps.
 
I don't really understand this mindset. Avatar is clearly accounting for scale of rendering with graphical features that consoles cannot do properly yet. And even have an extra mode on top of that with even greater fidelity settings, but everyone gets a good experience regardless. How is that not enough?

Consoles having a fixed target makes them easier to focus on as a baseline. This isn't new phenomenon, especially when games got so complex every developer had start to account for parity in quality of release, and even then it's hard to do as we see.

You say "scale down", but that just implies by default that the PC configuration for games that devs have to account for is the highest of all high end hw, so bleeding edge that it affects the design of the game, which just doesn't make any sense.

Literally no developer is going to make a game that has a 4090 with a high end CPU and 32gb of ram as a baseline and then try and figure out how to cut that down to work if they want to also sell on consoles. That inherently goes against the entire concept of scaling itself and is a waste of time for developers.

I understand nostalgia of old PC days where they literally got different games from console because they could not even work in the same way, but this isn't 2005 or earlier anymore.

There is practically no inherent game design that cannot be done in a console configuration and work well enough to be used for a baseline for a game at this point, atleast as far as accounting for mainstream PC hardware instead of bleeding edge.

If there is a PC game that devs specifically want to show off the power of PCs absolute highest tier(which is what I am assuming you want), it definitely won't be launching on console because cutting such a thing down would again be a total waste of time.

Which makes the whole notion of scaling down to consoles pointless.

Especially when by default they would have to account for lower end pc configurations anyway, putting a hard limit on any real merit there would be to focusing completely on the highest end without compromises. The game will have to be compromised to work for multiple lower end configurations regardless. So no wonder devs who are third party just decide to work on console by default and scale up.
True. Considering how PC gaming is aiming very very high framerates at high resolutions, it's that baseline that enables these two to co exist. Which is an area that the high end PC gamer has minimal room for compromise. Aiming for the highest configurations would have certainly produced more complex and detailed visuals, but at the cost of running above 60fps and resolution and literally tanking the game on every other configuration.

Crysis tried this model and it didn't favor the devs nor the majority of gamers
 
I'm not sure I follow. Are you thinking that publishers should invest hundreds of millions, or billions, in cutting edge AAA games for PC to sell it at launch for $70 to 1% of the user base, hoping that in 4-5 years when PC owners have upgraded their hardware, more will buy it for $10?

It feels like plenty of people already wait to get games in sales, so the only difference would be choosing to limit who can buy it at launch by requiring the highest performing hardware.

Why would that be the case when scaling exists? A game designed for cutting edge PC hardware can still sell to all console players as well as a large majority of PC gamers. I say large majority of PC gamers because some are still rocking DX9 level hardware (I know because some of my relative's children are gaming on those machines. :p).

The cost of doing so, however, is certainly a factor. Easier to recoup the cost of development (which is still extremely difficult for AAA current gen games) of a title with tech tailored to consoles (extremely expensive) than it is for tech tailored for the cutting edge PC tech (insanely expensive) or as Crysis did ... future cutting edge PC tech.

Regards,
SB
 
How feasible would it be for devs to allow PC gamers the option to use some of the perfect quality assets? For example, offering the Drake model and extremely high res flora in the UC4 teaser trailer as an optional, high res asset download. Since PC games are all digital there is no need to worry about the storage limits of a disc. I guess it's just a matter of if the dataset becomes too large for the engine to handle in real time?

I wouldn't be surprised if a large part of this is going to be because they don't want to those assets being out in the wild for IP protection reasons. If it's on the proverbial disc especially for the PC it's going to run the high risk, especially for mega popular IP, to be ripped and widely distributed.
 
Why would that be the case when scaling exists? A game designed for cutting edge PC hardware can still sell to all console players as well as a large majority of PC gamers.
If you track back through the conversation, the proposition was designing a game targeting high-end hardware. I took that it imply, scaling down was not an option, hence the economic debate.
 
The question: Is the average PC for the typical PC gamer more or less powerful than a PS5? Currently, I believe the answer is that it is less powerful than a PS5. So currently games are actually being downgraded onto PC. PS5/XSX ARE the high end targets. It just so happens that devs are also making scalable games that can take advantage of 80% of 4090s power, just not 100%. The consoles are the porridge that Goldilocks chose at the moment. That'll likely change in the next couple years.
 
The question: Is the average PC for the typical PC gamer more or less powerful than a PS5? Currently, I believe the answer is that it is less powerful than a PS5. So currently games are actually being downgraded onto PC. PS5/XSX ARE the high end targets. It just so happens that devs are also making scalable games that can take advantage of 80% of 4090s power, just not 100%. The consoles are the porridge that Goldilocks chose at the moment. That'll likely change in the next couple years.

It depends how you define the average PC gamer. In any case, Steam gives us the raw numbers, and wherever the average PC lies on the performance curve, there are a comparable number of PC's out there with similar or greater performance levels to the PS5. So in that respect they are certainly a valid target for a higher end experience.
 
If you track back through the conversation, the proposition was designing a game targeting high-end hardware. I took that it imply, scaling down was not an option, hence the economic debate.

I don't get that logic. Crysis targeted beyond high-end hardware (hardware that did not yet exist) and still scaled down. In the PC world you always had 2 design philosophies, target the low end and scale up (ensures the largest potential buyer base) or target high-end and scale down (still can hit several generations old hardware, but at some point you just don't support really old systems).

Most of the time those two different approaches still ran on the same machine with some notable differences. The one that starts by targetting the low end will generally run better on the low end but look worse on high end machines. Alternatively, the one targetting the high end will look significantly better on cutting edge hardware, but generally ran a little worse on low end hardware.

Additionally, and more importantly, targeting high and scaling down is more expensive than targeting low and scaling up. That's just the realities of the effort required to target high end hardware versus the effort required to target low end hardware. That cost and effort obviously contributes to why projects targeting high and scaling down generally look significantly better than projects targeting low and scaling up.

Targeting the highest level of hardware (assuming you can afford to do so as a developer) does not mean you ignore and don't scale down to lower levels of hardware. When scaling down, lower hardware will obviously use lower IQ settings but they additionally may not even include some graphical techniques available for higher level hardware.

Regards,
SB
 
I don't get that logic. Crysis targeted beyond high-end hardware (hardware that did not yet exist) and still scaled down. In the PC world you always had 2 design philosophies, target the low end and scale up (ensures the largest potential buyer base) or target high-end and scale down (still can hit several generations old hardware, but at some point you just don't support really old systems).

Most of the time those two different approaches still ran on the same machine with some notable differences. The one that starts by targetting the low end will generally run better on the low end but look worse on high end machines. Alternatively, the one targetting the high end will look significantly better on cutting edge hardware, but generally ran a little worse on low end hardware.

Additionally, and more importantly, targeting high and scaling down is more expensive than targeting low and scaling up. That's just the realities of the effort required to target high end hardware versus the effort required to target low end hardware. That cost and effort obviously contributes to why projects targeting high and scaling down generally look significantly better than projects targeting low and scaling up.

Targeting the highest level of hardware (assuming you can afford to do so as a developer) does not mean you ignore and don't scale down to lower levels of hardware. When scaling down, lower hardware will obviously use lower IQ settings but they additionally may not even include some graphical techniques available for higher level hardware.

Regards,
SB

I'm a bit confused here because in your own comment you already mention the limitations of scaling and why most multiplatform games won't target the highest end PCs (or the lowest end PCs) for that matter.

The target hardware will be the most optimal from a visual to performance demand perspective (and also development resource perspective) and as you mention any scaling in either way produces diminishing results. And we know the maximum user base target is not going to be a high end PC or a low end PC but generally the console spec (and let's face it the Playstation spec).

Even for PC centric games they target based the likely game audience not the overall PC audience. Valve's "average" PC hardware target for CS2 for example is likely completely different from say Star Citizens just as two diametric examples.

As an side here I really hate the generic term PC gamer that gets thrown around (people on there might notice I keep posting about this). I feel hardware enthusiasts are somewhat out of touch on this and have been for the longest time.

Also with regards to Crysis while Crysis did have settings that scaled to very high end PC hardware (and future hardware) that isn't same as the actual target. If you look at the settings they certainly do not go up in terms of perceived visual quality versus actual performance demands on the extreme ends.


And this applies to other games as well.

What would be interesting thing to look at is how say future games at their more "optimal" settings look visually compared to relatively older games at their maximum settings with the same performance envelope. For example let's say if we used a GTX 680 and ran Crysis at Max settings then compared to I don't know Battlefield 4, Metro Last Light, AC Unity, Ryze at comparable performance/optimal settings how do they actually compare visually.
 
Also with regards to Crysis while Crysis did have settings that scaled to very high end PC hardware (and future hardware) that isn't same as the actual target. If you look at the settings they certainly do not go up in terms of perceived visual quality versus actual performance demands on the extreme ends.
Even though I love tweak guides, your appraisal is off. The "Very High" setting in Crysis added a lot of things that dramatically upped the visual quality. Different post-processing, SSAO, per object motion blur, parallax occlusion mapping with shadows, and even more. Whole new visual features that were just not available on the "high" setting. Adding whole new visual features with "max settings" is a very different type of "max settings" than just "we up the resolution of effects internally" like one sees in most modern games.
 
Even though I love tweak guides, your appraisal is off. The "Very High" setting in Crysis added a lot of things that dramatically upped the visual quality. Different post-processing, SSAO, per object motion blur, parallax occlusion mapping with shadows, and even more. Whole new visual features that were just not available on the "high" setting. Adding whole new visual features with "max settings" is a very different type of "max settings" than just "we up the resolution of effects internally" like one sees in most modern games.

100% this. It was practically a different game once you kicked in the very high preset, even the colour grading was different (or maybe the lighting just made it appear so). I remember that clearly all these years later. I was running it on an 8800GTS at the time and I remember running the game at something like 720p and 25fps just so I could get that "very high" goodness.

Dropping to high got performance up to more realistic levels but the majority of that "next, next gen" visual goodness was gone.
 
I cant get Crysis to run on Windows 10 Warhead runs ok
edit: Got it working set compatibility to vispa sp2 and add -DX9 to the shortcut but very high is greyed out

ps: this is the original version not the remastered version
 
Last edited:
I cant get Crysis to run on Windows 10 Warhead runs ok
edit: Got it working set compatibility to vispa sp2 and add -DX9 to the shortcut but very high is greyed out

ps: this is the original version not the remastered version
I do not think it should be necessary to run in DX9, DX10 should be fine. I imagine the issue is maybe related to you being on Ryzen? PC gaming wiki has answers for that.
 
The funniest part with Crysis is that it being a demanding game has seeped so deeply into the common consciousness that people automatically assume that you needed a beast of a rig to run it. You really didn't and it ran on fairly modest hardware. It ran very well on my GTX 260. I believe 900p/medium settings/60fps. Still looked really good but obviously, max settings were a different ballgame both visually and in terms of system requirements. The game scaled beautifully back then.

The one big issue was the fact that it was heavily single-threaded and that even years later, a high-end 4-core CPU still struggled to run it properly.
 
I'm not sure I follow. Are you thinking that publishers should invest hundreds of millions, or billions, in cutting edge AAA games for PC to sell it at launch for $70 to 1% of the user base, hoping that in 4-5 years when PC owners have upgraded their hardware, more will buy it for $10?

It feels like plenty of people already wait to get games in sales, so the only difference would be choosing to limit who can buy it at launch by requiring the highest performing hardware.

An established dev with loyal backing from a big publisher (or favored 1st party dev) could act in such a fashion. Today's worthwhile cutting-edge tech eventually ends up as tomorrow's conventional technology. The value in such an approach rewards early investment with a technical advantage against competitors. The biggest issue is future hardware veering in a different direction that invalidates their forward-thinking efforts.

Most devs can't even afford a misstep and have become more conservative which means operating on the bleeding or cutting-edge is not a real option. It's probably advantageous for MS's most trusted devs to take such an approach as their efforts can be used to benefit other first-party devs. Sony could replicate this if it ever decides to support PCs with 1st day releases.
 
Last edited:
Why? A publisher is happy to throw money away to enable this dev to create a flagship title that doesn't turn a profit?
They can easily expense as general R&D with the perspective that they are just incorporating that work into commercial products as soon as possible.
 
Status
Not open for further replies.
Back
Top