Digital Foundry Article Technical Discussion [2022]

Status
Not open for further replies.
You replied to a thread referencing this issue specifically my dude, replying to a post talking solely about this issue. I'm sure there are plenty of issues from PC and console games that I don't give a shit about, but they're not relevant to this thread.
I mean yeah, I did, it's my fault you misunderstood me, I was just replying to clarify.
But it's not a 'different kind of computer' - it's not a computer. If it was I wouldn't be gaming on the PC at all, no one would.

This is the argument PC fanboys routinely make in dismissing consoles - consoles are now 'just PC's' - they're not. They have similar base technology, but the way you access games and the freedom you have provides a decidedly different experience in many cases.

Like, why do you have a 3080? Why not just sell it and put it towards from PS5 games? I think that's a perfectly viable option and I can't recall the last time I recommended PC gaming over console to someone who's not already knowledgeable about the platform, but I find it curious if you really think consoles are just a 'better type of gaming computer' why you even have a gaming PC in the first place?
This is kinda semantics? It's a machine with a cpu and a gpu that runs applications. I definitely agree with you that PCs have their advantages, I make plenty of money to afford a 3080, and I care enough about games personally and professionally to play them on frustrating platforms. The only real difference between me and you is I don't grumble about shader compilation -- I know it's already something on the minds of developers who ship games that suffer from it, or at least, in the case of indie devs who don't really know what they're doing, on the minds of the people who make the engine they use. Which is really all I'm trying to communicate here. We all know! It's on the todo list! Not everyone has a game that's shaped right to make fixing it easy or the resources to fix that right this instant.

Which has been the case since its inception. This is not a new development requirement, hence we see games with included shader compiling stages from decades ago. Which is why, by design, UE4 documents a process to gather these PSO's to alleviate this issue as best as it can.

This isn't about PC gamers complaining that they're not getting full mod support in every game. They're not demanding full Dualsense Haptic support for every game. They're asking for something that was actually more prevalent in years past to return.

As another poster said, part of the issue is just that there are likely less experience PC developers on the market these days, some developers are genuinely in the dark about this - The Ascent developers said as much, they just didn't know. Yet they fixed it.
Yeah, sure, but programmable shaders are new (well, they're pretty damn old, but they're new relative to the inception of the pc,) and there are a huge set of conflicting costs and problems that overlapped to create the situation we're in now. Those games from decades ago had much smaller numbers of shader variants -- it's barely even the same problem at this point, unless you just happen to have a very predictable game or a very small number of shaders.
 
terrible platforms: pc and mobile.

Also happens to be the two most popular platforms by a landslide. All gaming platforms have their share of problems, their all terrible in their own ways.

When I say "terrible platform" I don't imply I have zero interest in it -- I have a 3080, I play pc games, etc. However, fundamentally, PC gaming is about slapping an amazing piece of graphics hardware into a machine designed around tasks like loading emails. Shipping a game is never going to have the kind high touch polish on pc that the same amount of resources can produce on consoles. Despite this, somehow, pc gamers consistently show higher expectations and complain about more marginal issues than console players do.

Bolded comment is a throwback to the early 2000's system wars discussions over at GS lol. High polish games on PC has been done before, in a time when the platform was even more 'terrible' than today....


for the 5% of people who care on those platforms.

These terrible platforms count for much more than 5% of the entire gaming market. Much more than all consoles combined.

When I said "more marginal issues" I was referring to the broad corpus of 'things gamers complain about', not this issue. You get those huge stutters because the platform, by design, does not guarantee a set hardware specification. The reasons for that are numerous and historical but ultimately come down to "PC as a platform is not intended for playing videogames, there's a different kind of computer which is meant for that"

Unbelievable, yeah i miss these 6th generation era but not these kind of comments.

I think that's a perfectly viable option and I can't recall the last time I recommended PC gaming over console to someone who's not already knowledgeable about the platform

Its the vice versa for me, 20 years ago i'd recommend a console to the casuals, nowadays i can recommend either cause the PC aint that a difficult a platform anymore relative to two decades ago. Even 7/8 year olds have no trouble gaming on them.
 
This is kinda semantics? It's a machine with a cpu and a gpu that runs applications.

So is an iPad. I'm not gaming on it vs my PC for a reason.

I definitely agree with you that PCs have their advantages, I make plenty of money to afford a 3080, and I care enough about games personally and professionally to play them on frustrating platforms.

Ok, so you get that there are distinct differences then. So what was your point that's they're so similar? Clearly you don't think this is just semantics then.

The only real difference between me and you is I don't grumble about shader compilation -

Again - it's a PC-focused tech site, in a forum dedicated to discussions about another tech site, discussing approaches to the very issue. I'm 'grumbling' in that I'm discussing a niche topic in a forum thread dedicated to a niche topic, I'm not grabbing tired working parents from the grocery checkout aisle and proselytizing about proper asynchronous shader compilation.

Like, you're here too my man. Neither of use are apparently prioritizing our free time properly. :)

- I know it's already something on the minds of developers who ship games that suffer from it, or at least, in the case of indie devs who don't really know what they're doing, on the minds of the people who make the engine they use.

It's far more on their minds precisely because that it's been given such media attention now. DF's coverage literally informed The Ascent's devs about it. There is no doubt that the problems with RE:Village's stuttering would never have gotten the attention of Capcom without DF and the resulting media storm (relative to PC gaming) that arose from that coverage.

They're commercial products, being sold at a profit. They get to be reviewed and critiqued once they're on the market asking $80 from consumers.

Which is really all I'm trying to communicate here. We all know! It's on the todo list! Not everyone has a game that's shaped right to make fixing it easy or the resources to fix that right this instant.

The myriad of approaches to actually address this problem is exactly what this thread and the shader thread in particular have been talking about.

You were the one who brought up that only a few games have fixed this, so I don't get why you're suddenly arguing that this problem - one that you acknowledge is complex - has solutions right around the corner, and we just have to stop asking for games to ship without these issues in the meantime.

These 'solutions' are actually not right around the corner, precisely because of that complexity, but also because media attention is a factor in what actually determines the resources a company will devote to addressing a specific problem.

Yeah, sure, but programmable shaders are new (well, they're pretty damn old, but they're new relative to the inception of the pc,) and there are a huge set of conflicting costs and problems that overlapped to create the situation we're in now. Those games from decades ago had much smaller numbers of shader variants -- it's barely even the same problem at this point, unless you just happen to have a very predictable game or a very small number of shaders.

Yes, I've talked about this as well. Bear in mind CPU's were also much weaker then, but the point is that games like Call of Duty, even with a shader count likely far less than a modern UE4 game - still shipped with a shader precompile option. I'm talking about that to show that this requirement of the architecture is not new.

The issue is that isn't not being handled perfectly, it's that some games don't make an attempt to address it at all. If the problem is so insurmountable then maybe the PC as a gaming platform deserves to die.

Before that day occurs though, I think people are perfectly justified in pointing out that a product they paid for runs like shit, regardless if you feel they should just place their faith that the devs know about it and wipe technical critique from their review.
 
However, fundamentally, PC gaming is about slapping an amazing piece of graphics hardware into a machine designed around tasks like loading emails.

I realise these points have already been responded to but I really had to respond to this. It's pretty crazy to suggest a modern high end PC (even without a discrete GPU) is a "machine designed around loading emails". You don't need multiple GB of RAM, multi core, multi Ghz CPU's and crazy fast NVMe SSD's all interconnected by state of the art high bandwidth interconnects simply to "load emails".

I get that you're really trying to say that the PC is a machine designed first and foremost around generic workloads as opposed to being optimised for gaming, and that the email comment was just you being facetious, but even this doesn;t hold water in light of modern consoles which at a hardware level, literally are PC's. They feature PC APU's with PC memory subsystems connected through PC interconnects. These are slightly customised PC's running custom OS's, API's and firmware. And in the case of one of the major console vendors even that software stack is a modification of what's found in the PC.

Shipping a game is never going to have the kind high touch polish on pc that the same amount of resources can produce on consoles. Despite this, somehow, pc gamers consistently show higher expectations and complain about more marginal issues than console players do.

Again, already addressed but expecting games not to have shader compilation is not "higher expectations". It's the expectation that a game running on your platform of choice doesn't have a potentially crippling issue when it is entirely fixable as demonstrated by the majority of games out there.

But outside of shader stutter, yes - enthusiast level PC gamers absolutely have higher expectations. Why the hell shouldn't they? They are paying for a premium experience on more expensive hardware that is cable of producing better results than the consoles. So they absolutely should have higher expectations in exactly the same way a PS5 gamer has higher expectations than a PS4 gamer or a XSX gamer has higher expectations than a XSS gamer.

When I said "more marginal issues" I was referring to the broad corpus of 'things gamers complain about', not this issue. You get those huge stutters because the platform, by design, does not guarantee a set hardware specification. The reasons for that are numerous and historical but ultimately come down to "PC as a platform is not intended for playing videogames, there's a different kind of computer which is meant for that"

I'm sorry but this is rubbish. Gaming is pretty clearly one of the driving forces behind the progression of consumer level PC hardware which incidentally drives console hardware development. Modern gaming PC's both at a hardware and software level fundamentally cater for gaming. The fact that they can also do lots of other things, and the fact that they can be upgraded and expanded over time in a modular fashion as opposed to a fixed hardware platform that is locked down at a software level to the gaming function only does not mean that PC's are "not intended for playing video games". That's no more true than saying "PC's aren't intended for reading emails" because they are also very good at gaming.

I mean look dude, it's blatantly, obviously true that if AMD or Nvidia and a platform holder and the developers of their OS all coordinated they could ship a machine that doesn't have to compile shaders at runtime under any circumstances. That's happened several times. They're called game consoles.

Yes. For the last couple of generations that's pretty much what consoles have been. But just because you lock a PC's function and hardware down to a single configuration, why does that mean that the unlocked version is not intended for gaming.

The need for shader compilation is not because PC's aren't intended for gaming. It's because they're upgradable. Being upgradable, does not mean a platform is not intended or not suitable for gaming.
 
Well fortunately all those games on PC are free so everyone should just be happy they're on the platform and not ask for better ... oh wait a second... :rolleyes:
It's very easy to not buy a product that you don't think is good, I don't think cost of one game is really an issue here for any of us on an enthusiast gaming forum.
The need for shader compilation is not because PC's aren't intended for gaming. It's because they're upgradable. Being upgradable, does not mean a platform is not intended or not suitable for gaming.
I actually disagree on this point. PCs are modular because they serve a wide variety of business cases, need to be bought in bulk in various configuration, need to provide for extremely low end users for certain important tasks (I think everyone has reacted to me mentioning email above, but this is kinda what I had in mind), and so on. PCs are not machines that have unified memory, guaranteed access to a full suite of modern gpu features, are designed around super tight control of background tasks to avoid stealing resources from the current program (of course they're not! For many PCs in many environments the background tasks are just as, if not more, important than what is currently being run -- they could be IT software, bank databases syncing, whatever,) have a set, universal waterline of performance or access features that developers can be sure there's a good financial ROI on targeting, and, yeah, don't have fixed configurations you can pre-compile shaders for. All of these things make them *less fit for running games -- no game is going to ship such that every user has the intended experience, or no situation will result in a 100ms hitch for any number of reasons.

That is not an excuse for not delivering a great experience to those players who do have the hardware, the knowledge, and the care to provide an environment where the game ought to be able to do just that -- a developer can use cutting edge graphics APIs only available to some users, trust nobody is running something silly in the background, write streaming and loading code that leverages the big surplus of vram and ram this user spent thousands of dollars to provide instead of unified memory, and keep their shader variant count low and then carefully keep track of 99% of the variants that actually matter, and precompile those in a menu. However, while it isn't an excuse, it is something your boss will de-prioritize under other important tasks, because only 30% of your players are on PC, only 50% of them have that perfect gaming pc and knowledge, and only 10% of those will complain.

So is an iPad. I'm not gaming on it vs my PC for a reason.
Sure, but lots of people are gaming on it! They have shader compilation issues in common with you, and a whole suite of other concerns (the os is even less suited for games, they have terrible heat constraints, they have unusual graphics hardware with a whole new set of concerns, and so on!) I personally generalize them all as computers because they're all potential targets for multiplatform games that suck up development budget. Unity and ue4, of course, target them too.

You were the one who brought up that only a few games have fixed this, so I don't get why you're suddenly arguing that this problem - one that you acknowledge is complex - has solutions right around the corner, and we just have to stop asking for games to ship without these issues in the meantime.
I mean, I said "several years", 'around the corner' is semantics again. It's a set of problems with a clear cause and a clear path forward, but the path conflicts with a lot of other desired features and is expensive. My opinion is it's a matter of time. At a certain point chasing your tail trying to get a reasonable length list of variants to pre-compile in a menu turns into just keeping the variant count low as a matter of design.


*I edited a place where I said "unfit for running games" to say "less fit". Unfit was an exaggeration.
 
Last edited:
I mean, I said "several years", 'around the corner' is semantics again. It's a set of problems with a clear cause and a clear path forward, but the path conflicts with a lot of other desired features and is expensive. My opinion is it's a matter of time. At a certain point chasing your tail trying to get a reasonable length list of variants to pre-compile in a menu turns into just keeping the variant count low as a matter of design.

That's fair, but my point was that the using UE4's PSO gathering process - while certainly cumbersome - is more of the band-aid for existing games with this problem today that should be expected as the bare minimum. It is certainly not the ideal endgame, in fact this it precisely why I created my original thread about being concerned that this approach would run into a brick wall eventually. Now, good lord, I only wish 20 minute precompile times were the norm.

I mean it is kind of funny how you're going off on how privileged PC gamers can be, which was seemingly prompted right after a post where I'm theorizing an approach that could be used as an imperfect stopgap with the goal of providing some benefit to the end user while specifically considering how to involve the developer by the least amount possible. Is it workable? I have no idea, but the issue with all the moving parts that need to be involved in actually 'fixing' this has been discussed many times here, whether you consider it 'grumbling' or not.
 
That's fair, but my point was that the using UE4's PSO gathering process - while certainly cumbersome - is more of the band-aid for existing games with this problem today that should be expected as the bare minimum. It is certainly not the ideal endgame, in fact this it precisely why I created my original thread about being concerned that this approach would run into a brick wall eventually. Now, good lord, I only wish 20 minute precompile times were the norm.

I mean it is kind of funny how you're going off on how privileged PC gamers can be, which was seemingly prompted right after a post where I'm theorizing an approach that could be used as an imperfect stopgap with the goal of providing some benefit to the end user while specifically considering how to involve the developer by the least amount possible.
Sure -- I also admit that ue4 is a place where I don't have professional experience, so maybe I'm speaking a little too confidently about this process you've looked into, sorry. I will also say my original posts weren't targeted specifically at you, and while almost all of you have been very reasonable you don't have to look far in this conversation to find the kind of "privileged pc gamers" I'm complaining about.

My general impression of processes where you gather a list of shader variants to pre-compile is that they're a lot of manual work even if they provide 'automatic' tooling, can introduce bugs/qa pressure, and are only really feasible if you either have a small-ish total number of variants or are able to produce a clear idea of what shaders will be present, both of which require some combination of luck or planning.

Also, like, one possible outcome is just "do a bad job guessing what shaders are there and accept you'll still have some stutter -- but there will be less -- if a user compiles". while you or I would probably tolerate a button in a menu somewhere that takes 25 minutes and reduces the frequency but not occurrence of shader stutter, it's not such a clear-cut user experience win that it's guaranteed to ship in a game where most of the users are not like any of us in this forum, and they've played other games like call of duty that are able to do a very good job because of the fundamental changes to their approach. Any decrease in certainty that users will actually like a change means a decrease in prioritization vs 1000 other things a dev could be doing with their very limited time. For a small port outsourcing studio many of those thousands of things probably do more to decrease the chances of running out of money in a matter of months.
 
Last edited:
So, to approach the problem from a completely different side .. we should ask ourselves, who do we need shader compilation at all? I mean the compilation happens when the driver changes, the GPU changes (even to a one from the same GPU family), the game version changes, the OS version changes or often even when the graphics settings change!

So many factors trigger the re-compilation stage, that it's not even funny anymore. Why would I need to compile when I change from a 3080 to a 3090? why would I need to compile when I change the settings from High preset to Low preset? why would I need to compile when I change between two drivers that are a week a part?

Surfing the web, I discovered that this compilation is some form of a JIT (just in time) process to optimize the code of the shader to be more compatible with your machine for the purpose of achieving higher performance. So in essence, the GPU driver/OS are taking every shader and morphing them so that it can better suit the capabilities of the machine to achieve higher fps!

If so, then the logical next step would be to offer the user a choice: pre-compile the shaders each time any of the triggering factors occur, or don't compile the shaders at all, and be content with lower performance (can't be that low), but with a stable fps with no stutters.

In all cases, the developers should ship the game with as much neutral (common) version of the shaders as possible. For example: one neutral shader for recent AMD GPUs (RDNA/GCN), another one for NVIDIA's recent GPUs (Ada/Ampere/Turing), all of these archs share a common structure anyway, and aren't that drastically different, and when the user don't want to wait, and chooses to run the game immediately, without the time cost of compilation, you let him run the game without it, even if it means lower fps (can't be that significantly lower, can it?). If the user is knowledgeable enough he can choose to compile the shaders to eek out every last drop of fps his system is capable of but at the cost of waiting for dozens of minutes to achieve that goal. It's all about the freedom of choice, and by offering it, we tailor the experience to every user's comfort zone.
 
Sure -- I also admit that ue4 is a place where I don't have professional experience, so maybe I'm speaking a little too confidently about this process you've looked into, sorry. I will also say my original posts weren't targeted specifically at you, and while almost all of you have been very reasonable you don't have to look far in this conversation to find the kind of "privileged pc gamers" I'm complaining about.
Just say you're talking about me... I'm a big boy, I can handle it.

But yea, demanding a game work properly as it should makes me a "privileged PC gamer" huh... That's asking too much from developers/publishers right? Wow.

I've said it before and I'll say it again... when you buy a car you expect it to work properly, right? It doesn't matter what the manufacturers had to do, or what it cost them, to make it run right... It's their job to make something that runs correctly.. or they go out of business. You don't have to know how difficult it is to build a car.. and understand the plight of the people who design and build them... to have the right to complain about it when it doesn't work properly... do you?

The simple fact of the matter is this. They're putting out a product on this device.... and it better damn well perform as decently as should be expected. That is the BARE MINIMUM of their obligation to consumers. If they can't do it... then bugger off.

Look at this... LOOK AT THIS DAMN VIDEO:

Is that acceptable to you? Should that EVER have been released in that state? And guess what... 1 week after that video was made... the game was completely fixed... This is NOT some insurmountable grand blight on PC gaming... it's developers/QA/Management not doing their damn bare minimum.

Nobody is saying a game has to be 100% not a single hitch ever... give me a break. But it simply CANT be like that video... and expecting that is NOT being privileged.
 
Why are y'all so mad and arguing all the time these days...

Yes, spoken as a console user, devs who want to release on PC have the obligation to make sure their game atleast works satisfactory for people to play it and enjoy it.

Everyone gets the realities of game development and how hard it is, but releasing products in a state where they are stuttering every few seconds wouldent be acceptable on any platform
 
Why are y'all so mad and arguing all the time these days...

Yes, spoken as a console user, devs who want to release on PC have the obligation to make sure their game atleast works satisfactory for people to play it and enjoy it.

Everyone gets the realities of game development and how hard it is, but releasing products in a state where they are stuttering every few seconds wouldent be acceptable on any platform
Exactly.. and that's all we're saying. I have no idea when that became expecting too much.

And developers get NO benefit of the doubt anymore... We've seen this issue be resolved time and time again after enough of a stink was made about it and it making them look bad.

Like honestly... Unreal Engine just looks BAD when nearly every game has this issue and releases like this. Alex even alluded to Epic only doing something about this issue because it's beginning to make the engine look bad.

If console fanboys get off on telling PC gamers that developers don't give a damn about them and their platform... we'll they're not telling anyone anything we already haven't thought ourselves.. but we're tired of it now. And now it's time to start throwing it back in their faces. This is CLEARLY a QA/Management issue.
 
I only started hearing about shader compilation stutter recently for PC, and didn't even realize how bad it could be until I saw Alex's sackboy video. If that was the state Sony launched the game in on PlayStation they would have gotten hell for it.

It's honestly bewildering why I haven't heard more about it previously with UE4 games on PC considering how widespread the issue seems to be.

And it's twice as bewildering when I remember that high end tech demo in 2012 for ue4 and how Tim Sweeney and Cliff Blezinsky were boasting that only the most powerful PCs could truly take advantage of it. But it seems like they in the middle of all that forgot to tailor their engine for a general PC user experience.

Hopefully ue5 is much better in that regard
 
And developers get NO benefit of the doubt anymore...
Yeah, no kidding.

Shader stutter isn't good. It's reasonable for people like DF to criticize, and it's good to praise games that avoid it.

But this idea that now that you all finally noticed it any dev who ships a game with this issue isn't doing their job is preposterous. Some platforms have clear drawbacks. The possibility to mitigate those drawbacks by limiting the scope or content of your game, starting development over with a custom engine taking advantage of recent, cutting edge innovations, or by applying an "easy fix" that isn't that easy and only works well on certain games does not mean those drawbacks are not real or that they are caused by lazy devs.

If you displayed this kind of attitute towards other platforms' limitations -- like, "this one game on the switch runs at 60fps/4k, any game that can't deliver this shouldn't have been shipped" or "this mobile game doesn't degrade at all when the phone gets hot, anybody who works on a game that does that should be fired" you'd be (I hope?) laughed off this forum. The tiny ounce of technical knowledge you picked up doesn't make this case any less silly.

"Poor developers!! Why wont somebody think of them?!" "And now it's time to start throwing it back in their faces. This is CLEARLY a QA/Management issue." listen to yourself!
 
Those things you mentioned are not at all like the huge stuttering on certain games though. No one expects games to run at 4k 60 on any platform(hopefully) but especially not the switch, and the situation with the phone game isn't applicable either.

If I'm getting whats going on here, PC gamers are essentially asking for PC games not to release a mess of a state thanks to shader Compilation stutter which can be seriously obtrusive to the game experience. I think thats fully reasonable
 
"Poor developers!! Why wont somebody think of them?!" "And now it's time to start throwing it back in their faces. This is CLEARLY a QA/Management issue." listen to yourself!
How about you watch that video I posted again...?

Pubs/devs releasing trash like that DESERVE to have it thrown back in their face. Magically fixed one week later too!

The problem here isn't my attitude... its the attitude that the companies who release these products have towards their customers... that they expect to put out pure trash and that we just need to accept it.

You're part of the problem.
 
Yeah, no kidding.

Shader stutter isn't good. It's reasonable for people like DF to criticize, and it's good to praise games that avoid it.

But this idea that now that you all finally noticed it any dev who ships a game with this issue isn't doing their job is preposterous. Some platforms have clear drawbacks. The possibility to mitigate those drawbacks by limiting the scope or content of your game, starting development over with a custom engine taking advantage of recent, cutting edge innovations, or by applying an "easy fix" that isn't that easy and only works well on certain games does not mean those drawbacks are not real or that they are caused by lazy devs.

If you displayed this kind of attitute towards other platforms' limitations -- like, "this one game on the switch runs at 60fps/4k, any game that can't deliver this shouldn't have been shipped" or "this mobile game doesn't degrade at all when the phone gets hot, anybody who works on a game that does that should be fired" you'd be (I hope?) laughed off this forum. The tiny ounce of technical knowledge you picked up doesn't make this case any less silly.

"Poor developers!! Why wont somebody think of them?!" "And now it's time to start throwing it back in their faces. This is CLEARLY a QA/Management issue." listen to yourself!
Wow, you are waaaaay off the mark.
 
I actually disagree on this point. PCs are modular because they serve a wide variety of business cases, need to be bought in bulk in various configuration, need to provide for extremely low end users for certain important tasks (I think everyone has reacted to me mentioning email above, but this is kinda what I had in mind), and so on.

Yes of course the modularity serves a whole host of purposes, but that doesn't mean that one of those purposes it's gaming. Due to its modular nature you can indeed build a PC that's wholly unsuited for gaming. However you can also build a PC that's extremely well suited to gaming. Not merely because it has a big powerful GPU in there, but because it has a gaming orientated CPU, motherboard, SSD, memory etc.... Just because the system can be customised for one task, does not mean it's unsuited to another task when configured differently. That's the whole point of modular systems.

The premise of your argument seems to be that if it's not locked down to the single task of gaming, then it's not intended to be a gaming machine, which defeats the point of any modular system.

PCs are not machines that have unified memory,

Low end PC's using APU's absolutely do. And yet they are clearly less well suited to gaming than high end systems. Of course I understand that unified memory makes the devs life simpler - which is great, and it's easier to screw up performance with discrete memory pools. But there are also clear performance advantages of discrete memory pools in the form of higher overall bandwidth and capacity with no contention between the CPU and GPU.

In addition, PS3 and some earlier consoles did not have unified memory, so are we to say that those games consoles weren't intended for gaming?

guaranteed access to a full suite of modern gpu features

Of course they do if you have the right components. All you need is Windows 11 and a modern GPU and you have guaranteed access to the same suite of GPU features seen for example on the Xbox Series X - even to the extent of using largely the same API. I get that you're probably stating this from a developer point of view because it's easier for you if the entire target market conforms to a single spec, but that's hardly the case even in console land where you have to target Switch, PS4/XBO and current gen consoles. It's a fact of life that devs will always need to cater for multiple target platforms with different feature sets, PC's expand that somewhat (depending on how widely the net is cast) but that doesn't make the platform less fit or unintended for gaming - it's simply a side effect of the upgradable nature of the platform.

That may make developers lives more difficult, but hey, almost all games get released on PC these days so from a commercial point of view it's obviously worth it. And from an end user point of view, there's no reason why a gamer who's machine has a "full suite of modern gpu features" should be told their machine is less fit for gaming that a potentially less capable and less feature rich console.

, are designed around super tight control of background tasks to avoid stealing resources from the current program (of course they're not! For many PCs in many environments the background tasks are just as, if not more, important than what is currently being run -- they could be IT software, bank databases syncing, whatever,)

Why does this mean they aren't intended for gaming? If you want to want to run a bank database on your gaming PC while playing a game (lol) then go right ahead. Your performance will be lower, perhaps unplayable even but that's a user choice, not something that has been forced upon you because the system itself is unsuitable for gaming. That's like saying a Lamborghini is unsuitable for racing because in theory I can choose to fit a tow bar and tow a caravan with it while in a race and it would therefore perform worse than a car without a tow bar.

PC's absolutely require a greater degree of user knowledge to perform optimally while gaming, but as with many, many pieces of technology, just because they require user knowledge to operate, does not mean they are unsuited to the task, or not intended to perform it.

have a set, universal waterline of performance or access features that developers can be sure there's a good financial ROI on targeting, and, yeah, don't have fixed configurations you can pre-compile shaders for. All of these things make them *less fit for running games -- no game is going to ship such that every user has the intended experience, or no situation will result in a 100ms hitch for any number of reasons.

That's the whole reason for scalability in games. You seem to be operating from the position of users *should* be locked down to a specific hardware configuration and specific developer selected settings, and only that makes for a genuine/fit gaming system. But that's totally in opposition to a very large section of the market that don't want to play games like that. And PC's cater to that section of the market. Just because that section of the market can be more tricky to accommodate from a games developer point of view does not make it any less valid of a way to play games, and by extension does not make the platform that enables games to be played in that way any less of a legitimate, or capable gaming platform.

Case in point, PC's can and do offer significantly improved gaming experiences over consoles in the vast majority of games with the correct hardware. So how can that be the case for a system less fit or unsuited for running games?

That is not an excuse for not delivering a great experience to those players who do have the hardware, the knowledge, and the care to provide an environment where the game ought to be able to do just that -- a developer can use cutting edge graphics APIs only available to some users, trust nobody is running something silly in the background, write streaming and loading code that leverages the big surplus of vram and ram this user spent thousands of dollars to provide instead of unified memory, and keep their shader variant count low and then carefully keep track of 99% of the variants that actually matter, and precompile those in a menu. However, while it isn't an excuse, it is something your boss will de-prioritize under other important tasks, because only 30% of your players are on PC, only 50% of them have that perfect gaming pc and knowledge, and only 10% of those will complain.

Essentially, you're saying here that a platform which has more development challenges on account of its open and scalable nature is less well suited for and not intended for gaming, despite its capability to provide both a better, more flexible and more customizable gaming experience for the end user which is exactly what a large segment of gamers want. I'd have to disagree. You're absolutely correct that consoles are preferable to PC's as a target platform from a developer point of view. But it doesn't follow that from an end user point of view they are less fit or outright unintended to play games. And since we're talking about end user expectations here, that's the perspective that should matter. After all, supply is there to serve demand, not the other way around.
 
People - If you use Rivatuner Statistics to cap your frame it can reduce or eliminate the stuttering.

It works so well it constantly has me questioning why using driver level Vsync or the game level Vsync don't work anywhere near as well.
 
When I say "terrible platform" I don't imply I have zero interest in it -- I have a 3080, I play pc games, etc. However, fundamentally, PC gaming is about slapping an amazing piece of graphics hardware into a machine designed around tasks like loading emails.
This is a view stuck in the past. By definition, a computer can do anything as it's the software that enables it, not the hardware, and PCs are not designed around loading emails. In Ye Olden Times, hardware was limited, so you had bespoke machines better suited to playing games than running office apps, and vice versa. The PC used to be that, a work machine* that people played games on. That legacy made playing games on PC a bit awkward and inefficient.

Things have moved on a lot though. If you were to design a system for the purposes of playing games, a PC does not miss much due to its legacy. It could be more efficient, but then perhaps so could the consoles which have made hardware sacrifices in favour of supporting easier software development. The end result is the consoles are using hardware that's basically a match for PCs, showing it's only the software that's making the difference.

The shader compilation issue on PC is not because it's a poor platform for gaming, but because it is an open platform for hardware. If there was only one configuration of Windows PC released every 7 years, it'd have the same shader compilation situation as consoles.

* Although of course there were other non IBM clone personal computers that were far more about the games, from the Spectrum and C64 to the Atari ST and Commodore Amiga, some even including hardware for the benefit of running games better. Generally hardware that improves gaming improves media productivity and isn't simply a case of a piece of hardware being 'gaming' or 'work' oriented.
 
Status
Not open for further replies.
Back
Top