Digital Foundry Article Technical Discussion [2023]

Status
Not open for further replies.
At this point in time there isn't a reason to buy a ps4/pro/xone/xonex with the series s out there. You might have an occasional game that performs slightly better on the pro or x but going forward there wont be new games for them. You also loose out on the nvme drive in the s .

If you have a ps4/xone the series s is the way to go for a low price upgrade unless you really want the best possible performance or sony exclusives.

If you have a ps4 pro/ xonex you might want to buy the ps5 or series x. but if you want to save money the series s is a good pick up.


The closer to $200 the s stays the more of a good deal it is. I think this will become even more evident when we get redsign shrinks with 5nm . The series s is going to start getting really small
I think the extra $100 for a PS5 digital edition is well worth it.
 
I think the extra $100 for a PS5 digital edition is well worth it.
This will be a topic that ends badly.
All I'll say is that many people obviously disagree considering how well its selling.
And I don't think it's only because of availability either.
 
This will be a topic that ends badly.
All I'll say is that many people obviously disagree considering how well its sellimg.
And I don't think it's only because of availability either.
Do we have any info on how well Series S is selling? 33% more money for over double the performance seems like a no brainer to me. If you plan on Game Pass I suppose that would be the only decision changer.
 
Do we have any info on how well Series S is selling? 33% more money for over double the performance seems like a no brainer to me. If you plan on Game Pass I suppose that would be the only decision changer.

There are a lot of people out there who have brains, but are hanging on by their fingertips financially. 33% more on a toy is a lot for some folks. Especially when the games and the gameplay are the same.

But there's a bit more to it than that. The PS5 Digital is sold at a loss and is intentionally produced in limited numbers.

In the UK over Christmas, Series S was available for £200, while PS5 Digital was going for well over £300. That's one hell of a difference for machines playing the same games *if* you're not overly concerned with graphical fidelity.
 
I still not sure I follow. What you're describing above seems to be the more general ability for consoles to extract more performance from a given set of hardware due their fixed nature, which is fair enough. Translated to a real world example we could imagine a shadow setting which on medium looks worse than the console, while on high it looks better. But medium has the same cost on PC as the better looking setting does on console, while High has say a 50% higher cost. I think that's what you're describing above which is fair enough, but then surely there is still a place for a setting between the two that is as good as the console setting but only costs say 20% more performance on the PC? That way people can get at least console level quality for the minimum possible performance outlay.

I'm struggling to think of a scenario like the one above where you have settings both above and below the consoles where something in the middle that matches the console doesn't offer a compromise between the two.

I can at least envision a scenario though where for example a specific sub component of an effect is more costly on PC and thus the effect at any quality level will be more costly, and therefore on PC the developer chooses to increase the quality level of a different sub component of that effect because it's effectively free. Therefore you have what appears to be a very costly effect on PC that looks better than the console version but doesn't scale down. A crude analogy of that might be RT on a slow CPU with 40xx GPU where the BVH creation and thus overall performance is slower than the console and so RT resolution is ramped up beyond the console. But lowering RT resolution in that scenario gains you nothing and so the option isn't presented to the user.

His findings dont really match with the comparisons done by DF through the years atleast. Its possible to match console settings close enough to not see much a difference (DF/Alex optimized settings) with ballpark same performance on the same class/matching pc hardware to consoles.
 
This will be a topic that ends badly.
All I'll say is that many people obviously disagree considering how well its selling.
And I don't think it's only because of availability either.
What does this mean? Series s is a great bargain buy, but I don't think sales say anything about PS5 not being worth price?
 
This is why I'm genuinely inquiring how you would have phrased Alex's critique in that section - like I'm not asking for a form letter we can all spam MS with, but instead of saying "Games should provide all resolutions and refresh rates", something like...?
Games should provide good performance and resolution scaling options. Doing this via monitor mode changes is pure legacy at this point.

And indeed the OS could virtualize this to some extent especially for older games. That is - after all - exactly what the various IHV driver control panel overrides and settings do in a crude way.
 
I can at least envision a scenario though where for example a specific sub component of an effect is more costly on PC and thus the effect at any quality level will be more costly, and therefore on PC the developer chooses to increase the quality level of a different sub component of that effect because it's effectively free.
This is precisely what I'm talking about. In almost all cases games already provide quality settings that span both above and below the consoles. But coming up with "optimal" settings is always coming up with a tradeoff of the best use of performance across *all* of the various quality knobs... they are not really independent in practice. Thus if something is far more expensive on one platform it is likely to be turned down/disabled in favor of something that is relatively cheaper but provides a bigger visual impact.

These sorts of situations are actually fairly common, and thus while it's intellectually interesting to ask for "console settings" where it makes sense, IMO it's more important to get presets that are optimized *for PCs* on PCs. Of course this is an imperfect process in and of itself because PC configurations vary widely, but there are definitely systematic cases where something will be worse or better on all PCs just due to the nature of the available code paths on each platform.
 
I think the extra $100 for a PS5 digital edition is well worth it.
Yea but for the extra $100 bucks I think the xbox series x is well worth it :runaway:

But regardless the xbox series s has been on sale as low as $200-$225 in the states so sometimes its half the price of the ps5 digital. For some people that is a lot of money and for some people it isn't. For some people who might have multiple kids they could walk away with a series s for two kids or a ps5 that they have to share and fight over. $150-$200 is a bunch of games or what over a year of game pass ultimate ?

Some people might have a ps4 base or xbox one / s and may be a casual gamer and dropped $200-$250 on that system and don't want to spend more than that on a new system

So there are value props to be made everywhere.


I think that with a real price drop to $250 and All access moving to say $20 a month for 2 years it could become really popular in the current economy
 
Last edited:
I think the extra $100 for a PS5 digital edition is well worth it.

That may be true (it's subjective) but for people on a budget because they need to be on a budget, budgets don't work like that.

If you have a budget that allows for X amount spent over Y amount of time, you do not go over that at all even if you could get A thing that is 5x better than B thing for only 20% more. Your budget says you can afford B thing but not A thing so B thing is what you are getting regardless of how good A thing is as long as B thing still does the job required. And if B thing doesn't? Well, you still can't afford A thing, so you don't get either A or B thing in that case.

Regards,
SB
 
Do we have any info on how well Series S is selling?
How well its sellimg and it also contributing a lot to people who've never owned an xbox before is based on comments by Phil, earning calls, when ever there's sales breakdowns etcs.
Pretty well documented.
What does this mean? Series s is a great bargain buy, but I don't think sales say anything about PS5 not being worth price?
I never said either of the PS5's (or any console) aren't worth the price.
I'm saying he may not think the XSS is worth it, but a lot of people do based on how it's selling.
 
How well its sellimg and it also contributing a lot to people who've never owned an xbox before is based on comments by Phil, earning calls, when ever there's sales breakdowns etcs.
Pretty well documented.
I never said either of the PS5's (or any console) aren't worth the price.
I'm saying he may not think the XSS is worth it, but a lot of people do based on how it's selling.
Sorry I misunderstood your post 😂

I don't really see how anyone can see series s as not an attractive purchase tbh. It's mandated to run every game for the next half decade or more at almost half the price of the series x. I could buy one right now for 260$ and it's not even on sale
 
This is precisely what I'm talking about. In almost all cases games already provide quality settings that span both above and below the consoles. But coming up with "optimal" settings is always coming up with a tradeoff of the best use of performance across *all* of the various quality knobs... they are not really independent in practice. Thus if something is far more expensive on one platform it is likely to be turned down/disabled in favor of something that is relatively cheaper but provides a bigger visual impact.

These sorts of situations are actually fairly common, and thus while it's intellectually interesting to ask for "console settings" where it makes sense, IMO it's more important to get presets that are optimized *for PCs* on PCs. Of course this is an imperfect process in and of itself because PC configurations vary widely, but there are definitely systematic cases where something will be worse or better on all PCs just due to the nature of the available code paths on each platform.

Thanks for clarifying, I understand exactly where you're coming from now. It's funny as I was thinking over this offline before I saw this post and it occurred to me that you're talking about presets as opposed to individual settings (at least for the most part) and yes I agree the argument around presets makes sense.

To give a real world example of this we can look at Spiderman on the PC. The RT object distance is very costly on PC vs console and while the PC does match the console setting (10) it doesn't go any higher. Meanwhile things like RT resolution are much cheaper on many PC's compared with the consoles. Therefore if you had a "console" preset you would end up in the odd situation of having a preset that looked worse than "Very High" but also performed worse than "Very High" which might set every setting to very high but RT object distance to 8 (hypothetical example).

I think in this situation, you could still have a console preset that just advised the user that "some users may experience degraded visuals and/or performance in this mode vs the more optimised PC presets". However I concede that the utility of such a preset would be very limited to pretty much academic purposes and although I would certainly appreciate that, I'm probably in a very small minority here.

However, where I still think we absolutely should have console equivalents is in the individual settings themselves (and even better if they're labelled as such). Taking Spiderman again, the object detail goes all the way up to 10 despite being very expensive on PC which is an absolute must regardless of the cost of an effect IMO - it's unacceptable to have a game on PC unable to at least match the visual fidelity of the console version even if it's more expensive to achieve. However they also have things like crowd density were the console setting is higher than low but lower than medium. I'd prefer to see a setting inbetween the two which exactly matches the console one and presumably gives what on the console dev side at least was considered the best performance bang for buck for that particular effect. With that level of effect it's pretty insignificant granted, but then you have something like the Witcher 3 where PC's are forced into a much higher, and much more expensive quality setting than the consoles with no option to turn things down which simply makes no sense.

EDIT: also, there is a benefit to this from the developers point of view I think, as there are plenty of channels out there that like to compare PC and console performance in specific games. If you make it easy for them to do that, you're more likely to get screentime from them and more likely to get people talking about the game - free publicity.
 
To give a real world example of this we can look at Spiderman on the PC. The RT object distance is very costly on PC vs console and while the PC does match the console setting (10) it doesn't go any higher.
10 on console is only for the Fidelity setting which maxes at 40 fps - if you are willing to do 40 fps on PC, then it is very "cheap!" The performance mode (targetting 60 fps+) is between 7 and 8 on PC.

@Andrew Lauritzen
Regarding "console settings" - I think you are missing my point in the video by focusing on the concept that single system knowledge allows for more specific optimisation with regard to overlapping for async compute, etc. That is not what I am talking about in the video.

What I am talking about in the video are settings in the menu which expose the levels of samples, quality, etc. for a specific effect with an appropriate name which tend to be best visuals for performance on graphics computing components out there *in general*. Consoles tend to use that setting. This is essentially the "bare minimum" art quality that the game has been deemed able to have on consoles. And 90% of the time, excluding things like anisotropic filtering or in a CPU limited scenario, where inherent differences between platforms come to bear, this applies universally to all other GPUs and not just on the console.

Having reviewed literally hundreds of games and gotten the "console settings" from those games that match the in terms of visual output, it is really obvious to see often that the setting that has the best visual return in terms of samples/per-pixel etc. for the most reasonable performance cost, even on completely different PC GPU hardware, is indeed the one we find in use on console. But instead on PC it is given some arbitrary name like "Low", "High" etc. or maybe it does not exist at all and is instead some "in-between" setting that is rather dumbly missing.

Yes - there are inherent differences with regard to optimising for a single platform for things like Async compute overlapping, the usage of Anisotropic Filtering, or which CPU-related settings are chosen to ensure some arbitrary frame-rate target, but that is not at all what I am talking about in the video. I am talking about a clear labeling scheme for the samples per-pixel, etc that the ART team deems reasonable in terms of performance to hit the game's artistic vision being labeled as "console" or "original" settings. It just so happens that these "original" or "console" settings generally always happen to be the visual/performance sweet spots on nearly every GPU out there.

A really good example of this is in the video, but not explicit. On PC, in the Witcher 3 Complete edition there is only one quality level for DDGI available. That preset of "on" is of way higher quality than what is available on console. Why? Why is the "console" setting of lower quality not available on PC? Surely the artists and tech teams thought that the console's lower quality setting was "good enough" for visuals and performance to ship it there. Why is the same not afforded to lower end GPU owners on PC?

Because of the lack of proper naming in the menu and offering a lower "artistic minimum" setting that the console use, people either resort to not turning the DDGI on at all, or worse, modding the game to get a setting similar to one the consoles use. It is the developers job to provide useful settings for a game in their plurality, not the users.
 
Last edited:
Alex hit the nail on the head, the most recent example of a game needing a console equivalent setting is Witcher 3 ray traing update.

That game would have really benefitted from a 'low' RT option that offered the same reduced ray count and BVH distance used by consoles and then a 'high' option for those with the extra GPU power.
 
Last edited:
Yup, I'm playing the game with "optimized ray tracing" mode and it still ENHANCES the image quality a lot, while bringing huge uplifts in performance. I don't care if the higher settings look better, I just want tailored settings for my hardware. I could understand the sentiment if the said optimized settings wasn't worth the hassle in terms of image quality improvement but they are.

"low", in most cases NOWADAYS, became a semantics trap laid out by developers to lure people into thinking they need better and more expensive hardware to play games. Let's be real here; if something is to be low, it must looks hideous/bad quality. Most modern games nowadays do not even look "that" much different from bad at such low settings, which is funny. back then, I mean pre-2010, low settings really meant "low". graphics would go haywire. I remember setting GTA IV and watch it evolve into something completely different, hideous and ugly. Nowadays I set something like GOTG or RDR 2 to low EXCEPT texture settings, and game looks... %70 same. Oh, a couple trees and grasses are not there, oh that shadow is just a bit more jagged. Nowadays settings are too granular between low and ultra. Back then, low meant really low. Shadows would be gone, ACTUAL performance gains were there, stuff looked ugly, draw distance would be limited to 3 meters in front of you etc. Nowadays, one would expect RDR2 draw distance to go haywire at low but no, all mountains, trees etc. present themselves with all of their glory even at lowest preset BAR texture quality (crucial).

Mowadays, most settings are miniscule changes that you need %800 scope to find differences of. this is why I personally think the term "low" should not be used sparingly. call it "original", "normal", or at least "medium" or something. Low makes people think it is a bad thing. For example, "low" crowd density in spider-man makes no sense, as streets are still full of NPCs. I think I can add one more to the Alex's list: Name the settings PROPERLY. If you call a setting LOW, it MUST adhere to the semantics of the said word. It must bring the ramifications it represents. same goes the other way, sometimes ultra is just a buzzword in certain games.

I'm saying this not because I dislike such settings. Its just that people that NEED to use those settings cannot bring themselves to use them due to the naming scheme. They automatically think low settings as hideous things that should not be touched upon, despite having a low tier or mediocre hardware. I cannot blame them.

God of War proves my theory on this. My friend happily played this game on the "original" preset with great performance. He has a 1070 mind you, and a addict of "high/ultra" settings, often cursing to games for bad performance on such settings. The thought of using medium settings is a sin to him. Even though I urged him to turn down certain settings to medium or low in RDR2 with great gains of performance, he couldn't bring himself to do so and forced himself to play at 45 FPS, despite 60 FPS would be attainable with the optimized settings. Yet, the term "original" was enough safe haven for him to consider those settings playable. See? Psychology is in play here.

I agree with them too. Low/medium in terms of semantics have some serious semantics, at least to the regular end user. Especially if the said user experienced what low medium meant back in 2000s. Some of my friends, I was able to enlighten them, and I owe this to the DF's analysis on optimized settings. Seeing what these so called "ultra" and "medium" settings do in close ups was great of help for some of them.

But sadly, they can only reach to so much people. I think the underlying mentality between naming schemes has to change. I think calling reasonable medium settings "Normal" would be a great of doing it. I saw some games do this, and people were happy with such settings. You can always call higher tiers ultra, extreme, awesome etc. Just call the reasonable settings normal, original or something that means positive and not bad. Neither low nor medium sounds positive. They sound negative to most users. They sound "hehe, you HAVE to use this hideous setting to get playable performance. you cannot play high, hehe"
 
Yup, I'm playing the game with "optimized ray tracing" mode and it still ENHANCES the image quality a lot, while bringing huge uplifts in performance. I don't care if the higher settings look better, I just want tailored settings for my hardware. I could understand the sentiment if the said optimized settings wasn't worth the hassle in terms of image quality improvement but they are.

"low", in most cases NOWADAYS, became a semantics trap laid out by developers to lure people into thinking they need better and more expensive hardware to play games. Let's be real here; if something is to be low, it must looks hideous/bad quality. Most modern games nowadays do not even look "that" much different from bad at such low settings, which is funny. back then, I mean pre-2010, low settings really meant "low". graphics would go haywire. I remember setting GTA IV and watch it evolve into something completely different, hideous and ugly. Nowadays I set something like GOTG or RDR 2 to low EXCEPT texture settings, and game looks... %70 same. Oh, a couple trees and grasses are not there, oh that shadow is just a bit more jagged. Nowadays settings are too granular between low and ultra. Back then, low meant really low. Shadows would be gone, ACTUAL performance gains were there, stuff looked ugly, draw distance would be limited to 3 meters in front of you etc. Nowadays, one would expect RDR2 draw distance to go haywire at low but no, all mountains, trees etc. present themselves with all of their glory even at lowest preset BAR texture quality (crucial).

Mowadays, most settings are miniscule changes that you need %800 scope to find differences of. this is why I personally think the term "low" should not be used sparingly. call it "original", "normal", or at least "medium" or something. Low makes people think it is a bad thing. For example, "low" crowd density in spider-man makes no sense, as streets are still full of NPCs. I think I can add one more to the Alex's list: Name the settings PROPERLY. If you call a setting LOW, it MUST adhere to the semantics of the said word. It must bring the ramifications it represents. same goes the other way, sometimes ultra is just a buzzword in certain games.

I'm saying this not because I dislike such settings. Its just that people that NEED to use those settings cannot bring themselves to use them due to the naming scheme. They automatically think low settings as hideous things that should not be touched upon, despite having a low tier or mediocre hardware. I cannot blame them.

God of War proves my theory on this. My friend happily played this game on the "original" preset with great performance. He has a 1070 mind you, and a addict of "high/ultra" settings, often cursing to games for bad performance on such settings. The thought of using medium settings is a sin to him. Even though I urged him to turn down certain settings to medium or low in RDR2 with great gains of performance, he couldn't bring himself to do so and forced himself to play at 45 FPS, despite 60 FPS would be attainable with the optimized settings. Yet, the term "original" was enough safe haven for him to consider those settings playable. See? Psychology is in play here.

I agree with them too. Low/medium in terms of semantics have some serious semantics, at least to the regular end user. Especially if the said user experienced what low medium meant back in 2000s. Some of my friends, I was able to enlighten them, and I owe this to the DF's analysis on optimized settings. Seeing what these so called "ultra" and "medium" settings do in close ups was great of help for some of them.

But sadly, they can only reach to so much people. I think the underlying mentality between naming schemes has to change. I think calling reasonable medium settings "Normal" would be a great of doing it. I saw some games do this, and people were happy with such settings. You can always call higher tiers ultra, extreme, awesome etc. Just call the reasonable settings normal, original or something that means positive and not bad. Neither low nor medium sounds positive. They sound negative to most users. They sound "hehe, you HAVE to use this hideous setting to get playable performance. you cannot play high, hehe"
thousands of games do that. FIFA 23, going from Ultra to Low the difference is not that big -in fact the game is CPU limited, at Ultra it maxes out at 115-119fps on my computer, and so it does at Low settings-. Playing with optimised settings in Elden Ring doesn't make much of a difference with having everything set to Maximum. The most noticeable difference to me is the grass density, and even so it's not that easy to notice.

Details like what you mention and Microsoft having a standard "Windows gaming hardware seal of quality 2022-2023" for hardware ready for any modern game of a certain time span would make people's life easier but MS has been sucking at PC gaming when they had a gold mine there....

The only thing they did in that sense as of late is adding extra info on games where they told you if your computer is up for the task with certain games, but I am not seeing that nowadays.

That wouldn't detract from the good things of PC gaming, like openness, mods, "perfect" emulation and thousands of games that would never be published on consoles could of regulations and stuff.
 
Last edited:
Status
Not open for further replies.
Back
Top