Digital Foundry Article Technical Discussion [2023]

Status
Not open for further replies.
How would that work in practice? As an end user I may want different games to render at different resolutions. E.g. in older games I may want 4xDSR. Also how would it correlate to render resolution given more powerful hardware can aim higher. It seems this would require universal dynamic resolution and universal dynamic upscaling at the OS and api level adopted by all games.
The same way it already works today, just with the UI in the OS not in the application. Many monitors don't actually support the modes you are setting (especially LCDs); they are already getting a dumb rescale from windows, the GPU display hardware or in some cases the monitor's rescaling. These are almost always just simple bilinear resamples so it's not as if the quality target is high. No need for any API to the game - the swap chain API is already present and sufficient.

The point is implementing this in each and every game is redundant and inconsistent. Better to have a consistent general interface at the OS level for the rare cases where you need actual mode changes. Worth noting a very similar thing has happened for "exclusive" sound output/mixing and it's past time we do the same for display.
 
The same way it already works today, just with the UI in the OS not in the application. Many monitors don't actually support the modes you are setting (especially LCDs); they are already getting a dumb rescale from windows, the GPU display hardware or in some cases the monitor's rescaling. These are almost always just simple bilinear resamples so it's not as if the quality target is high. No need for any API to the game - the swap chain API is already present and sufficient.

So where does the bottleneck lie in your view with why render resolution scalers in games are far less ubiquitous that we would all like? Is this an education issue with developers, or something inherently more complex in Windows than it has to be?

I guess what I'm getting it is how would you have rephrased Alex's point about desiring games to support every possible resolution and separate refresh rates?

Also not sure you addressed it earlier, but - DLDSR? You can of course force that to run on the desktop and have it work with any borderless window games, but when I alt-tab out of a game running at 5K DLDSR, I want me desktop UI at native res. How would that work?
 
So where does the bottleneck lie in your view with why render resolution scalers in games are far less ubiquitous that we would all like? Is this an education issue with developers, or something inherently more complex in Windows than it has to be?
I mean I'd argue that a lot a new games do have resolution sliders or equivalent. In terms of automatic dynamic resolution, it's certainly not as reliable on PC as console. The video indeed pointed out several examples of it working reasonably well on DX11 games, but ironically in this case there are a few reasons it can be slightly easier there than DX12 (which dropped "disjoint" timer queries for a number of reasons). That's not to say it's impossible, but it's definitely more complicated.

But yeah simple resolution scale sliders are fairly common I think... like anything it takes a *long* time for something to become ubiquitous of course.

Also not sure you addressed it earlier, but - DLDSR? You can of course force that to run on the desktop and have it work with any borderless window games, but when I alt-tab out of a game running at 5K DLDSR, I want me desktop UI at native res. How would that work?
As I said, the exact same way it happens today. Windows virtualizes the game's query of resolution and swap chain or similar and handles the rescale in the compositor. There's no technical magic here, I'm literally just saying the UI and ultimate control should lie with Windows rather than expecting every game to redundantly implementing similar things with various quirks.
 
Great discussion here.

I am appreciative that Andrew is taking the time to engage with us from his "other sign of the coin" perspective.

I also appreciate Digital Foundry for using their influence to raise concerns regarding problems with ports that have started to become a trend. If even half of this list is used as reference for PC porting houses we will be in a better place than we are today.

Having said that, while watching the video, there were some of the points which came across as "make my life/workflow easier as a tech reviewer" as opposed to the betterment of the quality of ports. There were also a couple of points that seemed to ignore/handwave the differences of fixed platform vs modular and the challenges of implementations that it may provide.

(Personally, my biggest pet peeve and probably unique to less than 1% of the PC market is #11. It speaks to Andrew's point of OS level responsibilities as it isn't exclusively a gaming thing, it's a Windows/IHV thing so it probably shouldn't have been included on this list. The HDR and surround sound situation on PC is terrible if you are using it as an HTPC within an environment with HDMI ARC. However, I have low expectations and don't expect that to be addressed anytime in the near future given the extremely small amount of users it may affect.)
 
It's not necessarily about "completely impractical", it's more about the optimal tradeoff will be different. If setting A and B between subsystems have a similar cost on one platform but setting B costs 2x as much as A on another, that would often affect the choice of best settings for a given performance target. Async compute is the one example I gave where the performance overhead of certain things can be fairly different on PC (where they cannot be reliably overlapped across arbitrary SKUs since the tradeoffs are so different) vs console (where a simple static scheduling is often sufficient). There are other cases where shaders can be tuned using platform-specific features or assumptions that can affect the performance by a fairly significant amount. Even just tuning shader code gen on consoles to fit into various occupancy buckets is fairly common on console, but not really possible on PC. As a final example, being able to bake more things in advance on console (whether it be shaders or stuff like raytracing BVHs) can definitely affect the tradeoffs as well.

I still not sure I follow. What you're describing above seems to be the more general ability for consoles to extract more performance from a given set of hardware due their fixed nature, which is fair enough. Translated to a real world example we could imagine a shadow setting which on medium looks worse than the console, while on high it looks better. But medium has the same cost on PC as the better looking setting does on console, while High has say a 50% higher cost. I think that's what you're describing above which is fair enough, but then surely there is still a place for a setting between the two that is as good as the console setting but only costs say 20% more performance on the PC? That way people can get at least console level quality for the minimum possible performance outlay.

I'm struggling to think of a scenario like the one above where you have settings both above and below the consoles where something in the middle that matches the console doesn't offer a compromise between the two.

I can at least envision a scenario though where for example a specific sub component of an effect is more costly on PC and thus the effect at any quality level will be more costly, and therefore on PC the developer chooses to increase the quality level of a different sub component of that effect because it's effectively free. Therefore you have what appears to be a very costly effect on PC that looks better than the console version but doesn't scale down. A crude analogy of that might be RT on a slow CPU with 40xx GPU where the BVH creation and thus overall performance is slower than the console and so RT resolution is ramped up beyond the console. But lowering RT resolution in that scenario gains you nothing and so the option isn't presented to the user.
 
At this point in time there isn't a reason to buy a ps4/pro/xone/xonex with the series s out there. You might have an occasional game that performs slightly better on the pro or x but going forward there wont be new games for them. You also loose out on the nvme drive in the s .

If you have a ps4/xone the series s is the way to go for a low price upgrade unless you really want the best possible performance or sony exclusives.

If you have a ps4 pro/ xonex you might want to buy the ps5 or series x. but if you want to save money the series s is a good pick up.


The closer to $200 the s stays the more of a good deal it is. I think this will become even more evident when we get redsign shrinks with 5nm . The series s is going to start getting really small
 
But yeah simple resolution scale sliders are fairly common I think...

Resolution sliders are just that though - 'fairly' common. Of the games I've been playing through the past year:

Master Chief Collection: Scaling slider, but no downsampling - slider maxess out at 100%.
Hellblade: No scaling.
Little Nightmares 2: No scaling.
Super Mega Baseball 3: No scaling.
Deathloop: No scaling (but DRS)
Sackboy: No scaling.
God of War: Scaling + over 100% (downsampling!)
Metro Exodus Enhanced: Scaling + (downsampling, but only with TAAU - no option when using DLSS.)
Days Gone: Scaling + downsampling
Death Stranding: No scaling.
Witcher 3 (with patch): No scaling (+ DRS)

It's still pretty hit or miss unfortunately.

(edit: used "DRS" for some games when I meant downsampling)

like anything it takes a *long* time for something to become ubiquitous of course.

...but that's exactly my point! In the context of what is basically a year-end technical review video of the recent state of PC ports, a wish for games to provide refresh/resolution features that many have included for years and still impact the player experience today doesn't necessarily seem in conflict with the desire to see a more systemic solution to these issues going forward. What we want are these features to become ubiquitous, and certainly a method that enabled the benefits of being able to handle res/refresh that still provide flexibility to the gamer but that the developer never even has to pay attention to would help that become so. But it's a matter of timescales.

This is why I'm genuinely inquiring how you would have phrased Alex's critique in that section - like I'm not asking for a form letter we can all spam MS with, but instead of saying "Games should provide all resolutions and refresh rates", something like...? It might indeed be more effective if Alex could have focused his concern on one company directly (MS) for not providing the tools/directive to enable this more easily, but like I'm said I'm not sure it would really fit into the context of a short list of checkbox items devs can have for "makes your ports better in 2023" vs a more generalized "Here's what the the PC as a platform needs in order to evolve" video.

No one is going to seriously advocate against a solution that's ubiquitous, standardized and takes any guesswork out of the developers hands. I'm all for systemic solutions, which is why I really want to see Alex/DF make a video that truly gets into the nitty-gritty of the shader compiling bottleneck and speaks to industry players (hardware, OS, API/game devs) to get a clearer picture of the full scope of the issue and potential ways to address this going forward (and their benefits/drawbacks) - it's not just Unreal and clearly continually pleading with developers to properly handle this issue with the tools at their disposal currently has led to a somewhat middling uptake.
 
Last edited:
The closer to $200 the s stays the more of a good deal it is. I think this will become even more evident when we get redsign shrinks with 5nm . The series s is going to start getting really small
I've not seen anyone complain about the size, if anything the opposite.
Price is already competitive. Especially during sales.
I think the further we go into gen the more titles will not be released on last gen, that's when it gets to be even more of a good deal.
At the moment people can still play ok on last gen, soon won't be the case.
 
I've not seen anyone complain about the size, if anything the opposite.

I think that's more in reference to process node shrinks, if they're economical. Of course the size of the S is already a selling point, but how you could asborb the higher cost of a node shrink is not only the increased # of functional chips you can get in a wafer, but also further reducing overall BOM and shipping costs.

So while a process node jump per wafer might be more expensive, making the dimensions of the actual unit even smaller may end up making even a slight price cut feasible, albeit who really knows in todays economic/literal climate. Smaller PSU, smaller heatsink, etc - reducing the physical box by half saves you quite a bit. Generally with modern process nodes the results seem to be the lower your power target, the greater the benefit.
 
At this point in time there isn't a reason to buy a ps4/pro/xone/xonex with the series s out there. You might have an occasional game that performs slightly better on the pro or x but going forward there wont be new games for them. You also loose out on the nvme drive in the s .

If you have a ps4/xone the series s is the way to go for a low price upgrade unless you really want the best possible performance or sony exclusives.

If you have a ps4 pro/ xonex you might want to buy the ps5 or series x. but if you want to save money the series s is a good pick up.


The closer to $200 the s stays the more of a good deal it is. I think this will become even more evident when we get redsign shrinks with 5nm . The series s is going to start getting really small
I agree. I have been waffling on whether or not to buy a series s to replace my 360 on top of access to the new Xbox exclusives coming out...in the end I ideally would like all 3 systems in some way with the ps5 as my main unit. but that won't be for a while.

My main issue with the S is that I have a lot of 360 games on disc and would have to rebuy them on the MS store for a lot more then I bought them for elsewhere.

I am really annoyed. If only the series S had a disc drive or atleast could connect one remotely i wouldent even have to think about it...I'm not in the running for a series X and series S is right in my price range. It definitely would be impulse buy territory for me.


Is there any chance at all MS will allow for disc compatibility for series S or am I screwed
 
I think that's more in reference to process node shrinks, if they're economical. Of course the size of the S is already a selling point, but how you could asborb the higher cost of a node shrink is not only the increased # of functional chips you can get in a wafer, but also further reducing overall BOM and shipping costs.
I've seen many people say should be smaller to make it even more appealing.
Considering their still loosing fair bit on consoles, any BOM savings I wouldn't expect to automatically be passed to consumers.
 
Considering their still loosing fair bit on consoles, any BOM savings I wouldn't expect to automatically be passed to consumers.

Yeah that's true, I immediately think of SX/PS5 when it comes to the margins issue but as Spencer implied this applies to the S as well - it was designed to deal with the increasing wafer costs but that was also before covid. We're probably not going to see any price cuts for quite a while for any SKU, if ever during this gen.
 
Is there any chance at all MS will allow for disc compatibility for series S or am I screwed
There is a patent, but with any patent doesn't mean will be made a thing. Would be cool though.
Can't remember if it would work for pre X1 though.
 
Last edited:
I've not seen anyone complain about the size, if anything the opposite.
Price is already competitive. Especially during sales.
I think the further we go into gen the more titles will not be released on last gen, that's when it gets to be even more of a good deal.
At the moment people can still play ok on last gen, soon won't be the case.

The smaller they can make it the better . It will appeal to more and more people that way in more and more countries. I am sure there are a lot of mothers out there that may not want something ps5 or xbox series x size in the living room or heck even series s. The smaller it gets the better. same goes for bedrooms. Not everyone has big bed rooms. I remember my time in Japan and as beautiful of a country as it was there was very limited space. I already have friends that have had me send xbox series s there just because of the size. I think cutting it down even more would be beneficial in places like that.

Not to mention the side effects of a smaller / lighter console to produce , ship and stock.

I think that's more in reference to process node shrinks, if they're economical. Of course the size of the S is already a selling point, but how you could asborb the higher cost of a node shrink is not only the increased # of functional chips you can get in a wafer, but also further reducing overall BOM and shipping costs.

So while a process node jump per wafer might be more expensive, making the dimensions of the actual unit even smaller may end up making even a slight price cut feasible, albeit who really knows in todays economic/literal climate. Smaller PSU, smaller heatsink, etc - reducing the physical box by half saves you quite a bit. Generally with modern process nodes the results seem to be the lower your power target, the greater the benefit.
We could actually see some improvements also

The xbox one recieved a small gpu speed dump from 853 to 914mhz so its not out of left field that we could see some slight improvements to the series s.
 
The xbox one recieved a small gpu speed dump from 853 to 914mhz so its not out of left field that we could see some slight improvements to the series s.
Microsoft claimed they needed that increase to enable HDR (base Xbox One didn't support it).
I'm not aware of any bullet-point features that Series S completely lacks. And I don't think a 7% speed increase would make any difference when it comes to developers enabling ray tracing on Series S.
However, it also wouldn't break compatibility with base Series S, so why not?
 
Status
Not open for further replies.
Back
Top