Digital Foundry Article Technical Discussion [2022]

Status
Not open for further replies.
Where can I download this Intel SFS demo?
It's on Github though you'll need Visual Studio to compile it. I've also uploaded the compiled version here if you're willing to trust a stranger on the Internet. I don't think you should run into any errors but you can just post them here if you do.

The only thing you need to be aware of is that the demo-hubble.bat script needs to point to the correct location. I have it set to run off the root of C: so you'll want to adjust it accordingly.
 
It's on Github though you'll need Visual Studio to compile it. I've also uploaded the compiled version here if you're willing to trust a stranger on the Internet. I don't think you should run into any errors but you can just post them here if you do.

The only thing you need to be aware of is that the demo-hubble.bat script needs to point to the correct location. I have it set to run off the root of C: so you'll want to adjust it accordingly.
Thanks :)
 
I can easily see it just being lz compressed still as most games are still cross gen. Install sizes seem to be similar and any difference could be asset de-dup if there has been any differences in the odd title. Also probably makes it easier for smart delivery not having to mess with multiple package formats yet. That's for the devs doing the packaging and smart delivery itself.
Good point. It be interesting to have more information about Smart Delivery works, and if any effort is required by devs to support it. In an ideal world, you'd click build+publish and it the SDK would produce a bespoke build for each supported platform, i.e. the appropriate textures and data packages using the appropriate compressions according to the available hardware decompressor.
 
no, I think he meant the 5700XT does not have DP4a, and wants to see the results vs a DP4a card.
But as you noted, it can't happen, because XeSS will not run without DP4a.
That's correct.

But Intel said it's running on all Shader Model 6.4 cards, which the 5700XT supports.

There's probably a FP16 fallback.
 
That's correct.

But Intel said it's running on all Shader Model 6.4 cards, which the 5700XT supports.

There's probably a FP16 fallback.
That’s interesting. Okay looking forward to it as well then!
 
Good point. It be interesting to have more information about Smart Delivery works, and if any effort is required by devs to support it. In an ideal world, you'd click build+publish and it the SDK would produce a bespoke build for each supported platform, i.e. the appropriate textures and data packages using the appropriate compressions according to the available hardware decompressor.

Along those lines, I was wondering about SFS. If during testing you could determine which textures are being sampled, at which mip levels, and across what proportion of their mip tiles, you could maybe build up a good idea of which textures lods you actually needed, and what the impact of reducing texture resolution for each texture for each platform would be. This would reduce the load on developers physically eyeballing everything under all circumstances.

So an example would be for Series S. It has a tiny SSD. So you test away gathering data and determine that at Series S resolutions some textures never need their highest resolution level under normal conditions, some are needed but rarely, and some are needed but only a few tiles of the highest mip used. So you build up a kind of map of importance of texture detail direct from the samplers themselves.

You'd then use this data to make a Series S build for Smart Delivery where asset quality reductions on disk are masked by the resolutions used. This way, Series S installs could be smaller and it might even be able to store more than 1.7 non-indie games on the internal SSD.

I've decided to call this "B3DPack", because I'm shit at acronyms.
 
That's correct.

But Intel said it's running on all Shader Model 6.4 cards, which the 5700XT supports.

There's probably a FP16 fallback.
If I remember correctly it requires SM6.4 AND DP4a not either.
So it won't run on it.

I can't find where that was said, maybe in one of the videos.
Only thing really found was
XeSS is implemented using open standards to ensure wide availability on many games and across a broad set of shipping hardware, from both Intel® and other GPU vendors.**

Additionally, the XeSS algorithm can leverage the DP4a and XMX hardware capabilities of Xe GPUs for better performance.

**The main requirement for the XeSS algorithm is graphics drivers with support for Shader Model 6.4, making it compatible with a broad set of shipping hardware.
This does make it sound like it can work without DP4a which I'm sure wasn't the case originally.
Are there many cards that support SM6.4 without DP4a?
 
Last edited:
Small correction:

I was quoting the wrong numbers. The relative differences are still the same but a 360° rotation read ~250MB at 1080p, ~600MB at 1440p, ~900MB at 4k, and 2GB+ at 8k. Those other numbers are still accurate but they're from a different test I made using the built-in rolling demo.
Thanks.
Yeah when it comes to relative IO to XSX it's a monster when you consider sizes of textures it should be using etc.

Does the demo have a software virtual texture implementation to compare it with.
 
Continuing with the XeSS talk. I know that Intel wants to focus on Intel products, but if this is performant and produces excellent quality output, it would be massive for the industry IMO. I look forward to DF messing around with it on the SteamDeck. The open-source nature of this might have significant implications when it comes to the next Nintendo product. Adding Cuda cores to the next Switch might end up being an issue for Nintendo when it comes to die space, but they might be able to add dp4a without giving up much. Things might already be set in stone but bear with me. Seeing that Nintendo is willing to use FSR in their games, I would not be surprised if they still go with an Nvidia chip but leverage XeSS and FSR2.0 as upscaling solutions instead of DLSS. If XeSS does an excellent job of upscaling images from, let's say, 480p to 720p and 1080p, the possible power savings might be huge. Life for developers might also be better if you could leverage it to hit 60fps more frequently on what everyone believes will be their next handheld.

If it turns out to be a good solution, I'm hoping Dictator goes back and revisits his DLSS on a handheld video. XeSS on a handheld?
 
Adding Cuda cores to the next Switch might end up being an issue for Nintendo when it comes to die space, but they might be able to add dp4a without giving up much. Things might already be set in stone but bear with me.
I will say that I've always believed it will have tensor cores, and that hasn't changed.

But if it didn't have it, then XeSS running on DP4a could be a good solution depending on performance overheads.
I would expect XeSS to generally have better IQ than FSR2 at lower input resolutions much like DLSS. We'll see this week hopefully.

In terms of Nintendo using it, that wouldn't even cross their minds as being an issue.
All games can use what tech they want.
The only reason DLSS would be used more would be because it has tensor cores.
Without it the next best thing.
 
The only reason DLSS would be used more would be because it has tensor cores.
Without it the next best thing.
I think nVidia has an interest in Nintendo using an nVidia specific technology to lock Nintendo into future nVidia products. Is a future Switch launches and relies on DLSS then to maintain compatibility future consoles will need either tensor cores from nVidia or a license paid to nVidia for compatibility.
 
I think nVidia has an interest in Nintendo using an nVidia specific technology to lock Nintendo into future nVidia products. Is a future Switch launches and relies on DLSS then to maintain compatibility future consoles will need either tensor cores from nVidia or a license paid to nVidia for compatibility.
That's a legit concern for Nintendo when considering a switch 3 BC.
DLSS would mean your locked in as Nvidia won't allow it on non Nvidia hardware even if it had its version of tensor cores that could run it.
Nothing stopping Nintendo implementing XeSS on tensor cores.
Just doesn't make sense for Nvidia in the PC space.

Not sure I see Nintendo putting in that sort of work?
 
Considering the open-source nature of XeSS, as long as Nvidia doesn't do anything to stop XeSS from working on Nvidia's tensor cores, I fully expect someone to figure out how to get it running on the cores before Nintendo's next hardware comes out. In other words, someone will do the work for them.
 
I think nVidia has an interest in Nintendo using an nVidia specific technology to lock Nintendo into future nVidia products. Is a future Switch launches and relies on DLSS then to maintain compatibility future consoles will need either tensor cores from nVidia or a license paid to nVidia for compatibility.

Or would they?

Lets say Switch 2 uses DLSS, would they need it on a Switch 3 or could they disable it and brute for the resolution instead?
 
Or would they?

Lets say Switch 2 uses DLSS, would they need it on a Switch 3 or could they disable it and brute for the resolution instead?
Obviously if you had the power you could theoretically brute force your way through it. But we are talking about Nintendo here, so the likelihood they are going to go with hardware that's fast enough to be able to do that is unlikely. Also, we are talking about Nintendo, and their history with backwards compatibility has been one mostly of maintaining original resolution and features, to the best of their ability. GC games were original resolution on Wii, and Wii games were still 480p on WiiU. GBA games were presented in their original resolution using black bars to maintain pixel perfect output on DS. The only option I could think of where output resolution was different is when playing Gameboy and Color games on GBA allowed you to stretch the screen horizontally to fill the screen.

Regarding the power required, the theories I've seen about a future DLSS powered switch would still have a 720p screen and would use DLSS to output 4k to a TV. The Idea that Nintendo is going to have hardware one generation in the future that is going to have a the performance to render those games that target 720p at 4k is not only out of step with Nintendo's history, it would also be one of the largest generation leaps in performance. Maybe going from 2013 Xbox One to Series X would be close, but even One had some 1080p games, and few ran at 720p (most targeted 900p).
 
The Idea that Nintendo is going to have hardware one generation in the future that is going to have a the performance to render those games that target 720p at 4k is not only out of step with Nintendo's history, it would also be one of the largest generation leaps in performance.

Why would anyone even do that?

They wouldn't need to brute to 4k, just brute force to a resolution that's acceptable.
 
Status
Not open for further replies.
Back
Top