It's on Github though you'll need Visual Studio to compile it. I've also uploaded the compiled version here if you're willing to trust a stranger on the Internet. I don't think you should run into any errors but you can just post them here if you do.Where can I download this Intel SFS demo?
no, I think he meant the 5700XT does not have DP4a, and wants to see the results vs a DP4a card.Was that a typo?
Does 5700XT have DP4a?
If it doesn't then XeSS will not be supported.
ThanksIt's on Github though you'll need Visual Studio to compile it. I've also uploaded the compiled version here if you're willing to trust a stranger on the Internet. I don't think you should run into any errors but you can just post them here if you do.
The only thing you need to be aware of is that the demo-hubble.bat script needs to point to the correct location. I have it set to run off the root of C: so you'll want to adjust it accordingly.
Good point. It be interesting to have more information about Smart Delivery works, and if any effort is required by devs to support it. In an ideal world, you'd click build+publish and it the SDK would produce a bespoke build for each supported platform, i.e. the appropriate textures and data packages using the appropriate compressions according to the available hardware decompressor.I can easily see it just being lz compressed still as most games are still cross gen. Install sizes seem to be similar and any difference could be asset de-dup if there has been any differences in the odd title. Also probably makes it easier for smart delivery not having to mess with multiple package formats yet. That's for the devs doing the packaging and smart delivery itself.
That's correct.no, I think he meant the 5700XT does not have DP4a, and wants to see the results vs a DP4a card.
But as you noted, it can't happen, because XeSS will not run without DP4a.
That’s interesting. Okay looking forward to it as well then!That's correct.
But Intel said it's running on all Shader Model 6.4 cards, which the 5700XT supports.
There's probably a FP16 fallback.
Good point. It be interesting to have more information about Smart Delivery works, and if any effort is required by devs to support it. In an ideal world, you'd click build+publish and it the SDK would produce a bespoke build for each supported platform, i.e. the appropriate textures and data packages using the appropriate compressions according to the available hardware decompressor.
If I remember correctly it requires SM6.4 AND DP4a not either.That's correct.
But Intel said it's running on all Shader Model 6.4 cards, which the 5700XT supports.
There's probably a FP16 fallback.
This does make it sound like it can work without DP4a which I'm sure wasn't the case originally.XeSS is implemented using open standards to ensure wide availability on many games and across a broad set of shipping hardware, from both Intel® and other GPU vendors.**
Additionally, the XeSS algorithm can leverage the DP4a and XMX hardware capabilities of Xe GPUs for better performance.
**The main requirement for the XeSS algorithm is graphics drivers with support for Shader Model 6.4, making it compatible with a broad set of shipping hardware.
Thanks.Small correction:
I was quoting the wrong numbers. The relative differences are still the same but a 360° rotation read ~250MB at 1080p, ~600MB at 1440p, ~900MB at 4k, and 2GB+ at 8k. Those other numbers are still accurate but they're from a different test I made using the built-in rolling demo.
I will say that I've always believed it will have tensor cores, and that hasn't changed.Adding Cuda cores to the next Switch might end up being an issue for Nintendo when it comes to die space, but they might be able to add dp4a without giving up much. Things might already be set in stone but bear with me.
I think nVidia has an interest in Nintendo using an nVidia specific technology to lock Nintendo into future nVidia products. Is a future Switch launches and relies on DLSS then to maintain compatibility future consoles will need either tensor cores from nVidia or a license paid to nVidia for compatibility.The only reason DLSS would be used more would be because it has tensor cores.
Without it the next best thing.
That's a legit concern for Nintendo when considering a switch 3 BC.I think nVidia has an interest in Nintendo using an nVidia specific technology to lock Nintendo into future nVidia products. Is a future Switch launches and relies on DLSS then to maintain compatibility future consoles will need either tensor cores from nVidia or a license paid to nVidia for compatibility.
I think nVidia has an interest in Nintendo using an nVidia specific technology to lock Nintendo into future nVidia products. Is a future Switch launches and relies on DLSS then to maintain compatibility future consoles will need either tensor cores from nVidia or a license paid to nVidia for compatibility.
Obviously if you had the power you could theoretically brute force your way through it. But we are talking about Nintendo here, so the likelihood they are going to go with hardware that's fast enough to be able to do that is unlikely. Also, we are talking about Nintendo, and their history with backwards compatibility has been one mostly of maintaining original resolution and features, to the best of their ability. GC games were original resolution on Wii, and Wii games were still 480p on WiiU. GBA games were presented in their original resolution using black bars to maintain pixel perfect output on DS. The only option I could think of where output resolution was different is when playing Gameboy and Color games on GBA allowed you to stretch the screen horizontally to fill the screen.Or would they?
Lets say Switch 2 uses DLSS, would they need it on a Switch 3 or could they disable it and brute for the resolution instead?
The Idea that Nintendo is going to have hardware one generation in the future that is going to have a the performance to render those games that target 720p at 4k is not only out of step with Nintendo's history, it would also be one of the largest generation leaps in performance.
Don't know. Brute forcing it would be silly.Why would anyone even do that?
But maybe the only option if you move to a different GPU maker and need to ensure you offer at least matching image quality compared the older vendor.Don't know. Brute forcing it would be silly.