DirectStorage GPU Decompression, RTX IO, Smart Access Storage

Can't link to it directly at the moment but at 7:08 in DFs video Alex shows the same issue on the 4070, just not as pronounced as the 4060. It clearly dips every time he turns though.
 
Can't link to it directly at the moment but at 7:08 in DFs video Alex shows the same issue on the 4070, just not as pronounced as the 4060. It clearly dips every time he turns though.
Yea, that happens on all hardware. The large spike causing a freeze though seems to happen exclusively to Nvidia... and like I said, even with the 4070, it will spike as well (and dip even further), if he turns the camera faster than he is.. which is very common when someone is using a mouse and keyboard configuration. He's turning the camera at a very conservative speed in that video as well.
 
Yea, that happens on all hardware. The large spike causing a freeze though seems to happen exclusively to Nvidia... and like I said, even with the 4070, it will spike as well (and dip even further), if he turns the camera faster than he is.. which is very common when someone is using a mouse and keyboard configuration. He's turning the camera at a very conservative speed in that video as well.
Shame he didn't test newer Radeons though, since they've been free of "directstorage stutters" in other DS games where NVIDIA has them (not sure of Intel)
 
Monster Hunter Wilds stutter badly on the 9070XT due to VRAM spill over (yes at the highest texture settings the game can fill up more than 16GB of VRAM for PS4 like textures), the game is an absolute piece of crap optimizations wise.

Is there evidence this game uses GPU decompression? In the settings menu under PC Specs it shows DirectStorage "CPU".

20250310182849-1.jpg
 
Is there evidence this game uses GPU decompression? In the settings menu under PC Specs it shows DirectStorage "CPU".

20250310182849-1.jpg
The menu always says CPU no matter what. According to Kaldaien, it actually is GPU Decompression. You can check that out using Special-K to verify that.
Shame he didn't test newer Radeons though, since they've been free of "directstorage stutters" in other DS games where NVIDIA has them (not sure of Intel)
I want to make a video on it when ever I get time. Testing the most similar GPUs that I own. I have far less Radeons than I have Nvidia GPUs to do perfect like-for-likes.

RX 5700
RX 6800 XT
RX 7900 XTX
RX 9070 XT

RTX 2060
RTX 2060 Super
RTX 2070 Super
RTX 3060
RTX 3080
RTX 3080 Ti
RTX 3090
RTX 4060
RTX 4070
RTX 4070 Super
RTX 4090
RTX 5080
RTX 5090
 
I want to make a video on it when ever I get time. Testing the most similar GPUs that I own. I have far less Radeons than I have Nvidia GPUs to do perfect like-for-likes.
IGN analysis of the game reveals it has major texture streaming problems on consoles (low res textures, failed to load textures, frame spikes, stutters, traversal stutters ... ), as the game is a texture streaming nightmare, constantly loading and unloading data ... their AMD GPU (RX 6800) suffers massive stutters as well (timestamped). Their CPU (5800X3D) is maxed out streaming data in and out, even a 4090 wasn't enough, the game used a gigantic 18GB of system RAM and additional 18GB of VRAM!

 
Half Life 2 RTX uses RTX IO (which shares the same functionality as DirectStorage), it was featured in Portal RTX and Portal RTX Prelude, but this time it has implemented the first ever game application of Sampler Feedback Streaming, resulting in huge VRAM savings (almost 4GB on a 4090).

Sampler feedback streaming intelligently loads and evicts data on demand in order to reduce memory consumption

 
resulting in huge VRAM savings (almost 4GB on a 4090).
Xbox hope for the XSS, yet they haven't got a single title out using it.
Nice to see it actually in action, wonder if already being superceded by Neural Texture Compression.

Handhelds, Switch 2, cloud streaming and storage, all could do with lower storage footprint and once engines streaming gets updated vram also.

Are the tech mutually exclusive or can SFS be used to stream in NTC?
 
Xbox hope for the XSS, yet they haven't got a single title out using it.
It's likely they never will, either.
Xbox has given up on any custom features for their consoles since all their games going forward need to be portable to Playstation.
 
It's likely they never will, either.
Xbox has given up on any custom features for their consoles since all their games going forward need to be portable to Playstation.
SFS is not a massive custom thing the way they promoted it. Just some minor bits are, but all DX12U GPUs can use SF.
Can just use the SSD speed to swap larger textures, could even have lower quality textures 🤷‍♂️
Just making use of the hardware the same way they do the 5Pro.

Talking multi plat, could also be useful on switch 2.
 
Last edited:
Half Life 2 RTX uses RTX IO (which shares the same functionality as DirectStorage), it was featured in Portal RTX and Portal RTX Prelude, but this time it has implemented the first ever game application of Sampler Feedback Streaming, resulting in huge VRAM savings (almost 4GB on a 4090).



It’s a good title for SFS to be implemented on. It’s an older engine, so I’m not sure how much work they wanted to redo to get texture streaming to go. hardware based texture streaming with SFS is a good bolt on here. Unfortunately I’m not sure how many other titles will need it considering most are on software based VT systems today having a majority of their work built up over last generation.
 
There is actually another title with SFS - it is indiana Jones on the Series S, but there is not much out there about it atm!
It's only on the Series S version?
So it's not something that requires a ground-up reengineering of the renderer?
(or maybe it does and they actually did that because it was a first party game that MS insisted works without compromise on Series S)
 
It's only on the Series S version?
So it's not something that requires a ground-up reengineering of the renderer?
(or maybe it does and they actually did that because it was a first party game that MS insisted works without compromise on Series S)
I am actually not sure if it is on XSX, but I know 100% it is on XSS at least lol sorry. It should also be there for Doom The Dark Ages when that comes out I imagine.
 
It's only on the Series S version?
So it's not something that requires a ground-up reengineering of the renderer?
(or maybe it does and they actually did that because it was a first party game that MS insisted works without compromise on Series S)
We are more likely to see its adoption with engines that are already supporting Tiled Resources. Gears 5 and Halo Infinite are the only ones that come to mind for me that is not part of the above list. The list of games that leverage Tiled Resources is not very extensive. It's a covered topic here on how hardware virtualized geometry has very specific constraints compared to software VT that have made most developers to choose their own VT path.

I would be curious to know if it's leveraged in the Creation engine. SFS generally cannot be leveraged without TR. The API doesn't allow for it, at least this is my understanding.
 
Last edited:
I'm kind of worried.. TLOUP2 recommended specs have been announced, and in their tweet they also confirm that TLOUP2 will use DirectStorage.. and considering Nixxes is either helping or doing this port, I'm left worried that the game is going to have the same issue as all Nixxes other games which utilize DirectStorage.. and by that I mean stuttering going in and out of cinematics, and on camera cuts, both of which basically affects Spider-Man 2 to a ridiculous degree... as well as general frametime shittiness. But hey, here's hoping it's just Insomniac's engine.

Tweet:

Spec sheet
GmgKYGpXYAAwLcn


Link to blog post:
 
I'm kind of worried.. TLOUP2 recommended specs have been announced, and in their tweet they also confirm that TLOUP2 will use DirectStorage.. and considering Nixxes is either helping or doing this port, I'm left worried that the game is going to have the same issue as all Nixxes other games which utilize DirectStorage.. and by that I mean stuttering going in and out of cinematics, and on camera cuts, both of which basically affects Spider-Man 2 to a ridiculous degree... as well as general frametime shittiness. But hey, here's hoping it's just Insomniac's engine.

Tweet:

Spec sheet
GmgKYGpXYAAwLcn


Link to blog post:
Only if they use GPU decompression, which they've only used on two ports.
 
Back
Top