Digital Foundry Article Technical Discussion [2022]

Status
Not open for further replies.
Given how deeply grained the I/O and asset management would be in a game like Spider-Man, which is predicated in streaming in the city around you at all times, there you have to be two independnet entire code bases in the one binary and which could be flipped at the start from a hex edit. Chance of that? 0.0%. It makes literally no sense to build and distribute a game in this way.

If this was 1970s complexity I/O; function (read_io), hexedit to function (read_io_ds), sure. But things have not worked like that for four decades.
That's a strong rebuttal, but let me try to counter. If I were building a game that targets two different IO stacks I would build an abstraction layer around them, i.e., a shim/wrapper that provides a uniform interface to the rest of the game and hides the complexity of the underlying APIs. It's possible that the nature of the game and the difference between the two APIs may be so vast that building such an abstraction layer may be hard (i.e., it would either not be much of an abstraction or would invalidate the advantages of DirectStorage). But in my experience it's always possible to tease something out. In any case this is all hyper-speculative, I'm basically imagining things now.
 
That's a strong rebuttal, but let me try to counter. If I were building a game that targets two different IO stacks I would build an abstraction layer around them, i.e., a shim/wrapper that provides a uniform interface to the rest of the game and hides the complexity of the underlying APIs. It's possible that the nature of the game and the difference between the two APIs may be so vast that building such an abstraction layer may be hard (i.e., it would either not be much of an abstraction or would invalidate the advantages of DirectStorage). But in my experience it's always possible to tease something out. In any case this is all hyper-speculative, I'm basically imagining things now.
If your aim is to reduce I/IO overhead, building a wrapper around the I/O subsystem is only going to impact performance. That said, like you, I would definitely take the same approach if I needed to write software that has to support two very different software stacks.

Where I think this becomes moot is that Spider-Man remastered only supports 64-bit Windows 10 and Windows 11 which both support DirectStorage, and from what has been published, DirectStorage is no slower than the legacy I/O stack which means supporting two I/O stacks on a common API platform is completely unnecessary.

If Spider-Man works using DirectStorage then it works and there is no need to include the legacy Windows 10/11 I/O model.

I do appreciate you responding :yes:
 
I wonder why Spider-Man would reference DirectStorage at all in the first place though?
If you've experienced the way code is maintained and shipped in a lot of software, you may find is surprising how many references to unsupported and even deprecated APIs still exist in the binary, but that specific code itself is never run and never causes problems.

A bunch of software has been inadvertently shipped with full debug references, which makes reverse engineering it way easier.

This is probably a sign that Insomniac have, or are, experimenting with DirectStorage.
 
Last edited by a moderator:
I wonder why Spider-Man would reference DirectStorage at all in the first place though?

In programming during development it's common to have a global variable that can switch on/off/between different methods/functions/libraries/anything as programmers implement and experiment with various ways to accomplish something or attempt to include features that may or may not be finished in time for initial release.

When finalizing shipping code, it's not uncommon for that global variable to be set (hard coded or removed entirely) such that there is no longer references to the pieces of code that aren't going to be used in the shipped product.

If there is time, attempts will be made to remove unnecessary code. This also assumes that good coding practices have been maintained and everything is well documented such that it is relatively easy to find and remove said code.

If there isn't time or code maintenance and documentation isn't the best, then remnants of unused code (including such simple things as references or pointers to external libraries) will remain.

Regards,
SB
 
How have you managed to flip this on to me?
You're...literally the person who proposed this, and when given pushback as to how feasible it likely is, acts like your honor has been assailed. His initial explanations of why swapping over to Directstorage via a simple hex edit likely isn't possible were completely polite. This isn't complex (unlike integrating a new filesystem API).

I have no words.

If only.
 
Last edited:
I think his hypothesis is that an entire DirectStorage based IO stack has already been implemented into the game as an alternate path, but isn’t being used for whatever reason. E.g., perhaps the implementation is complete, but testing/validation was incomplete and the devs decided to play it safe and use the legacy stack in the shipping game. But the code paths still exist and the hex edit is meant to switch to them (per the hypothesis).

It’s certainly plausible, but is predicated on a number of independent assumptions being true. Occam’s Razor tells me that this is unlikely.
Most games are single binary release multiplatform. So you'll see something like
#IFDEF DIRECTX
run this code
#IFDEF GNM
run this code
#IFDEF Vulkan
run this code
#endif
and I think a bunch of graphics are coded in this block format and basically the compiler will only compile the single version they need. I think abstractions are much too slow. and I largely suspect they are just conditional compilation.
It could definitely be there, but it could never be enabled without a recompile
 
If you've experienced the way code is maintained and shipped in a lot of software, you may find is surprising how many references to unsupported and even deprecated APIs still exist in the binary, but that specific code itself is never run and never causes problems.

I know it's not binary, but if anyone was to judge my code by the comments they'd think that // TODO: was an awesome new type of functionality that had been fully integrated across the codebase.

This is probably a sign that Insomniac have, or are, experimenting with DirectStorage.

Maybe, or I guess it could also be from Nixxes and they have been experimenting with it in their software. Not everything in their port of Spiderman needs to have been ground up for this game, and they could be pulling on their previous code to make the best port they can with the time and resources available. Hopefully we'll see it in a patch down the line. Might make an interesting early comparison!
 
Most games are single binary release multiplatform. So you'll see something like
#IFDEF DIRECTX
run this code
#IFDEF GNM
run this code
#IFDEF Vulkan
run this code
#endif
and I think a bunch of graphics are coded in this block format and basically the compiler will only compile the single version they need. I think abstractions are much too slow. and I largely suspect they are just conditional compilation.
It could definitely be there, but it could never be enabled without a recompile
And that, my friend, is why I can never find work at a place that needs to write optimized-to-the-metal code. I like to live in beautiful ivory towers of abstractions :cool:
 
Maybe, or I guess it could also be from Nixxes and they have been experimenting with it in their software. Not everything in their port of Spiderman needs to have been ground up for this game, and they could be pulling on their previous code to make the best port they can with the time and resources available. Hopefully we'll see it in a patch down the line. Might make an interesting early comparison!

Yup, this could be something that Nixxes have in their own common set of I/O libraries as the basis for all of their Windows code and this is a remnant that maybe shouldn't even be in the Spider-Man code. When you project is hundreds of thousands, or millions of lines of code, nobody is going through that codebase to remove redundant unused code - it just isn't a good use of time.
 
Most games are single binary release multiplatform. So you'll see something like
#IFDEF DIRECTX
run this code
#IFDEF GNM
run this code
#IFDEF Vulkan
run this code
#endif
and I think a bunch of graphics are coded in this block format and basically the compiler will only compile the single version they need. I think abstractions are much too slow. and I largely suspect they are just conditional compilation.
It could definitely be there, but it could never be enabled without a recompile

Those are preprocessor commands for producing different binaries/compiling (not running) different code. Whenever you recompile you get a different binary (in general, not considering linked libraries/dlls etc).
 
Those are preprocessor commands for producing different binaries/compiling (not running) different code. Whenever you recompile you get a different binary (in general, not considering linked libraries/dlls etc).
Fairly certain what was implied was that it's a single binary for Platform X and a different single binary for Platform Y and another for Platform Z and not that a single binary would run for all platforms, as they have development experience.
 
If Spider-Man works using DirectStorage then it works and there is no need to include the legacy Windows 10/11 I/O model.
And for those running the game on Windows 10/11 from SATA III SSD's which from my memory is not DS compliant?

Forsaken minimum specs say it's it runs on Windows 7, I wonder how they've managed to run legacy and DS on that game.
 
And for those running the game on Windows 10/11 from SATA III SSD's which from my memory is not DS compliant?

Forsaken minimum specs say it's it runs on Windows 7, I wonder how they've managed to run legacy and DS on that game.

Direct Storage supports SATA SSD's and even HDD's. It's just that you only get the benefits (or at least the major bulk of them) with an NVMe SSD.
 
Those are preprocessor commands for producing different binaries/compiling (not running) different code. Whenever you recompile you get a different binary (in general, not considering linked libraries/dlls etc).
Yes. Precisely it’s what it is. Single code base. Compiles into different binaries for the APIs they want to use.
 
And for those running the game on Windows 10/11 from SATA III SSD's which from my memory is not DS compliant?

It works fine, it just doesn't benefit.

Forsaken minimum specs say it's it runs on Windows 7, I wonder how they've managed to run legacy and DS on that game.

Luminous Production did a 30 minutes talking on Forsaken in their GDC Presentation earlier this year. They cover DirectStorage.
 
Alex's tech interview with Nixxes is up on Eurogamer.net


The section about PSO/Shader compilation confirms what we already knew. It mostly comes down to how tedious it is for developers (QA) to collect PSO data so they can use it to pre-compile shaders during loading screens, or in the background when assets are loaded.

Hats off to Nixxes for doing it right, for as much of a pain in the butt as it is, by having their QA teams go through and generate the best data possible to ensure this isn't an issue. I wish more studios took the issue as serious as they do.
 
Last edited:
Status
Not open for further replies.
Back
Top