Originally Posted by Scali
Doesn't that have to do with the virtual videomemory system in D3D10 though?
From what I understood, in DX9 all texturememory is mapped into the virtual address space at all times. But with DX10 they aren't mapped into the address space at all unless you specifically Map() them...?
That's pretty much what I meant. What I was getting at (poorly, now that I re-read the post) was that because the game features an in-engine streamer you may have to control for it.
From what I understand, this built-in streaming engine was introduced to prevent the game from crashing at the 32-bit limit in DX9 mode at the higher texture settings. While I've never experienced it myself, I know that some setups can't run the game at full detail with streaming disabled for that reason. But while it's a fine solution to a DX9 issue, it's wholly unnecessary when running in DX10 while still being enabled by default. So, ideally, you'd want to disable in-engine texture streaming manually during testing (edit: setting textures to low or medium achieves the same result).
Don't get me wrong, I've never managed to get Crysis to run faster under DX10 than DX9 and I think it's definitely an engine issue. But if you want to test the actual rendering section of the engine (especially since you're using custom shaders) then you'd want to make sure that the streaming portion is disabled because it's a performance affecting workaround to an altogether unrelated problem.