CouldntResist
Regular
Looks like there might be one in the works: http://www.winehq.org/?issue=320
Looks like there might be one in the works: http://www.winehq.org/?issue=320
Like this undocumented binary format: Shader Code Format?Big problems are the undocumented binary formats for shader and effects.
Like this undocumented binary format: Shader Code Format?
I don't see anything on that page that tells it's D3D9-only, or not D3D10, but it's in the Windows Vista Display Driver Model Reference. And as far as I know this format has been used from Shader Model 1.0 to Shader Model 3.0. I see little reason to use anything completely different for Shader Model 4.0.No, the D3D9 format was always documented. D3D10 have a new format that even contains a hash key. The WDK even miss the header file that contains the structures and enumeration.
I don't see anything on that page that tells it's D3D9-only, or not D3D10, but it's in the Windows Vista Display Driver Model Reference. And as far as I know this format has been used from Shader Model 1.0 to Shader Model 3.0. I see little reason to use anything completely different for Shader Model 4.0.
What's the hash for?
I believe you.Believe me I have checked this by my own with a hex editor and decode some of it.
Sorry I don't fully see the point in that. Is it a form of authentication (avoiding third-party tools) or maybe threading related?Validation. The compiler signs the binary shader it generates and the runtime later checks it. If the hash doesn’t match it refuse to create the shader.
Anyway this doesn't seem like a hard reverse engineering problem.
Sorry I don't fully see the point in that. Is it a form of authentication (avoiding third-party tools) or maybe threading related?
Problem actually lies somewhere else -- in the driver. Especially when DX10.1 comes out there will be the need for driver to support context switching on a GPU. In another words, GPU will have to be capable of continuing some other work instead of waiting for that lengthy transfer from system memory to finish.
I don't see how something like that could be emulated without driver support.
It won't be that easy to map D3D10 to current OpenGL (+ extensions), right. But that doesn't mean OpenGL Mt Evans can't implement those same improvements D3D10 does. In fact we already know the new object model will be much more tuned for performance.Again, DX10 doesn't just bring a few new PS, VS and GS functions, it also brings very-low level performance enhancements that simply cannot be "emulated". And when a dev builds a DX10 render mode into their game, the very last thing you'll want to do is somehow build an emulator for it.
Let alone the nightmare of the DX -> OGL instructions conversion, since a ton of them simply don't map logically 1:1 like that anyway.
In fact we already know the new object model will be much more tuned for performance.
Could you share the links to the tests showing that DX10 on Vista is faster than OGL on XP?There continues to be some obvious confusion as to what DX10 is really giving you. It isn't just about a few extra features, it's also about a large number of efficiency enhancements. Sure, there is now geometry shader stuff and other pixel formats you can access (which wouldn't be "emulatable"), but there are also new memory access methods and driver state changes and a ton of other things that you simply could never emulate.
Most of what current devs are doing with DX10 is adding extra graphics goodies because of the efficiency enhancements provide for acceptable performance with them enabled. By trying to emulate DX10, you'd be shooting yourself in the foot TWICE: running the code that needs the extra efficiency in an emulation layer that will be FAR less efficient than even native DX9.
Again, DX10 doesn't just bring a few new PS, VS and GS functions, it also brings very-low level performance enhancements that simply cannot be "emulated". And when a dev builds a DX10 render mode into their game, the very last thing you'll want to do is somehow build an emulator for it.
Let alone the nightmare of the DX -> OGL instructions conversion, since a ton of them simply don't map logically 1:1 like that anyway.
I dont see why this would need to be emulated if you dont use it. I guess most people would just want to run DX10 Games (no context-switch required), and not hack/patch WinXP to use a 3D-Desktop.