DmitryKo
Veteran
Small text here since it's mostly off-topic.
I'd guess fog table emulation in the DDI9 part of the driver has zero effect on DDI10 functionality used for Direct3D10/11 games.
Compatibility paths will remain. Microsoft got rid of XPDM path only in Windows 8.0 - you could still use XP drivers in Vista/7 (and lose Aero/DWM), and Windows 8.x/10 still runs with WDDM 1.0 and level 9_3 graphics (but they lost Aero in process).
Microsoft does have a 11on12 layer in Windows 10 which runs on top of either Direct3D 12 runtime or DDI12 in the WDDM 2.0 driver (the docs are not clear on this yet).
https://msdn.microsoft.com/en-us/library/windows/desktop/dn913195(v=vs.85).aspx
They could use this layer to emulate Direct3D 11 and below on WDDM 2.x hardware at some point.
No, doomed if you 1) take a half-assed file versioning implementation from the awful Windows ME, 2) apply it to a completely different NT kernel operating system so it can boot from FAT32 volumes, 3) build the whole Windows Update system on top of it, 4) let the user unistall every update at every time, resulting in unpredictable system state, 5) port it to NTFS using crazy stuff like hard links, which makes file size reporting go nuts, 6) despite inherent instablility of the store, provide no repair or maintenance tools.
Only recently did Microsoft tooksk some right steps, that is releasing "rollup" update packages that can't be uninstalled and are required for future updates, releasing ISO images of the updated installation media for the general public, and slowly moving to technologies like WIMBboot which make far more sense.
And their clever part of the solution was to let the bug reign if the application requested it. Not-so-clever part was to build WinSxS into Windows 2000, which was unnecessary as it had a perfectly fine file security system.
It does support fixed pipeline. That's why you have to implement DDI calls for state changes like fog (you have to implement it) of color palette (I believe this one is a dummy). Some things are taken care of by the graphics stack, some aren't (or are terribly ineffective).
I'd guess fog table emulation in the DDI9 part of the driver has zero effect on DDI10 functionality used for Direct3D10/11 games.
Currently none, mostly because current mobile graphics parts only support DDI9 and feature level 9_3.And the alternative is?My point was, WDDM driver is still required to support DDI9 and DDI11 alongside DDI12 for that exact reason - to maintain compatibility with old games, since DDI9 is still used by Direct3D 9 path and 10Level9 path in Direct3D 11. This way, compatibility problems remain the responsibility of IHVs, and these problems seem to be huge.
I'm talking about a possible future update that takes advantage of modern hardware.And they'd have to accept that only a fraction of machines would run Win10. Brilliant strategy, why haven't they thought about it?That is seemingly the reason why Microsoft are unwilling to repeat what they did in Vista time - i.e. either remap Direct3D 9-11 on top of Direct3D 12, or remap Direct3D 3-9 to Direct3D 11 and reimplement the latter on top of DDI12 in WDDM 2.0. This would be a huge task on its own, but they would also need to maintain compatibility logic - probably by the way of a Direct3D compatibility layer mentioned above.
Compatibility paths will remain. Microsoft got rid of XPDM path only in Windows 8.0 - you could still use XP drivers in Vista/7 (and lose Aero/DWM), and Windows 8.x/10 still runs with WDDM 1.0 and level 9_3 graphics (but they lost Aero in process).
Microsoft does have a 11on12 layer in Windows 10 which runs on top of either Direct3D 12 runtime or DDI12 in the WDDM 2.0 driver (the docs are not clear on this yet).
https://msdn.microsoft.com/en-us/library/windows/desktop/dn913195(v=vs.85).aspx
They could use this layer to emulate Direct3D 11 and below on WDDM 2.x hardware at some point.
I see, doomed if you don't doomed if you do. Excellent. So if MS ships stuff - bad. If applications do - bad. If stuff doesn't work - bad. I guess the answer is either "you shouldn't have made any code mistakes in the last 20 years" or "just give up". Great outlook!Thirdly, they "solved" the problem of perceived "DLL hell" by requiring application to install commonn libraries - which doubled OS storage requirements, multiplicated support matrix, and made further security updates and bugfixes complicated.
No, doomed if you 1) take a half-assed file versioning implementation from the awful Windows ME, 2) apply it to a completely different NT kernel operating system so it can boot from FAT32 volumes, 3) build the whole Windows Update system on top of it, 4) let the user unistall every update at every time, resulting in unpredictable system state, 5) port it to NTFS using crazy stuff like hard links, which makes file size reporting go nuts, 6) despite inherent instablility of the store, provide no repair or maintenance tools.
Only recently did Microsoft tooksk some right steps, that is releasing "rollup" update packages that can't be uninstalled and are required for future updates, releasing ISO images of the updated installation media for the general public, and slowly moving to technologies like WIMBboot which make far more sense.
No no no, WinSxS in Windows ME was a terrible idea that tried to solve the problems that Microsoft created mostly by themselves - that is, introducing "bugfixes" and "feature enhancements" to common libraries which break older applications! It was all their fault which they blamed on application developers.
Sure, it's their fault that they fixed bugs but it'd also be their fault if said bug was exploited on your machine. Clever but doesn't work like this.
And their clever part of the solution was to let the bug reign if the application requested it. Not-so-clever part was to build WinSxS into Windows 2000, which was unnecessary as it had a perfectly fine file security system.
OS X 10.3 worked like charm on our Power Macs G4 and G5. Never had a virus infection or any problem with the system software.Which piece of code in 2005 (or 2001 if we're discussing XP) was of a much higher quality?
Vista reset happened in late 2003-mid2004, the same time they prepared SP2 for Windows XP. Whatever their focus was before that, it failed as witnessed by multiple accounts, including one made by Jim Allchin.And it's not true that there was no focus on quality and security before Vista reset. I know, because I interned before Vista shipped and there were tons of threat analysis docs from pre-Vista timeframe, procedures and tools aiding development. MS had static and dynamic analysis for ages. App verifier and driver verifier existed in XP timeframe. Prefix and prefast (which was released as OACR) existed for some time too. I appreciate your opinion, the problem is it laughs in the face of facts.
And the fact that lead developers have to perfom support duties has absolutely nothing to do with the quality of MSDN documentation or the design of the API...Most (if not all) of Windows teams spend time with 3rd party developers and help them use APIs and what not correctly. This was definitely true in Vista onwards but my guess is this wasn't new (WinSE guys had these processes as well so these contacts applied to XP sustained engineering as well). And you didn't have to be a huge software shop, you just had to be smart. We've had 2-3 people companies mailing us with questions and visiting us on-site once or twice a year.