No, DDI9 in WDDM 1.x does NOT support fixed pipeline. Microsoft emulates all fixed-function paths with Direct3D 9 shader code when you run WDDM drivers, and also remaps all Direct3D 3-8 functionality to Direct3D 9.
It does support fixed pipeline. That's why you have to implement DDI calls for state changes like fog (you have to implement it) of color palette (I believe this one is a dummy). Some things are taken care of by the graphics stack, some aren't (or are terribly ineffective).
My point was, WDDM driver is still required to support DDI9 and DDI11 alongside DDI12 for that exact reason - to maintain compatibility with old games, since DDI9 is still used by Direct3D 9 path and 10Level9 path in Direct3D 11. This way, compatibility problems remain the responsibility of IHVs, and these problems seem to be huge.
And the alternative is?
That is seemingly the reason why Microsoft are unwilling to repeat what they did in Vista time - i.e. either remap Direct3D 9-11 on top of Direct3D 12, or remap Direct3D 3-9 to Direct3D 11 and reimplement the latter on top of DDI12 in WDDM 2.0. This would be a huge task on its own, but they would also need to maintain compatibility logic - probably by the way of a Direct3D compatibility layer mentioned above.
And they'd have to accept that only a fraction of machines would run Win10. Brilliant strategy, why haven't they thought about it?
"Software hack" is a myth of Direct3D 8 era when we still had things like "hardware fog". In this time and place, if your processing unit does not have an instruction code, operand, addressing mode, swizzle mode, or page table or TLB which is required for some feature, that is the end of the story. Trying to emulate these things will kill your performance and reduce the number of valuable resources ("slots") available to the applications.
Man I wish you've known how many waivers some pieces of HW get for validation tests they can't handle. It obviously gets better but it's a myth that software workarounds are a myth. Read sebbbi's response above.
Then I finally upgraded to Windows 8.1, and the game became rock solid - all on the same hardware with the same Catalyst driver version. Doesn't really look like driver validation problem to me...
Sure, rearranging stuff in memory doesn't change stability of a poorly written driver (or any piece of code for that matter).
There's a chance that what you've experienced is a DXGK problem but it's much less likely since the same DXGK runs for everyone and your driver is run by a fraction of Windows users. Coverage matters, that's why AAA games ship with bugs that weren't experienced in QA (100 people, 40h/week, 3 months of stabilization is ~50k hours; 100k gamers playing 2h of game day one is 4x the time code runs). I have to assume that you don't write systems code, correct?
No no no, WinSxS in Windows ME was a terrible idea that tried to solve the problems that Microsoft created mostly by themselves - that is, introducing "bugfixes" and "feature enhancements" to common libraries which break older applications! It was all their fault which they blamed on application developers.
Sure, it's their fault that they fixed bugs but it'd also be their fault if said bug was exploited on your machine. Clever but doesn't work like this.
First of all, they didn't really follow secure software design rules until Vista reboot, hence Windows before XP SP3 was full of security exploits, buffer overruns, and other dreaded bugs.
Dude, it's 2015, you're arguing quality of a 10 year old software. Which piece of code in 2005 (or 2001 if we're discussing XP) was of a much higher quality? And it's not true that there was no focus on quality and security before Vista reset. I know, because I interned before Vista shipped and there were tons of threat analysis docs from pre-Vista timeframe, procedures and tools aiding development. MS had static and dynamic analysis for ages. App verifier and driver verifier existed in XP timeframe. Prefix and prefast (which was released as OACR) existed for some time too. I appreciate your opinion, the problem is it laughs in the face of facts.
Secondly, their MSDN documentation was not clear enough because it was written by people who had access to the OS source code for developers who didn't, and was a source of many misunderstandings.
Here you're arguing pre-2000 state since MS was forced (and rightfully so) to document everything they themselves use, so that e.g. Office can get unfair advantage over other software. It amplified in 2007 when EU ordered production of absurdly detailed documentation for everything created from then on. Sure, some pieces of code on MSDN were and are crap. This has nothing to do with access to source and everything to do with the structure: documentation is written by technical writers. Some of them are great, some... not so much. But if you have some experience with Win32 then you'll be able to easily spot problems with sample code and documentation. It doesn't matter how good the documentation is, if someone doesn't care about code quality, things will break. And they did.
On top of that a lot of partners have communication channels you're not aware of. Most (if not all) of Windows teams spend time with 3rd party developers and help them use APIs and what not correctly. This was definitely true in Vista onwards but my guess is this wasn't new (WinSE guys had these processes as well so these contacts applied to XP sustained engineering as well). And you didn't have to be a huge software shop, you just had to be smart. We've had 2-3 people companies mailing us with questions and visiting us on-site once or twice a year. Their code was being debugged and problems with API usage and documentations were identified and fixed. There's only that much you can do when EVERYONE can code for your platform. And as much as you want to hate, a lot was done.
Thirdly, they "solved" the problem of perceived "DLL hell" by requiring application to install commonn libraries - which doubled OS storage requirements, multiplicated support matrix, and made further security updates and bugfixes complicated.
I see, doomed if you don't doomed if you do. Excellent. So if MS ships stuff - bad. If applications do - bad. If stuff doesn't work - bad. I guess the answer is either "you shouldn't have made any code mistakes in the last 20 years" or "just give up". Great outlook!