a higher feature level does not necessarily indicate more advanced features
Feature levels directly expose very specific capabilities of the underlying hardware - these are "advanced" by definition, since working around unsupported capabilities could be very costly or outright impossible.
Level 3 contains everything in level 1 and level 2 so just have level 3! Just because a version has certain features doesnt mean the game has to use them and just because a game doesnt use certain features doesnt mean there needs to be a seperate dx version with those features omitted.
We discussed this in
Direct3D feature levels discussion.
"Let's just have the highest level" logic doesn't work here, because there is still graphics hardware that doesn't support the higher levels (and also uses a simpler version of the driver API (WDDM/DXGK) that does not expose advanced features of the runtime), and there is still code which uses these lower levels and would not really benefit from a higher level without much refactoring and creating new graphics assets.
The feature levels were not designed from top to bottom. If you recall, DirectX 10 was designed as a clean break to control the capability bits (CAPS) problem in DirectX 8.x-9.0, where multiple optional features make it hard to maintain code paths for different vendors. So Direct3D 10.0 eliminated the feature bits almost completely and required a strict set of features, including a set of supported DXGI texture formats - however many operations on these formats (filtering, multisample, MRT etc). still had to be queried with
D3D10_FORMAT_SUPPORT.
As more capable hardware appeared with in Direct3D 10.1, new "global" features will have to be advertised for the programmer to discover. This is how feature levels first appeared, and there were only two of them: 10_0 for existing hardware and 10_1 as a strict superset which includes new capabilities. This was further expanded to 11_0 and 9_x (
10Level9) in Direct3D 11, level 11_1 and a few options were added in Direct3D 11.1 for Windows 8.0, and even more options in 11.2 for Windows 8.1 and 11.3 for Windows 10.
Now from the system architecture point of view, the device driver doesn't really have to support all lower levels when it supports the higher ones. It could only advertise the highest possible capabilities and let the Direct3D runtime handle the rest, since the capabilities of the higher levels are nested in a strict superset of the lower level - and this is
exactly how this works for levels 10_x and 11_x in Direct3D 11.1/11.2 (though the runtime still uses DDI9 for 10level9 even on level 11_x hardware.)
In Direct3D12 developers can have explicit control over this with
Direct3D 11on12 layer.
The only reason that makes sense to me as to why ms would not just have dx12 level 3 as dx12 with no feature levels is if a ihv put pressure on them saying "our gpu supports all of dx12 except for one or two features which devs can work around or arnt that important and because of that we will have to market our gpu's as only being dx11 compliant and we will lose sales, you need to come up with a solution so that we can market our gpu's as dx12 compliant"
Hence this feature level nonsense.
I think the logic was quite different.
Level 12_0 is supported on the Xbox One.
Level 12_1 requires
Conservative Rasterization and
Rasterizer Ordered Views - they provide a very efficient way to implement occlusion culling, order-independent transparensy and ambient shadows, which require a lot of effort on current hardware.