AMD Mantle API [updating]

Inaction? OpenGL lost all relevance on the PC gaming market since 2007, when Kronos Group dropped the long-awaited Longs Peak (OpenGL 3.0) proposal, which was exactly what D3D12 is about right now (totally new pipeline and object model with immutable pipeline states), and instead gone the way of maintaining hundreds of optional extensions, just because CAD vendors were not willing to rewrite their rendering engines or maybe some major IHV did not want to rewrite their drivers, whatever. They were unable to address the threats from D3D10/11, and now with Mantle and D3D12 they have lagged behind all chances of recovery.
True.
Some extra extensions for high-performance rendering are not going to change the direction the API is going.

Do you really think that if Khronos added new useful features and MS *refused* to adopt them, the tide wouldn't reverse?
 
Do you really think that if Khronos added new useful features and MS *refused* to adopt them, the tide wouldn't reverse?
But are there really that many API-only features that would make a major difference?

At the end of the day, APIs are used to control underlying hardware. I'm sure that there are some hardware features that are currently not accessible due to API limitations, but there can't be that many of them.

I look at Mantle and DX12 (even if the latter has additional mystery features) as a way to make some operations a bit more efficient, but not something that will revolutionize games in any meaningful way. 15% faster in some cases looks nice on a bar graph, but it's only a minor difference in experience.
 
epic cant even sell/license console compatible builds of ue4
Interesting... does UE4 work through standard D3D11 and GMN/GMNX, so this is part of their standard SDK licensing policy? Or maybe Microsoft and Sony did allow Epic to build their own custom rendering that has direct low-level access the hardware, using some Microsoft/Sony code? Just wondering.

Do you really think that if Khronos added new useful features and MS *refused* to adopt them, the tide wouldn't reverse?
Uhm. How can Microsoft practically refuse anything proposed by Kronos when they do not control the implementation?

First of all, Microsoft left OpenGL ARB in 2003 even before it was dissolved. It did not join the Khronos Group, so it has no say over OpenGL matters anymore, and doesn't seem to care anyway.

Second, while Windows SDK supports OpenGL for Win32 apps (but not WinRT AFAIK) since at least Windows 95, they do support hardware-accelerate rendering using vendor-specific ICD (installable client driver) - which is a vendor-supplied proprietary implementation that talks directly to the same vendor's display driver. ICD, if provided by display card vendor, completely replaces Microsoft's version of OpenGL32.dll (which is software-only OpenGL 1.4 AFAIK). So Microsoft has no control over each specific implementation, it's entirely graphics card vendor's business how they implement OpenGL in their display driver.
 
Interesting... does UE4 work through standard D3D11 and GMN/GMNX, so this is part of their standard SDK licensing policy?

If there was no sourcecode included in the sdk I'm sure they could license it to anyone, but to ms want a homebrew scene springing up including games and maybe apps that could bypass the official store.
 
Interesting... does UE4 work through standard D3D11 and GMN/GMNX, so this is part of their standard SDK licensing policy? Or maybe Microsoft and Sony did allow Epic to build their own custom rendering that has direct low-level access the hardware, using some Microsoft/Sony code? Just wondering.

Usually middleware is setup so that they can easily strip out any platform-specific code for licensees that aren't registered for a particular platform. The easiest way to do this is to use files: for instance, anything PS4-specific goes in a file with _ps4 at the end of it. Then when they give the code to licensees, they simply exclude those files. Surely Epic did something similar for the version of their code that's semi-public on github, and if you're a PS4/XB1 registered dev you would get the extra code.
 
If there was no sourcecode included in the sdk I'm sure they could license it to anyone, but to ms want a homebrew scene springing up including games and maybe apps that could bypass the official store.

So in other words, Microsoft and Sony have a very strict developer license to disallow you things like Linux distribution, Windows loader, etc.

The console's firmware is only a boot loader and your game code basically controls the whole machine using the statically linked runtime provided by the SDK (which are supplied as compiled object files, not as source code). And since the machine architecture is very familiar x86_64 this time, there are also lots of technical barriers to enforce this restrictions - code signing, encrypted firmware, etc.

So Microsoft and Sony have absolute control, and of course AMD is tied and bound by these licensing restrictions and intellectual property agreements.

Usually middleware is setup so that they can easily strip out any platform-specific code for licensees that aren't registered for a particular platform.

But ptlatform-specific code is useless if you don't have access to the console SDK in the first place. This looks like purely a licensing restriction to prevent anyone from sideloading custom code this time around. Raising multiple barriers and placing "Confidential" stickers on every bit of documentation to prevent any non-licensee from getting even a glimpse of useful information.
 
Last edited by a moderator:
Anyone have an idea how Nvidia have increased their DX11 performance so much?

http://www.pcper.com/reviews/Graphics-Cards/NVIDIA-Talks-DX12-DX11-Efficiency-Improvements

VgKKYQe.jpg
 
But ptlatform-specific code is useless if you don't have access to the console SDK in the first place. This looks like purely a licensing restriction to prevent anyone from sideloading custom code this time around. Raising multiple barriers and placing "Confidential" stickers on every bit of documentation to prevent any non-licensee from getting even a glimpse of useful information.

Of course, but Sony/MS also forbid access to any code using their SDK's if you're not a licensed developer. They probably do this to prevent non-NDA'd people from gleaning technical details from the code or the API's that they use.
 
I just took a deeper look into the Microsoft DirectX 12 blog post http://blogs.msdn.com/b/directx/archive/2014/03/20/directx-12.aspx and the first diagram shows how Direct3D 12 is multi-threaded not only at the runtime level, but also in the user-mode display driver!
More than that, kernel-mode driver seems to be completely absent in the Direct3D 12 workload!
What does it mean?

2806.cpucompare.png

I'd say such fundamental changes are incompatible with WDDM 1.x and require an entirely new driver model.

AMD part of the GDC session included [post=1835443]a reference to the driver model[/post], unfortunately no details were provided.
Also Nvidia blog post http://blogs.nvidia.com/blog/2014/03/20/directx-12/ mentions a "new driver/application model" where the driver does not automatically manage any resources.

Sony/MS also forbid access to any code using their SDK's if you're not a licensed developer... to prevent non-NDA'd people from gleaning technical details from the code
That's what I'm basically trying to say.

Anyone have an idea how Nvidia have increased their DX11 performance so much?
I thought they explain it in the article cited?
 
Last edited:
I'd say such fundamental changes are incompatible with WDDM 1.x and require an entirely new driver model.
Indeed, if you check any of the Microsoft D3D talks from GDC there's a lot of content about WDDM 2, which will enable a lot of this stuff. Not sure if they are posted yet but I imagine they will be soon if not.
 
if you check any of the Microsoft D3D talks from GDC there's a lot of content about WDDM 2, which will enable a lot of this stuff. Not sure if they are posted yet but I imagine they will be soon if not.
No, the videos are not available yet, only [post=1835443]a live blog from one of the sessions[/post], which lacks any intricate details.


I'm only beginning to grasp the changes required. The current WDDM/DXGI model is built on two cornerstones. First, the kernel-mode driver (aka "display miniport driver") implements architecture-specific resource management and enumeration functions provided by DXGI. If you check the DDK documentation (with a quite bizarre flow chart diagram, or, alternatively, a much simpler flow chart from WinHEC 2005 sessions), this is the lowest level of the Windows graphics stack which serves as the foundation for the Desktop Windows Manager (DWM), Direct3D and GDI (cdd.dll), and it mostly does low-level resource management tasks like memory copy, surface creation, swap chain and presentation, BitBlt for GDI acceleration, etc.

The user-mode display driver is where the architecture-specific 3D rendering work is done. Processing pre-compiled shaders, vertex buffers, managing the workload on processor cores, etc.


Now If you remove kernel-mode miniport driver, it's much like a standard USB, SATA or Universal Audio Architecture driver stack in Windows. You have a Microsoft-supplied port-class driver or a kernel (win32k.sys) driver, which makes all the low-level interactions and device discovery/enumeration, and any architecture-specific details are handled by registry tweaks in an .INF file.

And since you do not perform any automatic memory management or resource sharing anymore, the user-mode driver is only doing preparation work to batch the workload in an architecture-specific format, and the kernel driver simply sends the workload down to the actual GPU.


I can't imagine how these two different modes of operation can be implemented in the same display driver without turning it into an incoherent mess. It's either DXGI/WDDM 1.x, where all GPU resources are explicitly managed by the driver, or it's a completely new WDDM 2.0, where DXGI does not even exist.

And that would require a complete overhaul of Windows graphics stack, which brings up compatibility issues. Will Microsoft convert Direct3D 10-11/DXGI 1.x and Direct3D 8/9/Ex to run as a compatibility layer on top of Direct3D 12 runtime? Or rewrite the Direct3D 9 and 11 runtimes and DXGI 1.3 to run on top of WDDM 2.0? What about DWM, GDI and Direct2D/DirectWrite?

Doh.

//Build 2014, April 2-4.
 
Last edited by a moderator:
Huh? AMD has no technical way to make Mantle available to console developers if Microsoft do not embrace it.

AMD has no control over either consoles beyond licensing the silicon and providing some engineering and software support. Microsoft control the firmware and the development environment, and they will implement only those APIs as they see fit.

AMD cannot sideway Microsoft and implement Mantle on their own, by making it a display driver extension like they did in Windows. There are no "drivers" on consoles, i.e. DLLs which export some OS-defined device driver API. There is statically linked code which contain the current version of the OS loader and a console version of the Direct3D 11 runtime which is statically linked to low-level code that talks to the actual GCN hardware.



Had you paid more attention to the post you quoted, you'd know that I never suggested that AMD would try to make Mantle available to the new-gen consoles by force.

What I did suggest is that 3rd party developers may put pressure into Microsoft and Sony to enable Mantle as a single API for both consoles in order to ease the transition between multiplatform ports, and that it could be an interesting option for both vendors.
 
Had you paid more attention to the post you quoted, you'd know that I never suggested that AMD would try to make Mantle available to the new-gen consoles by force.

What I did suggest is that 3rd party developers may put pressure into Microsoft and Sony to enable Mantle as a single API for both consoles in order to ease the transition between multiplatform ports, and that it could be an interesting option for both vendors.

Microsoft uses Directx as a tool to boost OS sales. I dont see what pressure the 3rd parties could apply to force MS to adopt Mantle. They wouldnt threaten to not make games for Xbox One. MS won't budge and allow this to happen. They probably got wind of Mantle and its low overhead a year or 2 ago and that motivated them to reduce their own api overhead.
 
Not sure about your timing since most accounts have DX12 development work starting more than 4 years ago.

http://www.guru3d.com/news_story/nvidia_directx_12_is_huge_improvement_for_gaming.html

NVIDIA said at GDC, at some other point, that DX12 has been in development for about a year, not 4. The "4 years ago" was probably nothing more than general talk at the coffee table, and obviously any company involved would say we suggested this low overhead stuff already back then, since there's no-one to question such claims, and it's the "hot stuff" in the talks right now
 
Nobody wants to acknowledge they do things in response to competition. Can you find some examples of where tech companies acknowledge they are playing catchup in response to competion? No company wants to acknowledge they are lagging behind in innovation because of a lack of competition. Intel isn't going to state they are having less competitive prices & taking their foot off the throttle for cpu improvements and release dates because of no competition.
 
Whatever was or is the situation of DX12. Something is clear, you look at the conferences, the talks at GDC, the big thing this year was the API improvement in the way AMD, Mantle have start it...
OpenGL, DirectX, Nvidia talks, Intel, MS talks, AMD talks. Everything was about new improved API model with lower hardware access level, lower overhead, increasing draw call and efficiency etc etc.

I have like the way AMD have not try bring Mantle in the sessions they have do for OpenGL and directX12. ( well Nvidia have enough speak about Mantle lol )
 
Back
Top