DirectX 12: The future of it within the console gaming space (specifically the XB1)

How can you say that?

Sure they were. Anyways one of those was Keith Judge a programmer for unreal engine 4 which is supporting directx12 and is baffled by Wardells statement.

Wardell in recent days has contradicted himself.
Now responding to questions on twitter and he's suggesting that directx12 won't close the gap between ps4 and XOne. That contradicts his statement that it would give XOne 2x the performance for most games.

I think Wardell confused his specific case with thousands of individual entities on screen generating masses of draw calls for the general case where there are far fewer objects requiring draw calls in a scene. He's just unfortunate that his enthusiasm ran away with him and now he is a trump card in fanboy wars even if he's been walking back his statements over the last few days.
 
I think Wardell confused his specific case with thousands of individual entities on screen generating masses of draw calls for the general case where there are far fewer objects requiring draw calls in a scene. He's just unfortunate that his enthusiasm ran away with him and now he is a trump card in fanboy wars even if he's been walking back his statements over the last few days.

That should have been exactly what any educated member of this forum was thinking from the getgo...
 
I think Wardell confused his specific case with thousands of individual entities on screen generating masses of draw calls for the general case where there are far fewer objects requiring draw calls in a scene
Yes, but it's more complex than that. In the Star Swarm demo video they say they had to turn off motion blur to be able to achieve good framerates under DirectX 11 - so in the end Mantle can remove inefficiencies even for a GPU-limited motion blur effect (which is implemented in vertex/geometry shader or pixel shader, depending on the technique used).

http://www.youtube.com/watch?v=6PKxP30WxYM


And even when there are scenes with far less draw calls, you still typically get twice as much CPU time to spend in your game engine - things like better AI, physics simulation, network code, whatever.


PS. BTW, Brad Wardell is CEO and founder of StarDock Corp. that was knows for OS/2 shell utilities 20 years ago, and currently for their Start8 and ModernMix utilities. I'd say he should know PC programming stuff... probably more so than Tim Sweeney of Epic/Unreal Engine who was predicting the end of the GPU and return to software rendering.
 
And even when there are scenes with far less draw calls, you still typically get twice as much CPU time to spend in your game engine - things like better AI, physics simulation, network code, whatever.
Where does the "twice" come from? That would imply that the amount of CPU work freed up by switching APIs is going to be equal to the current non-graphics-API CPU allocation. Maybe it'll be like that in some particular cases, but I can't imagine why that would be a general expectation.
 
That would imply that the amount of CPU work freed up by switching APIs is going to be equal to the current non-graphics-API CPU allocation.
No, this is not correct.

DirectX runtime and driver take significant amount of CPU time in current games, something closer to 50%. For 3D Mark 11 scene 3 demo shown at Direct3D 12 sessions on GDC and Build, graphics takes 40% of CPU time in D3D11 and 30% in D3D12.

D3D12 also gets twice as less CPU time in the graphics stack (4 times as less in the main rendering thread, but 2 times as much in other threads). Overall CPU time is about 1.5 times less in this particular demo.

The demo was running on a Windows PC with Iris Pro Graphics 5200 and preliminary WDDM 2.0 driver.
Code:
Times, ms        Total              GFX-only
              D3D11   D3D12       D3D11   D3D12
Thread 0      7.88    3.80        5.73    1.17
Thread 1      3.08    2.50        0.35    0.81
Thread 2      2.84    2.46        0.34    0.69
Thread 3      2.63    2.45        0.23    0.65
Total        16.42   11.21        6.65    3.32


http://www.pcper.com/news/Graphics-Cards/Microsoft-DirectX-12-Live-Blog-Recap (scroll to 10:13)
https://channel9.msdn.com/Events/Build/2014/3-564 (rewind to 54:00)
http://blogs.msdn.com/b/directx/archive/2014/03/20/directx-12.aspx
 
Last edited by a moderator:
How can you say that?

Sure they were. Anyways one of those was Keith Judge a programmer for unreal engine 4 which is supporting directx12 and is baffled by Wardells statement.

Wardell in recent days has contradicted himself.
Now responding to questions on twitter and he's suggesting that directx12 won't close the gap between ps4 and XOne. That contradicts his statement that it would give XOne 2x the performance for most games.

it just depends on, how bad or inefficient the current api and drivers are ;)
 
I was remembering some dev comment about Direct X 12 on twitter, it went something like "current consoles have good GPU's and lots of weak CPU cores, so they'll naturally be optimizing for that case".
 
Yes, but it's more complex than that. In the Star Swarm demo video they say they had to turn off motion blur to be able to achieve good framerates under DirectX 11 - so in the end Mantle can remove inefficiencies even for a GPU-limited motion blur effect (which is implemented in vertex/geometry shader or pixel shader, depending on the technique used).

http://www.youtube.com/watch?v=6PKxP30WxYM


And even when there are scenes with far less draw calls, you still typically get twice as much CPU time to spend in your game engine - things like better AI, physics simulation, network code, whatever.


PS. BTW, Brad Wardell is CEO and founder of StarDock Corp. that was knows for OS/2 shell utilities 20 years ago, and currently for their Start8 and ModernMix utilities. I'd say he should know PC programming stuff... probably more so than Tim Sweeney of Epic/Unreal Engine who was predicting the end of the GPU and return to software rendering.
Pretty nice info, thanks for sharing.

Just wanted to add that as the CEO of the company which created the wonderful Elemental: Fallen Enchantress -turn based strategy game with some similarities to Heroes of Might and Magic-, he has my uttermost respect and admiration.
 
PS4 uses Shader Resource Tables which is similar to Descriptor Heaps & Tables on DX12. There are some common features between GNM and DX12, and GNM is already out there.

10619555419189669427.png
 
Yes, but it's more complex than that. In the Star Swarm demo video they say they had to turn off motion blur to be able to achieve good framerates under DirectX 11 - so in the end Mantle can remove inefficiencies even for a GPU-limited motion blur effect (which is implemented in vertex/geometry shader or pixel shader, depending on the technique used).

http://www.youtube.com/watch?v=6PKxP30WxYM


And even when there are scenes with far less draw calls, you still typically get twice as much CPU time to spend in your game engine - things like better AI, physics simulation, network code, whatever.


PS. BTW, Brad Wardell is CEO and founder of StarDock Corp. that was knows for OS/2 shell utilities 20 years ago, and currently for their Start8 and ModernMix utilities. I'd say he should know PC programming stuff... probably more so than Tim Sweeney of Epic/Unreal Engine who was predicting the end of the GPU and return to software rendering.
Actually... Once again this is an example of a draw call limited effect with a PC with DX11. I just did a little searching and per object motion blur can really ramp up drawcalls.

http://www.lynda.com/3D-Animation-Games-tutorials/Applying-motion-blur/150613/165727-4.html
"This is about as far as I can look in almost 2000 draw calls. So keep in mind that these kind of image effects do drastically jump up the number of draw calls your graphics card is making. When I turn on the terrain in the grass we're going to see an even more drastic jump. I'll turn on those layers and see how all the image effects start to hold together. I'll turn on my plane for my water, and then finally the terrain. Now I"ll play again, and we can really watch those draw calls kick up. At the most, with everything blurred out, I'm at 2500. And this may be a little much for some mobile applications, so we really need to watch, not only where we are publishing to, but what image effects are we using, and how much blur are we putting in."

Here's a motion blur tool for game developers for mobile devices designed to create a simplistic low drawcall form of motionblur
http://www.bento-studio.com/index.php/all-the-tools/10-mobilemotionblur
 
Last edited by a moderator:
Actually... Once again this is an example of a draw call limited effect with a PC with DX11.
OK. So it seems like multiple draw calls are the easiest way to get it with correct alpha and without too much programming hassle, but when you need have per-object motion blur in a 3rd-view space game that has 4000 small objects, you have to multiply this by 10...
 
Last edited by a moderator:
It should also be mentioned that Stardocks owns a significant interest (40%) in neowin.net, which creates a conflict of interest especially under these circumstances.

How does one know if Wardell is simply stating what he thinks is true or trying to drive hits to a website, in which he has financial interest, with a pretty sensational article?

You can't so you have to take whatever he said with a pinch of salt.
 
It should also be mentioned that Stardocks owns a significant interest (40%) in neowin.net, which creates a conflict of interest especially under these circumstances.

How does one know if Wardell is simply stating what he thinks is true or trying to drive hits to a website, in which he has financial interest, with a pretty sensational article?

You can't so you have to take whatever he said with a pinch of salt.

Or you could just evaluate it logically and realize that he's being ridiculous regardless of his motives.
 
The Forza demo was running on that machine. Those numbers are not from that demo (they are from the modified D3D12 3dmark I believe).
Yep, this was a real-time (??!) demo of the new 3DMark and the Microsoft DirectX blog says "Tested on GIGABYTE BRIX Pro (Intel Core i7-4770R + Iris Pro Graphics 5200)" in the caption.
At the same time everyone is under impression that only Nvidia has working WDDM 2.0 drivers right now, this even started [post=1836407] a little AMD flame war in the Mantle thread[/post]...
 
Ah sweet that is public knowledge
OK, I edited [post=1841814]my post above[/post] to reflect that 3DMark11 D3D12 demo was running on an Iris Pro Graphics 5200.

BTW, I haven't had much luck finding [post=1836292]anything about WDDM 2.0 roadmap and/or implementation details[/post]. The Build session had two separate machines running Windows developer build 9697 dated Feb 27- one machine configured for D3D11 with a release WDDM 1.x driver, and another one for D3D12 with a WDDM 2.0 driver. Does it mean anything in terms of backward compatibility - i.e. would DDI12 co-exist with DDI9 and DDI10/11 in the same driver, as was the case in WDDM/DXGI 1.x?
 
Last edited by a moderator:
I'm a bit excited to see what the latest DirectX has to offer for the next-gen games. I mean, the "game changer" thing that could bring some new stuff to the gaming industry.
 
Back
Top