Brad & Oxide are also at the "gl next" keynote, expect them to be also showing off nitrous on that low level api as well
I wasn't sure if his game is Star Control or if another company is making Star Control. That part wasn't clear.I thought he did say Star Control was on the way, if not then what have I been excited for? Oh well, GDC is soon enough. They are supposed to have an X1 game, but no clue what it is.
I do wonder what this would open up for open world games, better foliage or trees?
He does play dumb on twitter in regards to this so far. lol
Is anyone else a little annoyed at Brad Wardells disingenuous intentions with his DX12 statements?
He's making it sound like DX12 will lead to huge performance increases for all scene's being rendered ONCE game engines fully support DX12 and once 2nd round of games supporting DX12 arrive circa 2017-18.
Draw calls aren't a bottleneck for the Xbox One depending on the API used for a game.
They did the same with titanfall and Sebbbi indicates on GCN hardware this is the fastest method of deploying draw calls.
You are missing the fact that there are actually 3 separate OS environments running simultaneously:Unless I'm missing something XB1 still has Win8, which in turn has WDDM 1.3 which is still got some of the CPU bottlenecks that DX12 + WDDM 2.0 resolves
Earlier I posted an article from AMD research in another thread that they were testing HBM modules and they found that 32 CUs doing a compute algorithm required 700GB/s bandwidth for bandwidth to not be the bottleneck.
If I were to extrapolate poorly on the required bandwidth of 12 CUs: 853/1000 = 0.853
700*.853 = 597
12/32 = .375
597*0.375 = 223GB/s
Esram can only provide 192GB/s theoretical with a write bubble. It can pull and additional 40GB/s from DDR. For a combined max bandwidth of 232. GBP/s. It's just enough. But you'd be doing a graphics test LOL.
You might have missed the text I'd quoted from the Metro devs earlier. They suggested that two APIs are available for the Xbox One; one being the fairly standard DX11 and the other being the "GNM style do-it-yourself" API.
Edit: I'll quote the text again in full:
"Oles Shishkovstov: Let's put it that way - we have seen scenarios where a single CPU core was fully loaded just by issuing draw-calls on Xbox One (and that's surely on the 'mono' driver with several fast-path calls utilised). Then, the same scenario on PS4, it was actually difficult to find those draw-calls in the profile graphs, because they are using almost no time and are barely visible as a result.
In general - I don't really get why they choose DX11 as a starting point for the console. It's a console! Why care about some legacy stuff at all? On PS4, most GPU commands are just a few DWORDs written into the command buffer, let's say just a few CPU clock cycles. On Xbox One it easily could be one million times slower because of all the bookkeeping the API does.
But Microsoft is not sleeping, really. Each XDK that has been released both before and after the Xbox One launch has brought faster and faster draw-calls to the table. They added tons of features just to work around limitations of the DX11 API model. They even made a DX12/GNM style do-it-yourself API available - although we didn't ship with it on Redux due to time constraints."
I suspect none actually, like in the sense that if games were coded the same today but just switching over to DX12. I think the developers must be purposefully be exploiting the draw call advantage where previously a lot of small jobs (ie if you look at Ubisoft's presentation) it would be painful to utilize as a compute shader because of overhead, so maybe they went through some other path on the GPU, or maybe it wasn't done at all.I read that the Xbox One's bandwidth is actually proportionally greater than the optimal bandwidth for a 32CU GPU to remain completely saturated. It's probably fairly unheard of for a GPU to have such a proportionally high bandwidth.
What kind of an improvement would you expect that the Xbox One will get from the switch to DX12?
.
2) statically linked game code which loads when you start the game.
You are missing the fact that there are actually 3 separate OS environments running simultaneously:
1) Windows 8, which runs the UI and DVR apps in the background;
2) statically linked game code which loads when you start the game;
3) hypervisor kernel to manage virtual hardware access between the two.
.