Huddy: "Make the API go away" [Edit: He wants a lower level API available too.]

Forget about removing the API, getting developers simply to use the best iteration of it and not the one that's now about 8 years old and suffers from horrible performance pitfalls would be a start.
 
Forget about removing the API, getting developers simply to use the best iteration of it and not the one that's now about 8 years old and suffers from horrible performance pitfalls would be a start.

It certainly helps that the new baseline is no longer the unspectacular 4500HD from Intel but rather the HD Graphics 2000 and 3000 series.

DX9 going away? I think first the new consoles need to be released for that to happen. As it is it looks like mobile phones are about to further increase the popularity of this API.
 
Hey when is Tim Sweeney going to chime in and recommend we just do GPGPU-driven software rendering? ;)
 
http://www.extremetech.com/article2/0,2845,2277870,00.asp

But there's a lot of things with gaming and Vista that you just want to slap Microsoft and go, "What the hell were you thinking?"

When we created DirectX, there's a reason it's call DirectX. It was direct access to hardware acceleration to the developer with very little abstraction and operating system nonsense in the way. So DirectX was meant to be very fast and low-level, and push all the OS bloat out of the way. Over the years that's been forgotten, so each subsequent generation of DirectX has had more value added from Microsoft, which makes the API more complex, more bloated, harder to understand, and so forth.
 
I disagree with that statement. There's little "bloat" in DirectX. It's all a matter of API design. DX10 has significantly lower overhead than DX9. DX11 provides the opportunity to thread the command buffer generation, which doesn't really reduce overhead but can improve throughput. For DX12 I hope the overhead can be further reduced, for instance by giving the applications more direct control over some hardware resource, such as like in Nvidia's bindless graphics extensions to OpenGL.
 
Why did he say something like that when it's not true?


I think that AMD are finally becoming aware of and upset that the endless stream of console ports aren't helping them sell newer cards.
 
Last edited by a moderator:
Why did he say something like that when it's not true?

You might know that he was responsible for DirectX some time ago. I am not sure why he has quit but since he left he claims all the time that Microsoft does it wrong. Maybe he really thinks so or maybe he is just angry for a reason we don’t know.
 
When is that interview dated? The site seems a bit screwed up so I can't be sure... doesn't sound modern. "Page 1" seems to say 2008 which makes it totally irrelevant in this space.

As Humus mentions it's also so factually untrue that it's staggering if truly coming from an expert. DirectX 10/11 isn't exactly an "easy to use rendering API" with lots of helpful features these days. It's pretty close to the lowest level abstraction over current hardware as possible... in fact it sacrifices several favorable abstractions to be "more low level" that restrict how future hardware is built.
 
I remembered this thread when I was listening to the keynote speech by John Carmack at QuakeCon 2011.

This transcript of an interview also hits on some relevant points.
http://mudojuegos.com/?p=5421

I searched for 'pointer' on this page for the discussion about low-level access v. API and what AMD and NVIDIA were doing.

(hello everyone)
 
John Carmack's comments on Rage development challenges and fighting PC APIs and drivers is along the same lines.
 
asmara, sorry I somehow missed your post when I was posting from my phone. I didn't mean to seem to bypass you.
 
Sorry for digging up an old thread, any news on the discussion?

There was a rather interesting panel discussion at Siggraph where this kind of topic was heavily discussed. It appears the general consensus is that making the API more low-level is considered the right way to go, with the main issue being how far you can go without constraining future hardware, breaking compatibility and still having it vendor agnostic. I think there are still things that can be made lower level than it is today without losing too much abstraction. For instance more freedom over memory use, with a more malloc-like approach rather than CreateXXX(). Being able to control whether textures are linear or swizzled. Command-buffer patching etc.
 
Back
Top