Huddy: "Make the API go away" [Edit: He wants a lower level API available too.]

It seems pretty amazing, then, that while PC games often look better than their console equivalents, they still don't beat console graphics into the ground. A part of this is undoubtedly down to the fact that many games are primarily developed for consoles and then ported over to the PC. However, according to AMD, this could potentially change if PC games developers were able to program PC hardware directly at a low-level, rather than having to go through an API, such as DirectX.

The whole premise is wrong. PC titles dont look similar to console ones because of direct x, but because they're console ports. Unless I'm missing something, lack of an API wont change that at all.

Surprised Huddy would say something so obviously off track.
 
The whole premise is wrong. PC titles dont look similar to console ones because of direct x, but because they're console ports. Unless I'm missing something, lack of an API wont change that at all.

Surprised Huddy would say something so obviously off track.

Well, I'm guessing his answers were tailored to fit the potential target audience, which may not have been all that interested into a deeper discussion as to what devs are apparently asking for/what a thinning of DX will involve. Luckily for everyone, I think I know a site where the target audience would be interested in something like that, hmm...
 
I just can't see us going back to the days when programmers had to hit the hardware for every possible piece of gear in someone's PC. Doesn't anyone remember GLide when the reality was that whoever paid the bucks got decent support for their hardware, and everyone else got the minimum?

Today with a couple of big players and marketing money involved, we'd very quickly get to the nightmare scenario for us customers where one company's hardware gets optimised and extra fries with everything, and the other company gets an inferior product, sometimes even deliberately crippled.

Would game devs even want to have to hit the hardware nowadays, given the huge amount of gear out there, in weird configurations or frankensteined drivers? Would Nvidia or AMD give out the low-level information required, or just provide some kind of low level API to get closer to, but not directly hitting on the card? And how would that help if you can't do anything about the OS layers that sit around a graphics card driver, that were put there for stability in the first place?

It all seems really pie-in-the-sky stuff that sound great in principle, but is full of the same real-world problems that caused API's like Direct X to be needed (and be very successful) in the first place. The reasons for the rise, use, and need of DirectX haven't gone away.
 
Would the lack of an API slow the the release cycle for new hardware?

It would take a lot longer to ship a new GPU. One thing most people don't consider about what goes on inside a GPU driver: workarounds. Just about every GPU that has ever shipped has multiple hardware bugs when it goes to market. But, with a driver between the application and hardware, you get a second chance to make things right without having to spin the chip, forcing a delay of many months.

Without a driver in the way, you end up developing GPUs like CPUs: perfection, or very very close to it. This comes at a huge cost in both QA time and additional silicon revisions. You should expect a GPU without a driver take significantly longer to bring to market than one with a driver (easily a year or more).

Console developers don't have this problem. Their GPUs have bugs too, but those bugs don't change every 6 months. They just learn "don't do that", and get on with their lives.
 
how is that any different from consoles ?
(ie set hardware)

Well, you keep the openness of the personal computer and have the m&kb/freedom of choice of input devices. You could still upgrade the Amiga, a similiar/updated modern concept like that that would be great if possible. You can write very optimized code for the base hardware, and when the user base builds up with expendable cards/kits, develop paths for them. In the meantime, those would run the existing code at higher res/PQ. This is just from layman's perspective of course.
 
Last edited by a moderator:
Doesn't this concept also legitimise Nvidia efforts to provide proprietary access to their hardware's benefits outside of DirectX? Considering that Huddy now seems to be evangelising dropping cross-vendor compatibility, this could be seen as approval for the likes of CUDA, PhysX and OpenGL extensions.
 
Doesn't this concept also legitimise Nvidia efforts to provide proprietary access to their hardware's benefits outside of DirectX? Considering that Huddy now seems to be evangelising dropping cross-vendor compatibility, this could be seen as approval for the likes of CUDA, PhysX and OpenGL extensions.
That's sort of what I'm thinking too. :???:
 
DirectX has been the framework that has enabled cross-company agreement on what abilities should be in the next iteration of their hardware. Companies get together with MS, decide what should be included in DirectX, and then implement that feature set however they see best in their hardware. Without DX as a basic framework, we'd go back to the days when every company did different things, offered different functionality, and no game could rely on having a given feature be available in hardware.
 
This really does not give me any faith in AMD's decisions with regards to software of any kind. Nvidia may do shit that is at best anti-competitive, and at worst trying to engender a monopoly, but at least they aren't trying to destroy Direct X and thus plunge us back into the dark ages of 3D software development!

Why does AMD have such brilliant hardware engineers, and such idiotic people in charge of the software and ISV relations departments?

businessman-banging-his-head-against-the-wall-ispc026073.jpg


Edit: And, for that matter, why can't we get a third GPU maker that doesn't pull Nvidia's bullshit and AMD's idiocy?!
 
Let's be clear... I don't think anyone is saying we should get rid of DirectX. Rather, some are advocating providing APIs in addition to it that allow higher performance at a lower abstraction level. Presumably DirectX will continue as a "least common denominator" standard as it is now and games will continue to have a DirectX path. They just could also have faster/more optimized paths for specific architectures by coding more "to the metal".
 
Let's be clear... I don't think anyone is saying we should get rid of DirectX.

with a quote like make the api go away i think huddy is :D

ps: does he give any idea of what speedup we could expect from going to the metal ?

If its more than 2x that will be fun, imagine the senario, crysis 3 comes out has a gf580 path scores 60fps
nv releases gf580 successor (we never have 2x improvement) has to use fallback dx path with crysis 3 scores lower.
 
By "Make the API go away" Richard is forcing a debate by stating an extreme view to cause the headline.

He well knows the pros and cons and no-one is suggesting in reality not have a cross-platform path that back/future and sidewides proof as the current OpenGL/DX is. The question is that *IF* there was a way similar (or lower) to CUDA for GPGPU to drive the HW lower and more directly, would AAA PC devs use it and so make the PC platform shine, as we all know its capable of.

As I mentioned in the earlier post, its as much related to cost as API, but the debate is whether its worth exploring? If IHVs made it so, and devs could get the funding, would could we do with a modern PC setup?
 
I somehow doubt that AMD wants to provide a second low level API that allows devs to hit the metal. AMD and Nvidia have enough trouble maintaining a robust DX API, and now they want a second one that will allow programmers to do all kinds of stuff that will make your PC unstable?

So what's Huddy really saying here? "Wouldn't it be great if we could get rid of API overhead?" Yeah it would, but then you wouldn't have a high level API. Huddy might as well say "wouldn't it be great if every programmer wrote in assembler and could hit the metal? Think of the extra speed you'd get without going through the API". Really who is going to do that? There are reasons programmers today use high level APIs instead of hand-tuned assembler to the metal, and those reasons haven't gone away.
 
Those systems that can outperform the consoles significantly may not be a big enough market to make it appealing...
 
Those systems that can outperform the consoles significantly may not be a big enough market to make it appealing...

The amount of systems that could outperform would increase in numbers by tenfold though? ;) If you think about it, it would also increase the bandwidth of PC games that can run console level graphics or better.

How about if Crytek and Epic could code to the metal, or D.I.C.E., or Id? People could make their own game engine APIs that are optimised for current GPUs. That could still work? How about people who would be able to write something that raytraces a lot, or is founded on smart tesselation, or voxels or whatever, not having DirectX get in the way of that performing half decently?

I can think of a lot of reasons why people would allow it, without DirectX ever having to disappear completely. But instead of making very complicated drivers with optimisation profiles for each individual game created by vendors during or after the fact (=release of an important game), we're back to where developers can find their way around.

And it's not like there are that many competitors on the market right now either.
 
This really does not give me any faith in AMD's decisions with regards to software of any kind. Nvidia may do shit that is at best anti-competitive, and at worst trying to engender a monopoly, but at least they aren't trying to destroy Direct X and thus plunge us back into the dark ages of 3D software development!
Or maybe AMD at the moment doesn't have a lot of faith in Microsoft's independence for the next DirectX?

Without a fair development model DirectX isn't all that useful.
 
And it's not like there are that many competitors on the market right now either.

And if everyone had to code to the metal we would never have any more
no one would code for a chip that had no market share and a card would never gain market share if no one coded for it
 
Back
Top