Is DirectX throttling Xbox 360 performance?

At the bottom it says 50K batches/sec using D3d vs 1M batches/sec using his (closer to metal) lib.

Ah, thanks. I just saw the Russian and skipped it. :)

Anyway, 20us for just a draw call is mighty expensive. I have little experience on the 360, but I did profile a renderer on it once or twice and I have not seen anything close to 20us average draw call time.

I remember 1.6us average draw call time for the PS3 at some point and the 360 was not that much slower. Maybe 2us.

Could be that he is testing really expensive draw calls with lots of state change, or something like that.
 
Ah, thanks. I just saw the Russian and skipped it. :)

Anyway, 20us for just a draw call is mighty expensive. I have little experience on the 360, but I did profile a renderer on it once or twice and I have not seen anything close to 20us average draw call time.

I remember 1.6us average draw call time for the PS3 at some point and the 360 was not that much slower. Maybe 2us.

Could be that he is testing really expensive draw calls with lots of state change, or something like that.


google translate link

If this doesn't show up for some reason, he says the overhead is due to load-hit-stores and lots of constant generation on command buffer. I don't know what he is testing though.
 
mm... the gamefest 08 presentation on the top 15 CPU boo boos might be more relevant in that case. Or even the Multi-threaded rendering presentation. (just throwing out some ideas)
 

Ok, ok, I did read it now. :)

Basically their engine uses the Effects-Framework of DX for its handling of renderstate, which is where the costs come from. This should not be news to anyone who has used it on the PC, I guess, where it basically does a lot of GetRenderState/SetRenderState there and will probably do comparable thing on the 'box...

What he does now is pre-bake command-buffers with all the state in there and call them instead.
I guess he will pay for it by not having a minimal size CB, as he cannot cache states. No sure how Xenos reacts to redundant state changes, on RSX it's not a terribly good idea, in my experience.
Furthermore he seems to have moved most of his vertex shader constants into a separate vertex stream that he does a dependent fetch from in the VS. That certainly takes the constants out of the CB, which will help in his scheme, as it saves him from doing constant patching. I have no clue how expensive that read is for the shader, however.

So all in all, D3D on the 360 is not that slow, the Effects system is. Not sure if many people are using it, though...
 
Sorry to disturb the nice Halo backwards compatibility thread :) Here's something related to the OP:

http://blog.gamedeff.com/?p=235

Brief translation: by dropping D3DX/FX and going to manually assembled precompiled command buffers, this guy achieves a significant increase in drawcall throughput, reaching the absolutely astronomical 1 mln drawcalls/sec. Can't wait to hear if this passes certification...

Hmm, I've already done 1mln draw calls/sec on a shipping 360 product through d3d. It's not as much as it seems, since at 30fps it's ~33k draw calls per frame. Most pix/gpad frame grabs I've seen have 10k to 30k draw calls anyways.
 
They recommend avoiding it too. :)

On multi platform games there is usually an engine used that will check for redundant state changes and toss them away. So we don't call d3d directly to set states, we make an engine call which in turn will tell d3d to make the change if it is truly needed. Sometimes the engine will go a step further and try to batch like draw calls together to minimize state changes even further when possible.
 
Actually it was based on direct discussion with someone producing an Xbox 360 game in the here and now, and the follow-ups here are certainly interesting in how other developers perceive the API.

I realize this is ancient, but since this article just hit NeoGaf and people may wander here I'll explain my main qualms with the blog post.

First, backwards compatibility is a red herring here. If Microsoft makes their next console backwards compatible they won't do it at the API level. I think people might have this PC-like understanding of the Xbox 360 where there are vendor-supplied dynamic libraries and driver layers and everything is a magical happy land of forwards compatibility, but it's not like that. Look at Xbox 1 backwards compatibility... it also used Direct3D to some extent but does not emulate perfectly, so why would we assume that it will be any better the next time around?

Second the article's assumption seems to be that since some of the rendering code on the Xbox 360 goes through Direct3D, all of it must. It doesn't. They've allowed very close interaction with the hardware in a number of areas. Incidentally this is another reason the "Direct3D is good for backwards compatibility" argument fails... you can avoid quite a bit of D3D's abstraction.

As a wider criticism, questions like "Is DirectX throttling Xbox 360 performance?" aren't really helpful. There's no correct answer for every game, or even every scene in a game. Like everything Microsoft's approach to GPU interaction has both advantages and disadvantages.
 
I realize this is ancient, but since this article just hit NeoGaf and people may wander here I'll explain my main qualms with the blog post.

First, backwards compatibility is a red herring here. If Microsoft makes their next console backwards compatible they won't do it at the API level. I think people might have this PC-like understanding of the Xbox 360 where there are vendor-supplied dynamic libraries and driver layers and everything is a magical happy land of forwards compatibility, but it's not like that. Look at Xbox 1 backwards compatibility... it also used Direct3D to some extent but does not emulate perfectly, so why would we assume that it will be any better the next time around?

That would have been much less of an issue had they stuck with Nvidia, but they weren't happy with the licensing model with Nvidia. Thus the switch to ATI, and no longer being able to fully support backwards compatability without paying Nvidia tons of cash per title for BC emulation.

Going forward if they are happy with ATI, it could simply be stated that the next GPU must be backwards compatible with Xenos to some extent. Or that emulation must be fairly simple and easily provided. And unless for some reason they are unhappy with the performance of Xenos, they are definitely much happier with the licensing, since they own it. :) Unless ATI performance suddenly falls off a cliff, I don't think it's very likely that MS will switch from them.

Then again, they may just scrap all that and say the heck with it, new stuff only.

Regards,
SB
 
That would have been much less of an issue had they stuck with Nvidia, but they weren't happy with the licensing model with Nvidia. Thus the switch to ATI, and no longer being able to fully support backwards compatability without paying Nvidia tons of cash per title for BC emulation.

That's the whole point! If every game (or even most games) on Xbox were written purely to the DirectX API, without any machine-specific optimizations, then the switch from nVidia to ATI would have been a non-issue. Instead, many Xbox games were written to the nVidia hardware, hence the need to emulate the nVidia GPU (and continue to pay royalties to nVidia) for Xbox 360.
 
That's the whole point! If every game (or even most games) on Xbox were written purely to the DirectX API, without any machine-specific optimizations, then the switch from nVidia to ATI would have been a non-issue. Instead, many Xbox games were written to the nVidia hardware, hence the need to emulate the nVidia GPU (and continue to pay royalties to nVidia) for Xbox 360.

Well, the whole BC being easier for X360 going forwards rumor was started due to a rumor that MS was being more strict with approving games that went too "close to the metal."

So if MS were in fact encouraging devs to stick to the API, the only reasoning for it would be for BC.

But there is doubt that MS is actually all that strict about enforcing API guidelines.

Regards,
SB
 
Back
Top