Report says that X1000 series not fully SM3.0?

With a bit of luck, I'll have a few more details tomorrow about it ;)

Hmmm I just saw that they've replaced the diagram...

SGX.gif


http://www.powervr.com/Products/Graphics/SGX/Index.asp#
 
All these discussion about SM3 compliance just ignore the basic facts about how these standards where (and are) made. Microsoft AND the IHVs decide together what direction the hardware takes in the next years, what kind of features are necessary and what kind of features are possible in what timeframe.
Then the specs for the features are defined in a way that suites all major IHVs.
Then it is checked whether features are optional (caps bits) or mandatory.

This is all based on consensus and all decisions are made many years before the silicon reaches market, in fact these decisions are probably made in early design phases. Remember that the SM3 specs are older than SM2 hardware.

ATI probably discovered the big mistake in the SM3 spec, of course that the silicon costs and the high latency makes this feature unusable and there found a loophole in the spec to savely drop the feature.

Yes the feature was ment to be mandatory, but yes, real world issues are more important and I bet, that M$ would never made this feature mandatory when ATI would have discovered the features early enough....
 
DemoCoder said:
Certainly if you read here

It seems like VT are a must. The loop hole is DX is allowed to return that NO texture formats are supported, but this to me is kind of a violation of the spirit of the shader model.

It's like if shipped you 3D accelerator that only supported Gouraud shading, because it reports *no* texture formats as supported, even tho PS2.0 suggests that the TEX instruction is a must.

So you support TEX vacuously. It can never be used, because no texture formats are supported, ergo, it passes the implementation unit test!

Democoder is there only Pacific-Fighters that does use some kind of VT in games right now, and isnt the performane "quite" good on the 7800 series compared to 6800?
As you say there are limitations but is the overal VS performance of the G70 compared to NV40 a result of the extra VS units or is it something else? The 40Mhz(?) difference has this been confirmed to be the vertices setup or just the VS units?

Im not replying to your post but rather just wondering and have been for quite some time about this.
 
Ailuros said:
Hmmm I just saw that they've replaced the diagram...
Should that be Coarse Grain Scheduler, or are the Course and the Grain separate terms?
 
Did X1800xl ship with WHQL drivers? That's what it comes down to for me, and I agree with NV's statement in that regard --MS owns it and they have the right and ability to say "nice try, but homey don't play that". They didn't. Further, it sounds like even if they did ATI would just deal with it as a driver route-around, so I'm still trying to figure out where the great sin here is.
 
Pete said:
Should that be Coarse Grain Scheduler, or are the Course and the Grain separate terms?

Probably just a typo; course doesn't make any sense to me.
 
I was under the impression that ATI couldn't actually "simulate" the full range of techniques one might centre upon VT, with driver "kludges".

Nice diagram Ailuros! All we need is a nice article...

Jawed
 
Simon F said:
A solution that didn't hide vertex texture latency would be pointless.

Not for a solution that doesn't have vertex texture latency. Impossible, you say? Piffle, keep working at it. :LOL:
 
Pete said:
Should that be Coarse Grain Scheduler, or are the Course and the Grain separate terms?
oops. I've emailed the guys about this!
 
Last edited by a moderator:
Aw, no more dispatching whole wheat to one shader unit, whole oats to the next.
 
Back
Top