ATi's R420 launch and Half-life2, some thoughts.

I was happily keeping everyhting over there until his last post. Now I'm just going to go play some Silent Storm or something.
 
Ratchet said:
So can you put that in perspective for me wrt the NV40 Shader abilities?

It means - don't fall for marketting. It rarely tells the whole story.

The pipelines have different capabilities and functionalities and will have stronger and weaker points dependant on the types of shaders being used.

From the outset it may seem like NV40's ALU's are more capable, but we don't know the exact capabilities - they are described as being both fully featured, but they are not and one certainly does MAD's while the other MUL's (I'm wondering if its more the case that these are NV30's full ALU's split into to with some instructions distibuted and some duplicated), and only one can be utilised with any texture operations. Conversly, ATI still won't say exactly what intrsuctions are in their second ALU's, other than the PS1.4 modifiers, but they maintain they do have some specific instructions, and they can hide texture latency with ALU operations as the separate texture address processor doesn't need to interfere with ALU operations.

WRT to utilisation, I believe that NV40 has gone a long way to eliminating many of of its long shader issues, but not entirely so and it does still appear to have a greater drop off in performance than the instruction lngths would suggest, indicating that they have pipeline bubbles where fewer pixels are in flight and the resources aren't fully maximised - with a different pipeline organisation this takes time to analyse and understand.
 
WRT to utilisation, I believe that NV40 has gone a long way to eliminating many of of its long shader issues, but not entirely so and it does still appear to have a greater drop off in performance than the instruction lngths would suggest, indicating that they have pipeline bubbles where fewer pixels are in flight and the resources aren't fully maximised - with a different pipeline organisation this takes time to analyse and understand.

Would it be possible to improve this situation with more mature drivers? I also wonder about what NV will do with all the GeForce FX series specific optimizations that were being done in the driver. One would think that the 6 series requires some very different optimization techniques in the drivers than what was used for the GeForce FX series cards.
 
DaveBaumann said:
indicating that they have pipeline bubbles where fewer pixels are in flight and the resources aren't fully maximised
Just checking, but by "pipeline bubbles" do you mean gaps with no information flowing thru the pipeline?
 
digitalwanderer said:
DaveBaumann said:
indicating that they have pipeline bubbles where fewer pixels are in flight and the resources aren't fully maximised
Just checking, but by "pipeline bubbles" do you mean gaps with no information flowing thru the pipeline?

No, AFAIK that is normally what's meant by stalls. Bubbles are where computational power is used on dummy, rather than legitimate, instructions (which the chip couldn't use at that point, for whatever reason).

That's my understanding anyway. Could well be wrong, in which case someone will correct me.
 
Would it be possible to improve this situation with more mature drivers?

I's guess so, but the main question (for me) can they improve it enough to match up to the X800? (assuming dig's right about it's shader performance, then again ATi have had the last couple of years to optimise the hell out of it's shader cores (both HW and SW wise).)
 
jimmyjames123 said:
WRT to utilisation, I believe that NV40 has gone a long way to eliminating many of of its long shader issues, but not entirely so and it does still appear to have a greater drop off in performance than the instruction lngths would suggest, indicating that they have pipeline bubbles where fewer pixels are in flight and the resources aren't fully maximised - with a different pipeline organisation this takes time to analyse and understand.
Would it be possible to improve this situation with more mature drivers? I also wonder about what NV will do with all the GeForce FX series specific optimizations that were being done in the driver. One would think that the 6 series requires some very different optimization techniques in the drivers than what was used for the GeForce FX series cards.
Well that is simple to answer...they will just drop them entirely. As soon as the full NV4X line is out and most people have forgotten the whole NV3X debacle they will be removed. As for the people that own those cards...well if you are an enthusiast and bought one of those cards expecting to not upgrade the next generation well that is your own damn fault. There was plenty of info pointing to the problems the NV3X had at that time. I would also hope that the NV4X doesn't require the radical "optimizations" that the NV3X needed. Then again if ATI releases a card that starts kicking the NV4X's ass in certain benchmarks those optimization will return rather quickly I believe. :(
 
Back
Top