It still comes down to the architecture of the family. If you want to pick on something, pick on that, not R580 per se (and yeah, I've said that a few times already). Given that architecture, what else could they have done? The answer seems to be, "add two more quads instead". Which would have given them not only 8 more texture units, but also 8 more ROPS. . .and they'd still have been math-underpowered per pipe. So they'd have had to add at least one more shader per pipe, instead of two. So now you have double the shaders per pipe *and* two more quads (8 more TMU, branching units, and ROPS).
How big do you suppose that would have been?
And what would we have heard then? We'd have been inundated with ATI's silly and wholly unjustified decision to have a huge chip with 50% more ROPs than NV, and treated to analyses that showed they weren't getting nearly the advantage out of them that they should. This whole argument seems to boil down to R580 sucks because it isn't G71. Well, gee. You could turn that around, but that would be pretty silly too.
If you look at it from a consumer pov, you have X1900 right now is cheaper than G71 (tho G71 is early enough in its lifecycle to not be able to say for how long), performance competitive, and has feature advantages.
If you look at it from a stockholder perspective, ATI's non-chipset business margins are in their longtime corporate average (34-38%). NV has had a breakout upwards --bully for them if you are looking to buy stock, but don't miss the fact that NV is the one who changed the status quo there, not ATI. And if you go back and put your consumer hat on you could just as easily ask "hey, waitaminute, why is it that NV made a signficant cost improvement and didn't pass on any of that love to me?"
How big do you suppose that would have been?
And what would we have heard then? We'd have been inundated with ATI's silly and wholly unjustified decision to have a huge chip with 50% more ROPs than NV, and treated to analyses that showed they weren't getting nearly the advantage out of them that they should. This whole argument seems to boil down to R580 sucks because it isn't G71. Well, gee. You could turn that around, but that would be pretty silly too.
If you look at it from a consumer pov, you have X1900 right now is cheaper than G71 (tho G71 is early enough in its lifecycle to not be able to say for how long), performance competitive, and has feature advantages.
If you look at it from a stockholder perspective, ATI's non-chipset business margins are in their longtime corporate average (34-38%). NV has had a breakout upwards --bully for them if you are looking to buy stock, but don't miss the fact that NV is the one who changed the status quo there, not ATI. And if you go back and put your consumer hat on you could just as easily ask "hey, waitaminute, why is it that NV made a signficant cost improvement and didn't pass on any of that love to me?"