GTX 460 Tested
Core Clock 900MHz
Shader Clock 1800MHz
Effective Memory Clock 4400MHz
http://www.coolaler.com/showthread.php?t=242673
Local nVidia PR allowed pre-NDA leaking of tests, which are showing, that GTX460-1024* is faster than HD 5850**. I expect similar "leaks" can appear everywhere...
* significantly overclocked
** default clocks
Which essentially means that they have killed off the market for their top end chips unless of course they can come out with a substantially faster revision for those too. In any case the important comparison may indeed be with the HD 6850 considering ATI's near clockwork release schedule thus far.
That isn't reasonable. In order to survive, nVidia needs to cultivate and maintain its relationship with its relatively small number of customers. Piss them off, and they could easily find themselves with a fraction of the sales the next go-around. It is most definitely in nVidia's best interest not to leave board vendors holding unsellable product.Nvidia could care less if their IHVs are left holding unsellable inventory. They already have the money in pocket.
That isn't reasonable. In order to survive, nVidia needs to cultivate and maintain its relationship with its relatively small number of customers. Piss them off, and they could easily find themselves with a fraction of the sales the next go-around. It is most definitely in nVidia's best interest not to leave board vendors holding unsellable product.
I'm not sure that we have the information available to make firm conclusions on that. We'll see. But in any case my point was slightly different, that nVidia most certainly does care that their small number of customers aren't left with unsellable product. Either that or they're stupid, and their business success to date indicates that isn't the case.It's not reasonable but it is reality. What you think happens with GTX465 sales now?
Which essentially means that they have killed off the market for their top end chips unless of course they can come out with a substantially faster revision for those too. In any case the important comparison may indeed be with the HD 6850 considering ATI's near clockwork release schedule thus far.
Exactly. I wonder about performance of a full blown GF104 with high clocks. Would it be enough to beat GF100? I guess it could come close to 5870, at least.
1751,25 × 384 = 1401 × 480.
So you'd need a full GF104 with shaders running around 1750MHz to be roughly equivalent to GF100 (geometry, cache, and bandwidth aside).
I doubt NVIDIA could make many of those, let alone within 300W; especially when they don't seem to be able to (or optimistically, to be willing to) make a 384-SP variant.
Which doesn't matter to Nvidia. They've already probably sold most of their original batch of Fermi chips.
Nvidia could care less if their IHVs are left holding unsellable inventory. They already have the money in pocket.
And I'm sure Nvidia would prefer to sell a less costly chip with higher margins in the consumer space. Then push Fermi chips into the Quadro and Tesla lines where the margins are astronomical.
They have benchmarks with GTX 480 higher than 5870, so its job is done. Retire it and move on is probably what they are thinking, and keep just enough in the channel to make sure they can continue to lay claim to fastest single GPU.
Regards,
SB
Well, that may not last long if Dave is being Dave and has already got his refresh models coming into the pipeline soon. Failing that, theres always the dreaded price cut and the HD 58xx models have been out long enough for non recoverable costs to have been paid back. They also seem to have been building up quite a following in the absence of any real Nvidia competition. I suspect the shoe might be on the other foot here in terms of momentum and for a change ATI may be in the drivers seat for once.
I wouldn't count on that, ATI faced worse competition when R300 was around and Nvidia didn't lose many fans from that. If they can get back on the track, I'm sure they'll be fine.
I dread the day Nvidia or ATI gets into enough trouble that one of them can no longer compete. That would be a bad day for 3D hardware no matter which side of the fence you sit on.
Regards,
SB