Absolutely nothing, but does that make the other predictions, despite being wrong, better ?
If you are right more often than wrong, then yes, your "predictions" are valuable.
Absolutely nothing, but does that make the other predictions, despite being wrong, better ?
And is there a confimation to this ? Maybe Rys or anyone from the staff of B3D can confirm or deny that that is or has been the case for them.
I remember on b3d member ebaying ex review cards...
No. It might make sense if nVidia was more like Intel. But the GPU market is far too fluid for this strategy to work out.Oh dear............
http://forum.beyond3d.com/showpost.php?p=1415696&postcount=533
http://forum.beyond3d.com/showpost.php?p=1415695&postcount=532
http://forum.beyond3d.com/showpost.php?p=1415692&postcount=529
http://forum.beyond3d.com/showpost.php?p=1415677&postcount=526
http://forum.beyond3d.com/showpost.php?p=1415668&postcount=522
http://forum.beyond3d.com/showpost.php?p=1415667&postcount=521
http://forum.beyond3d.com/showpost.php?p=1415666&postcount=520
http://forum.beyond3d.com/showpost.php?p=1415663&postcount=519
http://forum.beyond3d.com/showpost.php?p=1415660&postcount=516
http://forum.beyond3d.com/showpost.php?p=1415647&postcount=513
http://forum.beyond3d.com/showpost.php?p=1415634&postcount=510
Some of the posts above are partly directed at this question of yours.
Do you agree that all this sums to atleast one plausible motivation for taking a loss on gf100, assuming they are making one?
That's at least sorta kinda reasonable, but it rests on the assumption that nVidia would risk making a large order before being sure of the yields.There was a story from Charlie that apparently at around the A2 tape in timeframe, nvidia put in a large order for base layer wafers on the assumption that when A2 came back, all would be good. The number quoted was in the range of 50 mil or ~10-15k wafers.
"Not enough" is not accurate. Anandtech's 5870 crossfire test, showed that the 16X+16X vs 8X+8X configurations, have only 2-7% performance difference.
There are motherboards like the MSI Big Bang Trinergy (not to be confused with the Fusion) which sports a NF200 chip that gives a full complement of 32 PCIe 2.0 Lanes which can be divided at 16X+16X for the graphics cards (there's also the option for 16X+8X+8X). It works for both SLI and Crossfire and I should know since I own it as well as two 5850s.
The funny thing is that its cheaper than quite a few vanilla high end P55 mobos (vanilla=No NF200).
That's at least sorta kinda reasonable, but it rests on the assumption that nVidia would risk making a large order before being sure of the yields.
No. It might make sense if nVidia was more like Intel. But the GPU market is far too fluid for this strategy to work out.
What did you think ? That reviewers always kept the cards they were sent, free of charge ?
That's at least sorta kinda reasonable, but it rests on the assumption that nVidia would risk making a large order before being sure of the yields.
2-7% difference isn't something to sneeze about. If the difference between the two chips is in the order of a few % then it does make a significant difference especially when considering the HD 5970 vs say 2* GTX 470 and it becomes even more important in the next generation when the cards are expected to be even faster. I would suggest that most P55 boards do not have the extra NF200 chip installed, am I right?
2-7% is not much at all. You can always clock P55's PCIe bus to 102-107Mhz. It can do that quite easily. I've seen boards that can do 150Mhz on the PCIe.
Even without considering the OC option (despite all the benefits of the P55 based platforms compared to pretty much any other Intel's chipset) a 7% more performance would give like 3 more frames on a 45fps framerate for example, thus reaching 48fps. No biggy!
I don't know if I understood correctly what you wanted to say in your 5970 vs GTX 470 SLI example, but if you meant the 5970 on a PCIe X16 slot (obviously) and the GTX 470 on 8X+8X compared to 16X+16X then yes I see your point. The GTX 470s would lose that little advantage they may had. It is still small though and especially in the GTX 470 case with all their flaws, personally I wouldn't think it twice. A review site should bench at 16X+16X though as this is the norm.
P55 mobos, even high end ones, do not have a NF200 chip in their default configuration as you correctly state. A user that will start consifering a high end rig, will not stop at the extra 50 euros he has to spend though.
Even if they managed to make a small profit on each GTX480/460, that would probably be insignificant compared to design costs. I suspect the current models are mostly for marketing later derivatives and profit/loss on them is not a real concern. That may change if they can get a large amount of them to the market.
Does anyone know what proportion of previous generation cards sold used the high end die? Or what proportion of Evergreen cards sold have been Cypress?
Then they should grill beef and not egg.Hey thats not trying hard enough. They should overclock it and run Furmark, thats not a realistic grill test!