X800... at Meristation

Well, get your facts straight. I said 549 € for the 6800U and that a certain online store is always boosting its price in pre-release. What's the price over that's store 589€, what's the public price for the card (as indicated by yoursel)? 549€. So perhaps there's something fishy somewhere don't you think?
My pm are open if you want to continue this discussion, but i think it's useless...
 
http://www.meristation.com/sc/articulos/articulo.asp?c=HARD&cr=5218

Conclusions:

the first conclusion is that the expectations more optimistic are not fulfilled than we had become. Before proving cards we had heard affirmations on which this new generation would suppose a qualitative jump since we had never seen. After seeing the new functions, we are of seeming that it was much more important, for example, the appearance of the first GPU (jump of TNT to GeForce) on the part of NVIDIA or the later implementation of the Shaders. The new features presented/displayed by both manufacturers are similar, varying the name and being able to have reached one better or worse perfection, but always by the way marked by the DirectX. These functions in their majority are small visual improvements and optimization of effects that already were made. The great difference is that with each revision of the Shaders they are able to do effects more complex taking advantage of better the resources than, as well, are every time greater. That supposes a still greater increase of the yields. Those same yields that have not surprised us so much either. We had been able to accede to some leaves of yields produced by the own manufacturers where the differences with the immediate hardware made the rounds between the 80 and the 150%. In our test we have been able to see yes that although Radeon 9800XT leaves seriously beaten, in some test that plants new face and even it gets to surpass to the Chips. That teaches to us that looking for much, always we will find some test that offers the results to us that we want. But a thing is that all the expectations are not fulfilled and another one is that we deny the evident thing. As much NVIDIA as ATi has put us within reach authentic beasts. The jump in gross power yes that has been very important. We have noticed it in some of the test. Still using one of the machines more powerful than nowadays we can mount, Pentium 4 3,4 GHz and 1GB of ram, have taken place some necks of bottle that we believed actually impossible. That is to say, the graphs could assimilate more than what the rest of the system was able to offer to them. Here it has much to do the architecture of cards. The use of the 16?pipelines? or channels of render real on the part of NVIDIA supposed a jump that jeopardized the product of ATi. The same happens with the operations of floating point of 32bit. Finally ATi ends the doubts and presents/displays a product that counts on those possibilities. Thus, both architectures are very even and is necessary to look for the differences elsewhere. NVIDIA does not get tired to speak of their support, exclusive, for Shader 3,0, while ATi still feels less fatigue in saying than really they do not contribute anything. The debate without a doubt still will extend much and with time and the sales will be seen whom gives to the reason the user. Thus, whereas in architecture and functions both Chips runs very even, with Vista to the future it leaves beneficiary NVIDIA and her Shader 3,0, but that advantage is seen clearly surpassed by the problems that can cause the requirements of the card. One or takes to many years seeing cards and memory that the same sensation that I had when I took to the X800 in a hand and the 6800 in the other, or I had it many years ago. At that time also I had a NVIDIA, and in other hand one 3dfx. Both worked similarity more or less, but 3dfx was being forced to make so long PCB that they did not enter many boxes or to find solutions like the one to make work two cards in parallel. And while NVIDIA did the same, to the same price and without presenting/displaying no problem. Now it happens a little to me the same. The two cards are excellent, but the requirements of the new GeForce make me think that the technology of the ATi more is guessed right. Right now, after them to have proven, it is the most recurrent detail at the time of having to decide me by one or another one. That yes, to buy a ATi would not accept to maintain it with the refrigeration system that brings samples analyzed. For a hardware thus and its most interesting memory GDDR3, overclocking it is an inevitable future and that refrigeration does not allow it. If there are to pay attention strictly to the yields, will depend on our tastes at the time of playing. Although both Chips does not take off too much in the test, the NVIDIA were suffered plus whichever greater was the exigency of filtrates. Finally, I want to leave well clear that we have both analyzed first samples in being offered by the manufacturers to the press. Those samples still must be optimized and, mainly, they must be having controllers who remove everything to them what they take inside. The comparative one is had enough complicated because both Chips gave problems in certain situations. We have not been able to take off the eyes of the screen by fear to that in a certain situation a card drew a shade or reflection more than the other. All it will be known the problem of the NVIDIA and the Officers' Club of Revolutionary Armed Forces Cry, solved already by means of a patch, reason why you will know of what we spoke. This comparative one has been one first taking of contact with the Chips, very green still, but already we have received the confirmation of which we will not take much in receiving first cards?retail. Then the tests will focus otherwise and perhaps the conclusions also are different. What we do not think that they change they are the prices. Compraos a moneybox
 
Back
Top