SourceNvidia is currently finalizing the clocks on the chip and it is waiting to see the final performance of R700. Nvidia's chip is likely to be slightly faster than ATI's dual chip offering, but we suspect that the performance will be quite close.
Nvidia is certainly not giving up without a fight, as Nvidia's entire marketing was based on the fact that Nvidia was a performance leader. Nvidia simply cannot let the crown slip out of its hands and that is why Nvidia works so hard to maintain its performance dominance.
R700 scales 100% everywhere since when?
it will be AFR, so many people won't buy it simply because its sli-on-a-card.
Well, it's been indicated by choosing a setting, which isn't prone to exhibit Microstuttering, to be exact.It's already been demonstrated that 48x0 hardware in CFX does not suffer from micro-stuttering, ...
That's true - but also true for all other SLIs-on-a-stick solutions. You've just got a wider range of corresponding options on HD4870 X2....so as long as you choose the proper settings to achieve a desirable frame rate, there's no reason not to consider R700.
Well, it's been indicated by choosing a setting, which isn't prone to exhibit Microstuttering, to be exact.
You say this as though there are settings which *will* exhibit micro-stuttering.
When and where? From whom?Well these "many people" (you're talking about a small sub-set of an already tiny group, aka hardware enthusiasts) will be missing out if they choose to believe R700 has the same limitations as previous multi-GPU boards. It's already been demonstrated that 48x0 hardware in CFX does not suffer from micro-stuttering,
When and where? From whom?
I don't see ANY review yet , just previews.
Seen enough promises from ATi/AMD last years, one successful chip is not enough to blindly believe in anything they claim.
Of course there are.
http://www.pcgameshardware.com/aid,653711/
If you want to be a believer, fine, but don't just ignore any facts contradicting the faith... like evidences that the Earth is roundWTF? Perfect micro-stuttering? Sorry, I don't buy it for a second. Never since the introduction of AFR have I seen anything even remotely resembling that graph you linked. There is more going on here than meets the eye.
If you want to be a believer, fine, but don't just ignore any facts contradicting the faith... like evidences that the Earth is round
R7xx is just way-more powerful than R6xx, hence, in order to observe stutter one needs more demanding situations/game settings.
Did MAD claim ANY fundamental changes how frames are rendered? No, AFAIK.
chip-2-chip communication was increased 2-3x, yet even here people had calculated that the BW is nowhere near the one needed to make multi-gpu act as a monolithic one.