Why did it take so long?a.k.a., RV770 done right!
(an optimistic view)Conspiracy theory: this was a quick fix after their attempts to make a 40nm version failed or proved not worth the bother.
(an optimistic view)
Or probably because RV800 design got so well on the 40nm node, that the bothering with the current generation didn't worth either way.
Note though that xbitlabs is testing with "high quality" driver options. Not wanting to start a fight here if that's fair or not, but some other reviews will use use default options which tends to help Nvidia's cards more than ATI (since they can't do "perfect" aniso filtering even with high quality). Might just make enough of a difference that even the OC cards can only compete with the GTX 285.http://www.xbitlabs.com/articles/video/display/radeon-hd4890.html
This is a very good result for the card that wasn’t initially supposed to compete against the single-processor flagship product from the enemy camp."
Oh, that's interesting. Reviews now mention it supports memory bursts due to these docs.Just a quick update. RV790 does *not* support burst memory reads, we had an error in our documentation that will be fixed and updated shortly, sorry about that. Y'all are just too quick scanning through things and finding nuggets like that. ;-) There are some tweaks in RV790, but burst reads is not one of them.
http://www.xbitlabs.com/articles/video/display/radeon-hd4890.html
In this review HD4890OC (900/3900) beats the GTX285 in 11 out of 16 tests and 1 draw (2560x1600). Pretty impressive!
There are no logic design changes to HD 4890, the changes are done from a physical design point of view.I guess there's a good reason that it doesn't support this (doesn't it have to due to the 256bit memory bus or what?).
Ok, thanks for the clarification. Still, considering the timeframe and the fact you need a new chip anyway, it seems slightly odd to me that you wouldn't also include some minor logic changes (especially those which were already implemented in other chips, so shouldn't be much risk). I guess that feature won't really help performance that much, then.There are no logic design changes to HD 4890, the changes are done from a physical design point of view.
Xbitlabs tested PowerColor reference board. I'm not sure, but it seems, that Asus, HIS and other OC boards have different BIOS and PowerPlay don't work at all. Even OC versions of HD4850 don't underclock in 2D fully, only to 500MHz and power consumption is therefore much worse.I'm seeing conflicting numbers; some say it's (much) better than 4870, some say worse. For example:
http://www.xbitlabs.com/articles/video/display/radeon-hd4890_6.html#sect0
vs
http://www.pcper.com/article.php?aid=684&type=expert&pid=11
I've just read the Sacred 2 chapter of Anandtech review. If nVidia asked for PhysX tests (and quite probably they did), that's an epic failAnd ouch. Nvidia asked, quite profusely, for Anandtech to mention PhysX. And dock ATI points for not having it. Ouch. Totally opposite effect, I guess, be careful of what you ask for... You might just get it.
And ouch. Nvidia asked, quite profusely, for Anandtech to mention PhysX. And dock ATI points for not having it. Ouch. Totally opposite effect, I guess, be careful of what you ask for... You might just get it.
Hmmm... insanely good scaling at 1680x1050 and on up.
http://www.techpowerup.com/reviews/ATI/HD_4890_CrossFire/28.html
Would be more interesting if they also compared the scaling of the other multi-GPU setups.
Yeah I don't get that. For the most part PhysX is still just a promise and while it's fine to talk about its potential it's way to early to be forcing it down reviewers' throats. If it was out there in a lot of games and people were ignoring it that would be another story. But Mirror's Edge and Sacred aren't enough for Nvidia to be so belligerent about including it in reviews.