AMD confirms R680 is two chips on one board

If you believe FUDzilla there are some new HD3870 X2 cards on the way using faster DDR4 memory.

Link.

A few ATI partners have confirmed that they are almost done with their GDDR4 R680 design. Radeon HD 3870 X2 cards are currently selling for around €400, but they are all based on slower DDR3 memory clocked at 1800MHz for the reference cards.

This card should be just a bit more expensive and it should launch in a few weeks, but we don’t expect any significant speed jumps with the new DDR4 card.

The point is, ATI can do it and the partners want it, so they will definitely follow up with new refreshed card.
 
It's not so unbelievable; the memory controller is 100% ready. I'm not really sure why they didn't go for GDDR4, except for maybe cost. But the heat would be less and so too the power consumption, so I'd think a dual-GPU board would benefit nicely from the upgrade.
 
Indeed, and in fact one drawback I have yet to see addressed is the inability to turn off crossfire if it's sucking down your performance. At least I have the opportunity to go turn off the checkmark in the control panel if I run into a game that sucks canal water with CF enabled; not so much luck for an X2 owner.

The X2, just like the upcoming GX2, will live or die by drivers. If they do a good job with drivers, the cards will be excellent for the price. If the drivers suck, these cards will get very little love.

maybe Im' reading this bit wrong so forgive me if Im wrong however:

from: http://www.digit-life.com/articles3/video/rv670-3-part2.html

"The card from GeCube has another peculiarity - four DVIs instead of two (in the reference card). Moreover, non-standard BIOS in this card does not allow the driver to enable CrossFire by default (it cannot be disabled in the reference card). That is users can disable CrossFire and get just two cards, each with two DVIs. In this case you can plug four monitors to your card. So, flexibility of this card is praiseworthy."

and then from: http://www.digit-life.com/articles3/video/rv670-3-part3.html

"But don't forget about its power connectors on the rear, so the plugged power cables will make the card longer. And don't forget, this graphics card is currently the fastest of P¥2 products. We should also note four DVIs and an option to disable CF. In this case one can plug four monitors to this card (as a multi-monitor solution)."
 
If you believe FUDzilla there are some new HD3870 X2 cards on the way using faster DDR4 memory.

Link.

Ahhh Tardzilla. Just love those inconsistencies.

Is it GDDR4 or DDR4? :rolleyes: (rhetorical, that is. They did the same thing with the 2900). Guess my decision to wait a few weeks before buying an X2 was a good one, this all reminds me of the x1900xt -> x1950xtx sidegrade. If the price barely moves, it should be a nice little bargain.
 
maybe Im' reading this bit wrong so forgive me if Im wrong however:

from: http://www.digit-life.com/articles3/video/rv670-3-part2.html

"The card from GeCube has another peculiarity - four DVIs instead of two (in the reference card). Moreover, non-standard BIOS in this card does not allow the driver to enable CrossFire by default (it cannot be disabled in the reference card). That is users can disable CrossFire and get just two cards, each with two DVIs. In this case you can plug four monitors to your card. So, flexibility of this card is praiseworthy."

and then from: http://www.digit-life.com/articles3/video/rv670-3-part3.html

"But don't forget about its power connectors on the rear, so the plugged power cables will make the card longer. And don't forget, this graphics card is currently the fastest of P¥2 products. We should also note four DVIs and an option to disable CF. In this case one can plug four monitors to this card (as a multi-monitor solution)."

I think it's a translation artefact. They're Russian, and their articles' translations are usually mediocre, although the articles themselves are quite good IMHO.
 
They are saying that the GeCube card that doesn't use the reference design has the quirk of allowing you to disable Crossfire in the control panel, a setting that ATI's ref design doesn't allow. Their review is great IMO. I really like how they photograph the cards without the coolers attached and take a close look at the coolers themselves. I think their Rightmark tests are probably more useful than 3DMark too, for synthetic testing.
 
The problem with the TechReport review is I don't know what "Ultra" they used. Folks can buy an 8800GTX for under 400usd @ Newegg with rebate and those easily do 610 on the core. Many can do 630 or so with the stock cooler, and up to 650 with better cooling.

There is no reason to buy an Ultra IMO, unless you plan to water cool it and shoot for 700 or more on the core.

Most folks @ 16x or less would do perfectly fine with an overclocked 8800GT or 8800GTS 512Mb. In some cases they'd be GTG @ 1920x with just a GT too.

http://www.driverheaven.net/reviews/8800GTs/index.php
 
The problem with the TechReport review is I don't know what "Ultra" they used. Folks can buy an 8800GTX for under 400usd @ Newegg with rebate and those easily do 610 on the core. Many can do 630 or so with the stock cooler, and up to 650 with better cooling.

There is no reason to buy an Ultra IMO, unless you plan to water cool it and shoot for 700 or more on the core.

Most folks @ 16x or less would do perfectly fine with an overclocked 8800GT or 8800GTS 512Mb. In some cases they'd be GTG @ 1920x with just a GT too.

http://www.driverheaven.net/reviews/8800GTs/index.php

If you read the Techreport review fully, I believe he mentions using a STOCK GTX Ultra, however he quotes prices for Overclocked GTX's saying most can reach similar speeds to the Ultra at a much lower price. Thus while getting a GTX Ultra is a bad idea, the stock GTX Ultra is fairly representative of a general OC'd GTX.

Regards,
SB
 
Back
Top