AMD confirms R680 is two chips on one board

Nvidia Geforce 7950GX2 utilize combine bandwidth - wasn't so bad.

I think my point may have gone unnoticed ;) Each GPU only has it's own bandwidth, so combining the two for some utopian number doesn't have lot of bearing on real life.

Effectively, the 7950GX2 and the upcoming 3870X2 don't have twice the bandwidth, they instead have two individual video cards on the same PCB. Does that really make much difference? There's probably a few corner test / theory cases where it does, but I don't know if those apply in real life.
 
Even better, if the thermal design power is well in excess of true thermal output, that simply means less noise at that power level. So at idle, a chip putting out 45w under a heatsink design meant for a 145W TDP may need about 5% fan speed to keep cool ;)
 
I won't pretend to know much about GPU hardware, but GDDR3 doesn't make much sense at this point:
1) GDDR4 produces less heat and consumes less power, and that's got to be a concern for this card. Plus the increased cost would matter little to prospective buyers of such a monster, especially since one would expect that GDDR4 has gotten cheaper nowadays.

2) Why boost the GPU core frequency compared to 3870 and yet lower the memory speed to 3850 levels? There was a mismatch for the 2900 but nobody complained about the 3870's core-memory pairing.

Although I would be happy that the board stay below 2900 power consumption numbers (225 watts, correct me if I'm wrong), considering they've outfitted it with the same blow-dryer cooler. At the same time, the heat production is spread over two distinct cores, so there's hope it will ease the cooling process.
 
While I am posting:
I can understand variable heat output, but usually a component has only one TDP value, which is used for dimensioning the cooling system. Definitely something strange about that info, I'd take it with a grain of salt.
 
I think my point may have gone unnoticed ;) Each GPU only has it's own bandwidth, so combining the two for some utopian number doesn't have lot of bearing on real life.

Effectively, the 7950GX2 and the upcoming 3870X2 don't have twice the bandwidth, they instead have two individual video cards on the same PCB. Does that really make much difference? There's probably a few corner test / theory cases where it does, but I don't know if those apply in real life.

Your point is taken! :)

But I would rather have two Horses pulling me & my wagon up the heel vs. just one.
 
Last edited by a moderator:
According to local nVidia PR, GDDR4 are ~1/3 more expensive than GDDR3 and worsen price/performace ratio.

But consume 50% less with lote more performance. You only need to see the good and bad things os technologie.
 
IIRC, per-pixel bandwidth demands drop at higher resolutions, and I expect AMD expects R680 to be aimed at high resolutions--for today's games, anyway.

Are people buying high-end cards that worried about power consumption? I figure the cost savings (and associated MSRP lowering, which may lead to higher sales) were worth more to AMD than the power savings, though I'm looking forward to OCing tests that'd show just how much performance they gave up by giving DDR4 a pass.

As for the 110W TDP, I guess that rules out shutting down a core and its RAM when the card's basically idling (2D or OS chores)?
 
Well, I have to say that I really like how my new 725MHz 3850 uses the same or less power at idle as my 540MHz 8600GT did! I hope this is a new trend from the GPU folks!

I am not impressed with how my 8800GTX pulls something like 70W at idle. :)
 
I'm loving my 3870's, except for the 100% fan speed all the time :( This was PowerColor's answer to the complaints about the fans not spinning enough; go figure. I'm going to start experimenting with BIOS reflashes tonight in an attempt to correct it

Of course, not before I make a solid backup flash of the original firmware. Good thing I have two of 'em :)
 
Arctic Cooling Accelero S1 was my solution to 3850's hair drier. And a very effective solution it is. I'm going to be shopping for passively cooler cards in the future or cards known to be quiet under load at least. Of course, everyone and his grandma was claiming that 8800GTX's cooler was near silent too. And it is, until you play certain games like Oblivion and it spins up to 100%.
 
Hard launch may appear on 1/24/08
Source
Chipset ATI Radeon HD 3870X2
System Interface PCI-Express 2.0 16x
Core Clock 775MHz
Memory Clock 900x2
Memory DDR3 1024MB / 512-bit
RAMDAC 400MHz
Connector HDTV, dual DVI-I DVI w/HDCP (HDMI adaptor*)

specifications are subject to change without any notice / * optional

Wait, I thought this would be GDDR4 at 825MHz core? What gives :?:
 
Hard launch may appear on 1/24/08
Source


Wait, I thought this would be GDDR4 at 825MHz core? What gives :?:

Looks like the website is down :(

Anyway, it could be two flavors of 3870's

1. Radeon HD3870X2 GDDR3 775MHz core 900x2 Memory 1024MB.
2. Radeon HD3870X2 GDDR4 825MHz core 1200x2 Memory 1024MB - "Extreme Edition Radeon ".
 
Last edited by a moderator:
Looks like the website is down :(

Anyway, it could be two flavors of 3870's

1. Radeon HD3870X2 GDDR3 775MHz core 900x2 Memory 1024MB.
2. Radeon HD3870X2 GDDR4 825MHz core 1200x2 Memory 1024MB - "Extreme Edition Radeon ".

Source
Looks like 777MHz is the stock core speed even with ddr4 :cry:
 
Source
Looks like 777MHz is the stock core speed even with ddr4 :cry:

My VisionTek Radeon HD3870 Default 777core/1126mem GDDR4; but I easily clocked to 825core/1206mem for my everyday use.


I believe some ATI partners might clocked HD3870X2 825core/1206mem GDDR4.


Edit: 18K in 3DMark06 for 3870X2 pretty impressive compare to GF8800Ultra which gets 13K. "5,000 points over G80Ultra :)"

It could be push to 19k in 3DMark06 if it be clocked 825core/1206mem.
 
Last edited by a moderator:
Being that the tester is on an XP box with a Wolfdale quad-core, I'm not that impressed. I'm on Vista64 with a Conroe dual-core and I hit 17.5k on my pair of 3870's in crossfire. Gimmie a 45nm quad extreme and a copy of WinXP32 and I'd be knocking on 20k myself.
 
I'm not that impressed...I hit 17.5k on my pair of 3870's in crossfire
Erm :???: Why should 3870s in crossfire getting about the same score as some other 3870s in crossfire be something to be impressed by :?:
I'd call it expected.
 
Because a large segment of fanboi's both on this forum and elsewhere keep regurgitating their hopes that the 3870X2 is going to perform better than a standard pair of individual 3870's.

"Wow, look at the 3DMark performance... 18K!!!OMGWTFBBQSAUCE!"

I have the exact same feelings you do hoom -- this isn't impressive, this is expected. I don't see why people are so hoorah over the performance, when we already knew approximately how it was going to perform as soon as they confirmed the chips in it.

I'm not saying it'll be crap when comes out at ~$450USD, as that's less than the price of two individual 3870's in most cases. And I wouldn't mind seeing some "quadfire" results either ;) But I think we need to squash the pie-in-the-sky performance wantings...
 
Back
Top