AMD confirms R680 is two chips on one board

So based on reviews, the HD3870 ala RV670 clocked at 775Mhz on the core is quite hot even with the dual slot HSF combo.

Judging from the leaked pics, the HSF on the R680 doesnt have any heatpipes. So im assuming that inorder to keep heat under control (not to mention that the HS seems to be linked for both cores), each core is going to be clocked lower than the RV670.
Well it's hot not because the cooling is at its limit but because the cooling is designed to only increase fan speed when it's REALLY hot. I wouldn't see any problem if those chips had the same clock as on 3870 - should still be comparable in power draw to a single HD2900XT... Though I guess cooling could definitely be more noisy (at full load at least).
But, whats better. Dual GPU on one long PCB OR One GPU each on two short PCB ala GX2?
Well of course 2 dual gpu pcbs :)
Finally.. Things are getting exciting in the high end. Its been long overdue.
I dunno I'm seriously unimpressed by SLI, Crossfire, or any on-pcb version of these. The mainstream performance parts though OTOH are really impressive :)
 
So based on reviews, the HD3870 ala RV670 clocked at 775Mhz on the core is quite hot even with the dual slot HSF combo.

Judging from the leaked pics, the HSF on the R680 doesnt have any heatpipes. So im assuming that inorder to keep heat under control (not to mention that the HS seems to be linked for both cores), each core is going to be clocked lower than the RV670.

I guess performance can be roughly matched by a pair of HD3870s which also means it suffers from crossfire limitations.

But, whats better. Dual GPU on one long PCB OR One GPU each on two short PCB ala GX2?

Finally.. Things are getting exciting in the high end. Its been long overdue.

HD3870_65.jpg

http://www.hardwarecanucks.com/foru...0-review-crossfire-performance-preview-9.html

You have to use proper cooler for Radeon HD3870.
 
While back I read somewhere that R680 or 2X-RV670 on same PCB will be faster vs. Nvidia GF8800Ultra about ~15% overall due to short distance travel has to be done between 2X-RV670 GPU's.

In the Radeon HD 3800 presentation AMD gave us, there was a very interesting slide in the end called 'And there's more...'. According to that specific slide, AMD hints that their upcoming high-end GPU will be faster than Nvidia’s GeForce 8800 GTX-Ultra by 15%. The same slide also suggests a price tag of $399-499 USD and availability somewhere around January (New Year). According to the rumors around, AMD is going to name that product, as the Radeon HD 3870 X 2; but only time will tell. Anyway, since the NDA of the Radeon HD 3800 series has been expired, I have attached a screenshot from the presentation above.
Another source told us that R680 could reach 20K in 3DMark06 or rather the target AMD is trying to achieve.

http://www.ngohq.com/news/12672-amd-upcoming-gpu-will-faster-than-8800-gtx-ultra-15-a.html
http://www.vr-zone.com/articles/R680_To_Score_20K_In_3DMark06/5325.html
 
Last edited by a moderator:
I like AMD..err ATI :p and all but theyve promised things in the past, how they'll beat certain GeForce cards and whatnot and in the end they failed to deliver.

I dont expect this card to beat an 8800Ultra. I'd be surprised if it beats a G92 GTS mind you :cry:

edit lol
 
I like AMD..err ATI :p and all but theyve promised things in the past, how they'll beat certain GeForce cards and whatnot and in the end they failed to deliver.

I dont expect this card to beat an 8800Ultra. I'd be surprised if it beats a G92 GTS mind you :cry:

edit lol

d31280.gif

http://www.firingsquad.com/hardware/amd_ati_radeon_hd_2900_xt_performance_preview/page16.asp

I think Dual RV670 will do better on same PCB.

Edit: If Nvidia won't deliver Dual G92 about same time frame when ATI will: AMD could claim they overtaken the crown from Nvidia with their top dog GF8800Ultra. (I do think that R680 will beat G80Ultra)
 
Last edited by a moderator:
So based on reviews, the HD3870 ala RV670 clocked at 775Mhz on the core is quite hot even with the dual slot HSF combo.

Judging from the leaked pics, the HSF on the R680 doesnt have any heatpipes. So im assuming that inorder to keep heat under control (not to mention that the HS seems to be linked for both cores), each core is going to be clocked lower than the RV670.

I guess performance can be roughly matched by a pair of HD3870s which also means it suffers from crossfire limitations.

But, whats better. Dual GPU on one long PCB OR One GPU each on two short PCB ala GX2?

Finally.. Things are getting exciting in the high end. Its been long overdue.

Well, the 3870s are hot because they tackled the noise issue in typical ATi fashion(and if you're thinking that means designing a far better cooling solution, you haven't been following ATi's approach to GPU cooling:) ):they simply don't spin-up the fan, not even when it hits 90 degrees. Thanks ATi, it's much better to have a space heater rather than a bit of noise...no, really, I appreciate a thingie that dumps a metric fuckton of heat into my case(yes, i know it blows hot air out of it, but the back of the card sure as heck will get hot, and that heat'll be dumped into the case).

PPl have experimented with it and once you start spinning the fan up, via RivaTuner for example, temps drop nicely before it gets obnoxiously noisy(think of x1900/hd 2900 noisy as obnoxiously noisy). But everyone's gushing over how silent the cooling is at stock and how great 55nm as a process is...well, sure it's silent, a 2900 stuck at 9% duty cycle is also quite silent, and would probably achieve similar temperatures. So if it's a bug, please fix it.
 
and if you're thinking that means designing a far better cooling solution, you haven't been following ATi's approach to GPU cooling:)
Partners are free to do as they choose. However, if you want heat pipes, etc. you'll be paying extra for them. Thats the end users choice.

As for "dumping a load of heat in the case" you'll find that HD 3870 is disappating much less heat than other solutions out there, certainly than HD 2900.
 
Representatives from AMD would not confirm that the R680 is essentially two RV670 GPU cores on the same board, though the company did confirm that each core has the same specifications of an RV670 processor.
http://www.dailytech.com/article.aspx?newsid=10033

Exactly!
RV670 maybe based or has similarities to R680 but there also, most likely, is more to R680 than what meets the eye.

Awhile back I had postulated that Rv670 is R680 with something disabled or R680 has some extra/added specs/features.
 
I'm still holding out hope that R680 will use some form of ring bus stop (or some kind of expansion of the internal bus to the second chip) for the multi-chip interconnect that doesn't see the same performance drop as when games aren't CrossFire 'compatible'. If it doesn't then I wonder how ATI/AMD will survive next year...
 
Partners are free to do as they choose. However, if you want heat pipes, etc. you'll be paying extra for them. Thats the end users choice.

As for "dumping a load of heat in the case" you'll find that HD 3870 is disappating much less heat than other solutions out there, certainly than HD 2900.

You're probably right, given the target-market of the 3870...but given the target-market the X2 is supposedly aimed at, asking for heat-pipes etc. isn't too much, nor is worrying about paying extra a primary aspect of the purchase decision. So I've got my fingers crossed that you won't take the same approach with the X2, and actually make it an enthusiast board in all respects:D
 
That thing won't fit into most of the more compact midi-towers (just like the 8800 GTX) and there are a lot of these out there. I'm really beginning to get late 3Dfx vibes from ATI - try to remain competitive by tacking more GPUs on one PCB. And if there's any truth to this then that's all they'll have to offer throughout 2008.

Kind of like how nVidia brought back SLI after assuring us for years that dual-card solutions were not in the cards, you mean....?.....;) Those kind of 3dfx vibes? Frankly, if they can do Crossfire X on a single pcb, that seems vastly more preferable to me than one gpu on two twinned pcbs. But we'll see. So far, the crossfire/SLI on a card concept hasn't really worked very well for anyone who's tried it, including both 3dfx and ATi. I would think though that as the technology implemented for concept today is far better than it was then that there's at least a much better chance of seeing much better results. It'll be kind of cool if they can pull it off.

Wanted to add that what would be really cool would be if they could figure out how to make the Crossfire scaling on a single pcb transparent to the OS and drivers, so that you could effectively gain the benefit of 2x the performance without having to worry over custom driver CrossfireX support--which I think should only be necessary anyway if you are trying to twin two or more independent pcbs occupying separate slots.
 
Last edited by a moderator:
Did not quite understand here....

I was being sarcastic. The benchmarks you pointed out are rather irrelevant, given the fact that every solution tested passes with flying colors, and that's not a very demanding test to begin with. Find a test that shows a game is playable on one solution and unplayable on another, and then we'll talk.
 
Did not quite understand here....
Point being that the lowest score there is 82.4fps. Who can tell the difference between 82.4fps and 144.9fps?

dumps a metric fuckton of heat into my case
You're not understanding the difference between the equilibrium temperature of the cooler and the thermal output of the chip.

Say you have two chips that both do the same thing:
A 100W chip with 'poor' cooling that runs at 90 degrees.
A 200W chip with 'good' cooling that runs at 40 degrees.

I'll take the 100W, 90 degree chip any day as long as that chip has been designed to run at that temperature.
 
Point being that the lowest score there is 82.4fps. Who can tell the difference between 82.4fps and 144.9fps?

You're not understanding the difference between the equilibrium temperature of the cooler and the thermal output of the chip.

Say you have two chips that both do the same thing:
A 100W chip with 'poor' cooling that runs at 90 degrees.
A 200W chip with 'good' cooling that runs at 40 degrees.

I'll take the 100W, 90 degree chip any day as long as that chip has been designed to run at that temperature.

Ya think?Maybe I was exaggerating just a bit:D But if you're arguing for a scenario where a chip is building its 100W of heat and the heat stays there due to the fan supposedly exhausting it moving too slowly, and keeping most of it there, compared to one that has a 200W chip whose cooling solution properly exhausts heat, allowing no buildup, I'm not sure I agree. A stationary heat-pocket is nothing to be uber happy with. Nor is the likelihood of the design goals for the RV670 chip to have been meant to run at 90 degrees+ under load all that great, IMHO.

Most probably, this was done in order for everyone and their dog to harp about how silent the new Radeons are and how the curse of noiseness has finally been broken, with future drivers/bios patches(the former being far less likely) changing its behaviour and warranting some more hi5s to the extent of:good going guys, handled the temperature issues in a great fashion, please ATi be the father of my child:D. Which is a great strategy after all, and I'm not really ragging on it...I actually appreciate it, to be frank.
 
I was being sarcastic. The benchmarks you pointed out are rather irrelevant, given the fact that every solution tested passes with flying colors, and that's not a very demanding test to begin with. Find a test that shows a game is playable on one solution and unplayable on another, and then we'll talk.

The point I was trying to make that AMD could claim that they have the fastest video-card on planet. :) I mean R680.


Edit: If you would like to talk based on playable frame-rates and not the fastest card.

coh2560.gif


GF8800Ultra 59.4FPS (Playable yes)
R600-Crossfire 99.3FPS (Playable yes)

I'm pretty sure if you crank to 8XAA and 16AF @2560x1600x32 GF8800Ultra would hit in tougher spot to keep smooth frame rates all the time during long hour game play experience.
 
Last edited by a moderator:
The point I was trying to make that AMD could claim that they have the fastest video-card on planet. :) I mean R680.


Edit: If you would like to talk based on playable frame-rates and not the fastest card.

coh2560.gif


GF8800Ultra 59.4FPS (Playable yes)
R600-Crossfire 99.3FPS (Playable yes)

I'm pretty sure if you crank to 8XAA and 16AF @2560x1600x32 GF8800Ultra would hit in tougher spot to keep smooth frame rates all the time during long hour game play experience.

This is DX9 COH, if I'm not mistaken...again not something truly taxing, and the cranking up this and that in ancient games bears little relevance IMHO. Also consider that the R680 will most likely be up against the D9E almost as soon as it's released so....
 
Back
Top