R580+

Ah forgot about that. Maybe they're using the move to lower power consumption and not increase performance = quieter cooler.
 
no-X said:
HIS or Sapphire produce cards which are clocked over 650MHz. So why to stay at 650MHz, if the core has a lot of reserves and the card will be equipped with new (probably better) cooler?

The cooler isn't bad just noisy
 
karlotta said:
So 80nm wont be a help... im sure it will
Just look at how 110nm didn't help clocks at all. In fact, it hurt them a lot. It was used because smaller chips = more per wafer = cheaper. And because it reduced power consumption.
 
Sorry but is this the R580 at 90nm with add respins or a 80nm chip? Didnt they cancel the 80nm node for R580 or am i wrong at that.
 
swaaye said:
Just look at how 110nm didn't help clocks at all. In fact, it hurt them a lot. It was used because smaller chips = more per wafer = cheaper. And because it reduced power consumption.
The problem wasn't 110nm, but missing low-k. (ATi used 130nm low-k -> 110nm w/o low-k - nVidia didn't use low-k (130nm w/o low-k -> 110nm w/o low-k) and the smaller process helped them to raise clocks)
 
trinibwoy said:
Doesn't look like the R580 core has a lot of reserves. Especially not to produce 700Mhz parts consistently.
It would certainly worsen their yields on it, while in the same context they had planned to introduce it relatively cheap. So it´s not so much about just "how capable" the core actually is, but rather their goal to place those cards very competitive, more silent and still have acceptable margins on it. I´ve actually had a post done some time ago that outlines pretty much all that in our R600 thread.

Whether they´ve decided to ramp-up their R580 as a 80nm shrink is more of a secondary thing to me, cause they will pair it with GDDR4, which - as already been quoted here - has rather drastically cut lower power consumption, which gives them some room to use their "new" cooler design, which should be done by now.
 
Last edited by a moderator:
Sobek said:
The Inquirer eh? Hmmm.

So r580+ is a refresh of current boards, fitted with GDDR4? Any idea if an r580+ would be at all compatible in a Crossfire situation with a standard x1900xt/x?
I see no reason why it wouldn't be, considering you can already pair different speeds of X1900s asynchronously.
 
The Baron said:
I see no reason why it wouldn't be, considering you can already pair different speeds of X1900s asynchronously.

The different type of memory wouldnt cause a problem? Since a on a X1800 master card if you have an X1800 with only 256MB of memory it will cut the master cards in half, I figured memory would have something to do with Crossfire and therefore cause problems with the R580 and R580+ running in Crossfire together.
 
Skrying said:
The different type of memory wouldnt cause a problem? Since a on a X1800 master card if you have an X1800 with only 256MB of memory it will cut the master cards in half, I figured memory would have something to do with Crossfire and therefore cause problems with the R580 and R580+ running in Crossfire together.
Well, I'm assuming that you can run X1900s with different memory speeds. I don't see why that would be a problem--it's still a 512MB card, just with faster RAM.
 
The Baron said:
Well, I'm assuming that you can run X1900s with different memory speeds. I don't see why that would be a problem--it's still a 512MB card, just with faster RAM.

I suppose that makes sense in a way. One would probably expect the x1950xt(x?) to just cut it's memory speed down to match the x1900xt if necessary.

I hope ATI explains this at some stage. I'm looking at replacing one of my x1900's and an r580+ looks tasty enough :p
 
Less heat, better cooler... I'm glad I resisted the impulse to buy a X1900XT for €340 last month.
 
Dailytech has snapped up the first picture

20p4pr9.png


Groovy!
 
I immediately found myself humming "Jetsons!" theme song. Groovy, Daddy-o. Hopefully it actually works too.
 
Back
Top