The LAST R600 Rumours & Speculation Thread

Status
Not open for further replies.
i think its insane to think the chip was delayed for anything other than technical issues, whether it be yields, performance etc. and the 8800 doesnt have lower power draw than the r580, under any circumstances
 
i think its insane to think the chip was delayed for anything other than technical issues, whether it be yields, performance etc. and the 8800 doesnt have lower power draw than the r580, under any circumstances
I beg to differ:
http://techreport.com/reviews/2006q4/geforce-8800/index.x?pg=16
http://www.neoseeker.com/Articles/Hardware/Reviews/bfg8800gts/10.html

Some dissenting reports:
http://www.bit-tech.net/hardware/2006/11/08/nvidia_geforce_8800_gtx_g80/18.html
http://www.anandtech.com/video/showdoc.aspx?i=2873&p=7

In either case, the power consumption has been measured to be very, very similar to the R580 under load, higher under idle, and one outlier (the bit-tech.net article).
 
i think its insane to think the chip was delayed for anything other than technical issues, whether it be yields, performance etc. and the 8800 doesnt have lower power draw than the r580, under any circumstances

I went over to a few sites that did do power draw tests and you are right, the 8800 does not have lower power consumption than the R580 (I am not sure why I thought they did). Thank you for the correction. They are fairly close though with Anandtech showing a 14 to 20 watt difference depending on the application. Very respectable considering the size and complexity of the G80 when compared to the r580.
 
zealotonous said:
ATI took the more aggressive approach and went to DDR4 memory and a 512 bit bus.

They've been using GDDR4 for a while now. I don't think continuing to use it would create a supply issue. More chips maybe but supply of GDDR4 doesn't seem to have been a problem.

Also the power requirements don't seem to be that extreme. Marginally more than G80 but not necessarily in unheard of territory. And if I recall, according to measurements it wouldn't be any larger than an 8800GTX.

Also your power figures are incorrect. G80 uses about 40W more than a 1950XTX while idling. About 15W more under load. Given the connectors R600 was likely to use it couldn't use more than ~40W more than an 8800GTX under full load.
 
The R600 isn't broken and neither is its support of DX10, there are working cards out there.

I don't see this as being a performance (heat, noise) issue, they would have known sooner. I also don't think it's likely that they just got the latest spin back and found it broken when they were planning to launch end of March.

There should be no problems with RAM availability either...

Not sure what it was at this point...
 
I beg to differ:
http://techreport.com/reviews/2006q4/geforce-8800/index.x?pg=16
http://www.neoseeker.com/Articles/Hardware/Reviews/bfg8800gts/10.html

Some dissenting reports:
http://www.bit-tech.net/hardware/2006/11/08/nvidia_geforce_8800_gtx_g80/18.html
http://www.anandtech.com/video/showdoc.aspx?i=2873&p=7

In either case, the power consumption has been measured to be very, very similar to the R580 under load, higher under idle, and one outlier (the bit-tech.net article).

Chalnoth, thank you! I knew I had seen a graph that the G80 had lower power consumption. i just couldn't remember the review site(s). The only site I looked at with power consumption was Anandtech and their numbers showed the G80 consumed just a bit more power than the r580 which is an amazing feat considering the power/performance ratio.
 
They've been using GDDR4 for a while now. I don't think continuing to use it would create a supply issue. More chips maybe but supply of GDDR4 doesn't seem to have been a problem.

Also the power requirements don't seem to be that extreme. Marginally more than G80 but not necessarily in unheard of territory. And if I recall, according to measurements it wouldn't be any larger than an 8800GTX.

Also your power figures are incorrect. G80 uses about 40W more than a 1950XTX while idling. About 15W more under load. Given the connectors R600 was likely to use it couldn't use more than ~40W more than an 8800GTX under full load.

Well, I don't know as I am merely speculating and do not have insider info. I am fairly certain that this is not the result of a change in strategy. Only throwing out some possibilities. Chalnoth posted some alternative results on power numbers which seem to redeem my earlier statement. Hopefully we'll know soon enough.
 
amdr600mi5.png


They discovered something wrong with the engineering sample board?
 
I beg to differ:
http://techreport.com/reviews/2006q4/geforce-8800/index.x?pg=16
http://www.neoseeker.com/Articles/Hardware/Reviews/bfg8800gts/10.html

Some dissenting reports:
http://www.bit-tech.net/hardware/2006/11/08/nvidia_geforce_8800_gtx_g80/18.html
http://www.anandtech.com/video/showdoc.aspx?i=2873&p=7

In either case, the power consumption has been measured to be very, very similar to the R580 under load, higher under idle, and one outlier (the bit-tech.net article).

im kind of confused, every link shows the 8800 gtx to suck more power than the r580 under all circumstances, unless ur talking about the gts.
 
Maybe ATI still could not produce enough working GPU-silicons to justify the demand and they just found out a lot of GPU-silicons came back bad for the final production of them.
 
im kind of confused, every link shows the 8800 gtx to suck more power than the r580 under all circumstances, unless ur talking about the gts.

The techreport link shows the 8800GTX under the X1950XT when under load. Either way, The two cards are very close in power consumption and the 8800GTX is alot bigger chip.
 
Almost equal according to HFR

IMG0018436.gif


The R600 isn't broken and neither is its support of DX10, there are working cards out there.

I don't see this as being a performance (heat, noise) issue, they would have known sooner. I also don't think it's likely that they just got the latest spin back and found it broken when they were planning to launch end of March.

There should be no problems with RAM availability either...

Not sure what it was at this point...

Well, it's because you have a working silicon that it is not bugged...
 
Icecold. I was looking at the X1900XT. That is R580 is it not? Guess I got them confused. Either way the watt difference between either the X1900XT, X1950XT and 8800GTX is minimal. And to make a fuss over it is splitting hairs and I really dont think it changes the point that the 8800GTX is pretty dang power efficient compared to the R580.
 
Icecold. I was looking at the X1900XT. That is R580 is it not? Guess I got them confused. Either way the watt difference between either the X1900XT, X1950XT and 8800GTX is minimal. And to make a fuss over it is splitting hairs and I really dont think it changes the point that the 8800GTX is pretty dang power efficient compared to the R580.

not making a fuss, im just nit picky, obv the 8800 is a fantastic product
 
amdr600mi5.png


They discovered something wrong with the engineering sample board?

Good idea, but when its true time come for AMD to find the personal responsibility for all this things what happend in the last 18 months in ATi.
I don't think so shareholders are happy with this situation.
I know dekstop gpu section only a little part of the company,but still has the biggest hype, no one talking about the 10million+ xenos gpu, but many user talking about R600 delay, and now its negativ for the whole company.
 
You might want to recheck that math. You're assuming that G80 at full load uses all 225 watts available to it.
Just like our pal Fud-o does....

Fud-o said:
Two power six-pin connectors means that the card gets 2x75W from the cables plus an additional 75W from the PCIe bus. This brings total power consumption to an earth-heating 225W.

I can't help feeling the Inquirer is going to lose much of what makes it special when Fud-o leaves them at the end of the month.
 
trinibwoy said:
You might want to recheck that math. You're assuming that G80 at full load uses all 225 watts available to it.

Where did I assume G80 uses all 225W? The only thing I assumed is that the power connectors rumored to be on R600 wouldn't let it exceed 225W. G80 uses ~170W, allow 15W padding with the 225W max and R600 can't be more than ~40W more. Where was I messing up with that math?
 
Status
Not open for further replies.
Back
Top