icecold1983
Banned
i think its insane to think the chip was delayed for anything other than technical issues, whether it be yields, performance etc. and the 8800 doesnt have lower power draw than the r580, under any circumstances
I beg to differ:i think its insane to think the chip was delayed for anything other than technical issues, whether it be yields, performance etc. and the 8800 doesnt have lower power draw than the r580, under any circumstances
i think its insane to think the chip was delayed for anything other than technical issues, whether it be yields, performance etc. and the 8800 doesnt have lower power draw than the r580, under any circumstances
zealotonous said:ATI took the more aggressive approach and went to DDR4 memory and a 512 bit bus.
I beg to differ:
http://techreport.com/reviews/2006q4/geforce-8800/index.x?pg=16
http://www.neoseeker.com/Articles/Hardware/Reviews/bfg8800gts/10.html
Some dissenting reports:
http://www.bit-tech.net/hardware/2006/11/08/nvidia_geforce_8800_gtx_g80/18.html
http://www.anandtech.com/video/showdoc.aspx?i=2873&p=7
In either case, the power consumption has been measured to be very, very similar to the R580 under load, higher under idle, and one outlier (the bit-tech.net article).
They've been using GDDR4 for a while now. I don't think continuing to use it would create a supply issue. More chips maybe but supply of GDDR4 doesn't seem to have been a problem.
Also the power requirements don't seem to be that extreme. Marginally more than G80 but not necessarily in unheard of territory. And if I recall, according to measurements it wouldn't be any larger than an 8800GTX.
Also your power figures are incorrect. G80 uses about 40W more than a 1950XTX while idling. About 15W more under load. Given the connectors R600 was likely to use it couldn't use more than ~40W more than an 8800GTX under full load.
I beg to differ:
http://techreport.com/reviews/2006q4/geforce-8800/index.x?pg=16
http://www.neoseeker.com/Articles/Hardware/Reviews/bfg8800gts/10.html
Some dissenting reports:
http://www.bit-tech.net/hardware/2006/11/08/nvidia_geforce_8800_gtx_g80/18.html
http://www.anandtech.com/video/showdoc.aspx?i=2873&p=7
In either case, the power consumption has been measured to be very, very similar to the R580 under load, higher under idle, and one outlier (the bit-tech.net article).
im kind of confused, every link shows the 8800 gtx to suck more power than the r580 under all circumstances, unless ur talking about the gts.
The R600 isn't broken and neither is its support of DX10, there are working cards out there.
I don't see this as being a performance (heat, noise) issue, they would have known sooner. I also don't think it's likely that they just got the latest spin back and found it broken when they were planning to launch end of March.
There should be no problems with RAM availability either...
Not sure what it was at this point...
Icecold. I was looking at the X1900XT. That is R580 is it not? Guess I got them confused. Either way the watt difference between either the X1900XT, X1950XT and 8800GTX is minimal. And to make a fuss over it is splitting hairs and I really dont think it changes the point that the 8800GTX is pretty dang power efficient compared to the R580.
They discovered something wrong with the engineering sample board?
Almost equal according to HFR
Well, it's because you have a working silicon that it is not bugged...
Given the connectors R600 was likely to use it couldn't use more than ~40W more than an 8800GTX under full load.
Just like our pal Fud-o does....You might want to recheck that math. You're assuming that G80 at full load uses all 225 watts available to it.
Fud-o said:Two power six-pin connectors means that the card gets 2x75W from the cables plus an additional 75W from the PCIe bus. This brings total power consumption to an earth-heating 225W.
trinibwoy said:You might want to recheck that math. You're assuming that G80 at full load uses all 225 watts available to it.