The LAST R600 Rumours & Speculation Thread

Discussion in 'Architecture and Products' started by Geo, Jan 2, 2007.

Thread Status:
Not open for further replies.
  1. icecold1983

    Banned

    Joined:
    Aug 4, 2006
    Messages:
    649
    Likes Received:
    4
    i think its insane to think the chip was delayed for anything other than technical issues, whether it be yields, performance etc. and the 8800 doesnt have lower power draw than the r580, under any circumstances
     
  2. KimB

    Legend

    Joined:
    May 28, 2002
    Messages:
    12,902
    Likes Received:
    218
    Location:
    Seattle, WA
    I beg to differ:
    http://techreport.com/reviews/2006q4/geforce-8800/index.x?pg=16
    http://www.neoseeker.com/Articles/Hardware/Reviews/bfg8800gts/10.html

    Some dissenting reports:
    http://www.bit-tech.net/hardware/2006/11/08/nvidia_geforce_8800_gtx_g80/18.html
    http://www.anandtech.com/video/showdoc.aspx?i=2873&p=7

    In either case, the power consumption has been measured to be very, very similar to the R580 under load, higher under idle, and one outlier (the bit-tech.net article).
     
  3. zealotonous

    Newcomer

    Joined:
    Feb 23, 2007
    Messages:
    29
    Likes Received:
    0
    I went over to a few sites that did do power draw tests and you are right, the 8800 does not have lower power consumption than the R580 (I am not sure why I thought they did). Thank you for the correction. They are fairly close though with Anandtech showing a 14 to 20 watt difference depending on the application. Very respectable considering the size and complexity of the G80 when compared to the r580.
     
  4. Anarchist4000

    Veteran Regular

    Joined:
    May 8, 2004
    Messages:
    1,439
    Likes Received:
    359
    They've been using GDDR4 for a while now. I don't think continuing to use it would create a supply issue. More chips maybe but supply of GDDR4 doesn't seem to have been a problem.

    Also the power requirements don't seem to be that extreme. Marginally more than G80 but not necessarily in unheard of territory. And if I recall, according to measurements it wouldn't be any larger than an 8800GTX.

    Also your power figures are incorrect. G80 uses about 40W more than a 1950XTX while idling. About 15W more under load. Given the connectors R600 was likely to use it couldn't use more than ~40W more than an 8800GTX under full load.
     
  5. ants

    Newcomer

    Joined:
    Feb 10, 2006
    Messages:
    44
    Likes Received:
    3
    The R600 isn't broken and neither is its support of DX10, there are working cards out there.

    I don't see this as being a performance (heat, noise) issue, they would have known sooner. I also don't think it's likely that they just got the latest spin back and found it broken when they were planning to launch end of March.

    There should be no problems with RAM availability either...

    Not sure what it was at this point...
     
  6. zealotonous

    Newcomer

    Joined:
    Feb 23, 2007
    Messages:
    29
    Likes Received:
    0
    Chalnoth, thank you! I knew I had seen a graph that the G80 had lower power consumption. i just couldn't remember the review site(s). The only site I looked at with power consumption was Anandtech and their numbers showed the G80 consumed just a bit more power than the r580 which is an amazing feat considering the power/performance ratio.
     
  7. zealotonous

    Newcomer

    Joined:
    Feb 23, 2007
    Messages:
    29
    Likes Received:
    0
    Well, I don't know as I am merely speculating and do not have insider info. I am fairly certain that this is not the result of a change in strategy. Only throwing out some possibilities. Chalnoth posted some alternative results on power numbers which seem to redeem my earlier statement. Hopefully we'll know soon enough.
     
  8. Arty

    Arty KEPLER
    Veteran

    Joined:
    Jun 16, 2005
    Messages:
    1,906
    Likes Received:
    55
    [​IMG]

    They discovered something wrong with the engineering sample board?
     
  9. icecold1983

    Banned

    Joined:
    Aug 4, 2006
    Messages:
    649
    Likes Received:
    4
    im kind of confused, every link shows the 8800 gtx to suck more power than the r580 under all circumstances, unless ur talking about the gts.
     
  10. Shtal

    Veteran

    Joined:
    Jun 3, 2005
    Messages:
    1,344
    Likes Received:
    3
    Maybe ATI still could not produce enough working GPU-silicons to justify the demand and they just found out a lot of GPU-silicons came back bad for the final production of them.
     
  11. ChrisRay

    ChrisRay <span style="color: rgb(124, 197, 0)">R.I.P. 1983-
    Veteran

    Joined:
    Nov 25, 2002
    Messages:
    2,234
    Likes Received:
    26
    The techreport link shows the 8800GTX under the X1950XT when under load. Either way, The two cards are very close in power consumption and the 8800GTX is alot bigger chip.
     
  12. icecold1983

    Banned

    Joined:
    Aug 4, 2006
    Messages:
    649
    Likes Received:
    4
    [​IMG]
     
  13. Evildeus

    Veteran

    Joined:
    May 24, 2002
    Messages:
    2,657
    Likes Received:
    2
    Almost equal according to HFR

    [​IMG]

    Well, it's because you have a working silicon that it is not bugged...
     
  14. ChrisRay

    ChrisRay <span style="color: rgb(124, 197, 0)">R.I.P. 1983-
    Veteran

    Joined:
    Nov 25, 2002
    Messages:
    2,234
    Likes Received:
    26
    Icecold. I was looking at the X1900XT. That is R580 is it not? Guess I got them confused. Either way the watt difference between either the X1900XT, X1950XT and 8800GTX is minimal. And to make a fuss over it is splitting hairs and I really dont think it changes the point that the 8800GTX is pretty dang power efficient compared to the R580.
     
  15. icecold1983

    Banned

    Joined:
    Aug 4, 2006
    Messages:
    649
    Likes Received:
    4
    not making a fuss, im just nit picky, obv the 8800 is a fantastic product
     
  16. vertex_shader

    Banned

    Joined:
    Sep 8, 2006
    Messages:
    961
    Likes Received:
    14
    Location:
    Far far away
    Good idea, but when its true time come for AMD to find the personal responsibility for all this things what happend in the last 18 months in ATi.
    I don't think so shareholders are happy with this situation.
    I know dekstop gpu section only a little part of the company,but still has the biggest hype, no one talking about the 10million+ xenos gpu, but many user talking about R600 delay, and now its negativ for the whole company.
     
  17. vertex_shader

    Banned

    Joined:
    Sep 8, 2006
    Messages:
    961
    Likes Received:
    14
    Location:
    Far far away
    Maybe no respin needed, and AMD will start with A15 silicon?
     
  18. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    10,430
    Likes Received:
    433
    Location:
    New York
    You might want to recheck that math. You're assuming that G80 at full load uses all 225 watts available to it.
     
  19. Coz

    Coz
    Newcomer

    Joined:
    Dec 16, 2004
    Messages:
    36
    Likes Received:
    1
    Location:
    Kent, England
    Just like our pal Fud-o does....

    I can't help feeling the Inquirer is going to lose much of what makes it special when Fud-o leaves them at the end of the month.
     
  20. Anarchist4000

    Veteran Regular

    Joined:
    May 8, 2004
    Messages:
    1,439
    Likes Received:
    359
    Where did I assume G80 uses all 225W? The only thing I assumed is that the power connectors rumored to be on R600 wouldn't let it exceed 225W. G80 uses ~170W, allow 15W padding with the 225W max and R600 can't be more than ~40W more. Where was I messing up with that math?
     
Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...