Radeon 2900 Pro?

Discussion in '3D Hardware, Software & Output Devices' started by pelly, Jul 31, 2007.

  1. GrapeApe

    Newcomer

    Joined:
    Apr 3, 2004
    Messages:
    57
    Likes Received:
    2
    Location:
    Calgary, Canada
    What is your link supposed to prove?
    The review shows the HD2600XT drawing 6W less than the GF8600GT, all with more transistors and higher clock speed. So the power consumption figures were correct according to your link, with the HD2600Pro drawing less than the card the GF8500GT which has half the transistors.

    Would also be more convincing if it was the same platform for the testing, or if the test were for the power draw of the card, and not the entire system.

    Seriously of all the reviews out there you chose that one to focus on for your tangent, one that doesn't even really support your argument?

    Strange choice.
     
  2. Rand

    Newcomer

    Joined:
    Jan 7, 2005
    Messages:
    12
    Likes Received:
    0
    That link hardly justifies tremendous concern over the power consumption of the RV630, indeed it looks extremely good IMO.

    Under load the 2600XT consumes less power then the 8600GT, and the 2600Pro consumes almost 10W less then the 8500GT. The 2400T consumes far less then any of them.

    Those aren't figures that are going to make the RV630 look poor relative to it's immediate competition.

    The idle power consumption tends to be higher, but it's the peak power consumption that people are going to be worried about when purchasing a PSU.

    The power requirements of any of those GPU's isn't inordinately harsh. Remember the listed power consumption is for the whole system not just the GPU alone. Under 200W for even a system built around an X6800 + 4GB RAM + 8600GTS, should be more then adequate for your average gamer and 200W isn't going to require an incredibly expensive PSU.
     
  3. vertex_shader

    Banned

    Joined:
    Sep 8, 2006
    Messages:
    961
    Likes Received:
    14
    Location:
    Far far away
    Its prove rv630 not have great performance/watt ratio than its hyped for, and all this at 65nm against 80nm g84, i talking about hd2600XT and 8600GT, lower cards unusable for gameing.
    Reviewers always show the whole system power usage, its enough to prove the watt usage because the system and the applicaiton running the card under load is the same.
     
  4. Snyder

    Regular

    Joined:
    Feb 10, 2002
    Messages:
    609
    Likes Received:
    8
    Location:
    Vienna, Austria
    Where?
     
  5. GrapeApe

    Newcomer

    Joined:
    Apr 3, 2004
    Messages:
    57
    Likes Received:
    2
    Location:
    Calgary, Canada
    Which is also 65nm vs 80nm, and also is still more transistors and speed versus less transistors and speed. And for every review that shows the GF8600GT being better performing than the HD2600XT, there's another that shows the inverse, it's greatly dependant on the app/game and the settings.

    No, reviews don't always show system power, some do, and some don't; like Xbit Labs which created it's own test rig for measuring power consumption of the card alon. "The mainboard in this testbed was specially modified: we connected measurement shunts into the power lines of the PCI Express x16 slot and equipped them with connectors to attach measuring instruments. We also added such a shunt to a 2xMolex → PCI Express adapter. The measurements were performed with a Velleman DVM850BL multimeter (0.5% accuracy)."

    http://www.xbitlabs.com/articles/video/display/r600-architecture_14.html#sect0

    Also, your assumption that two sepearate motherboards and setups will draw the same is a strange assumption. On the very same MoBo it it would be 'safer' to assume that, but still adding an additional variable to the equation. So as Xbit points out, some people Naively assume that it doesn't matter, but they proved otherwise last week;
    http://www.xbitlabs.com/articles/mainboards/display/ga-p35c-ds3r_17.html#sect0

    And that's the same chipset showing 20-30W difference, whereas the Techreport review uses two completely different chipsets (intel / nVidia) and MoBos (Asus / XFX).
    So how accurate can that system figure be for looking at the cards when the differences between the graphics cards at idle are so close, and as the Xbit review shows that the difference between MoBos, let alone chipsets & MoBos, can be so high?

    So once again the link doesn't porvide much evidence for power consumption itself, and wouldn't provide anything to make the claims you make that's for sure, if anything it shows the opposite.
     
  6. AnarchX

    Veteran

    Joined:
    Apr 19, 2007
    Messages:
    1,559
    Likes Received:
    34
    2900Pro gets a little brother/sister:

    AMD preps Radeon HD 2900 Pro and GT for fall
    The question is, what will 2900GT be? 4 SIMDs each 60 SPs and 12 TMUs or will it be 3 full R600 SIMDs and 16 TMUs? I would bet 3 SIMDs/16 TMUs and low clocks (~500MHz) and low voltage to make a single-slot SKU resp. cheaper dual-slot.
     
  7. fellix

    Veteran

    Joined:
    Dec 4, 2004
    Messages:
    3,552
    Likes Received:
    514
    Location:
    Varna, Bulgaria
    The question here really is how much can you slice down the R600 core, without braking down into the mainstream line with such a huge GPU. And yes -- R600 is by a measure more flexible in scaling, compared to G80, but I think it doesn't have the quantity enough to manoeuver in a more wider range. Huh!
     
  8. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    19,426
    Likes Received:
    10,320
    Here's something that's been bugging me. What did they do with the R600 Pro to cut the boards power consumption so much that is is rated at <150 watts compared to R600 XT(X) that is rated at <225 watts.

    As far as I've seen the only difference between the two is that the R600 Pro only has a 256-bit memory interface. It still uses GDDR3.

    Does the memory interface take up THAT much juice? Or is it more that they'll be massively cutting down on the speed of both the GPU and memory? And thus probably also cutting the voltage to GPU and memory?

    Actually, now that I think about it, it's probably going to be a fair cut of speed and voltage that'll do the trick. Although I still wonder if the memory interface eats up a fair chunk of power.

    Regards,
    SB
     
  9. nicolasb

    Regular

    Joined:
    Oct 21, 2006
    Messages:
    421
    Likes Received:
    4
    As I recall, the actual power consumption of an R600 card (as distinct from the theoretical peak power that could be drawn through all of its connectors) is around 160W. So you wouldn't actually have to reduce it all that much in order for 150W theoretical peak draw to be safe.
     
  10. GrapeApe

    Newcomer

    Joined:
    Apr 3, 2004
    Messages:
    57
    Likes Received:
    2
    Location:
    Calgary, Canada
    If you look at the link to the XbitLabs review I posted above you see their tests showing 161.1W ;
    http://www.xbitlabs.com/images/video/radeonhd2000/test_results/2900xt_power.gif

    So I would consider that the 150W numbers are achieved by using GDDR4 instead of the GDDR3 of the XT-512, which we already know it's clocked at 800Mhz to achieve the 51GB/s bandwidth. Add a slower core and you'd get under the 150W barrier.
     
  11. vertex_shader

    Banned

    Joined:
    Sep 8, 2006
    Messages:
    961
    Likes Received:
    14
    Location:
    Far far away
    Link
     
  12. Albuquerque

    Albuquerque Red-headed step child
    Veteran

    Joined:
    Jun 17, 2004
    Messages:
    4,309
    Likes Received:
    1,107
    Location:
    35.1415,-90.056
    So the next question -- Can we get a 2900GT on AGP?

    :twisted: Sorry, couldn't resist... :twisted:
     
  13. AnarchX

    Veteran

    Joined:
    Apr 19, 2007
    Messages:
    1,559
    Likes Received:
    34
    :lol:

    At the moment I heard ATis DX10-AGPs does not support DX10, so why we need a cutdown R600 when there is already a R580XT, which should be faster and have a lower consumption. :wink:
     
  14. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    19,426
    Likes Received:
    10,320
    True, and considering they don't have to worry about overclocking and systems hosting a Workstation class card generally have more predictable and stable power feeds, I can see where they might not have to leave as much of a buffer.

    Although considering in the past, all their FireGL cards have been basically clones of their consumer cards, I'd think they're still planning on leaving quite a bit of leeway there.

    I'm guessing somewhere in the general area of 100-120 watts power draw for the R600 Pro. Especially if it's only going to rely on power provided by 1x6-pin power connector (as the <150 watts power consumption would lead me to believe.)

    Regards,
    SB
     
  15. vertex_shader

    Banned

    Joined:
    Sep 8, 2006
    Messages:
    961
    Likes Received:
    14
    Location:
    Far far away
    Link

    CJ not hinted anywhere 160SPU, i have no idea why they write this :smile:
     
  16. vertex_shader

    Banned

    Joined:
    Sep 8, 2006
    Messages:
    961
    Likes Received:
    14
    Location:
    Far far away
    Link
     
  17. Vincent

    Newcomer

    Joined:
    May 28, 2007
    Messages:
    235
    Likes Received:
    0
    Location:
    London

    Two versions of R600 will be available during next months :


    R600GT 240SP with 256 bit bus, in comparsion to R600XT/XTX.

    R600 Pro however will retain the 320SP with 256bit bus.

    Two Skus sooner will be brandished as the most hottest product and will be replaced by the RV6XX.

    The most profitable moment for either AMD and Nvidia is in the beginning of next year.

    Allowing the rumorsaid that G92 is a ASIC, which contains with 64 TCPs. depending on the price and cost among the mainstream, performance, High performance. Nvidia could launch the product without 200mm^2 while having 256bit Bus enabled.


    Last but not the least, as the previous data shows that , from NV42,G72, it is hard to predict the configuration of the G92 could be. Barely any leaked information is avaiable. Hopefully G92 is not a half scale of the G80.
     
  18. AnarchX

    Veteran

    Joined:
    Apr 19, 2007
    Messages:
    1,559
    Likes Received:
    34
    http://geizhals.at/eu/a280299.html
    Available early October
    :wink:
     
    #98 AnarchX, Sep 14, 2007
    Last edited by a moderator: Sep 14, 2007
  19. vertex_shader

    Banned

    Joined:
    Sep 8, 2006
    Messages:
    961
    Likes Received:
    14
    Location:
    Far far away
    "Maximaler Verbrauch: 150W" :runaway:

    This means 1x6pin or 1x8pin power connector.
    When its have only 1x6pin than its can't be clocked to xt level.

    Looks like the 512mb pro is in the same price segment than the 8800gts320mb, only performance numbers missing, i take my quess some weeks ago, i still keep it.
     
  20. Albuquerque

    Albuquerque Red-headed step child
    Veteran

    Joined:
    Jun 17, 2004
    Messages:
    4,309
    Likes Received:
    1,107
    Location:
    35.1415,-90.056
    :???: There is no interface specification for DX10 -- where did you get that idea? In theory, an ISA-based video card could support DX10, albeit very slowly. AGP 8x and PCI-E 16x benchmarks show about a 15% difference in maximum transfer speeds; real-world applications show almost no difference at all.

    So, again, there is no interface specification for DX10 or DX10.1.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...