AMD: R7xx Speculation

Discussion in 'Architecture and Products' started by Unknown Soldier, May 18, 2007.

Thread Status:
Not open for further replies.
  1. 2senile

    Regular

    Joined:
    Feb 19, 2003
    Messages:
    317
    Likes Received:
    0
    Location:
    Fantasy Land
    Oooopps!

    Missed it. That's the trouble with a big thread. *runs away with embarrassment*
     
  2. Shtal

    Veteran

    Joined:
    Jun 3, 2005
    Messages:
    1,344
    Likes Received:
    4
    ATI RV770 = ATI R650

    I remember long time ago when Dave Orton said R6xx GPU generation will go up to 96SP's while R600 was 64SP's.

    I believe R650 is renamed RV770 with DX10.1 add-on.
     
  3. pjbliverpool

    pjbliverpool B3D Scallywag
    Legend

    Joined:
    May 8, 2005
    Messages:
    9,237
    Likes Received:
    4,260
    Location:
    Guess...
    Even if 1000w were needed for the highest of the high ends (which looking at Brits post its not even close), that wouldn't translate into being what it takes to play the latest PC games.

    My system runs fine with a C2D and 8800GTS 640MB on a 430w PS. And it runs all the latest games just fine. A Penryn coupled with a 9600GT would draw even less power for a similar experience.

    It looks as though ATI is trying to take power efficiency into account with RV7xx aswell so hopefully we won't see any serious increases in power requirements this generation.
     
  4. ShaidarHaran

    ShaidarHaran hardware monkey
    Veteran

    Joined:
    Mar 31, 2007
    Messages:
    4,027
    Likes Received:
    90
    Yeah but not for performance, at least not in the sense that NV uses clock domains. ATi only does it to lower power consumption/heat output (lower clocks in "2D" mode).
     
  5. Dave Baumann

    Dave Baumann Gamerscore Wh...
    Moderator Legend

    Joined:
    Jan 29, 2002
    Messages:
    14,090
    Likes Received:
    694
    Location:
    O Canada!
    Thats not differing clock domains, thats engine scaling. Different things.
     
  6. ShaidarHaran

    ShaidarHaran hardware monkey
    Veteran

    Joined:
    Mar 31, 2007
    Messages:
    4,027
    Likes Received:
    90
    So do ATi GPUs use multiple clock domains as others have claimed?
     
  7. w0mbat

    Newcomer

    Joined:
    Nov 18, 2006
    Messages:
    234
    Likes Received:
    5
    Yes, they have a higher clock delta for the ALUs AFAIK.
     
  8. silent_guy

    Veteran Subscriber

    Joined:
    Mar 7, 2006
    Messages:
    3,754
    Likes Received:
    1,382
    It can say for a fact that ATI has been using multiple clock domains since the early nineties.

    VGA interface clock domain, PCI clock domain, memory clock domain, GPU engine clock domain. :wink:

    So when SirEric claims in an interview that they have a whole bunch of them, he's exactly right. He's also not saying anything informative, as long as he doesn't claim whether or not it has decoupled shader vs rest of the GPU engine clocks.
     
  9. R300King!

    Newcomer

    Joined:
    Aug 4, 2002
    Messages:
    231
    Likes Received:
    5
    http://beyond3d.com/content/interviews/39/7

     
  10. silent_guy

    Veteran Subscriber

    Joined:
    Mar 7, 2006
    Messages:
    3,754
    Likes Received:
    1,382
    Exactly my point: a completely meaningless remark that pretends to answer the question. Clock domains are well understood by everyone. These days, there's not a chip in the world that doesn't have multiple clock domains.
    As long as he doesn't specify exactly what they are used for, he might as well have said that the sky is blue: same amount of information content.

    Edit: by not being specific enough, the interviewer made it extremely easy to answer the question without revealing anything useful. :wink:
     
  11. AlexV

    AlexV Heteroscedasticitate
    Moderator Veteran

    Joined:
    Mar 15, 2005
    Messages:
    2,535
    Likes Received:
    144
    Chances are high that was his(or those who combed the answers) goal:D
     
  12. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    19,426
    Likes Received:
    10,320
    Well if you consider 300ish USD for the 9800 GTX a truly top end price, then I'd have to agree with you. :)

    If not for ATI and their renewed focus on Performance in the Mid-Range. That 9800 GTX would probably have shown up as a 8900 GTX, in the past, with a price much much closer to the 8800 GTX Ultra.

    It can be argued that ATI is only focused on performance in the Mid-Range because they can't currently compete in the high end enthusiast space. But regardless of WHY they are focused there, it HAS brought down prices of all video cards quite significantly.

    Something I'm sure Nvidia isn't very fond of.

    And considering ATI appears to be sticking to this focus for their cards (affordable performance) it should keep prices for video cards down in general. Although I'm sure there will always be room for the Uber Special Edition Super Expensive video card. I suppose a GX2 or X2 could be considered in this segment, but meh.

    Yes, I do somewhat miss the heady days of constant and rapid releases of Video cards that pushed the performance envelope. But my wallet and energy bill is thanking ATI for pushing prices back down, even if it may not have been by choice.

    Regards,
    SB
     
  13. Valzic

    Newcomer

    Joined:
    Aug 1, 2005
    Messages:
    168
    Likes Received:
    0
    Location:
    Ontario, Canada
    A lot of people know that. However when you got money to burn and want the best, 1000watts sounds uber.

    This kid in the last year had had a 2900xt replaced by 3 - 3870XTs (lasted enough for a few weeks of that uber game the internet is enamored of - 3DMark LOL) and then replace by a nvidia card.) Throw in new case, raid raptors, a few TB harddrives, Q6600 .... You are getting the picture right. Yous just MUST have ne of those 1000Watt power supplies.

    Its his hobby and he doesn't have to pay for it.

    Me - I have a pretty decent system (for work and play) but I have a gasp 2600xt... something I bought as a fill-in for new Motherboard until i decided what I wanted. I have not been playing any games after I got tired of LOTR, so there is absolutely nothing I require a good video card for. Howver I might be doing Age of Conan so will order up a card if and when I decide.. move the 2600xt to my wife's machine.

    Maybe there might be a different choice in May... hopefully the RV770 is out and that may be the choice or whatever is ideal at that time.
     
  14. Slyne

    Newcomer

    Joined:
    Jul 26, 2004
    Messages:
    101
    Likes Received:
    3
    The presence of Umlaut (on vowels) and Eszett (characters almost exclusive to German) make it quite obvious this is German. Even if the Umlaut was replaced with an "e" following the vowel, and the Eszett was replaced with 2 "s", those letter combos would be totally out of place in French text.

    You're rigorous regarding hardware design, please do not be dismissive of other disciplines. Thank you.
     
  15. Pete

    Pete Moderate Nuisance
    Moderator Legend

    Joined:
    Feb 7, 2002
    Messages:
    5,777
    Likes Received:
    1,814
    Heh, I was (gently!) mocking the poster I linked. I'm aware of the difference b/w French and German, and not aware that I was dismissing either language. (Heck, I'm not entirely lost with one of them.) If your point is that I was favoring one to the detriment of the other, that wasn't my intention, either.

    I'm not sure I have something to apologize for, but I was never aware of the Eszett, so thanks for that. :) Lexical_quiver++!
     
  16. Karma

    Newcomer

    Joined:
    Jul 28, 2004
    Messages:
    36
    Likes Received:
    0
    I'm not sure if this was covered, and I hate to compare cards that haven't been released. But assuming the R700 is two smaller RV770 chips on one card, wouldn't it be cheaper and better business for AMD to make their high-end card than it would for nVidia to make their 1+ billion tranny GT200? Or would there be some other manufacturing aspect to putting the two chips together that would counter out just placing one chip on one card?
     
  17. nexus_alpha

    Newcomer

    Joined:
    Nov 25, 2006
    Messages:
    44
    Likes Received:
    0
    All depends on yields what 55 nm version they are using and how well two rv770 scales
     
  18. Karma

    Newcomer

    Joined:
    Jul 28, 2004
    Messages:
    36
    Likes Received:
    0
    Do we know how the chips are going to be placed onto the one card yet? Is is MCM or more like the 3870X2?
     
  19. jamis

    Newcomer

    Joined:
    Oct 10, 2006
    Messages:
    127
    Likes Received:
    4
    Is it 100% certain that GT200 is 1+billion single chip? I mean there is this "Tegra" thing floating around. Isn't R700 supposed to be some new kind of multicore thingy, dulacore but not crossfire? Maybe NV is doing something similar -> Tegra = multicore but not SLI?
     
  20. nexus_alpha

    Newcomer

    Joined:
    Nov 25, 2006
    Messages:
    44
    Likes Received:
    0
    Ok one of the downfalls of the 3870x2 was intergpu communications and the fact that textures had to copied twice as memory was not shared leadingly to low low min fps if this solved ATI MAY HAVE a chance.

    TEGRA something else.
     
Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...