AMD: R8xx Speculation

Discussion in 'Architecture and Products' started by Shtal, Jul 19, 2008.

?

How soon will Nvidia respond with GT300 to upcoming ATI-RV870 lineup GPUs

Poll closed Oct 14, 2009.
  1. Within 1 or 2 weeks

    1 vote(s)
    0.6%
  2. Within a month

    5 vote(s)
    3.2%
  3. Within couple months

    28 vote(s)
    18.1%
  4. Very late this year

    52 vote(s)
    33.5%
  5. Not until next year

    69 vote(s)
    44.5%
  1. elsence

    Newcomer

    Joined:
    Aug 31, 2009
    Messages:
    80
    Likes Received:
    0

    I certainly agree with that.
    The thing is, right now we have only data for AMD vs NV, so if someone wants to speculate, has to take those data.
    Is the speculation going to be accurate?
    Probably not.
    But we do this for fun, it doesn't matter at this stage if we are 100% correct.
    Figuring out if the 5870 is going to be bandwidth limited or not, is good enough.



    About DX11 and DX10.1, i knew that depending the DX11/DX10.1 use, it can yield a performance gain.

    I don't know if the future Dx11/DX10.1 use will somehow alleviate some of the need for more bandwidth (i mean in a appreciable ammount like -20%)
    If it does then it is a good thing for ATI.

    I just thought that since we don't know in what percentage this will help, it will lead to confusing us more.

    For example:

    Take a HD4850, which is already bandwidth limited a little and downclock the memory to 700MHz.

    So for this part,
    if a developer makes a future DX11/DX10.1 codepath, the game will perform a little better in relation with the DX10 codepath (lets say for example +20%), does this mean that the 4850 with 700MHz mem, stopped being bandwidth limited?

    Try overclock the mem at 1,2Ghz and see the new results.

    Also, the potential +20% (for example) in perf. that DX10.1/DX11 may bring in the future could be irrelevant to the memory bandwidth.

    I don't have a technical background, so it is very hard for me to predict at what percent the Dx11/DX10.1 use, will help regarding memory bandwidth need.

    If you are a programmer in the games industry, or an engineer of Graphics cards or if you happen to know about this staff and you can predict, i respect your opinion.
     
    #3321 elsence, Sep 18, 2009
    Last edited by a moderator: Sep 18, 2009
  2. rpg.314

    Veteran

    Joined:
    Jul 21, 2008
    Messages:
    4,298
    Likes Received:
    0
    Location:
    /
    Wouldn't the deferred shading, lighting features of dx10.1/11 help here? At any rate, we have seen that shaders have become alu oriented over the years in view of the memory wall. May be the game devs will realize this too and implement deferred techniques.

    In games that use dx10.1, they already see ~10-20% increase in fps, from the same bandwidth of a supposedly limited 48xx.

    In the leaked slides from the 10 sept event, we also saw a deferred lighting demo using dxcs. So there is a good chance that games will start using deferred techniques to save bandwidth.

    The situation might be a precursor to a giant bandwidth fubar, but there is some hope yet.
     
  3. Davros

    Legend

    Joined:
    Jun 7, 2004
    Messages:
    17,884
    Likes Received:
    5,334
    Woah, they don't ? didn't someone post benchmarks of the new ati card being +40% - +60-% (cant remember the exact figure)
     
  4. psolord

    Regular

    Joined:
    Jun 22, 2008
    Messages:
    444
    Likes Received:
    55
    I wonder why we haven't seen a single picture of the 5850!

    Does anyone has any info about it? Will it be the same as the 5870 just with a different sticker and hardware or what? I mean what kind of cooler will it have?
     
  5. Lightman

    Veteran Subscriber

    Joined:
    Jun 9, 2008
    Messages:
    1,969
    Likes Received:
    963
    Location:
    Torquay, UK
    You're right!
    HD5850 is above GTX285 according to leaked benches. HD5870 was compared to GTX295 and in most cases won as well, but we all know how some games are scaling on SLI/CF ....

    I'm personally looking forward to professional benchmarks like SpecViewPref! Judging by OGL game performance of these cards FireGL based on RV870 should be mighty! :twisted:
     
  6. DeF

    DeF
    Newcomer

    Joined:
    May 3, 2007
    Messages:
    162
    Likes Received:
    20
  7. psolord

    Regular

    Joined:
    Jun 22, 2008
    Messages:
    444
    Likes Received:
    55
    @Def

    Wow! My deepest respect! Thanks a billion!
     
  8. rpg.314

    Veteran

    Joined:
    Jul 21, 2008
    Messages:
    4,298
    Likes Received:
    0
    Location:
    /
    IIRC, the 4850 was a single slot part. Clearly, power budgets have shot up with this gen. Not good. :(
     
  9. sc3252

    Newcomer

    Joined:
    Jun 6, 2008
    Messages:
    36
    Likes Received:
    3
    Has power use ever gone down from one generation to the other?
     
  10. Bouncing Zabaglione Bros.

    Legend

    Joined:
    Jun 24, 2003
    Messages:
    6,363
    Likes Received:
    83
    Depends really. Would you be willing to pay 1.5 times more in power to get 2 times performance? Especially if the idle power draw has been dropped right down as leaks suggest. Although your peak power usage may go up for more performance, your overall power usage would go down as your idle power usage has dropped a lot more, and that's probably what your card spends more of it's time doing.

    That would sound like a good deal unless you absolutely cannot afford more power under any circumstances.
     
  11. Pressure

    Veteran

    Joined:
    Mar 30, 2004
    Messages:
    1,655
    Likes Received:
    593
    In that case you should probably not be looking at high-end graphic cards or a high-end computer to begin with.
     
  12. XMAN26

    Banned

    Joined:
    Feb 17, 2003
    Messages:
    702
    Likes Received:
    1
    You do realize that the 5870 lead increase isn't so much from the fact that it isn't bandwidth limited, but rather it simply isn't nearly as limited as its competition is. That and the FACT that ATI has some nice 4x and 8x AA algerithims in place right now to so reduce the hit for those settings.
     
  13. XMAN26

    Banned

    Joined:
    Feb 17, 2003
    Messages:
    702
    Likes Received:
    1
    I'd like to know where the complainers are about how they all have dual slot coolers? Even the 5770 has one, if they cards have dual slot coolers, you know they are gonna run warm to hot.
     
  14. Pressure

    Veteran

    Joined:
    Mar 30, 2004
    Messages:
    1,655
    Likes Received:
    593
    Or cool and quiet.
     
  15. XMAN26

    Banned

    Joined:
    Feb 17, 2003
    Messages:
    702
    Likes Received:
    1
    Dual slot coolers mean single slot ones were not good enough to keep them cool.

    And teh 5870X2 looks like it is an inch longer than the 5870. Gonna need bigger cases.
     
  16. neliz

    neliz GIGABYTE Man
    Veteran

    Joined:
    Mar 30, 2005
    Messages:
    4,904
    Likes Received:
    23
    Location:
    In the know
    The reference card will have all those connections, the two slot design is not because of TDP requirements, but because of connections. expect cards with less than 2 DVI ports.
     
  17. psolord

    Regular

    Joined:
    Jun 22, 2008
    Messages:
    444
    Likes Received:
    55
    I always prefered dual slot coolers. Especially ATI ones, that blow all the hot air out of the case. It is the most important thing for me.

    After selling my 4870X2, i got a GTX 260, just to have a decent card that can play some games, until the new ones arrived. The problem is that my case's temperature rose about 5C degrees, just because there is too much air blown into the case and that comes from a card that is much less of a power hog, than the 4870X2 was.
     
  18. gamervivek

    Regular

    Joined:
    Sep 13, 2008
    Messages:
    805
    Likes Received:
    320
    Location:
    india
    or perhaps they've decided to do away with 90C idling gpus:razz:
     
  19. Bouncing Zabaglione Bros.

    Legend

    Joined:
    Jun 24, 2003
    Messages:
    6,363
    Likes Received:
    83
    I've always been willing to give up that unused slot if it makes the difference between my card running cool and quiet or noisy and hot. The days when people stood for noisy PCs is long gone.

    In fact, I've just sent back a 4890 with a non-standard cooler (Powercolor changed the coolers as some kind of differentiator, but the retailer didn't change the description), because it was just far too noisy under load. It wasn't terrible by the standards of a few years back, but it's awful by today's standards where everyone is using slow and quiet 120mm+ fans in even the most modest builds. Graphics cards can't be so much noisier than everything else, even though they generate so much heat - people just won't give them a pass on it.
     
  20. XMAN26

    Banned

    Joined:
    Feb 17, 2003
    Messages:
    702
    Likes Received:
    1

    What card did you go with is what I gotta ask. I've got BFG and eVGA cards and both blow the air of the case totally, none goes into the case.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...