[B3D Analysis] R600 has been unleashed upon an unsuspecting enthusiast community

Discussion in '3D Hardware, Software & Output Devices' started by Farid, May 14, 2007.

  1. Julidz

    Newcomer

    Joined:
    May 5, 2007
    Messages:
    57
    Likes Received:
    0

    but the XT beats the GTS in almost all games , and even reach the GTX in some situations

    i haven't seen a review with this performance from XT yet
     
  2. neliz

    neliz GIGABYTE Man
    Veteran

    Joined:
    Mar 30, 2005
    Messages:
    4,904
    Likes Received:
    23
    Location:
    In the know
    Depends on which games you benchmark and what settings
     
  3. dnavas

    Regular

    Joined:
    Apr 12, 2004
    Messages:
    375
    Likes Received:
    7
    :lol: Very droll. I think, perhaps, when I said "i'd be interested in more noise" what I meant was "I'd be interested in more signal" :) Obviously using a test drawn up by a biased party has certain ... disadvantages. I am merely recognizing some old G80 FUD and voicing the implicit accusation.
    :shrug:

    -Dave
     
  4. Cuthalu

    Newcomer

    Joined:
    Oct 28, 2006
    Messages:
    116
    Likes Received:
    1
    Look at the XP benches too, not just the Vista ones.
     
  5. Brian118

    Newcomer

    Joined:
    Apr 28, 2007
    Messages:
    14
    Likes Received:
    0
    Location:
    Florida
  6. 3dilettante

    Legend Alpha

    Joined:
    Sep 15, 2003
    Messages:
    8,361
    Likes Received:
    3,940
    Location:
    Well within 3d
    It's possible the compiler for G80 has a bug, or there is a deliberate design or compiler emphasis on ops that go to memory.

    VLIW and other statically scheduled architectures are not often instruction-limited, but data-constrained.

    It seems optimally filling instruction slots is not enough, and that other factors that are less simple to compile for are pushing R600's utilization way below the peak instruction issue in the sample programs.
     
  7. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    9,051
    Likes Received:
    2,925
    Location:
    Finland
  8. IbaneZ

    Regular

    Joined:
    Apr 15, 2003
    Messages:
    743
    Likes Received:
    17
    Thought it was weird why some sites benched with 8AF. No I know why. :wink:
     
  9. PSU-failure

    Newcomer

    Joined:
    May 3, 2007
    Messages:
    249
    Likes Received:
    0
    I'll try to explain why I seriously doubt it, perhaps it's the same for the ones having the same doubt.

    In the G80's reviews, it was eplained the SIMD units just compute the same thing for 16 pixel's components for example. So, what's the typical use of a SIMD unit? Isn't it the execution of the same Op on multiple data? If it's not the case, I think they wouldn't call that SIMD but rather VLIW, even if VLIW is somewhat a particular SIMD implementation...

    Now, if 16 identical shaders instances are run, it's clearly near 100% efficiency as all use exactly the same Ops in the same order, right.

    And if the shaders just look like "bunches of Ops" as I stated earlier, resulting in, say, 20 MUL, 50 ADD, 5 SQRT and so on, with average amount of dependancies? Of course, any of the two hardware will suffer from that, each one being SIMD, but there the higher number of simultaneously executable Ops should increase efficiency. I know it's not what we could find in a GPU at the moment, but it could be the key to explain why they chose such an organization.


    This strange thing looks more like a general purpose, massively parallel processor, but it could be what AMD wanted after all... I don't think they would ever think to launch a pure GPU with such tradeoffs in production as they could simply have doubled the R580 units, improved some aspects and it would have had better performance under DX9 at that time, which would have been sufficient to refine their DX10 GPU while we're waiting for DX10 apps just as ATI dit for R420.
     
  10. leoneazzurro

    Regular

    Joined:
    Nov 3, 2005
    Messages:
    518
    Likes Received:
    25
    Location:
    Rome, Italy
    So Oblivion is SO filtering limited?

    http://techreport.com/reviews/2007q2/radeon-hd-2900xt/index.x?pg=5

    Seems a little strange, it's almost the same scaling.

    But anyway, it could be even an issue with the way Oblivion applies AF and R600 drivers.
    The "new-old" release triples performance in some areas, but even removing AF should not lead to these kind of results.
    And the IQ loss should be immediately recognized in this case.
     
    #210 leoneazzurro, May 14, 2007
    Last edited by a moderator: May 15, 2007
  11. SugarCoat

    Veteran

    Joined:
    Jul 17, 2005
    Messages:
    2,091
    Likes Received:
    52
    Location:
    State of Illusionism
    well its not just AF, AA modes too in far too many titles badly damage performance. I have a feeling, given time, we'll see this card doing GTX performance for the most part, but right now it looks like the card isnt optimized much at all. Problem is they need that performance now.

    The performance fluctuates far too much, from excellent in the synthetics and OpenGL games and a few select DX titles, to just plain abysmal. Usually those situations involve high settings of AA/AF. These are two areas that should be strengths due to the MC and bandwidth available, also two areas the R520/R580 excelled at, which again, leads me to believe we'll see some real tinkering in the drivers. Especially between now and the next 6 months.

    I found one of theinq graphs in particular to be the most funny/telling of exactly how, for lack of a strong enough word, broke the drivers are.

    [​IMG]
     
  12. neliz

    neliz GIGABYTE Man
    Veteran

    Joined:
    Mar 30, 2005
    Messages:
    4,904
    Likes Received:
    23
    Location:
    In the know
    if you read carefully, 3dmark scores are up 3.6% R6 vegas about 10%
    My girlfriend's asleep now, so she can't translate everything right away.. now where is Mao5?
     
  13. Brian118

    Newcomer

    Joined:
    Apr 28, 2007
    Messages:
    14
    Likes Received:
    0
    Location:
    Florida
    I think we got him mad...now he doesn't talk to us anymore :(
     
  14. 3dilettante

    Legend Alpha

    Joined:
    Sep 15, 2003
    Messages:
    8,361
    Likes Received:
    3,940
    Location:
    Well within 3d
    If the theory that there are hardware faults being worked around in the driver holds water, then there will be a limit to the driver's ability to improve.

    AMD can't sell a version with fixed hardware as a 2900 XT either.

    It would be a huge PR blow to sell R600-based chips, then later start selling fixed versions that don't require driver workarounds, and the low launch price is a strong hint that the card is priced to move, perhaps prior to a refresh.
     
  15. neliz

    neliz GIGABYTE Man
    Veteran

    Joined:
    Mar 30, 2005
    Messages:
    4,904
    Likes Received:
    23
    Location:
    In the know

    At first I thought inq just messed up.. but the numbers seem to be correct. performance is all over the place in some synthetic tests
     
  16. Star_Hunter

    Newcomer

    Joined:
    Jan 21, 2003
    Messages:
    77
    Likes Received:
    0
    Location:
    Michigan
    So its either a bad driver or the AA is done by the shaders since they couldn't get their AA hardware to work right like Anandtech suggests?

    Also I am wondering what everyone thinks about the performance of these different drivers.

    [​IMG]
    [​IMG]
    [​IMG]


    I get the feeling these are either not real or they are some kind of error (like forgetting to turn on AA/AF) although it would be cool if they were real.
     
  17. Geo

    Geo Mostly Harmless
    Legend

    Joined:
    Apr 22, 2002
    Messages:
    9,116
    Likes Received:
    213
    Location:
    Uffda-land
  18. neliz

    neliz GIGABYTE Man
    Veteran

    Joined:
    Mar 30, 2005
    Messages:
    4,904
    Likes Received:
    23
    Location:
    In the know
    he's there as PII Maomao
     
  19. Brian118

    Newcomer

    Joined:
    Apr 28, 2007
    Messages:
    14
    Likes Received:
    0
    Location:
    Florida
    I don't know why, but I feel like it would be bad if I bought an 8800gtx now, instead of the XT.
     
  20. SugarCoat

    Veteran

    Joined:
    Jul 17, 2005
    Messages:
    2,091
    Likes Received:
    52
    Location:
    State of Illusionism
    I dont agree at all, what you're describing is exactly what happened with the Geforce FX and i dont see that evident here at all because the cards performance is so random, even in the same games. As i said most of the performance loss comes with HQ AF and some modes of AA applied which isnt what you'd expect with all that bandwidth.

    Also the price isnt set to move lemons out as fast as they can, its set because thats realistically the most they could ask, its late and not performing up to par, quite problably due to very immature drivers. If they went for $499 they would of got blasted by every sane review site on the net, instead at $399 they actually got praise from most which is exactly what their goal was.

    I wouldnt be surprised if we saw a completely different card performance wise between this XT and the same one in 6-8 months, but theres no reason you the consumer should have to wait that long, so personally i'd tell you to go for the GTX or, if you're in the US, one of the $330 640MB GTS cards. Another plus is the lower power consumption and much lower dB level which bugs me the most about the 2900.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...