ATI Radeon X800 XT Platinum Edition / PRO Review

Discussion in 'Beyond3D Articles' started by Dave Baumann, May 4, 2004.

  1. PatrickL

    Veteran

    Joined:
    Mar 3, 2003
    Messages:
    1,315
    Likes Received:
    13
    I think the Orton's intervew, with that review in mind, will be very very interesting to get a better understanding on ATI plans for the next 12 months.
     
  2. Anonymous

    Veteran

    Joined:
    May 12, 1978
    Messages:
    3,263
    Likes Received:
    0
    Yep. It is strange that a lot of people seem to neglect this rather important issue.
    For those who wanted SM 3.0 from Ati ... they will have it when there will be SM 3.0 titles around. Made up my mind, too. :p
     
  3. Richard

    Richard Mord's imaginary friend
    Veteran

    Joined:
    Jan 22, 2004
    Messages:
    3,508
    Likes Received:
    40
    Location:
    PT, EU
    Again, best review of the bunch. Very impressed with what ATi has managed to do with basically two-year old technology.

    You mention a couple of times that the R420 doesn't have any specific z/stencil optimizations such as the NV3x/nv40's dual 'pumped' pipelines, remaining at 16x1. However other p/reviews mention that while using AA the R420 is capable of outputting 32 'zixels' per clock. The B3D review points out the dual pixel ops per clock when using AA. Question: does the R420 behave like a 32 pipeline part for z/stencil ops when AA is enabled?
     
  4. anaqer

    Veteran

    Joined:
    Jan 25, 2004
    Messages:
    1,287
    Likes Received:
    1
  5. Anonymous

    Veteran

    Joined:
    May 12, 1978
    Messages:
    3,263
    Likes Received:
    0
    Anyone knows what X stands for in X800?

    how about 800?
     
  6. Ostsol

    Veteran

    Joined:
    Nov 19, 2002
    Messages:
    1,765
    Likes Received:
    0
    Location:
    Edmonton, Alberta, Canada
    Card looks great in terms of performance, but it seems more like a bigger, faster R300 than anything else. More tech advances would have been nice. There's really nothing that makes me want to upgrade quite yet. The raw speed makes it tempting, but my Radeon 9700 Pro still runs games very well at the resolution I play at and beyond that, I had been expecting a technological leap more like the R200 to the R300 was. Given that I don't have the money to upgrade anyway, I think I'll wait 'till the next generation at least, then do a full system upgrade.
     
  7. nelg

    Veteran

    Joined:
    Jan 26, 2003
    Messages:
    1,557
    Likes Received:
    42
    Location:
    Toronto
    Thetechreport.com has confirmed the power draw for themselves, it seems. They are inline with what ATI told Dave. :D
     
  8. Natoma

    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    1,913
    Likes Received:
    84
    I just upgraded my IQ levels from my standard 4xAA/16xAF to 6xAA/16xAF at 1024 a couple of days ago. After seeing what the X800XT is capable of, especially wrt Temporal AA, and the :shock: speed :shock:, my 9800 Pro 256MB, purchased just 10 months ago, feels so small. :lol:

    Great review. Now comes the time for convincing my bf to let me have the X800XT. :D Here's hoping Nvidia comes out with something truly competitive to the XT in order to get prices down.

    The way I see it, according to the reviews I've seen around the web, the X800 Pro = 6800U at similar IQ levels, and the XT is in a class all its own, justifying that extra $100. My only question though is, is this round of videocards going to be the last AGP peripherals? Because if not, I might just wait for the 6 month refresh in time for the christmas rush. I don't intend on upgrading the PCIE until next year when BTX is available.

    p.s.: Does anyone else find it frightening that these cards are cpu limited even at 1600x1200 with a P4 3Ghz? Where's that 3.4Ghz EE price drop when you need it. :?
     
  9. Anonymous

    Veteran

    Joined:
    May 12, 1978
    Messages:
    3,263
    Likes Received:
    0
    This was my first impresson as well. Nothing earth shattering or ground breaking here.

    My gut tells me ATI thought if it is not broke, don't fix it. Saves money. Good foundation to build upon. Less risk. Why not?

    All good for ATI.

    However, I still can not get past the clock speed difference and performance. To me, it still seems Nvidia is ahead. Both in features and performance this time around. When all things are considered.

    Yet Nvidia dropped the ball on power requirements and size. Nothing really huge in the big picture - yet a blimish never the less.
     
  10. ram

    ram
    Newcomer

    Joined:
    Feb 6, 2002
    Messages:
    218
    Likes Received:
    0
    Location:
    Switzerland
    R420 can not output 16 pixels / clock

    http://www.beyond3d.com/reviews/ati/r420_x800/index.php?p=14
    http://www.beyond3d.com/previews/nvidia/nv40/index.php?p=18

    Something is odd here with the R420 pixel throughput. There is a clear differenece between what the diagrams propose and what is beeing measured with MDolenc's fillrate tester. R420 only writes 12 pixels/clock (peak).

    As both, R3x0 (8/clk) and NV40 (16/clck) reach their theoretical peak in this test (R420 and NV40 have the same bandwith), something else must limit the R420 pixel throughput. Any idea?
     
  11. Sage

    Sage 13 short of a dozen
    Regular

    Joined:
    Aug 22, 2002
    Messages:
    935
    Likes Received:
    15
    Location:
    Southern Methodist University
    where did the SPEC benchmarks go!?

    SPECViewperf 7.1 was listed as the very first benchmark on the "Test Setup" page of both the GF6 6800 and the Radeon X800 articles, but the results are in neither one! And, after I PM'd Dave about it on the 6800 he said he didnt run them..... I want my SPEC benchmarks!
     
  12. Joe DeFuria

    Legend

    Joined:
    Feb 6, 2002
    Messages:
    5,994
    Likes Received:
    71
    Re: R420 can not output 16 pixels / clock

    I don't see anything that can't be explained by "simple" bandwidth limitations.

    NV40 Ultra and R420 XT do have the about the same absolute bandwidth, and both get similar "pure fillrate" scores. R420 XT is more bandwidth limtied (bandwidth per pixel) than the NV40, and much more bandwidth limited than the R3x0.

    This would suggest that NV40 Ultra is more "balanced" than the R420 XT in terms of fill-rate. Wheras the XT is sacrificing the balance (cost) for more pixel shading ops / sec.
     
  13. ram

    ram
    Newcomer

    Joined:
    Feb 6, 2002
    Messages:
    218
    Likes Received:
    0
    Location:
    Switzerland
    Re: R420 can not output 16 pixels / clock

    That was my first thougth too. But have a look at this new xbit results:

    http://www.xbitlabs.com/articles/video/display/r420_26.html
    and next page

    Enabling z writes (=more bandwith used) doesn't decrease the pixel throughput, for me this indicates that the bandwith isn't responsible for this low result.
     
  14. Joe DeFuria

    Legend

    Joined:
    Feb 6, 2002
    Messages:
    5,994
    Likes Received:
    71
    Re: R420 can not output 16 pixels / clock

    Hmmm...that is interesting. Though the NV40 has the same behaviour when z writes are enabled/disabled (no appreciable change in fill rate.)

    On a related note...I also thought I saw another site show pretty much the opposite effect with respect to multitexturing that is shown at x-bit....the X800 pulled far ahead of the 6800 as you increased the number of textures. I'll try and find it to see if it was a completely different test or if I'm remembering incorrectly...

    Edit:

    Here it is: http://techreport.com/reviews/2004q2/radeon-x800/index.x?pg=7

    (Scroll down to the bottom of the page)

    Tech Report is using D3D RightMark.

    Curious...Is McDolenc Fill-rate tester D3D or GL?
     
  15. Anonymous

    Veteran

    Joined:
    May 12, 1978
    Messages:
    3,263
    Likes Received:
    0
    Re: R420 can not output 16 pixels / clock

    Those results seem to show that R420 is a rather ineffecient architecture compared to NV40, seeing as it offers much higher peak theoretical fillrate in color + Z rendering (+30%), has the same bandwidth available to its core, but falls short of the slower NV40 by 10-20%.

    From the R420 / R3x0 comparison, it seems like memory efficiency is the culprit, as the older cores, having available more bandwidth per pipe, get much closer to their theoretical peaks. I'd speculate that nVidia implemented a much better memory subsystem.

    cu

    incurable
    ________________________________________________________________________________________
    The 1st was a surprise, the 2nd a disturbance, the 3rd caused irritation and now I'm done with them. For good.
     
  16. THe_KELRaTH

    Regular

    Joined:
    Dec 9, 2002
    Messages:
    471
    Likes Received:
    0
    Location:
    Surrey Heath UK
    Exellent review Dave.. as per usual.

    While I'm impressed with the power requirements / performance increase I can't help but be a little dissappointed that there's no support for supersampling AA. I was hoping for something like x2MS/2SS x4MS/2SS mixed mode formats to be added. Other areas such as an enhanced version of "hardware" trueform and hardware techniques to compensate stuttering when framerates are low would have been a good advance too.

    I think it's too early to say which card is the real performance winner till DX9.0c arrives and we see the performance effects, if any, from games adding PS 3.0 (Farcry / UT2004 & games based on the Unreal Warfare engine) etc. (Though if there are good benefits in FarCry the surely there would be even greater benefits replacing 1.1 for 1.4)
     
  17. THe_KELRaTH

    Regular

    Joined:
    Dec 9, 2002
    Messages:
    471
    Likes Received:
    0
    Location:
    Surrey Heath UK
  18. Joe DeFuria

    Legend

    Joined:
    Feb 6, 2002
    Messages:
    5,994
    Likes Received:
    71
    Re: R420 can not output 16 pixels / clock

    I still don't see that in B3D's resutls.

    The fact that they have similar absolute bandwidth and similar absolute fill rate results doesn't have to say much wrt efficiency. If we can overclock the NV40 core and see if that impacts fill-rate, that would be a good indicator though.

    the X-Bit results do in fact indicate relative inefficiency of the R420 memory interface...but then, the results from Tech Report that I linked to seem to show the opposite.

    Again, is McDolenc's fill-rate tester GL or D3D? I'm wondering if it's a driver efficiency thing, rather than a hardware efficiency thing...
     
  19. mczak

    Veteran

    Joined:
    Oct 24, 2002
    Messages:
    3,022
    Likes Received:
    122
    Nice cards, nice review. My 9000pro feels so old :)
    This round seems to be a close battle.

    Basically, IMHO ATI has the advantage of lower power consumption (should be more OEM friendly), whereas Nvidia has SM 3.0 (and I guess because of this reason developers are likely to use Nvidia hardware as their primary development card). Oh, and ATI needs to rewrite its OpenGL driver if they ever want to compete in that area...

    Special negative mention deserves Nvidia (again), for handing out "Review Edition" cards with unknown and limited availability, as well as for providing beta drivers which have the convenient "bug" that trilinear optimizations can't be disabled. Why are they doing this? It's not exactly like the NV40 couldn't compete with the R420XT at all with regards to performance. The 6800GT version especially looks very interesting, if Nvidia can actually deliver them in quantities (hopefully their yields are good enough) in a reasonable timeframe.
     
  20. Anonymous

    Veteran

    Joined:
    May 12, 1978
    Messages:
    3,263
    Likes Received:
    0
    Re: R420 can not output 16 pixels / clock

    Hi Joe,

    One could also argue to down clock the 420 to match the Ultra and see what happens. I'd rather down than over clock.

    It still appears that the NV40 is doing more with less mhz. Just my opinion.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...