MSAA + HDR benchmarks ?

^eMpTy^ said:
Overall I don't think the XL is comparable to a GTX at any resolution...it loses rather decidedly to the GT...the GTX beats it out pretty much without question.

Except if you look at the quake 4 benchmarks linked earlier in the thread, where with AA/AF the XL thrashes the GT and is very close to the GTX.
 
Pete said:
Ugh, so much politicking. Why don't we figure out the technical reasons, if any, for the R520's apparently disappointing HDR performance before resorting to breaking out the f-word and weak historical comparisons?

I'll throw in a theory here ...
It doesn't match anything I've seen, I don't think that cost is normal. In the HDR sample in the SDK the cost of 6x MSAA was 14% last time I checked. And that's with fairly cheap shading, so the relative cost of MSAA should be comparably high. I don't know how the shader load in Far Cry compares though. Anyway, my theory is that the memory controller could be tuned a bit for this case. I don't know the state of the driver in respect so I'm only speculating, but given that I've seen some strange performance characteristics I find it plausible. For instance is 2x slower than both 4x and 6x in the HDR sample, which doesn't make much sense, but shows there's at least some room for improvement.
 
bdmosky said:
Honestly I'm beginning to think Nvidia was right about waiting for HDR + MSAA. I can't afford the x1800xt but even if I could, I sure as hell wouldn't want to run it at 1024x768 or less just to get MSAA on my HDR applications (if Farcry is any accuracte indication of anything). Hell... even the 1024x768 numbers are borderline playable.
Are you nuts!?! I don't think you're looking at the numbers very well.

Even in Far Cry, where the implication in this thread is that ATI has a different workload, the XT at 1024x768 w/4xAA is as fast as the GTX at 1280x1024 w/o AA. Only an idiot would think ATI's gamma correct, rotated grid 4xAA looks worse than one resolution step higher without any AA whatsoever.

Most people would gladly take 1024x768 with 4xAA (rotated grid) over 1600x1200 without AA, let alone 1280x1024.

I mean really, the X1800XT drops by 30% when 4xAA is added in HDR Farcry. The 7800GTX drops 40% in Doom 3 (NOTE: NO HDR) when AA is added. Are you saying you wouldn't enable AA on the 7800GTX?

Just wait until Splinter Cell enables AA with HDR...
 
Mintmaster said:
Most people would gladly take 1024x768 with 4xAA (rotated grid) over 1600x1200 without AA, let alone 1280x1024.

Depends on the size of the monitor. Irrelevant of anything else, you don't get the same dpi values on the same resolution on a 17", 19" or 21" monitor as examples.
 
radeonic2 said:
On my 19" .27 (I think) dotpitch I need 1280x960 or above for it look decent to me;)

On a 15" monitor though 1280 would be a quite high resolution, while on contrary on a 24" it would look like a relatively low resolution.
 
Ailuros said:
On a 15" monitor though 1280 would be a quite high resolution, while on contrary on a 24" it would look like a relatively low resolution.
Indeed, 1200x900 was the biggest res I used on my 17", but for most of it's lie it was at 1164x864, but in one ati driver set I discovered that res was available and it worked at 75 hz.
When first got my 19" crt I set it to 1600x1200 at first and it was really small to me at first, but I got used to it over a week.
 
But he, like hellbinder, simply ignores FSAA benchmarks on the XT because of the high clocked memory, while forgetting the XL is competitive with the GT.
If you ignore it, it will go away;)
 
Hellbinder said:
yes the numbers once for MSAA once the hit for HDR is factored is good. but the hit for HDR is SEVERE. and its 10+ FPS slower than a GTX.

That may be true but as a "whole" i think its a good card. Unfortune for ATI the XT will be 6 months later to the game here in Sweden atleast(12/01+ about 100Euro more) compared to the instant launch with the 7800GTX(06/22). The XL really confuses me because i would have very hard to justify it compared to an XT if i were to spend so much money anyway.

Those who waited just because its ATI i think made a bad desicion but for the others not planning to buy a new card untill christmas anyway have god options also i the price falls a bit.

If i hade to chose a card i would take the X1800XT 512MB over the 7800GTX.
But im not planning to buy any new parts untill Vista so then it should stand between R600 and G80/90(?).
 
Last edited by a moderator:
Hellbinder said:
winning the AA benchmarks becuase of the 10GB bandwidth advantage due to the max clocked ram
"With 6x FSAA we can see similar trends, but again the performance difference is widened - this is not just due to the large availability of bandwidth the X1800 XT has because the same holds true for the X1800 XL, it would appear to be the case that R520 is better at handling, and optimising, high bandwidth situations, bearing out the work done on the memory controller and related elements."

http://www.beyond3d.com/reviews/ati/r520/index.php?p=24

"Underlying that there are further optimisations are the differences in performance impacts between the X1800 XL and X800 XT as the XL has a significantly lower performance hit from going from 4x to 6x FSAA."

http://www.beyond3d.com/reviews/ati/r520/index.php?p=28#aa
 
no-X said:
check AA/AF @ 1600*1200 and higher (if available)

HalfLife2
FarCry
Splinter Cell 3
Colin Mc Rae 2005
Battlefield 2
Perimeter

and possibly other


Don't handpick results, it could end up misleading. Xbit-labs usually writes quite extensive shootouts with a large number of applications; I'm looking forward to see such a shootout with new final ATI drivers that contain the OGL hotfix. That said I'd rather say that if you draw an average the 7800GT and X1800XL would most likely end up on par more or less and the X1800XT taking a lead over the 7800GTX (something like 15% on average maybe?).

What I'd love personally to do is a high image quality shootout between a X1800XT and if there's ever going to be a 512MB G70. Of course the first would be tested strictly by itself with 6xAAA and HQ AF enabled, lack of comparable modes from the other side.
 
Ailuros said:
Don't handpick results, it could end up misleading.
You can read the whole review here. All tested games except opengl titles (Doom3 + IL2, both testet w/o fix) run equally on X1800XL and 7800GTX at high resolutions with AA enabled. It's a good verification (at least from my point of view), that R520's performance advantage with AA enabled comes mostly from improved core design and not from higher memory bandwith, as Hellbinder said. (GTX ~ 20% bandwidth advantage over XL) or fillrate (GTX ~ 30% fillrate advantage over XL).
 
no-X said:
You can read the whole review here. All tested games except opengl titles (Doom3 + IL2, both testet w/o fix) run equally on X1800XL and 7800GTX at high resolutions with AA enabled. It's a good verification (at least from my point of view), that R520's performance advantage with AA enabled comes mostly from improved core design and not from higher memory bandwith, as Hellbinder said. (GTX ~ 20% bandwidth advantage over XL) or fillrate (GTX ~ 30% fillrate advantage over XL).

You used results from two different reviews and handpicked results from both; the highlight above verifies it. As I said we need a full blown shootout with the OGL fix and from as many games as possible, preferably with Q4, Fear (final game), SS2 and what not results.
 
I mean games that are playable at HR (1600, 1920) + AA. F.E.A.R. and similar games are mostly not playable at these settings, so I wouldn't consider these results as conclusive.

But I agree, that an extensive article about HR gaming with G70/R520 comparision (using latest drivers) would be advisable :)
 
Chalnoth said:
Er, the X1k series can do more AA samples per clock, which may well explain these results with 4x AA and higher.
How? All current architectures do 2xAA per clock, don't they? And all of them max-out at 16 ROPs.

Jawed
 
Jawed said:
How? All current architectures do 2xAA per clock, don't they? And all of them max-out at 16 ROPs.

Jawed
Hrm, I thought I had read in Dave's review that the X1k series supported more AA samples per ROP, but I can't find it now, so I removed the comment.
 
Back
Top