MSAA + HDR benchmarks ?

overclocked said:
I am rather impressed with the numbers with 4AA+HDR.
The R520 is a really interesting part from a tech pov so the only problem is that its really late to the table.

One thing i wonder Hellbinder is why you praise/praised any ATI tech before and then just made a U-turn? Have you lost money in buying stocks?

yes the numbers once for MSAA once the hit for HDR is factored is good. but the hit for HDR is SEVERE. and its 10+ FPS slower than a GTX.
 
I still like ATi tech.

What i am not happy with is the yet again justification for not having a basic feature in SM3. Constantly following the leadership of NVidia and then making excuses about how the market is not ready for "X" feature yet. The 5 year long complete ignoring of OpenGl performance that we are all suddely supposed to forget becuase they finally diod something for one Game engine with specific setting on one series of cards that are not even released yet. Not Sticking with the design philosophy that made them great with the 9700. Instead building a part that is exaxtly like the 5800 they made fun of so much etc etc etc.

I am not a blind ******.
 
Hellbinder said:
I still like ATi tech.

What i am not happy with is the yet again justification for not having a basic feature in SM3. Constantly following the leadership of NVidia and then making excuses about how the market is not ready for "X" feature yet.

What feature is that?

Not Sticking with the design philosophy that made them great with the 9700....

I don't see any change in design philospohy.

Design in features that, when used, will have a high possibility of actually having the performance to be, well, useful.

I am not a blind ******.

No, apparently, you are a vengeful ******. Apparently, you want ATI's design philospohy to be "vindicate everything I've pre-maturely hyped about their parts before they were released".

All IMO, of course.
 
this is totally OT but the question was raised

IF you are going to have an insanely high power consumption and heat generation product that takes up 2 slots then you should get some payoff for it. This card should have been clocked at around 700mhz and been a 16-1-3-1. Then it would have some seriously leading performance and justify its exsistance.

Losing all the non AA benchmarks, winning the AA benchmarks becuase of the 10GB bandwidth advantage due to the max clocked ram, losing most of the shader tests, still being clocked *almost* through the roof and all the things that goes along with that, and not being SM3 feature complete is the second comming of the nv30 series to anyone who pays attention and is not a blind ******.
 
Last edited by a moderator:
No, apparently, you are a vengeful ******. Apparently, you want ATI's design philospohy to be "vindicate everything I've pre-maturely hyped about their parts before they were released".

All IMO, of course.

Simply not true.
 
Hellbinder said:
IF you are going to have an insanely high power consumption and heat generation product that takes up 2 slots then you should get some payoff for it.

Hmmm...the fastest and most feaure rich card is not enough?

This card should have been clocked at around 700mhz and been a 16-1-3-1. Then it would have some seriously leading performance and justify its exsistance.

Why don't we wait and see what nVidia can put out on 90 nm tech with 512 MB of the fastest ram available before we start complaining about power consumption.

Losing all the non AA benchmarks...

Simply false....

(EDIT: for example: http://www.firingsquad.com/hardware/quake_4_high-end_performance/page5.asp)

...winning the AA benchmarks becuase of the 10GB bandwidth advantage due to the max clocked ram...

Which contributes to power consumption...you can't have it both ways.

...losing most of the shader tests, still being clocked *almost* through the roof and all the things that goes along with that, and not being SM3 feature complete is the second comming of the nv30 series to anyone who pays attention and is not a blind ******.

Anyone who could remotely compare the R5xx series to the NV3x series is just a blind idiot with some axe to grind.
 
I agree 120% with Joe DeFuria on HB.
Like OpenGL guy says, are you certain the nvidia HDR method and the ati method is comparable?
If they're not doing the same thing it's not really fair to compare numbers.
Shouldn't compare apples to oranges.
And also the far cry hdr method is a hack.
I think SC-CT is a much better way to measure HDR performance, and there the X1800XL matches the 7800 gt and the XT beats the gtx.
 
Honestly I'm beginning to think Nvidia was right about waiting for HDR + MSAA. I can't afford the x1800xt but even if I could, I sure as hell wouldn't want to run it at 1024x768 or less just to get MSAA on my HDR applications (if Farcry is any accuracte indication of anything). Hell... even the 1024x768 numbers are borderline playable.
 
Pete said:
The X1800s appear to lag behind the 7800s in Far Cry HDR perf. Dave's test system was pretty similar b/w the two reviews, with just the MB (and PSU) changing.

Except that they are not running the same version of Far Cry. Until you see the 7800 and X1800XT running on the same release and have quality comparisons its a bit premature.

Dave Baumann said:
The Far Cry patch is new, unreleased patch which alters the engine to allow ATI's HDR method, as such results to previous versions may not be comparable and we don't yet know that the path will be doing the same thing from vendor to vendor.
 
bdmosky said:
Honestly I'm beginning to think Nvidia was right about waiting for HDR + MSAA. I can't afford the x1800xt but even if I could, I sure as hell wouldn't want to run it at 1024x768 or less just to get MSAA on my HDR applications (if Farcry is any accuracte indication of anything). Hell... even the 1024x768 numbers are borderline playable.
Are you certain far cry is a good benchmark for HDR, given it's add on nature?
 
radeonic2 said:
I agree 120% with Joe DeFuria on HB.
Like OpenGL guy says, are you certain the nvidia HDR method and the ati method is comparable?
If they're not doing the same thing it's not really fair to compare numbers.
Shouldn't compare apples to oranges.

Yes that have to be investigated, first and maybe its very raw and unoptimized in the first place.

And also the far cry hdr method is a hack.
I think SC-CT is a much better way to measure HDR performance, and there the X1800XL matches the 7800 gt and the XT beats the gtx.

But the visual difference between HDR en no HDR in Far Cry looks way bigger then in SC CT. (not saying Far Cry's HDR is supererior )
 
Last edited by a moderator:
bdmosky said:
Honestly I'm beginning to think Nvidia was right about waiting for HDR + MSAA. I can't afford the x1800xt but even if I could, I sure as hell wouldn't want to run it at 1024x768 or less just to get MSAA on my HDR applications (if Farcry is any accuracte indication of anything). Hell... even the 1024x768 numbers are borderline playable.
Crossfire? :devilish:
 
Skinner said:
Yes that have to be investigated, fits and maybe its very raw and unoptimizedin the first place.



But the visual difference between HDR en no HDR in Far Cry looks way bigger then in SC CT. (not saying Far Cry's HDR is supererior )
Far cry is much more in your face, it's way too much even at 7 imo, where the "community" likes it best at.
It's not worth the fps hit imho:D
However SC-CT without HDR is much too dark without it.
 
Hellbinder said:
Losing all the non AA benchmarks, winning the AA benchmarks becuase of the 10GB bandwidth advantage due to the max clocked ram
Really? So why is 32GB/s XL comparable with (or better than) 38GB/s GTX @ hi-res+AA?

Hellbinder said:
losing most of the shader tests
...and winning all shader games

:rolleyes:
 
I can't find which hdrrendering method Dave used. 7?

2 performs a lot better (on a 6800GT), and looked better too imo (less overdone).
 
AlphaWolf said:
Except that they are not running the same version of Far Cry. Until you see the 7800 and X1800XT running on the same release and have quality comparisons its a bit premature.
Thanks, Alpha, I overlooked that. Still, I hope the rendering differences aren't just ATI at FP32 and NV at FP16.

HB, isn't HDR still above and beyond SM3 spec? "SM3 done right" appears to refer solely to a low branching penalty (and this was hinted at back with Huddy's PDF note). Putting the HDR performance aside--which may be due to different paths, etc., as Alpha's quote and OGL's posts indicate--I'm not sure how HDR + AA isn't SM3 done right. It may not be fast enough for you at high res, but NV doesn't even offer it! And until we understand the reason why R520 takes a bigger hit with HDR than NV, at least in FC, then IMO all we have to go on is the performance hit of using AA with HDR. According to Dave's numbers, it's not that much higher than using it with other (non-HDR) titles.

Ugh, so much politicking. Why don't we figure out the technical reasons, if any, for the R520's apparently disappointing HDR performance before resorting to breaking out the f-word and weak historical comparisons?
 
radeonic2 said:
Far cry is much more in your face, it's way too much even at 7 imo, where the "community" likes it best at.
It's not worth the fps hit imho:D
However SC-CT without HDR is much too dark without it.


I agree its way overdone and out of proportion in Far Cry, but that alone doesn't make it a lesser way to messure the HDR-performance. But I do'nt have the knowledge to evaluate that ;)

But was the 7800 gtx also a strong performer in Lost Coast?
 
Last edited by a moderator:
Skinner said:
I agree its way overdone and out of proportion in Far Cry, but that alone doesn't make it a lesser way to messure the HDR-performance. But I do'nt have the knowledge to evaluate that ;)

But was the 7800 gtx also a strong performer in Lost Coast?
Well I think since the engine wasn't designed out of the box for HDR it isn't a good benchmark of typical HDR performance.
 
no-X said:
Really? So why is 32GB/s XL comparable with (or better than) 38GB/s GTX @ hi-res+AA?

...and winning all shader games

:rolleyes:

Overall I don't think the XL is comparable to a GTX at any resolution...it loses rather decidedly to the GT...the GTX beats it out pretty much without question.
 
Back
Top