The X1800s appear to lag behind the 7800s in Far Cry HDR perf. Dave's test system was pretty similar b/w the
two reviews, with just the MB (and PSU) changing.
Far Cry HDR (XT/GTX, XL/GT)
8x6: 91/77, 83/77
10x7: 78/77, 67/74
12x10: 52/67, 43/57
16x12: 36/47, 28/39
The 7800s are basically a res above the X1800s at the higher reses. It's interesting to see the 7800 capped (by their FP blend "fillrate?"*), whereas the X1800s scale more--or, rather, drop more. Is more MC tweaking in order?
On the plus side, it doesn't seem they drop further than when adding AA+AF to non-HDR Far Cry. Of course, that's judging only with the XL's 16x12 figure (with and without AA+AF and with and without HDR), as both the XT and XL are capped at 100 up to that res: -27% w/o HDR, -26% with.
-----
* I'm not quite sure how to interpret this. IIRC, NV40+ can do half as many blends as it has ROPs. I just realized R520 has the same number of ROPs as G70 and thus possibly the same blend/ROP ratio. Only, R520 is clocked higher than G70. So, 400-430MHz for G70, and 500-625 for R520. 90fps/77fps ~= 500MHz/430MHz, but that's the XL vs. the GTX. I can't quite square the rest. Kirk (tangentially)
says FP rendering fillrate scales w/bandwidth, but that doesn't square my guess that both parts can do 8 blends per clock with the reality that the XL scores higher than the GT (at low [non-ROP-limited?] res), despite (theoretically) having the same available bandwidth. Yes, the XL's
core (and thus ROPs?) is clocked higher than the GT's, but the GT is stuck at 77fps from 6x4 to 10x7, whereas the XL starts higher and drops fillrate throughout. OTOH, I see from Dave's G70 preview that the GTX w/o HDR is locked at 81fps from 6x4 to 16x12, and up to 10x7 it only loses 4fps w/HDR. Compare that with the XT, which takes a similarly slight hit from HDR immediately yet a pronounced one starting with 10x7, or one res before the GTX stumbles significantly.
Where was I? Right, in the land of confusion.