R520 = Dissapointment

trinibwoy said:
Thanks Unfortunately, that just confuses things for me. In that BF2 bench where XL > GTX, the GT and GTX are pretty close....why? Also, the GTX > XT without AA/AF in both FEAR and BF2.

I'm not sure if I missed anything in the Xbit Labs review but I don't see any mention of how much ram there is on each GPU. The X1800XT definitely has 512MB ram, what I'm unsure about is the X1800XL. If the latter should have had only 256MB ram it could explain the GT and GTX being as close in BF2.

As for the XT pulling ahead in Fear and BF2 with AA/AF enabled, it could be both the bigger framebuffer and the much higher memory bandwidth (48 vs. 38.4GB/s).

Now that I know HDR + AA is possible with X1800XT, would I be able to play Splinter Cell: Chaos Theory with AA enabled (& HDR in SM3.0 mode of course)? AOE3 is another game that I've been anxiously waiting for, and it'd be great if I can enable both of them in the game.

Yes.
 
Ailuros said:
I'm not sure if I missed anything in the Xbit Labs review but I don't see any mention of how much ram there is on each GPU. The X1800XT definitely has 512MB ram, what I'm unsure about is the X1800XL. If the latter should have had only 256MB ram it could explain the GT and GTX being as close in BF2.

As for the XT pulling ahead in Fear and BF2 with AA/AF enabled, it could be both the bigger framebuffer and the much higher memory bandwidth (48 vs. 38.4GB/s).
I'm pretty sure only a 256MB XL has been announced so far. I could be wrong, but I haven't seen a 512MB mentioned in any of the previews. Many of the SKUs rumored days prior to the launch seem to have gone MIA.
 
Ailuros said:
Bear in mind that very much depends on what the game allows - as NVIDIA's hardware has been here for a long time and dev's are only now starting to get X1000's then many titles may opt to for AA for when HDR is enabled.

Far Cry is being patched (I now have a beta of that patch, but I'm not convinced it changing AA depth) and Splinter Cell appears to do something from the numbers, I'm just not sure what that something is.
 
Yeah, it should be absolutely trivial to implement multisampling for FP render targets for current games that use FP render targets.
 
wireframe said:
I'm pretty sure only a 256MB XL has been announced so far. I could be wrong, but I haven't seen a 512MB mentioned in any of the previews. Many of the SKUs rumored days prior to the launch seem to have gone MIA.

That actually helps my theory this far (but doesn't mean that I'm right either):

fear_candy.gif


http://www.xbitlabs.com/articles/video/display/radeon-x1800_9.html

7800GTX (256MB) = 31 fps
7800GT (256MB) = 30 fps
X1800XL (256MB) = 27 fps
..............
X1800XT (512MB) = 41 fps
 
Dave Baumann said:
Far Cry is being patched (I now have a beta of that patch, but I'm not convinced it changing AA depth) and Splinter Cell appears to do something from the numbers, I'm just not sure what that something is.

LOL@something. If YOU can't be sure whether it uses AA or not, then the engine most likely struggles to enable it but it's not actually happening. Future patches then...
 
trinibwoy said:
Thanks for the link. The GTX link was for TSAA but it looks like QAF on X1800 should never be turned off!

Yes, but I want Wavey to tell us why it is so --what's happening under the covers to make it possible when it wasn't before. He's already sided with Demirug on the texture array not being the thing. So is it the scheduler? How, exactly? Something else?
 
Can someone explain to me what Doom3 is doing that it always runs better on NV hardware?

I was expecting this new X1K architecture to handle Doom3 better than it does. (Not that the framerate is really a problem)

Are the ATI and Nvida cards running the same code path?
 
rwolf said:
Can someone explain to me what Doom3 is doing that it always runs better on NV hardware?
First, It's using OpenGL. Second, Nvidia has optimised their drivers, via shader detection and replacement, to perform the best on it. Sadly, you weren't the only one slightly disappointed in the lack of performance improvements in Doom3-type games the X1800 series has.
 
Jawed said:
That's at 1024x768. Do you think the AA is causing the game to overflow 256MB?

Jawed

I haven't tried the multiplayer demo yet so I couldn't possibly know that it's limited to only 1024, unlike the single player version.

Point taken, that still doesn't explain why the X1800XL isn't faster in it compared to the 7800GT.
 
BRiT said:
First, It's using OpenGL. Second, Nvidia has optimised their drivers, via shader detection and replacement, to perform the best on it. Sadly, you weren't the only one slightly disappointed in the lack of performance improvements in Doom3-type games the X1800 series has.

It's a trend across OGL applications if you look close enough in the recent xbit labs review.
 
Last edited by a moderator:
Jawed said:
That's at 1024x768. Do you think the AA is causing the game to overflow 256MB? Jawed
No it's the AA in addition to the ridiculous texture memory requirements. The game hitches at 1024x768 at max quality even with no AA.
 
Ailuros said:
I haven't tried the multiplayer demo yet so I couldn't possibly know that it's limited to only 1024, unlike the single player version.

Point taken, that still doesn't explain why the X1800XL isn't faster in it compared to the 7800GT.
No the demo isn't limited to 1024 :oops:

XBit Labs uses blue to indicate 1024, and sloppily didn't seem to indicate this on the page.

From other reviews it's apparent that FEAR is death to high-end GPUs.

Jawed
 
Prices are changing by the hour. Last nite Allstarshop had a retail ATI XL for $499, but it was down to $459 this afternoon. Last nite MonarchPC had an OEM Connect3D XL for $449, but it was down to $436 this afternoon, and is now up to $459. Laugh.

Availability is still about a week away, according to Monarch (ships 10/12).
 
Pete said:
Last nite MonarchPC had an OEM Connect3D XL for $449, but it was down to $436 this afternoon, and is now up to $459. Laugh.

Old codger that I am, I still shiver at ATI OEM parts until someone who has one in hand reports on the clocks/pipes/mem/etc.
 
Heh, that was a problem with their "value" parts, never their high-end ones (well, OK, their AIW line was also restored to its full glory starting with R300). I'll start shivering when we see X1800LEs. :)
 
Oh, your memory is going or you aren't going back far enuf. ;) Original Radeon OEM did not ship at same clock as the retail box. But that was just off the top of my head.

But I'll readily agree that is mostly me "playing old tapes".
 
Last edited by a moderator:
Back
Top