Catalyst AI

Thinks already suck. I find the performance of Doom3 on ATI cards awefully suspicious considering development started on the 9700.

It is almost as if it was engineered to run better or worse depending on the card vendor.

Eh i dunno.

I think doom3 is a case of hardware being very tuned for a certian game + shader replacements .

Look at the gains ati made at driverheaven.net with the 4.10s and the ai . The x800xt pe is only 8 fps behind the 6800ultra . 8 fps is not the end of the world . Its good steps foward.
 
Reverend said:
The solution to the problem of possible confusion is simple.

Read only Beyond3D reviews. We Never Get Things Wrong.

:)

You cannot get things wrong because you never compare different manufacturers.

Chickens!!!! /runs
 
rwolf said:
Thinks already suck. I find the performance of Doom3 on ATI cards awefully suspicious considering development started on the 9700.

It did? JC had a 9700 to play around with 4 years ago?
No seriously - IIRC JC stated years ago that D3 would be programmed around what's possible with GeForce1. Well, hardware requirements raised with the years, but still - development was heavily based on the nVidia architecture.
 
Tokelil said:
Mendel said:
Don't you mean it the other way around?

look at the pipes, it's almost as if 4.10 didn't have gamma corrected AA and 4.9 did. or 4.10 had 2xAA and 4.9 had 4x. look just left from the soldier :oops:
No, take a look at those 3 images:
Cat. 4.9 No AA
Cat. 4.10 No AA
Cat. 4.10 4AA 8AF

Looking at the pic. 2 & 3 at the edge left of the soldier, the edge doesn't change by much. It does between 1 & 2.

It looks like a 4.10 run Doom3 with 2xAA all the time.
 
rwolf said:
Thinks already suck. I find the performance of Doom3 on ATI cards awefully suspicious considering development started on the 9700.

It is almost as if it was engineered to run better or worse depending on the card vendor.

JC had a 9700 Pro but "Nivida drivers are the gold standard" was his line however he failed to mention that he was running Nvidia at FP16 (not supported in DX9 until a later revision) while ATi was running at the percision required by DX9 (FP24)
 
YeuEmMaiMai said:
...he failed to mention that he was running Nvidia at FP16 (not supported in DX9 until a later revision) while ATi was running at the percision required by DX9 (FP24)
Which isn't relevant if one is developing an OpenGL renderer.
 
Actually, it is relevant hardware-wise because one hardware is doing FP16 internally while the other is FP24 internally, regardless of choice of API. Go ask John exactly what he means by "NVIDIA drivers are gold standard" wrt OGL -- is it stability or performance (and this is where this becomes relevant) or both? AFAIK I think he wasn't talking about performance at the time he said that now-trademarked phrase.
 
YeuEmMaiMai was making comments pertaining to what was required by DX9 throughout the various changes during its development and it was that to which I was responding.
 
Back
Top