Fine. But one is almost as pointless as zero.OpenGL guy said:Tomb Raider: Angel of Darkness is DX9.
A sample size of one means that this game tells us next to nothing.
Fine. But one is almost as pointless as zero.OpenGL guy said:Tomb Raider: Angel of Darkness is DX9.
ATI first launched their naming scheme to mesh with DirectX. Remember?Bouncing Zabaglione Bros. said:I don't agree with ATI renaming the 8500 core as a 9000 series card. I think that is misleading. However, AFAIK, they are not saying that it is a DX9 card,
First you say there are no DX9 games. I point out there is. Now you say that "one is almost as pointless as zero". Just can't make you happy.Chalnoth said:Fine. But one is almost as pointless as zero.OpenGL guy said:Tomb Raider: Angel of Darkness is DX9.
A sample size of one means that this game tells us next to nothing.
Fine. But one is almost as pointless as zero.
A sample size of one means that this game tells us next to nothing.
Lets see quite a few, so we know if:Heathen said:Fine. But one is almost as pointless as zero.
A sample size of one means that this game tells us next to nothing.
Along with all the DX9 benchmarks out there it tells us quite a lot actually, you just don't want to see it.
Tell you what though, let's turn this question around: How many DX9 titles would it take to convince you either way?
Which sample DX9 sample are we talking about? Tomb Raider?DaveBaumann said:Just an FYI: With this sample DX9 sample being discussed at the moment chooses to disable all DX9 features by default for the 5200 and runs it as it would a 9000/9200 class board (PS1.4).
First you say there are no DX9 games. I point out there is. Now you say that "one is almost as pointless as zero". Just can't make you happy.[/quote]OpenGL guy said:Fine. But one is almost as pointless as zero.
A sample size of one means that this game tells us next to nothing.
Yeah, read that afterwards, but that's only by default. Shouldn't there be a way to turn the features on?RussSchultz said:Dave suggests they turn off the DX9 features if its a 5200.
Though I can't fathom why they'd use PS1.4 as a fallback, which just as slow on the 5200... (or so I thought the data showed that with the 5800, so I'm extrapolating a bit)
Chalnoth said:For the purposes of this thread I'll make it simple: games that make significant use of DX9-generation shaders (which will include, for example, DOOM3).
RussSchultz said:Though I can't fathom why they'd use PS1.4 as a fallback, which just as slow on the 5200... (or so I thought the data showed that with the 5800, so I'm extrapolating a bit)
Chalnoth said:1. Have a definition. It could mean a few different things. For the purposes of this thread I'll make it simple: games that make significant use of DX9-generation shaders (which will include, for example, DOOM3).
3. If a pattern does emerge, alternative explanations must first be ruled out before making a definitive conclusion about the trend of "DX9 games." One example would be lack of focus on the PC version of the game, in the context of games developed for both a console and the PC. Another issue may be just a low-budget, quick game.
Regardless, I do have to say one thing. If there is one DX9 game, while it means close to nothing, it's not quite nothing. How does the 5200 do in the latest Tomb Raider?
I don't doubt that, but from my understanding of benchmarks run to date, PS1.1 is considerably faster on the FX hardware than PS1.4.DaveBaumann said:I think the DX8 shaders may be simpler in operation than the DX9 ones.
Chalnoth said:For example, is more processing done with DX9 effects enabled? Or are the same effects done for higher performance?
Chalnoth said:Yeah, read that afterwards, but that's only by default. Shouldn't there be a way to turn the features on?
Regardless, I really don't like people using Tomb Raider in these comparisons of shader performance without any knowledge about how Tomb Raider implements the various technologies.
The only problem is, you don't state the reasons that developers don't recommend it. It could be a number of things. After reading the link Dave posted, I'm inclined to think that it's due to the driver automatically dropping to integer precision when applying the DOF affect, which screws that effect up. Again, I think Microsoft should have implemented an integer type so that nVidia was not forced to go the route of not following the spec for reasonable performance under DX9.Joe DeFuria said:I don't know, but it's the "by default" that matters, IMO. That is, performance is slow enough that the developers don't recommend it. That's important for two reasons: