ATI engineering must be under alot of strain. MS funding?

StealthHawk said:
When using hardware T&L 3dmark2000 was slower than using software T&L. Although I think that was the exception, not the rule. It caused quite a ruckus back in the day though ;)
If I'm not remembering the wrong thing, I do recall the GF being slower in 3DM's T&L test than then-top-end Athlons. I don't see how this can be construed as 3D deceleration, as the whole point is GF frees up the CPU to do other things--you're not going to get those T&L scores out of a CPU while playing a game.

I don't ever remember seeing a game that ran slower with hardware T&L on...then again there weren't many back in that time frame. Savage2000 was the true hardware TnL decelerator anyway...lower performance when "hardware TnL" was enabled than without :D
As for the S2K, I think S3 never got TnL to work--something about defective hardware. My memory of those cards is vague, though, as I wasn't paying as much attn as I do now. :)
 
Qroach said:
... What I'm saying is that I don't think spong actually has an impact on the market, despite what CBS market watch mentioned. Now if all the news media outlets picked up on Spong, then I think more people would have known about it and then there's a greater chance for the stock price being affected. That's my take on it,, I'm not saying that I know for certain one way or the other.

How do you think CBS Marketwatch gets their trading wrap-up stories? Do you think they troll the Internet (and all other public sources of information) looking for news and rumors which they can then match up to unexplained movements in the day's stock trading? :rolleyes:

They get their info by talking to traders and finding out why they made the moves they did. The fact that at least some traders cited the spong report as the reason for NVDA's move that day means that it was a reason for the move. The fact that there was no other likely cause indicates that it likely was the reason for the move.

With one caveat: some traders could be playing at the same game Dave Baumann was doing here: they probably had a more reliable source for the same information than just reading it on spong, but pointed CBS Marketwatch to spong because that was the only place the info was being reported publicly.

Of course the reality is likely somewhere in between: a very few traders had reliable inside sources that tipped them to the deal, and they reacted accordingly. The market as a whole (or at least a wider portion of it), seeing both those movements from traders with inside sources and the article at spong, put two and two together and realized that, unlike most spong articles, this one seemed to be validated by those who are likely in the know; thus they gave it more credence than they normally would and traded accordingly. Which is how a small market shift turns into a big one.

But all of that is what CBS Marketwatch means when it cites something like an article at spong.com as the source for a stock's movement. They most certainly aren't just guessing.
 
RussSchultz said:
Joe, I agree with you, except the "s" on "developers". All we have is one data point. Once we get some more, the picture will be less fuzzy.

Well, I meant "developers" as in "developers of Tomb Raider." ;) (As in multiple individuals).

Of course, the more data points we have, the better. But as I aluded to above (not directed at you personally)....first it was "Synthetic tests don't count...bring on real games!" Then its "well, OK, this real game doesn't count too much because it's ONE real game, and we don't know all the nitty gritty details about it...." I suppose next it will be "well, ignore these 10 games, but THIS one counts, because X-Y-Z..."
 
Chalnoth said:
The only problem is, you don't state the reasons that developers don't recommend it.

That's not a problem, Chalnoth. That's my point.

It doesn't really matter what the reason is. What matters is the result: DX9 features not recommended for 5200 series. Is one data point enough to declare a definitive answer on this? Of course not.
 
Joe DeFuria said:
I suppose next it will be "well, ignore these 10 games, but THIS one counts, because X-Y-Z..."
I already posted in regards to this.

I'll post again:
In order for a general trend to be apparent, it must be visible in at least three games that are primarily designed for the PC that make use of DX9-level shaders (OpenGL or Direct3D, but since I expect differences between the API's, it would be better if all three used the same API).

If the three games do not show the same trend, then we will require more games to make any sort of definitive judgement about what most DX9 games will show.

The best judgement would likely be made from games that are made off of heavily-licensed engines. The upcoming DOOM3, therefore, will be a huge compass (particularly because the shaders are not just "extra fluff," they are integrated deeply into the rendering).
 
Chalnoth said:
...If the three games do not show the same trend, then we will require more games to make any sort of definitive judgement about what most DX9 games will show.

No one is making any definitive judgements here, Chalnoth. We are merely making specualtion (derived from synthetic DX9 tests) that DX9 games, and 5200 just don't mix. And so far the evidence we do have supports that speculation.

The best judgement would likely be made from games that are made off of heavily-licensed engines. The upcoming DOOM3, therefore, will be a huge compass....

I'll repeat again what has been said by more than just myself.

Doom3 is not DX9 shader based, so I don't see how that is a compass for DX9 shaders at all. Doom3 is a compass for global use of DX7 functionality. (Cube-maps / stencil.)
 
Joe DeFuria said:
Doom3 is not DX9 shader based, so I don't see how that is a compass for DX9 shaders at all. Doom3 is a compass for global use of DX7 functionality. (Cube-maps / stencil.)
If you want to attempt to use that definition of a DX7/DX9 game, then you're going to have to wait at least two years before any DX9 games appear.

See, DOOM3 was designed to use DX7 hardware as a minimum. JC looked at the original GeForce, and asked what he could do on that hardware that could not be done before. The shadow technology and per-pixel lighting was the outcome of that.

No game will use DX9 hardware as its minimum for quite some time. Games using DX8 hardware as a minimum still seems a bit far off.
 
Chalnoth said:
If you want to attempt to use that definition of a DX7/DX9 game, then you're going to have to wait at least two years before any DX9 games appear.

Which goes back to my gripe about your original 3 definitions: what defines "significant DX9" titles?

See, DOOM3 was designed to use DX7 hardware as a minimum.

Correct. Not DX9 shaders. It uses DX8 and DX9 level coding paths to reduce passes and improve performance, but not really "do" anything that's not possible with DX7 features.

No game will use DX9 hardware as its minimum for quite some time.

Nor did I say that a game must use DX9 as a minimum to "qualify as a game that uses significant DX9 tech."

I would qualify TR as one title that include "DX9 significance." Half-life 2 will probably be another.
 
Chalnoth said:
No game will use DX9 hardware as its minimum for quite some time. Games using DX8 hardware as a minimum still seems a bit far off.

Doesn't Half-Life 2 claim to use many DX9 shaders when the card supports it? I suspect that's when people will find out the mid/low-end Nvidia DX9 hardware simply won't be usable with the graphical nicities enabled.
 
imho, there are enough data points of dx 9 shaders to start forming conclusions about the hardware.

1. Numerous synthetic benchmarks of shader performance.
2. One shipping game (Tomb Raider)
3. Soon shipping game (HL 2)

Number 1 could be disputed because they aren't games. All I can say is, "A shader is a shader is a shader," regardless of what program it's running in.

Number 3 could be disputed since it's not out yet. But I trust that the developers of HL 2 know what cards are best for it, and that they've recommended radeons for good reason.
 
With synthetic shader benchmarks, we certainly know how cards will perform under certain loads relative to one another.

It doesn't tell us what will be the general useage model for shaders, which is somewhat required to determine whether a card is acceptable or not for useage.

It may be that the 5200 is so ubiquitous that developers have to target it (by reducing the overall shader load by shortening shaders or by lessening the number of pixels shaded).

Or it may be that it performs so poorly, that developers skip PS2.0 for the 5200 and make it use Dx8 paths.

I don't know one way or another, do you?
 
Chalnoth said:
Joe DeFuria said:
Doom3 is not DX9 shader based, so I don't see how that is a compass for DX9 shaders at all. Doom3 is a compass for global use of DX7 functionality. (Cube-maps / stencil.)
If you want to attempt to use that definition of a DX7/DX9 game, then you're going to have to wait at least two years before any DX9 games appear.
Huh? Didn't I state earlier that TRAOD is a DX9 games? Did you know the game uses PS 2.0 shaders? Did you know some of these shaders are 48 instructions long? (There may be others that are longer, I'm just going from memory.)

What more does it take to be called a "DX9 game"?
No game will use DX9 hardware as its minimum for quite some time. Games using DX8 hardware as a minimum still seems a bit far off.
I see. So having a fall-back path for older chips makes a game "non-DX9"? I don't see the logic here.
 
Joe DeFuria said:
Chalnoth said:
Yeah, read that afterwards, but that's only by default. Shouldn't there be a way to turn the features on?

I don't know, but it's the "by default" that matters, IMO.

Said game "by default" comes up with a blank screen for the main menu (NV cards), or a completely garbled menu screen (ATI cards).

To make a working main menu, you have to turn off a feature (same one for both cards), which is like the 134th checkbox in the completely useless settings screen.

If one can do that - why not setting this? :)
 
RussSchultz said:
It may be that the 5200 is so ubiquitous that developers have to target it (by reducing the overall shader load by shortening shaders or by lessening the number of pixels shaded).

Or it may be that it performs so poorly, that developers skip PS2.0 for the 5200 and make it use Dx8 paths.

I suspect it will be the latter. Look at what Epic did with UT2003. They put in a lot of work to get it working on cards like the V5500, but it ran slowly, at low resolution, and without a lot of eyecandy. Running the same game on a R3x0 card is like the difference between night and day. In fact I was prompted to do a full system upgrade because although the game ran, the performance and quality was simply not acceptable on lower end cards.

Sure, developers will try to make it possible to run DX9 games on the 5200, but it won't look good, be as fast or run at the resolutions gamers have come to expect. It will be like looking at a great work of art through a keyhole while wearing dark glasses. That's not acceptable performance in my book.
 
ZoinKs! said:
2. One shipping game (Tomb Raider)

I'd bet the current build of HL2 is
- more playable
- has less bugs
than TR AOD.

So much for the shipping game.

Edit: I wonder if either Chalnoth or Joe actually tried to play TR ...
 
Hyp-X said:
ZoinKs! said:
2. One shipping game (Tomb Raider)
I'd bet the current build of HL2 is
- more playable
- has less bugs
than TR AOD.
Note that none of this has any bearing on whether the game is DX9 or not. Plus, it's pure speculation because noone has seen HL2.
 
OpenGL guy said:
Huh? Didn't I state earlier that TRAOD is a DX9 games? Did you know the game uses PS 2.0 shaders? Did you know some of these shaders are 48 instructions long? (There may be others that are longer, I'm just going from memory.)
There are two ways to look at what constitutes a "DX9" game. Does it use DX9-level shaders for gimmicks? Or does it use DX9-level shaders as a fundamental element of the game?

I see nothing to suggest that the Tomb Raider effects are anything more than a gimmick. So, by one definition, I could easily say that Tomb Raider is not truly a DX9 game.

By another, if it just uses any shaders that can only be done properly in DX9 (either for precision or other reasons), then I could call it a DX9 game. Anyway, you can't use both definitions at the same time. You have to use one or the other.

This is what I was calling Joe on. It seemed to me he was attempting to apply a double standard to two different games. I say that DOOM3 is at least as much a DX9 game as the new Tomb Raider will be. But it all depends on the definition you're using at the time. I hope that I'll always be specific about which definition I'm attempting to use.
 
Chalnoth said:
OpenGL guy said:
Huh? Didn't I state earlier that TRAOD is a DX9 games? Did you know the game uses PS 2.0 shaders? Did you know some of these shaders are 48 instructions long? (There may be others that are longer, I'm just going from memory.)
There are two ways to look at what constitutes a "DX9" game. Does it use DX9-level shaders for gimmicks? Or does it use DX9-level shaders as a fundamental element of the game?

I see nothing to suggest that the Tomb Raider effects are anything more than a gimmick. So, by one definition, I could easily say that Tomb Raider is not truly a DX9 game.
No offense here, but what makes you the expert in making this determination?
By another, if it just uses any shaders that can only be done properly in DX9 (either for precision or other reasons), then I could call it a DX9 game. Anyway, you can't use both definitions at the same time. You have to use one or the other.
Did I not say that the application uses shaders that are 48 instructions? Please let me know how this could be done without PS 2.0.
This is what I was calling Joe on. It seemed to me he was attempting to apply a double standard to two different games. I say that DOOM3 is at least as much a DX9 game as the new Tomb Raider will be. But it all depends on the definition you're using at the time. I hope that I'll always be specific about which definition I'm attempting to use.
Let me give you a better way of determining whether something is "DX9" (i.e. PS 2.0 or better): ALU ops vs. TEX ops. If the ratio is 1:1 or close to it, then it's not really DX9, sound fair? Let's see, if TRAOD used the maximum number of textures supported by PS 2.0 (16) then it'd still have an ALU vs. TEX ratio of 3:1 for at least some shaders. However, since it's not using this many textures, the ratio would be much larger.

I'd say it passes the test.

P.S. Before someone goes and says, "Well, its shaders are not optimized for nvidia hardware." I'll go ahead and tell you that the shaders are using the _pp modifer in many places.
P.P.S. As I said earlier, there's just no pleasing you Chalnoth. Can't please a moving target.
 
Let me put it another way, OpenGL Guy. There are two ways to define a "DX9 game."

1. The game uses some effects that can only be seen at their best when using PS 2.0 or higher.

2. The game requires PS 2.0 or higher because the developers decided to incorporate the use of PS 2.0 into the game in a fundamental way.

Assigning Tomb Raider: Angel of Darkness as a "DirectX 9 game" clearly relies upon the first definition.

Saying that DOOM3 is not a "DirectX 9 game" clearly relies upon the second definition.

Using both definitions at the same time is a fallacy. One must use one or the other. Throughout this thread I have been working by the first definition. It just really got to me when Joe attempted to use the second definition at the same time when he brought DOOM3 into the mix.
 
Pete said:
StealthHawk said:
When using hardware T&L 3dmark2000 was slower than using software T&L. Although I think that was the exception, not the rule. It caused quite a ruckus back in the day though ;)
If I'm not remembering the wrong thing, I do recall the GF being slower in 3DM's T&L test than then-top-end Athlons. I don't see how this can be construed as 3D deceleration, as the whole point is GF frees up the CPU to do other things--you're not going to get those T&L scores out of a CPU while playing a game.

As Chalnoth stated, there was also that Driver 6 something game. I had forgotten about that, but that was something that made hardware T&L looks like a joke.

I don't ever remember seeing a game that ran slower with hardware T&L on...then again there weren't many back in that time frame. Savage2000 was the true hardware TnL decelerator anyway...lower performance when "hardware TnL" was enabled than without :D
As for the S2K, I think S3 never got TnL to work--something about defective hardware. My memory of those cards is vague, though, as I wasn't paying as much attn as I do now. :)

Well, that's just the point. There was an option in later drivers that enabled "hardware TnL" and games like Q3 ran slower with it "enabled." Which signifies that it was indeed broken ;)
 
Back
Top