My response to the latest HardOCP editorial on benchmarks...

I find it quite illustrative of how easily influenced by IHV PR some websites are, and I wonder how much of the...umm..."nonsensical" is the polite term...benchmarking evaluations (Lars is the one who specifically raises the most alarm bells for me) have been influenced by correspondence that has not gone public. These articles seem to illustrate it is very easy to do so. :-?

For these articles on the "evils of 3dmark03", I wouldn't mind so much their conclusions if their reasoning definitively illustrated a rationale and accurate information, instead of a complete and direct parroting of exactly what nvidia spoonfed them! :rolleyes:
 
Well, at least the its pointing out the ones who are actually attempting to think for themselves or those that are just parroting a particular line. In this respect this whole thing has been quite useful...
 
Demalion: Agreed.

I wonder if Beyond3D is going to post their own Full Analysis of 3D Mark (beyond their introduction.) I honestly don't know what their full evaluation of it is, but the reputation of this site's quality, technical competance, and perception of fairness may have a bigger impact on "the public's" view of 3D Mark than review sites...

I would also love to see B3D not only analyze and give concluding thoughts on 3D Mark...but also on NVIDIA'S PR....
 
All i can say is how can so many people like Lars and kyle allow themselves to be spoon fed to this extent??? He especially takes the time to requestion the PS 1.4 support in his conclusion. *sigh*

This whole thing is just simply to depressing. They dont understand the technologies enough to spot a fraud when they see it. I wonder what the outccome of all this will be? Will sites like [H], and Toms really help kill 3dmark? Simply becuase it has equal representation of competing technologies and Nvidia does not like it?

For crying out loud, 3dmark has a detailed white paper that would put to rest a lot of this mud they are slinging. It appears they refuse to read it.

I am now wondering what the hell is wrong with so many people. Its Simply Shamefull. I wonder what it would take for Nvidia, or these web sites to turn on id Software?

If John Carmack suddenly said that he was not going to do an Nv30 code path... and make them and everyone else run the ARB and ARB2 standards...

Would Nvidia suddenly send out massive PR statements to everywhere? Would they say *we dont think Doom-III is a game worth buying?*, would they say *doom-III is poorley coded becuase it does not support us they way we want*. Would they put out statements condeming the ARB board??

My guess is they would. I dont think they have a shred of decency in their entire company. Or loyalty.
 
Dave,
In this respect this whole thing has been quite useful...

Dont you see the problem??? there are only a few handfull of people, mostly here at B3D and perhaps one or two other places who understand the issues. Or have people capable of explaining the issues.

I am reading just horrible things being posted about Futuremark all over the place, about ATi as well. Even at Futuremarks own foum. People like Kyle and lars have a gigantic influence on people. So does gamespot and a few other sites that are parroting the same thing.

None of these people are even attempting to break down Nvidias statement and answer them point for point. In the end the majority of web sites are going to follow Nvidias lead, or they wont get review hardware. What do you think is going to happen?

I just cant understand why people are not as upset about this as I am. Forget ATi, forget Nvidia, MAtrox. This kind of distainful behavior from anyone should not be tollerated. How can we sit around and camly talk it over, when Futuremark is getting systematically attacked and destroyed???

If the information were true, then There would be no problem. But these people are only offering one side of the story. Even worse, the side they are pushing does not even contain but perhaps 10% valid correct information. And even that 10% is complete Hypocracy from what was being preched 72 hours ago.
 
This is the quote that really makes me want to puke. So 3DMark03 would have been a more realistic benchmarking software if Nvidia would have provided their input? Yes, Nvidia is the be all and end all of the 3D hardware industry.
Nvidia was a member of the beta program for 16, of the 18 month development cycle.

They had no input???
 
Kyle is almost a politician, he is brash, has bad timing, and often doesn't justify his actions. But, I do not think there is a NVIDIA / HardOCP conspiracy. This just might be another effect of his bad timing, or inability to explain things that over interpreters such as ourselves can understand.

If we can not trust Kyle, can we atleast trust Brent, who has not only proved to us his good character on these very boards, but also in the quality of articles he puts forth on a website most of us would otherwise not goto?
 
I like brent , but i had that site and thus no longer go to it and none of my friends go to it . thats all . its all i can really do.
 
Crusher said:
First, you can say that my computer getting 30 3DMarks shows how useless it's going to be on DX9 games. The fact that it's going to be useless on DX9 games is probably true. However, that score is based only on Game Test 1, since that's the only test my computer can run. Game Test 1 is supposed to be a DX7 benchmark. Well, last time I checked, the GeForce 2 was one of the best DX7 cards around. I think the peak framerate I saw displayed was 8 fps, and it usually was below 3 fps. That's not indicitive of any DX7 game I know. Even UT2003 and NOLF 2 run much better than that on my computer, and they're about the most demanding DX7 games I've seen.

That tells me that Game Test 1 is clearly not indicitive of the type of situation it's meant to portrait. If GT1 can't properly judge the abilities of a DX7 card running a DX7 game, how is it going to properly judge the abilities of a DX8 or DX9 card running a DX7 game? Now you might say, "who cares? there are DX7 games out to test with if you want to know how well a card works with DX7 games. 3DMark03 is supposed to compare cards with DX8 & DX9 level games." That's all well and good, except that Game Test 1 still counts towards the final score on DX8 and DX9 cards. So here we have a clear example of how the final 3DMark03 score is being based in part off an irrelavent and inaccurate test.
This is a common misconception about what the target of a future-looking benchmark like 3DMark03 is or should be. That is, a lot of people look at 3DMark03 and think that the four tests are supposed to represent:
  1. a game that might be released today
  2. a game that might be released ~6 months from now
  3. a game that might be released ~12 months from now
  4. a game that might be released ~18 months from now
I can't claim to speak for Futuremark, but IMO that's not their aim at all. Instead, these four tests are supposed to represent 4 games that might be released around 18 months from now. That's why it's a great choice to make GT1 a flight simulator, because it's quite likely that this is one of the game genres most likely to still not be using pixel shaders in games released in late 2004. Note further that while Nvidia's market research may give "flight simulators" only 1% of the market, there are a number of other genres (e.g. outer space RTS) that are also characterized by most of the screen being covered with a single-textured skybox.

The next thing you need to realize about GT1 is that it's not a DX7 benchmark. It is a DX8 benchmark with a fallback to DX7: that is, it uses VS 1.1, but if the GPU does not do hardware vertex shading it can emulate the functionality on the CPU. Moreover, this is almost entirely to blame for you score of 30 with a GF2 and (IIRC) a Celeron 5xx@8xx; your Celery is based on the Mendocino core and does not support SSE1, which is very helpful for running vertex shading calculations. For example, my phat gaming rig (1.2 GHz Thunderbird, GF2MX 400) scores a towering 130, despite a GPU with half the fillrate and bandwidth of a GF2 GTS. (Yeah 130! Suck it down!! 8) )

The use of VS 1.1 instead of fixed-function T&L was quite deliberate, and is discussed in the White Paper. GT1 is as much a DX7 benchmark as DoomIII is a DX7 game, which is to say, not much.

In any case but your poor SIMD-lacking Celeron (well, on the bright side it has MMX), the real beef Nvidia has with GT1 is that it's mostly single-textured (again, extremely appropriate both for the flight simulator genre and for the "type of game likely to use a simple, pixel shader-less engine in late 2004" category), which puts their current lineup at a huge disadvantage. Every Nvidia GPU since the GF1 has had either a 2x2 or 4x2 organization. Meanwhile, every ATI GPU in their current lineup (with the exception of the 8500/9100, which is being phased out) is either 4x1 or 8x1. So obviously a mostly single textured benchmark is going to make ATI's cards look better.

The question is, who's "right"? Well, considering every future GPU core with known specs is nx1 (including NV3x, of course), and that loopback has removed nearly all the reasons for organizations other than nx1, it's safe to say ATI's lineup is the more forward looking in this regard. It's also interesting to note that all of these nx1 GPUs from ATI have been released in the time since Nvidia last released a GPU.
What good does it tell you to know that one card is faster than another at rendering stencil shadows when calculating the geometry redundantly at intermediate steps in the rendering process, if no game is ever going to do that? Does it tell you how fast their vertex shaders are relative to one another? Or does it tell you how much each card stalls when throwing in redundant processing at different points? Or does it say something completely different? The one thing that you know it doesn't say is how fast Doom 3's vertex processing will be.
First off, as I understand it, DoomIII will "calculate the geometry" just as redundantly (i.e. on every pass: 1 + 2-per-light), in the sense of doing T&L on every pass. It's just that it does the skinning once per frame (on CPU) and sends the info over the AGP bus (on every pass?). (Or does DoomIII do all geometry calculations, including transform (I guess there's no per-vertex lighting), on the CPU, i.e. no use of hardware vertex shaders at all?)

Multipass inherently means redoing the geometry for every pass. That's why DoomIII has such a low poly count (per scene) relative to other high-end games.

As for reskinning with every pass:

As OpenGL Guy has pointed out and as Ichneumon's with/without PS1.4 tests have proven, the Radeon 9700 Pro isn't vertex shader limited running GT2 or GT3. (PS 1.4 only gives a ~23% boost over PS 1.1 on 9700P, despite almost 100% higher geometry.) And indeed, there's no good reason I can see why any modern card (or even a year-old one like GF4) should be vertex shader limited from having to skin what is likely not more than a few hundred thousand polygons a second. After all, only dynamic joints (as in skeletal animation) need to be skinned, and GF4 Ti4600 only gets 12.6 and 10.5 fps in GT2 and 3 respectively. Skinning, as I understand it (and as Futuremark claims) is a very light workload, and with such low fps, the number of skinned vertices per second should not put a serious damper on any GPU that is honestly designed for more than a trivial application of vertex shaders.
With the ammount of programability in DX9, there's a vastly greater number of ways of performing the same type of task. A hardware light is a hardware light in any program, and the T&L functionality is the same across all cards, the only thing that differs is the performance. Pixel shading version 1.1 was also fairly limited in what it allowed. Pixel shading 2.0, Vertex Shading 3.0, and the various ways of breaking them down to allow support for cards with lesser versions are completely different situations. To take the specific way 3DMark03 implements these features and the way the perform fallbacks for older cards, and say it's indicitive of the overall ability of one card compared to another is rediculous.
I strongly suspect that this is incorrect; that because shaders are run on general-purpose instead of fixed-function hardware, a GPUs relative performance on a certain set of PS 2.0 shaders will be a very good proxy for all PS 2.0 shaders with a similar length and instruction mix. (And it is a good bet that the shaders in 3DMark03 are at least similar to what real shaders will be like w.r.t. length and instruction mix.) However, I'm not an expert on this so I'll defer to someone who can shed a bit more light on this.
And that brings up the other twist--the fact that the majority of game developers inherintly try to do their best to make sure games perform acceptably on all makes of video cards. If a developer does things the way 3DMark03 does them and finds tremendous descrepancies between how different cards from different vendors and/or different generations perform those actions, they will probably change the way they're doing things. You can argue the extent to which things will change, or the number of developers who will ultimately make such decisions, but that doesn't change the fact that it's a relevant variable.
This is an interesting point, and quite rightly made. It brings up an interesting dilemma in any purpose-made benchmark software. Benchmarks are supposed to be impartial among IHVs. And benchmarks are supposed to reflect real software performance--in this case, "real" (projected) games circa 2004 or so. The problem is that real games are not impartial among IHVs; they will tailor their engines to run well on whatever cards have the largest installed base, even if this is "unfair" in the sense that it disadvantages GPUs with dissimilar designs.

An interesting dilemma but, in this case, not a relevant one. As I've said, these tests are all supposed to represent various games released in the late 2004 time frame. And, as Nvidia has so helpfully pointed out, the decisions taken in 3DMark03 which most directly disadvantage Nvidia hardware are to make GT1 mostly single-textured and to use PS 1.4 (with PS 1.1 fallback) heavily in GT2 and GT3. Happily, starting with NV30 these decisions will no longer disadvantage Nvidia hardware in any way. All future Nvidia cores (certainly all cores released between now and late 2004) are likely to be nx1 designs, and certain to support PS 1.4. Come to think of it, all near-future discrete GPUs are likely to be nx1 and support PS 1.4.

Yes, games released in late 2004 will still be written to advantage Nvidia cards. (Assuming, as is likely, they maintain their high marketshare.) But this will no longer mean avoiding PS 1.4 or single texturing. By that time, the GF4 Ti will be over 2.5 years old. Would you expect a game released next month to significantly modify its engine in order to cater to GF2 GTS owners? No? Then you shouldn't expect a late 2004 game to make the changes Nvidia suggests, either.
 
DemoCoder said:
Now that 2.0 exists, I don't have much desire to code for 1.4, and if it gets supported, it will only be because the HLSL compiler generates it automatically, and it fits within the specs of 1.4

Well, if your handle was "GameCoder", I might have thought some more of this, but as a demo coder, it's obvious that you'd prefer to use 2.0 exclusively. Game coders might also, but they have other considerations.
 
For the overall discussion, what does this have to do with ATi v nVidia? I don't understand the correlation between the bench being dropped and ATi's and nVidia's performance on the bench. Is this yet another nVidia PR is evil incarnate thread or is this supposed to be a discussion about the merits of the bench in video card reviews? I thought it was the latter, if it is the former could someone clarify so I can remove all of my comments and not be associated with such fannish spew?

Boddo-

You want to get in to a discussion about gaming statistics, I'm always game. If you want to break out the month by month spread sheets showing gaming sales statistics, long term trends of particular genres or sub genres, franchise power in the marketplace, buying habits of different types of gamers, whatever you want to get in to is cool :)

- The number of players playing Half-Life online at any time is greater than the number of online players for Unreal Tournament, Quake 3, RTCW, SoF2, JKII, and UT2k3 combined.
- The best selling game series of all time is Myst, followed by The Sims.

Point one is true, point two you have the order reversed and then that applies exclusively to PCs. SuperMarioBros sold more then Myst, all of its sequels, The Sims and all of its expansion packs combined. It is nice that you brought up The Sims and Half-Life however. At any given point you are likely to find ~50K-150K people online playing HL(vast majority CS) out of the 2-3Million people that own the PC version/s of the game(and they have, by far the highest ratio). The Sims by has sold five to six times as many units as HL with no on line experience built in to the base game at all. The Sims Online meanwhile, has managed to fall a staggering amount below industry expectations not even managing to crack 100K online users as of a week or two ago. On line gaming is a very small factor in the greater PC gaming market. Out of the top ten PC games for '02 two were capable of playing on line at all(WC3 and MOH:AA). Anecdotal evidence of a small niche of PC gaming doesn't change the broader PC gaming market at all, nor what they are looking for.

Tell that to the millions of gamers who got GeForce4MX440s. That'll sure run "Quake 4" real well. More likely, they bought their NV17 so that they could run Warcraft III - a rendering engine that is basically DX6.

You are talking about people who don't read reviews at all, obviously review methodology will have no impact on them in the least.

Again, if you were an average gamer, you wouldn't worry about the games you were going to buy. A Quake3 bench wouldn't do you good, and neither would 3dMark03... in fact, you'd be quite fine buying a Xabre - Half-Life and The Sims aren't going out of style anytime soon, and neither of them require a videocard more powerful than a VooDoo2. If you do buy a videocard to run a game, it's probably Warcraft III, the fastest selling PC game ever. War3 runs fine enough on a GeForce2MX, a GeForce4MX is more than good enough.

What drives the gaming industry, the guys and gals who buy maybe six games a year or the guys who buy twenty-fifty? The people who stay online playing one game all the time are likely to fail to even qualify for the first group, let alone the second. You have very casual gamers, those that rarely play or those that play a few games, and it escalates from there. The gamers that rely on video card reviews the most are those that drop $1K-$2.5K a year on games.

The reason hardware sites are aimed toward hardware enthusiasts, not "real" gamers, is because the average gamer doesn't give a rat's ass about hardware. And he has no reason to - the most graphically intense thing he'll ever run is 2-3 generations behind the cutting edge. The average gamer bought his trusty GeForce2MX back in 2001, and it serves him faithfully today. He might get occasional slowdowns while playing Tower Defense UMS'es, but that's fine with him. If anything, the major cause of his gaming slowdowns is his 56k AOL connection.

Not even an appreciable minority of gamers play online, let alone any absurd thoughts that the majority do. Do gamers think about hardware the same sort of way the pure hardware enthusiasts do? Of course not, they want what works. Does the average gamer care if the game looks better with new hardware? Yes. Does the average gamers care that the game runs faster with new hardware? Yes. Does the average gamer care that his hardware can run the game at all? Yes. Will Quake3 give the slightest hint of what the disparity will be running Doom3 between a GF2Ultra and a R9500? Not even close, nor does UT2K3, at least 3DMark2K3 gives them a general indicator.

Is it any wonder that game companies don't push new technologies as fast as they can? It'll be Fall 2004 before Joe Average has a videocard capable of running UT2k3 at a decent framerate, it's no wonder Half-Life remains the superpower of PC gaming.

Because they upgrade rather infrequently. You brought up the GF4MX as a good example of a gamers style board, looking at reviews the GF4MX may look like it is within the range of the DX8 boards in most of the benches shown, but at a lower price point. 3DMark blows this possible misconception right out of the water and if they based their buying decission solely on 3DMark they would actually have made a far better long term choice then by using every 'current' gaming bench combined.

The average gamer is completely irrelevant to benchmarking. There's not a single "average gamer" on all of Beyond3d. Not a single Average Gamer will ever pay $200 for a videocard upgrade, and they're very unlikely to even spend $100.

When did I say anything about "average" gamers? I'm talking about gamers who actually read video card reviews and use them to make purchasing decissions. If you want to say that there is no such thing, or they are irrelevant then fine, have sites like [H] drop the laughable notion that they are aimed at gamers.

Joe-

I can hardly wait for Anand's "analysis"....

Anand does not, nor do I recall him ever using the actual 3DMark score in any of his vid card reviews. He uses the synthetic tests, but doesn't report the 3DM score nor the individual game scores.
 
Hellbinder[CE said:
]All i can say is how can so many people like Lars and kyle allow themselves to be spoon fed to this extent??? He especially takes the time to requestion the PS 1.4 support in his conclusion. *sigh*

This whole thing is just simply to depressing. They dont understand the technologies enough to spot a fraud when they see it. I wonder what the outccome of all this will be? Will sites like [H], and Toms really help kill 3dmark? Simply becuase it has equal representation of competing technologies and Nvidia does not like it?

For crying out loud, 3dmark has a detailed white paper that would put to rest a lot of this mud they are slinging. It appears they refuse to read it.

I am now wondering what the hell is wrong with so many people. Its Simply Shamefull. I wonder what it would take for Nvidia, or these web sites to turn on id Software?

If John Carmack suddenly said that he was not going to do an Nv30 code path... and make them and everyone else run the ARB and ARB2 standards...

Would Nvidia suddenly send out massive PR statements to everywhere? Would they say *we dont think Doom-III is a game worth buying?*, would they say *doom-III is poorley coded becuase it does not support us they way we want*. Would they put out statements condeming the ARB board??

My guess is they would. I dont think they have a shred of decency in their entire company. Or loyalty.

I have no interrest in bashing 3D Mark 2003. But I also can´t ignore the statements made by NVIDIA on that topic. That´s why I wrote an column with my own opinion and thoughts about that and not and "article". You can be sure that we´ll see tons of articles poping up on the web using NVIDIA´s point of view. But I wanted to give Futuremark the chance for a response on that. That´s why I quoted NVIDIA´s opinions to show how they see it. I´m in very close contact with Futuremark right now and there will be an answer from their side. I never said that I fully agree with NVIDIA´s opinion.

Yes, I think there are some issues in 3D Mark and things that could have been done better (or more clever). Why did´nt they make tests like (GF1: DX7, GT2: DX8.0 (PS1.1/1.3), GT3: DX: 8.1 (PS1.4) and GT4 with DX9). Then NVIDIA would´nt have had the chance for critics since it´s their own fault not to implement PS1.4 in their hardware.

As for my critics on image quality tests: Yes, it´s great that you can now use the exact frame for comparsion. But differences between ATI (jittered) and NV (ordered) for example can be seen only in certain conditions very clearly. I did not mean to create some simple line tests. You can also create a game scene whith pylons and stuff like that where you can see the differences very clearly. Also tests with transparent textures etc woul be possible. 3DM has a special test for texture filtering so why not for AA tests?

Another point that was discussed here. Sure, I´m not a shader coder. Neither is Tom or Kyle. Otherwise we would be game developers and not press. But it´s also the fault of game developers that NVIDIA has the chance to criticise 3DM in that way. I tried to ask game developers in London at NVIDIA´s Dusk To Dawn event to tell me what they think about FX and R300´s shader implementation. You just don´t get any answers ("I can´t talk about that...")!! They are all in close contact with those companies while developing their apps. I did´nt see any official statements where game developers are speaking about how well or bad shaders run on hardware X or Y. They only say we ported our app to DX9 but not that they had to implement different codepaths for NV and ATI and WHY! J.C. is the only one who speaks about it but he´s also very carefully. I would be VERY pleased to get some comments from developers of what they think about R300´s and FX´s shader implementation to make their opinions public (anonymous if they like).

That´s why I posted that column. I wanted that people think about what´s haeppening here. How can you create a "standard" benchmark to test the shader performance when you won´t see that kind of code in real games since the developers have to optimize the code in their apps for every chip to get optimal performance? NVIDIA gave us a new driver which increases the FX performance by more than 50%! And they say: yes, we optimized this driver for 3D Mark 2003. This makes 3DM almost useless!! It´s very clear that NVIDIA wants do destroy 3D Mark with that. It´s also very clear that NV wants to see the press doing that work for them. We often get "inside" information from manufacturer X or Y about issues in competitors hardware. At THG we are very carefully with things like that. You can also think about who provided the whole Q3 ATI texture cheat information.... That´s why I did´nt post that at THG at the first time. OK, I did not expect it to be such a huge quality impact at first so not writing about it was a mistake.

I wanted to start a dicussion about 3DM before the "bashing" begins. I received the NV document and thought about how I can get people to think about the whole situation. That´s why I wrote a clumn with some thoughts and not an simple article which shows that 3DM is a "bad benchmark". Now Futuremark has the chance to response on NV´s reproaches.

I hope you see my point.

Lars - Tom´s Hardware Guide
 
Borsti said:
Yes, I think there are some issues in 3D Mark and things that could have been done better (or more clever). Why did´nt they make tests like (GF1: DX7, GT2: DX8.0 (PS1.1/1.3), GT3: DX: 8.1 (PS1.4) and GT4 with DX9). Then NVIDIA would´nt have had the chance for critics since it´s their own fault not to implement PS1.4 in their hardware.

PS 1.3 is a part of DX8.1, not 8.0. :)
PS 1.3 would basically not introduce any tangible improvement over 1.1.

So why even bother?
Besides look at the difference betwen the R300 results when forcing PS1.1 support, it doesn't hurt the score very much at all. www.rage3d.com

PS1.4 is supported in some games today and will be supported in even more later on this year. (If you've heard anything about how many liscences of the Doom 3 engine that has been sold you know what I mean.)

The one thing I could agree on is GT1, single textured fillrates are simply not very good at indicating real world performance.
The only thing nVidia has to be upset about is really the GT1 and quite frankly that only hurts their suck ass cards GF4MX right now, those cards should be killed on sight anyways. ;)

Sorry didn't bother reading the rest of the replies so just ignore me if someone has posted similar views before me (I assume that's the case hehe).
 
LARS,

That´s why I wrote an column with my own opinion and thoughts about that and not and "article".

The problem is, "your own opinion and thoughts" are currently tainted because you only have one side of the story.

It would be much more prudent, even if you posted a quick story that nVidia sent you the PR material....to hold off on YOUR opinions, until you get a response from at LEAST FutureMark on the matter. That way, you can consider BOTH sides of the story (from people who are admitedly more technical than yourself), before putting out opinions on the matter.

Also, did you bother to ask ATI or other IHVs for their opinion on the matter?
 
It´s very clear that NVIDIA wants do destroy 3D Mark with that. It´s also very clear that NV wants to see the press doing that work for them.

And yet no one is really asking "why?" What benefit would nVidia receive by "destroying" 3D Mark? And why is the "press" so seemingly willing to go along with it?
 
Joe DeFuria said:
It´s very clear that NVIDIA wants do destroy 3D Mark with that. It´s also very clear that NV wants to see the press doing that work for them.

And yet no one is really asking "why?" What benefit would nVidia receive by "destroying" 3D Mark? And why is the "press" so seemingly willing to go along with it?

Personally I think it's because they didn't really investigate what nVidias says.
nVs claims seems to make sense when taking the theoretical approach. Problem is even if they did fix all those things nV are whining about that would effect current scores very much at all, so why bother crying their asses off about this?

Seems strange to me.

They could also alienate themselves a bit from developers. I sure as hell wouldn't appriciate if some manufacturer started pointing fingers at me if I were to develop software.
Also judging from the response in the community most people just seem to think that this is one desperate move from a company upset about the poor launch of the FX..

In the long run I think acting out like this will hurt nV more than it might do them good.
 
Joe DeFuria said:
LARS,

That´s why I wrote an column with my own opinion and thoughts about that and not and "article".

Also, did you bother to ask ATI or other IHVs for their opinion on the matter?

You really think we get the correct answers on teh issues if we ask without making the discussion public before?! Now there´s the chance to get the right (interresting) answers also from ATI and others. I don´t say tha my opinion won´t change in the next days... that´s why I posted a column and not an article.

About the "why"... just think about where NVIDIA is making their money:Mainstream! This means GF4 Ti 4200 / MX440 (or below). And those cards suddenly look very bad compared to R8500/9000 in 3DM. Real games can get optimized by NV devs. 3DM cannot!

@Doomtrooper
Sure, we are not at DX9 but who cares? How many DX9 cards have been sold yet? Do you really expect many DX9 games in a short time? We have now reached the point where a huge amount of DX8 cards are used by gamers. And Xbox is important because many games will be also launched on the MS console. Apart of some "big" titles it does´nt make sense to develop a game for PS 1.4 and then report it back to 1.1.
 
Back
Top