3DMurk03 new cheats ?

I've not been around as long as some, but I've been around for a number of years atleast. B3D isn't really in such bad shape as a lot of people seem to think. It's not perfect certainly, but I don't think in actuality it's really much worse than it was back in the 3dfx vs nvidia days. I mean, we are still discussing interesting topics, doing research, learning from each other. It's the way this site has always been, and it still is now. Some complain that the signal to noise ratio is lower these days, but I really don't buy it. It's easy to think about things like the FSAA articles and anisotropic articles, the investigations and other interesting things about the old site, but there were a lot of flame filled threads back then too. In 2-3 years we will probably be moaning about how it was so great back in 2003 when we had all the interesting discussions about floating point precision modes, shaders, and investigations into cheating.

Really, instead of reminicing about the past, get involved in the interesting conversations going on now.

Nite_Hawk
 
Well, let`s end this, OK?It`s not like it`s usefull in any way.I said what i said and I stand by it.Now, back to the topic of cheating.3dm2003 has a major drawback that makes it the no1 target for optimization:it`s used by every friggin site and every forum poster(well, almost:)).And it`s not really coded in a way that makes optimal use of what cards offer.While adding clipping planes is something that sucks,IMO, changing shaders that are "clumsy" with your own optimized ones is not that bad.
 
3dmark 03 is coded to the DX 9 standard. The nVidia shader replacements ignored the DX standard. Requesting modifications within the DX standard seems to possibly fit within what you said...like the NV35 could do without clipping plane cheats with partial precision hints and fp16, except for the lower precision output hint addition when it wasn't there in the code (that's not following the standard).

However, and unfortunately, the NV35 isn't their only card, so your commentary isn't something that applies very broadly until the other NV3x cards are replaced with cards more like the NV35. The picture for PS 2.0 performance is accurately compared by 3dmark 03 (with detection methods defeated), and that's what 3dmark 03 is trying to represent. That's it. It doesn't seem to do so poorly right now at all.

Allegations of "clumsiness" seem to be nVidia's marketing line for their hardware's inability to compete well using DirectX workloads, and to sell the idea of changing the workload when their competitors are not. Assuming your "down hill" evaluation is associated with lack of technical discussion, perhaps you'd be better served by including something beyond "And it`s not really coded in a way that makes optimal use of what cards offer" and stating your opinions as fact (in your earlier post) as the totality of your argumentative support when contributing to a thread. Perhaps revisit discussion of the 3dmark/nVidia whitepaper article on the site if you feel it is warranted?

I also propose to you that complaints of Rage3D forum member influx and a general hostility towards nVidia are not very useful reasons for negative labelling of the forums, and are in fact flamebait. They seem pretty useless reasons to criticize the forums compared to the technical discussion reason, unless you are propoosing people aren't entitled to form negative opinions about IHVs based on technical discussions and the actions IHVs take?
 
Testiculus Giganticus said:
While adding clipping planes is something that sucks,IMO, changing shaders that are "clumsy" with your own optimized ones is not that bad.
Nothing wrong with optimizing shaders... as long as the output is the same after optimization. This wasn't the case with nVidia's optimizations so I'd say that's bad.

-FUDie
 
Testiculus Giganticus said:
Well, let`s end this, OK?It`s not like it`s usefull in any way.I said what i said and I stand by it.Now, back to the topic of cheating.3dm2003 has a major drawback that makes it the no1 target for optimization:it`s used by every friggin site and every forum poster(well, almost:)).And it`s not really coded in a way that makes optimal use of what cards offer.While adding clipping planes is something that sucks,IMO, changing shaders that are "clumsy" with your own optimized ones is not that bad.
I think the main falw is your so called argument (i've seen no logic or reason) that 3dmark03 is "useless".
Why is it useless?
It offers a mix of technology to attempt to determine the cards strengths and weaknesses, and it does a pretty damn good job at it.
I've heard a lot of spouting about uselessness, and not one shred of reasoning that stands up to logic.
So you dont like the overall score? Why?
If you are gonna say its useless, please say WHY, HOW, etc - or its just another case of parotting what you've been told.
 
Testiculus Giganticus said:
And [3dMark03's] not really coded in a way that makes optimal use of what cards offer.

To the extent that any of the PS 2.0 shaders in 3dMark03 (GT4 and PS2 tests) could have used FP16 without a loss of quality but didn't include the _pp hint, this is true. _pp is part of the PS 2.0 spec, and should be used whenever appropriate. NV3x offer (generally) faster performance in FP16 than FP32, and this should be taken advantage of where appropriate.

If, however, you're referring to the fact that NV30-34 offer much improved performance in FX12 mode, then your criticism is misguided. The problem there is not how 3dMark03 is coded, but rather that PS 2.0 is not really designed in a way that makes optimal use of what NV30-34 offer. This is either Microsoft's fault or Nvidia's; Futuremark has nothing to do with it.

IMO, if there is a sufficiently common class of shaders that does not suffer from low-precision artifacts with FX12 but cannot fit into the limitations of PS 1.4, then PS 2.0 should have included an integer precision hint, and thus the fact that it doesn't reflects poorly on Microsoft. If such a class of shaders doesn't really exist in any practical way, then Nvidia has poorly targeted NV30-34 for DX9-class pixel shaders (although just fine for DX8-class shaders). Any benchmark of DX9-class pixel shaders which did not reflect this would be deficient. (Luckily none of them seem to have much trouble...)
 
May I mumble incoherently? :)

All this integer vs really low res floats vs bottom feeder single precision floats is all about speed on one paticular hardware implementation, right? The only reason anybody cares about using less than 32 bit or 24 bit is speed or am I missing something? (Recalls some minor quibble about integer maybe having more precision on random tuesdays when haley's comet is flying overhead).

I would assume that the next chip from NVIDIA will have better speed at 32 bit floats, that's their job, to make slower bits faster. This will probably happen before the end of the year is out.

Considering all that, how evil is it of Futuremark, or by extension MS for creating the spec, to go with the precision that will most likely be the common demoninator when PS 2.0+ shaders are actually used in games in a few years time? It won't be this year's cards running thos games anyway most likely, or they will have low to mid range performance.

NVIDIA has a habbit of micro optimizing too much, IMO, it's great so long as it doesn't require developer time to make it work, so I think they should keep it in their pants. :)
 
I`m not saying that FX12 should be used for PS2.0-it would be a major step in the wrong direction.BUT _PP hint is in DX, and if it`s there, why not use it, since I don`t think that the 3dm2003 shaders require 32 bit FP everywhere.I didn`t bash these forums, and I don`t intend to-it`s not my place to do this.If it sounded like that, I appologize-I was merely expressing my sincere opinion.Why is 3dm2003 useless?Well, for two reasons-it`s only a bench of optimizations now,and who can do more with less and it`s not a depiction of how a card will play a similar game engine AFAIK and IMHO
 
You can't deduce how a card will perform on game B from benchmarks from game A, so how is that a valid argument to discredit 3DM03's relevence? Hell, JK2 and RtCW are based on Quake 3, yet none perform the same.

As for it being a bench of optimizations, patch 330 solved that. Even if it turns into that, if companies substitute optimized shaders (but NOT clip planes), then 3DM03 will end up being just like any other game.

Two bad arguments for irrelevence, IMO.
 
But you can deduce how a card will perform on game B from benchmarks from game B-that`s the big idea-no game is even by far related to 3dm2003.And, as far as i can remember, JK2 and RtCW followed Quake3 after a few years,no?And as to the fact that shader changing is a good thing, I think I already said that.BUT, that`s not exactly the only think that went on/goes on/will go on in 3dm.Trust me on this one ;)
 
Testiculus Giganticus said:
But you can deduce how a card will perform on game B from benchmarks from game B-that`s the big idea-no game is even by far related to 3dm2003.And, as far as i can remember, JK2 and RtCW followed Quake3 after a few years,no?And as to the fact that shader changing is a good thing, I think I already said that.BUT, that`s not exactly the only think that went on/goes on/will go on in 3dm.Trust me on this one ;)
according to your theory, all benchmarks are irrelevant.
So why even bench at all?
What you are saying is that benchmarks are only good for themselves - IE, that only a quake3 benchmark will give an idea of quake3 performance.
The problem is, this idea of yours is simply CRAP.
Its untrue.
Look at performance on games, and compare to the 3dmark01 and 3dmark03 scores.
You'll find a pretty damn high correlation, with a few discrepancies.
Note that the correlationw ith 3dmark03 scores will RISE as games make more and more use of shaders - so future game performance can be predicted even more accurately.
I'm sorry, but try again.
 
Well, this will be the last post that I make in this thread, because it`s tiring to misunderstood at every step.I`m not saying that all benches are useless, god forbid I would dare commit the sacrilege, what I am saying is that game benchmarks are far more usefull(guess what, you don`t get to play 3dm2003).Anyhow, I wish you all an enlightening discussion and, people, CHILL-this is something like aliens-flamethrowers everywhere :LOL:
 
Testiculus Giganticus said:
But you can deduce how a card will perform on game B from benchmarks from game B-that`s the big idea-no game is even by far related to 3dm2003.And, as far as i can remember, JK2 and RtCW followed Quake3 after a few years,no?

And do you know how much relation they bear to Quake3? Very little since JK2 and RtCW are totally CPU bound at virtually all resultions. To assume that any game just becuase it uses the same engine will behave exactly the same is foolish, that game will have different properties to another that uses the that engine and will stree the system in different ways,

And as to the fact that shader changing is a good thing, I think I already said that.BUT, that`s not exactly the only think that went on/goes on/will go on in 3dm.Trust me on this one ;)

I can guarantee you don't know the half of whats gone on with 3DM03.
 
Testiculus Giganticus said:
Well, this will be the last post that I make in this thread, because it`s tiring to misunderstood at every step.I`m not saying that all benches are useless, god forbid I would dare commit the sacrilege, what I am saying is that game benchmarks are far more usefull(guess what, you don`t get to play 3dm2003).Anyhow, I wish you all an enlightening discussion and, people, CHILL-this is something like aliens-flamethrowers everywhere :LOL:


I think the point is that 3DMark03 is the ONLY DX9 test we have at the moment. There are NO Games to use to bench DX9 Capabilities. 3DMark03 is not meant to simiulate a Game. And not meant to be played. duh! It is a battery of tests that test DX7,8 and 9 functions. Is it going to tell you how XYZ Game will run? No. But it will give some idea of the relative merits of the DX9 Cards available right now. (That is if the IHV's don't use cheating drivers to skew results and muddy the waters.

No, We don't need to run out and buy DX9 capable cards right now, no games remember, but we do anyway. To pick the best DX9 Card at this point, (Still Moot) we need a Benchmark that gives us some clue to the DX9 Capabilities. 3DMark03 just tests DX9 functions as prescribed by the DX9 standard. If Nvidia's cards don't run worth shit on the Bench, it is NOT Futuremarks fault but Nvidia's for poor execution.

As for FP precision, imho, Nvidia screwed themself when they had to one up ATI with some more PR BS by using full 128b FP precision, instead of 96b (the DX9 Standard) as ATI did. Pure Marketing hype on their part. Then when it jumps up and bites them in the arse and they find out that full 128b precision is SLOW as hell, they come up with Fallback settings that aren't really DX9 compliant. Less precision, Less IQ ... and No repect for their Customers.
 
The very fact that 3DMark 2003 is NOT a game engine is what makes it so useful. Game benchmarks are useless for anything other than telling you how well the card is going to perform on that single game. An engine in one game has very little relation to the engines use in the next game. 3DMark 2003 actually tells you what the card can do. Beyond that, it's up to the teams who write the game engines and graphics drivers.

3DMark 2003 is a graphics card test, not a system test, and not many people seem to be grasping that.
 
Quitch said:
The very fact that 3DMark 2003 is NOT a game engine is what makes it so useful. Game benchmarks are useless for anything other than telling you how well the card is going to perform on that single game. An engine in one game has very little relation to the engines use in the next game. 3DMark 2003 actually tells you what the card can do. Beyond that, it's up to the teams who write the game engines and graphics drivers.

3DMark 2003 is a graphics card test, not a system test, and not many people seem to be grasping that.

I wonder about that. If it's just meant as a Graphics test, Why is there a CPU test and also Sound Tests? The one Problem I've always had with 3DMark in general is, for instance, in 3dMark2001 higher Front side Bus speeds and ergo, Memory Bandwidth, was a big influence on scores. If it's a pure Graphics Bench, then why is it so sensitive to System variables? They should shit can the Game tests, and just use the Function tests to derive a score, or perhaps use all the tests for the aggregate score instead of just the Game tests
 
I wonder about that. If it's just meant as a Graphics test, Why is there a CPU test and also Sound Tests? The one Problem I've always had with 3DMark in general is, for instance, in 3dMark2001 higher Front side Bus speeds and ergo, Memory Bandwidth, was a big influence on scores. If it's a pure Graphics Bench, then why is it so sensitive to System variables?

Sheesh... you complain that 3dmark2001 was too platform-dependent. Well, Futuremark solved that in 3dmark03 by making the game tests more GPU-limited and including a separate platform score (CPU tests). Then you complain about that. Can you be any more inconsistent?

Besides, what in the world is wrong with extra functionality?. You can ignore the CPU and sound scores if you wish.
 
beyondhelp said:
I wonder about that. If it's just meant as a Graphics test, Why is there a CPU test and also Sound Tests? The one Problem I've always had with 3DMark in general is, for instance, in 3dMark2001 higher Front side Bus speeds and ergo, Memory Bandwidth, was a big influence on scores. If it's a pure Graphics Bench, then why is it so sensitive to System variables? They should shit can the Game tests, and just use the Function tests to derive a score, or perhaps use all the tests for the aggregate score instead of just the Game tests
The answer is clear: Video cards got faster. In fact, they got so fast that the rest of the system (CPU, RAM speed, AGP, etc.) became the bottleneck for many of the tests. I don't see this happening with 3D Mark 2003 anytime soon, except that GT1, at least at default settings, is CPU limited on faster video cards.
 
Bolloxoid said:
Sheesh... you complain that 3dmark2001 was too platform-dependent. Well, Futuremark solved that in 3dmark03 by making the game tests more GPU-limited and including a separate platform score (CPU tests). Then you complain about that. Can you be any more inconsistent?

Besides, what in the world is wrong with extra functionality?. You can ignore the CPU and sound scores if you wish.


You missed the point of my comments. Quitch said "3DMark 2003 is a graphics card test, not a system test, and not many people seem to be grasping that."

I disagreed with that assertion and used 3DMark2001 for my argument. It most certainly has not in the past been a pure Graphics Benchmark, and his pontification that 3DMark03 is any different is what I'm taking issue with. Testing for Graphics Functionality doesn't need a CPU test or Sound test.

WTF are YOU talking about? Where did you get the idea I was inconsistant? I essentially said I had reservations about calling 3DMark a Graphics Benchmark as opposed to a system Bench, and If It is a GRAPHICS Benchmark, then it doesn't need a CPU test or Sound Tests. I have specific Benchmarks for those items.

You can stuff your extra Functionality, and I still have to sit through the meaningless to Graphics CPU and Sound Performance tests whether I ignore them or not...
 
beyondhelp said:
You missed the point of my comments. Quitch said "3DMark 2003 is a graphics card test, not a system test, and not many people seem to be grasping that."

I disagreed with that assertion and used 3DMark2001 for my argument. It most certainly has not in the past been a pure Graphics Benchmark, and his pontification that 3DMark03 is any different is what I'm taking issue with. Testing for Graphics Functionality doesn't need a CPU test or Sound test.
But the sound or CPU tests have no bearing on how many 3D Marks you get, except that a slow CPU may hurt your scores because you can't render stuff as fast.
You can stuff your extra Functionality, and I still have to sit through the meaningless to Graphics CPU and Sound Performance tests whether I ignore them or not...
If you pay for it, you don't. Or, you can go get some beer (or a "bevvy" as Dio would say) while the tests are running.
 
Back
Top