Futuremark Announces Patch for 3DMark03

Update - Just had some information in from Futuremark, IQ differences from the Fire, fog and hair effects should be discounted.
 
DaveBaumann said:
Update - Just had some information in from Futuremark, IQ differences from the Fire, fog and hair effects should be discounted.
This effects are using random functions... right?
 
Yes, apparently FM aren't resetting them. So, the effect that Sascha was reporting from the guy on your forums was correct.
 
DaveBaumann said:
Yes, apparently FM aren't resetting them. So, the effect that Sascha was reporting from the guy on your forums was correct.
Hmmm. What does "resetting" mean? Are particle effects always, completely, random? Or do you have to reboot your system? o_O

Weird. Why would anyone implement an IQ test using scenes that heavily feature random elements? :?

93,
-Sascha.rb
 
It's actually a credit to nVidia that any remaining IQ inconsistencies are tiny and barely noticeable for the most part - all the major problems should now be "discounted".
 
PaulS said:
It's actually a credit to nVidia that any remaining IQ inconsistencies are tiny and barely noticeable for the most part - all the major problems should now be "discounted".

Agreed. And that'll make the guidelines even harder to apply for FutureMark, considering IQ differences are so small. Sad, because I'd still prefer to receive results as the ones the v340 give, instead of the ones the v330 do.
Refusing to give the user this option, however, is at NVIDIA's dishonour.


Uttar
 
nggalai said:
DaveBaumann said:
Yes, apparently FM aren't resetting them. So, the effect that Sascha was reporting from the guy on your forums was correct.
Hmmm. What does "resetting" mean? Are particle effects always, completely, random? Or do you have to reboot your system? o_O

Weird. Why would anyone implement an IQ test using scenes that heavily feature random elements? :?

93,
-Sascha.rb

I agree... that seems like that is a rather large oversight. For a benchmark you'd figure they'd seed the generator with the same value for each run, or give you the *option* to be able to do multiple runs with the same seed value so you could get comparable output.
 
Reason #100 to leave the benchmark off your system....

I might add hand coding shaders for every application out there makes their compiler bloatware...games like Half-life 2 are just one example.
 
Doomtrooper said:
Reason #100 to leave the benchmark off your system....

I might add hand coding shaders for every application out there makes their compiler bloatware...games like Half-life 2 are just one example.

No it doesn't. I made a few posts regarding this on Page 16.
NVIDIA PR and Marketing adding illegal features to the "Unified Compiler" legit system means nothing more than they're trying to mislead people by claiming:
a) Their automatic optimizations are better than they really are.
b) Their shader replacements are part of their Unified Compiler, which does not degrade IQ, according to NVIDIA's own public PDFs.

There is such a thing as a real, legit "Unified Compiler" in the Det50s, but it's sad NVIDIA tries to put "illegal" optimizations in the same category as the "legal" ones. Not a good move either IMO.


Uttar
 
There is such a thing as a real, legit "Unified Compiler" in the Det50s, but it's sad NVIDIA tries to put "illegal" optimizations in the same category as the "legal" ones. Not a good move either IMO.

I agree. And I don't want, in any way, to dismiss Nvidia's driver team good work on the 52.16 Detonators. And I don't think any poster here wants to dismiss any legal optimization because it's Nvidia doing it. But hand-coded shader replacement because it's a widely used benchmark is not part of a good compiler technology, IMHO. I just want Nvidia to stop the FUD.
 
CorwinB said:
And I don't think any poster here wants to dismiss any legal optimization because it's Nvidia doing it.

On the contrary, that's exactly what's happening with some people on the forums (not just here). I've said it before, and i'll say it again: This constant black & white, nVidia = evil, ATi = good, has got to stop.
 
I've said it before, and i'll say it again: This constant black & white, nVidia = evil, ATi = good, has got to stop.

Well, I agree somewhat to that also. First, there's no such thing as "evil" in this market. There are companies doing this to protect their marketshare. That some companies (*cough* Nvidia *cough*) are going waaaay over the top and trying to deceive the consumer, I don't think anyone in his/her right mind can deny that. But that's not "evil" for the sake of it. That's bad for the market, that's not acceptable, but I don't believe people like Jen Hsu, or even BB or the other PR guys are like "Ah ! Let's screw the consumers one more time by spewing mindless BS, that will teach them". Those guys are sincerely scared for their company/jobs, and are acting in a misguided way because of it. Does that has to stop ? Yes, I do believe so. Does that make some kind of "higher level" moral point ? No, I don't believe so.

Back on topic, I don't think even the strongest ATI supporters on this board (or Nvidia haters, but that somewhat amounts to "ATI supporter" nowadays, because Matrox will only carry you that far :p) would object to Nvidia genuinly improving shader compiler technology and providing its end-users with better speed and identical IQ. People posting at B3D are generally far too fond of technology to go that far. Or I hope so. :)
 
Considering the last number of pages has barely mentioned ATi at all except in reference to what they don't do regarding FM, and it's ALL been about the latest FM/nVidia revelations and all the various and asundry PR spewing out in its wake, I don't understand why some feel the need to imply a "black and white" comparison. Heck, this is pretty much just a continuance on the same complaining that went on before regarding nVidia and FutureMark, only with all new information (and more PR and more denials and more finger-pointing...) to hash out! I've never owned an ATi card, yet I feel entirely justified in bashing nVidia for what I find to be craptacular stands. Imagine that. :rolleyes:
 
I don't have any problem with Nvidia improving their hardware, drivers, performance and behaviour. More good products in the market place benefits customers like me.

However I don't believe in patting Nvidia on the head for improving their compiler, when at the same time they are still hacking in app-specific shader replacement in order to give misleadingly high scores, and once more spreading FUD in order to try and hide it. When they are being honest in *all* they do, they can be congratulated for it. It just so happens that ATI has been setting a particularly good example in all the areas where Nvidia has been falling short.

You don't forget a man robs you just because he does one honest day's work, and neither should we forget what Nvidia has been doing for the last 18 months until they have earned it.
 
DaveBaumann said:
Update - Just had some information in from Futuremark, IQ differences from the Fire, fog and hair effects should be discounted.

WTF!?? Are there similar conflicting results with the R300/R350 cores? What is the cause for the massive drop in the NV core scores about then? Even if the pseudo random output (which is something I find strange BTW) is perfectly fine what is the cause for the score to drop so much then?
 
PaulS said:
On the contrary, that's exactly what's happening with some people on the forums (not just here). I've said it before, and i'll say it again: This constant black & white, nVidia = evil, ATi = good, has got to stop.
You know what has got to stop? Take a look at what kind of misleading PR nonsense nVidia has released just in the previous 48 hours. :!:
 
Sabastian said:
WTF!?? Are there similar conflicting results with the R300/R350 cores?

Two consecutive benchmark runs on a 9800 Pro, using the 340 patch both times:

anisohair.jpg


I didn't look at any of the other shaders in question, but this one pretty much proves that what FutureMark said is correct, and true of both ATi and nVidia boards.
 
CorwinB said:
I've said it before, and i'll say it again: This constant black & white, nVidia = evil, ATi = good, has got to stop.

Well, I agree somewhat to that also. First, there's no such thing as "evil" in this market. There are companies doing this to protect their marketshare. That some companies (*cough* Nvidia *cough*) are going waaaay over the top and trying to deceive the consumer, I don't think anyone in his/her right mind can deny that. But that's not "evil" for the sake of it.

You nicely examplify a standpoint I find fascinating - that actions performed by individuals for the material gain of a corporation are devoid of moral implications either for those individuals or the corporations, whereas if the same actions were performed for purely personal gain, they would have been regarded as morally unacceptable.

Seems strange to me, even though it fits nicely into the tradition that killing on command is fine, whereas killing on your own volition is unacceptable. The moral priviliges that were once the domain of nations (for the greater common good, presumably), are now extended to corporations. Hmm.
 
Back
Top