Is PS 1.4 better than PS 1.3?

I may sound insane saying this, but I believe nVidia doesn't care about 3DMark 2003 using PS1.4.

Apparently you have not read any of their press statements then??? Cause they dedicated whole paragraphs to this specific subject. Completely and intentionally confusing the issue by Equating PS 1.3/1.4 which has been proven (which many people already knew) is completely false.

Truth is and they knew and knwo this. PS 1.3 Somply does not beneffit 99% of the kinds of affects PS are used for. or will be used for in future games.
 
Re: My god.

Uttar said:
Evildeus said:
binmaze said:
Thanks guys for enlightenling me.

Then what in the world of this fuss about the new 3DMark?
I see no unfair, invalid, or biased featuring of using PS 1.4? Am I biased or what?
A hint, Nvidia till GFFX doesn't support PS 1.4 ;)

I may sound insane saying this, but I believe nVidia doesn't care about 3DMark 2003 using PS1.4.

Well, when they release comments specifically criticizing ps 1.4, I hope you understand why people would get that impression.

In barely a few months, the whole GF4 Ti family will not be manufactured anymore. The NV31 and NV34 are to replace the GF4 Ti and GF4 MX respectively.
So, if their GF4 looks worse than it actually is and they got the best mainstream product real soon, then that means people will be more interested in upgrading to their new products.

I think the problem is how their competitors products look in comparison to theirs, now and likely how it will continue to look with the nv31 and nv34 out.

And that's good for them.

If they didn't have any competitors with parts that look likely to be ahead of their own products...

I think that what nVidia doesn't like about 3DMark 2003 is the time they'll need to optimize their drivers for it ( their driver team is already VERY busy with the NV30 optimizations & NV34 software shader system )

Hmm. Yeah, you think optimizing for one benchmark like they've done in the past is harder than optimizing for general usage? I think it is more logical that they are upset that shaders are harder to optimize for without producing noticeable graphical deficiencies, especially on their hardware. If they can't compete by optimizing, they have to compete on general hardware functionality, and this picture is not looking too rosy at the moment. I think this is why they wanted to control Cg...and if it isn't, trying to maintain final control of the specification of the language was absolutely the wrong way to go about getting it widely adopted as was their stated goal.
I think it is more reasonable to think these observable factors are related to their disapproval of 3dmark03 than to ignore them and try to rationalize a new one.

Also, they don't like so little different shader programs are used. The NV30 got a slight advantage when more shader programs are used, because they're stored in video memory ( the R300, I think, got to send them through AGP, and thus with higher latency, everytime )
Hmm...I don't see the basis for this conclusion?


Also, no 3DMark 2003 shader needs multiple passes on DX9 hardware. nVidia would have loved to have their instruction limits used a little more.


Uttar

Their instruction limit advantages compared to the DX 9 spec aren't practical for gaming...that is a bit TOO forward looking. Also, I don't think not multipassing would give them significant advantage with shader lengths that high in any case.
 
Re: My god.

demalion said:
I think it is more logical that they are upset that shaders are harder to optimize for without producing noticeable graphical deficiencies, especially on their hardware. If they can't compete by optimizing, they have to compete on general hardware functionality, and this picture is not looking too rosy at the moment.

This is interesting. Can you be more specific?
 
Re: My god.

TheMightyPuck said:
demalion said:
I think it is more logical that they are upset that shaders are harder to optimize for without producing noticeable graphical deficiencies, especially on their hardware. If they can't compete by optimizing, they have to compete on general hardware functionality, and this picture is not looking too rosy at the moment.

This is interesting. Can you be more specific?

The ability to optimize for games presents a complex opportunity for "optimization". For my discussion, optimization can be thought of as either "invisible cheating" or "removing inefficiency". What is undesirable is when "benchmark specific" optimizations occur that are specifically of the "invisible cheating" variety and are exclusively applicable to the benchmark alone. "Invisible cheating" can be valid, IMO, if it is general and not intended for distortion of comparison (think of hidden surface removal), such as targetting a program whose only function is to provide benchmark results.

Many vendors have been guilty of this, and some vendors don't even have a firm grasp on the invisible bit (no, I wasn't thinking of ATI, but I'm sure Quack jokes are on the minds of many. :LOL: ), and this is the undesirable "focus" nvidia is suddenly decrying.

All well and good, but they are the ones who released "3dmark boosting" drivers at 9700 launch that created lots of problems for many users (looked very much like a benchmark specific "invisible cheating" release). They also released "3dmark 03" boosting drivers, but my personal opinion (except for the issue Wavey brought up of their statements on 16-bit per component fp precision being enough to meet DX 9 specification) is that this is very likely a "removing inefficiency" (if their "DX 8" benchmark figures increased, the precision issue does not seem like it should apply to them atleast).

For ATI, they seem to have learned since their rocky start with the 8500 to optimize for general cases, and for the longest time their 3dmark performance (on my system) has stayed the same, or worsened (for 3dmark 2001 I consider that a good thing :p), except when improvements were observed in other applications as well. The Catalyst 3.x series has also exhibited, to my mind, a very strong "removing inefficiency" trend.

Now, what strikes me about shaders is that it should be hard to perform "invisible cheating" optimizations...perhaps there are some very specific cheating opportunities, but they are likely to to be rarely ones that can be done invisibly (which causes me to recall an odd thought that popped into my mind when viewing the "nvidia 3dmark03 whitepaper" pictures of shadow problems in 3dmark03...I suppose some examination of 9700 and 8500 screenshots are in order...).

I've looked forward to 3dmark03 as an improvement over 3dmark 2001 for a while, as I was mentioning last month I think. This is because with a focus on shaders, the "invisible" part of "invisible cheating" seems less likely to be successful when trying to target the benchmark alone, and therefore optimizations that would impact the benchmark the most would also likely impact shaders in general. In my view, with shaders, the GPU still has to do the work in the end (which is why I like the idea of time to completion results for frame based rendering mode). I am also pleased that reviewers have improved image quality tools at their disposal to investigate the benchmarks as well (it maps very closely some of what I hoped for, except for online reference images and comparison tools).

What I think nvidia's comments reflect is that their hardware lineup is not up to competing as they'd like against their competitors, their driver efficiency lead is eroding, and without the ready opportunity to "optimize" around these shortcomings, it is a convenient time to denounce something people have complained about (even though, to my mind, they do this just when the nature :p of the benchmark's applicability has changed for the better), and try to appear as taking the moral high ground.

The misinformation and hypocrisy around this is rather staggering in my opinion, as are the stated reasoning they propose for why they are now taking the moral "high ground" in the "interests of consumers", but this isn't to say the comments can be dismissed out of hand (3dmark 2001 certainly had those flaws I think, and my opinion that 3dmark03 does not is open to dispute).
It also brings to mind yet again certain comments that were made about how we should expect "creative marketing" from nvidia with the launch of the GF FX. :-?
 
Back
Top