What do you guys think of this Anand quote?

Dean

Newcomer
"Speaking of video card stuff, we're hoping to have the Half Life 2 benchmark soon to do an updated comparison with NVIDIA's latest drivers. I'm hearing that the latest Detonators NVIDIA is working on fix the issues that Gabe outlined in his list of complaints and improves performance significantly. It seems like the 5900 Ultra is now slightly slower than the Radeon 9800 Pro with AA/Aniso disabled, and slightly faster with it enabled; I'm hoping to see for myself shortly. I haven't heard anything about how the 5600/5200 perform with the latest drivers, but I'm guessing the performance isn't as competitive. "

Read the rest here: http://www.anandtech.com/weblog/index.html

If Nvidia can pull that huge rabbit out of a hat without butchering image quality, it would be a miracle.
 
Given the performance disparities in the HL2 rendering paths, and the types of performance gains shown under Shadermark 2.0 with the 52.xx'a my guess would be that's a talking about the standard path for ATI but the mixed mode for NV3x.
 
Yup I was thinking the same. Just make HEAVY use of low precission calculations and throw in some "automatic shader optimizer" shaders throughout the scene and you can gain a lot more frames at the cost of image quality of course.
 
Going to partial precision doesn't necessarily mean sacrificing image quality.

Adding 2+2 gets you 4 in 8 bits, 16, 32, 64, or 128 bits. If your shaders doesn't need the extra precision, you're not gaining anything by having it.

But, I'm suspicious of the numbers, also. None of the shader benchmarks seem to suggest that the 5900 should perform equal to or better than a 9800, at any precision.
 
Agreed, he's talking about differing code paths and of course can't be bothered by such a "technical" distinction.

Also, this statement:

Anand said:
...I'm hearing that the latest Detonators NVIDIA is working on fix the issues that Gabe outlined in his list of complaints and improves performance significantly....

...simply means to me that he's "hearing" the same things nVidia initially stated about the 50.xx Dets that they later recalled and decided weren't so great for general use after all (of course, people are still using them anyway.) IE, he's "hearing" the same thing we all heard initially about the 50.xx's. So of course he has no first-hand knowledge of whether any of that might be true, and I guess he hasn't "heard" yet about all of the tests people across the Internet have been doing with the 50.xx's nVidia tried to convince Valve to use--tests which have not been favorable. So my guess is what he's "hearing" is coming not from people using the Dets, but direct from nVidia. Of course we all know that nVidia always tells the Gospel truth about its drivers, don't we? And so, Anand, of course, would have no earthly reason to doubt what he's "hearing".

I'm glad Anand's hearing is OK, but this kind of statement really indicates to me how far he's slipped. I mean, unless he just completely avoids the Internet entirely with the exception of posting to his own site, I find it really baffling how he thinks that what he's "hearing" is anything especially newsworthy, since we've all been hearing it for the last couple of weeks, and nobody's seen anything definitive yet--except Valve--who was so impressed they refused to use the 50.xx's. Including Anand--who admits to having seen nothing himself--apart from what he's "heard," of course. I'm beginning to think I liked Anand much better when he was 16...:)

I don't mean to sound abrasive, but this is the kind of gossipy, "Gee, listen to what I've heard," frivolous remark that is more at home in the Drudge Report than in a "major" hardware news and review site like AnandTech. Things are changing these days, and the fellows running the "old guard" better sit up and take note, else they find their assumed positions eroding before their eyes.

"I'm hearing"....oh, good grief. He doesn't even do us the dignity of telling us the identity of the little bird chirping in his ear...OK, sorry guys...I guess I'm so sick of reading nothing but empty fluff all year from nVidia and *some* of the sites which discuss its hardware that my irritation is bubbling over...(It's a good thing I don't read Anand or THG anymore on any kind of regular basis--I'd probably have a stroke...:))
 
RussSchultz said:
But, I'm suspicious of the numbers, also. None of the shader benchmarks seem to suggest that the 5900 should perform equal to or better than a 9800, at any precision.

Unless, of course, there are lots of drop downs to integer precision....
 
DaveBaumann said:
There are reasons why it could be the case with FSAA enabled.

Wouldn't the higher bandwidth and that big clock speed help out in this situation?

Of course, with perf increases such as these, and from nvidia, I have to wonder what was done to achieve such an increase. I hope valve sticks to their guns and doesn't buckle under any nvidia pressure...
 
Anandtech seems to put alot of effort in showing nV in the best light possible. I will give you and example that I commented on @ nVnews.....
nelg said:
What troubles me about that article is that Anandtech seems to use language that obfuscates the issue intentionally. Read the first two paragraphs and imagine that you have no knowledge of the current issues surrounding the GFX series of cards......

N.B. The bold parts are added my me and are my opinion as to how it should have been said..


quote:
--------------------------------------------------------------------------------
Well we are here at Computex, and the buzz surrounding NVIDIA and DX9 [particularly Half-Life 2] hasn’t let up. As a matter of fact, it is the hot topic of many under the table talks between video card manufacturers and their respective customers. Our latest mobile look into the latest and greatest for NVIDIA and ATI didn’t leave much room for argument on the DirectX 9 front (which is better ?). Meanwhile, the industry buzz is still debating about who has legitimate claim in this controversy. There are suggestions that ATI and Valve have been conspiring, and Valve specifically coded their image quality paths to ATI hardware (By whom other than nVidia?). This is really hard to validate since there are other DX9 games that show similar [though with less intense margins] results between NVIDIA and ATI hardware(isn't this the proof you are looking for?).

Due to all of this we turn back to the concept of our Forum articles: basically, inquiring about the thoughts and opinions of various manufacturers anonymously. While Half-Life 2 may be centralized toward the American market, DX9 is a concern for consumers, manufacturers, and programmers worldwide. Consumers are less likely to buy a certain graphic solution should he or she know that image quality and rendering abilities are inferior of its competitor. Those that have already chosen the “unfavorableâ€￾ graphics solution will likely then be alienated. Manufacturers, therefore, are directly effected due to sales [or lack of] relating to the limitations of a graphic solution, in this case NDIVIA and ATI(why include ATI in this statement ?). Programmers, as we noted before, are frustrated on two different levels. First is the topic of resources, typically, developers have an optical frame per second range, which they try to hit on all graphic solutions. Regardless of the reasons behind it, ATI and NVIDIA graphic processing parts can’t hit the same frame per second range in an intensive image quality game, at least for now(should this not say that nVidia cards cannot reach the same performance as ATI ?). This means that to keep up to par with the competition, developers have resorted to coding special code paths for these frames per second ranges to be hit. Obviously, this means image quality settings need to be lowered and this directly points to our second conundrum: consumers not being able to enjoy the full DX9 experience the way the developer intended. Programmers are artists, and for this reason; they hate to see their effort and artistic talent go to waste. On the other hand, they also understand that not creating a special code path could possibly lead to low sales or even undercutting them.



--------------------------------------------------------------------------------

(which is better ?)

Why all this pussy footing around ?
 
Anand is almost the same crap like THG's... ok, just almost.
Nevertheless I stopped checking any of them long time ago...
 
Nelg, that quote from AnandTech has got to be one of the worst written I've seen in recent memory...I can't recall the last time I've seen someone write so much and manage to say so little at the same time...:) (Oh, wait--I just read Kirk's interview on Firingsquad, so I should probably take that back.)

Besides the rather large inconsistencies in logic you point out, the general wording of the quote and phrasing indicate an author not really comfortable with English. I'm assuming Anand did not write this. Things like "effected" instead of "affected" and "optical" frame per second range instead of "optimal...," seem to bear this out as well. The phrasing is just stilted and pretty poor as a result. I'm really not sure that the author was intentionally being evasive--what comes out for me is that in English his thoughts are pretty muddy, basically, and he probably thinks he's said something much different from what he's written. Just a guess...Since the story was from Computex it sounds like a stringer piece somebody sent them which they just printed verbatim without bothering to read or edit it first (they could have addressed the spelling errors and corrected some of the punctuation and polished up some of the phrasing, at the very least, before running it.) But I guess the AnandTech "editor," whomever that is, was indisposed at the moment--but apparently not the person who posted it to the site--whom I have to guess didn't understand it any better than I did and so just reprinted it as it was received. (I *really* hope this was not an edited result...:))
 
nelg, Anand's first paragraph was pretty clear, as I think it was written right after Andrew Ku's review of ATi's and nV's mobile parts (9600 and 5600), in which ATi literally spanked nVidia. You're really picking nits there.

The second paragraph is somewhat evasive, though. Again, people like Anand can't really afford to malign a product without specific and multi-sourced proof. I don't think beta benchmarks constitute enough proof for him to firmly conclude that nVidia sucks all around and to recommend ATi to all his readers.

His review of the XT should be very interesting, though. I'm curious to see what lessons he's taken from his 5800 review, and how his new writer does. Not to mention we have a bunch of new benchmarks for him to try, and it appears he's updated his test suite significantly.
 
DaveBaumann said:
There are reasons why it could be the case with FSAA enabled.
Which reasons?

The ATI cards do that centroid mode of FSAA for free on HL2, while the NVidia cards need to do additional clamping work for FSAA in a DX9 shader. So the ATI cards should have a theoretical performance advantage in FSAA. What am I missing here?
 
madshi said:
DaveBaumann said:
There are reasons why it could be the case with FSAA enabled.
Which reasons?

The ATI cards do that centroid mode of FSAA for free on HL2, while the NVidia cards need to do additional clamping work for FSAA in a DX9 shader. So the ATI cards should have a theoretical performance advantage in FSAA. What am I missing here?

While I've read that there's a solution for Centroid AA for the FX range I've not seen any confirmation that Nvidia are implementing it. With Nvidia's reputation for cutting every corner they can just to get reasonable fps can you believe that they would then include a performance hit feature?
 
Pete said:
nelg, Anand's first paragraph was pretty clear, as I think it was written right after Andrew Ku's review of ATi's and nV's mobile parts (9600 and 5600), in which ATi literally spanked nVidia. You're really picking nits there.

Only because you know the realative performance between the two parts. Try reading it with the perspective of having no prior knowledge.
 
THe_KELRaTH said:
While I've read that there's a solution for Centroid AA for the FX range I've not seen any confirmation that Nvidia are implementing it. With Nvidia's reputation for cutting every corner they can just to get reasonable fps can you believe that they would then include a performance hit feature?
I believe the application should be sending down the appropriate shader (i.e. one that clamps texture samples) as the driver doesn't export this feature.
 
nelg said:
Only because you know the realative performance between the two parts. Try reading it with the perspective of having no prior knowledge.

True, I bet anyone without prior knowledge of what's happening would think:

"Hmmm, this FX costs $1100 and this Ati card costs $700, I think I read about this at *insert fanboi site*. The FX was kicking ass and considering it's more expensive than the Radeon how can I possibly go wrong?"

This is the type of attitude a majority of consumers have, believe it or not. When I ask why a friend or whoever why they bought an nVIDIA card (NV3x) that would be the usual response or this one sometimes:

"nVIDIA is the best".

Lame responses eh?

These guys have no knowledge of what has happened to nVIDIA nor what they have done. These guys think nVIDIA's drivers are perfect and have awesome IQ, performance and are bugless. :LOL:
 
K.I.L.E.R said:
Lame responses eh?

Sadly, no. I could almost ACCEPT that as it's just typical "uninformed consumerism." (Of course if they kept you aware of upcoming purchases, you may have wanted to inform them more. ;) )

The worst I get are from people who have no interest in reading up or checking anything, yet somehow remain die-hard fans. "I had a problem with an ATi card, therefore ATi sucks." "ATi makes horrible drivers (referring to, like, Rage128-era stuff, even)." "I know a guy who had problems in a MMORPG with his ATi card, so ATi sucks." "nVidia has more marketshare so they're the best!"

<sigh>

Funny thing is I would debunk their comments and explain how one SHOULD make smart hardware-buying decisions, and all of this was easily done WITHOUT talking about anything the FutureMark issues on up.

I'm pained a lot more by people who do no research, don't care, yet can be amazingly stubborn regardless.
 
Back
Top