Half Life 2 Benchmarks (From Valve)

WaltC said:
If their products aren't number 1, then they won't be, either. And for once I say it's about time to see the companies stand or fall on the products they make--and to hell with PR.
Eloquently put.

Somewhere along the line nVidia stopped listening to what their customers wanted and instead tried to tell their customers what they wanted, and they still haven't figured out why the customer isn't happy with that. :(
 
Pete said:
Just to be clear: pixel shaders (meaning hardware) aren't communal like vertex shaders, right? AFAIK, one pixel shader will work on one pixel to be shaded--you can't use two pixel shaders to halve the time of computation on a single pixel, because of the exclusive nature of the pipelines and this pixel shaders, correct?

Yes. However, the assumption is that you are shading enough pixels at a time that the distinction becomes pretty much irrelevant. (Although this is not quite the case when certain pixel pipelines sit idle at the edges of a polygon--a case which becomes more and more frequent as polygons get smaller and smaller.)
 
incandescent said:
eh? sure is a lot of fuss over something so trivial. All NVIDIA has to do is release the NV40 on time, and such that it is the undesputed perf leader. Things changed in an instant for ATi when the 9700 was launched --- and they can just as quickly change for NVIDIA.

But:

1. A lot of people who might have bought NV40 have already spent this cycle's upgrade money on R3x0 and won't need to upgrade so soon.

2. A lot of people who might have bought NV40 will wait for R420 (especially those from 1 above who now have R3x0)

Why? Because of Nvidia's recent bad behaviour and inability to deliver a competative product. All the lies and cheating that Nvidia put into place as a stopgap in the face of overwhelming competition has been payed for with the trust of their potential customers. A sizable number of people won't forget the last year just because NV40 comes out - they will remember the Nvidia PR war, and will elect to wait a few more months and go with the new company they have gained trust in. If the delays rumoured on NV40 are correct, the wait will be a matter of weeks.

I guess what I'm trying to say here, is that brand loyalty is a powerful thing. Nvidia had loads of it, and pissed away quite a sizable proportion of it to it's competitors in less than a year. All to prop up a substandard bit of hardware they will be superceding in another few months.

It will take a lot longer and a lot more for Nvidia to get back that loyalty and that market than simply bringing out a NV40 that is slightly better than the R3x0. For people like me that were originally looking to buy an NV30, and switched to ATI, Nvidia will have to bring out something that totally blows the doors off current cards - and even then, I'll wait to see if R420 is better agian because I have faith (or brand loyalty) to ATI because of what Nvidia have done over the last year. So NV40 will in fact have to be a factor of times better than R420 for me to switch back - being a little bit better or as good as just won't cut the mustard - and that's all a corner that Nvidia has painted itself into.
 
@bzb, i agree with most of what you said. Performance and price will be what will be the deciding factors for me. Right now ATI hands down beats NV. If i remember correctly NV has a butt load of cash in reserves and im sure they are willing to spend it on bribes^H^H^H^H^H^H PR, R&D, and whatever else they need to get back on top. :)

later,
 
epicstruggle said:
@bzb, i agree with most of what you said. Performance and price will be what will be the deciding factors for me. Right now ATI hands down beats NV. If i remember correctly NV has a butt load of cash in reserves and im sure they are willing to spend it on bribes^H^H^H^H^H^H PR, R&D, and whatever else they need to get back on top. :)

later,

Are they going to bribe each customer? Do you think they'll be putting out R420 killers at $100 to do it? It's either that or spend a long time behaving well and putting out good products for the next couple of cycles before I'll consider buying their graphics cards again. Nvidia have screwed themselves by giving faithful customers whatever the exact opposite of "brand loyalty" is.
 
BRiT said:
Tahir said:
I want the old NVIDIA of TNT days through to the GF3 days. I dont like the post 3dfx NVIDIA.

What makes you think today's Nvidia is any different from yesteryear's Nvidia? The only difference I see is yesteryear's Nvidia was never caught...

and I get abused for saying this:


Payback's a bitch...... nVidia is now reaping just what it has sowed!

nVidia's been cheating for years. I believe someone here once quoted a former 3DFX employee, who went to work for nVidia, as say that they knew nVidia cheated on benchmarks, but never realized by just how much!
 
incandescent said:
eh? sure is a lot of fuss over something so trivial. All NVIDIA has to do is release the NV40 on time, and such that it is the undesputed perf leader. Things changed in an instant for ATi when the 9700 was launched --- and they can just as quickly change for NVIDIA.

Just to expand on a couple comments already made.

First, things did NOT change quickly for ATI. I would only say that right now, with the Half-Life2 bencmarks, ATI is finally pretty much accepted as the leader by the masses.

Flash back to a year ago. R300 is released. The "technically inclined, non fanb*ys", recognized the technical superiority immediately, and wondered if nVidia could respond with NV30.

The masses, however, took the "Well, just wait for NV30....it's obviously going to be better...mostly because well, it's nvidia. ATIs drivers suck...blah blah blah..." I agree that technical leadership can turn on a dime. NV40, when released, can be the indisputable technical leader at that time, as perceived by non-biased observers.

However, mass cusomter sentiment doesn't turn on a dime. They are usually dragged kicking and screaming. ;) Customer sentiment started to turn toward ATI with the R300. The shift is just about complete now with the Half-Life benchmarks. Cusomter shift can start to turn back to nvidia with the next gen...but it likley would take a year of continued leadership to complete the shift.
 
martrox said:
I believe someone here once quoted a former 3DFX employee, who went to work for nVidia, as say that they knew nVidia cheated on benchmarks, but never realized by just how much!

Hmmm, the quote I remember was more along the lines of ex-3dfx folks being surprised by nVidia's drivers. At that time I assumed that they were surprised by the quality and/or quality assurance of the drivers. But today it is of course a possibilty that they were talking about some sort of cheating. :rolleyes:
 
Come on, although I am happy with my Radeon 9700, ATi is cheating also in their drivers. A recent article proved that both nVidia and ATi use the same kind of cheats on 3DMark 2001 SE. I really dislike people only looking through their colored glasses and 'forget' this fact and just state that nVidia is the cheater. They both play that cheating game if it fits their needs. Or in other words: YOUR money. Don't be so naive that ATi is the saviour, because you will only look like a fool in the end. ;)
 
Anyone who thinks there is evidence showing anything remotely close to the same level of cheating in ATIs drivers vs. nVidia's drivers, is taking the position of a fool, IMO.
 
yes, Joe, you are absolutely right. The only reason to imply that ATI cheats is to justify nVidia's cheating.
 
LeStoffer said:
martrox said:
I believe someone here once quoted a former 3DFX employee, who went to work for nVidia, as say that they knew nVidia cheated on benchmarks, but never realized by just how much!

Hmmm, the quote I remember was more along the lines of ex-3dfx folks being surprised by nVidia's drivers. At that time I assumed that they were surprised by the quality and/or quality assurance of the drivers. But today it is of course a possibilty that they were talking about some sort of cheating. :rolleyes:

No, it was supposedly the level of cheating in the drivers once they started working for Nvidia. The quote, which I heard before it was circulating on the 'net, was something like, "You know, we always suspected they'd been cheating but we had no idea it was this bad."
 
sonix666 said:
Come on, although I am happy with my Radeon 9700, ATi is cheating also in their drivers. A recent article proved that both nVidia and ATi use the same kind of cheats on 3DMark 2001 SE. I really dislike people only looking through their colored glasses and 'forget' this fact and just state that nVidia is the cheater. They both play that cheating game if it fits their needs. Or in other words: YOUR money. Don't be so naive that ATi is the saviour, because you will only look like a fool in the end. ;)

Don't you understand the difference between the cheating on benchmarks that Nvidia does with the intention of producing inflated and misleading scores, and the genuine optimisations that ATI implement to increase performance for their customers? You don't appear to. :rolleyes:
 
Sharkfood said:
The single most important points to be discovered are generally what you DONT see on the bars and graphs. It's interesting that some read this as being another "quack" issue when there was no mention of such thing. The only point being, without some solid, conclusive analysis, what has been provided is still mostly useless and meaningless until peer review and some research/logical progressions of findings can be provided.

I can't really agree with this assessment in the case, Shark, mainly because this is not "bars and graphs" released by an IHV or by a prejudiced web site. This is damage-control information released to the public as a self-defense measure by a premiere software developer who wants people to understand why their software runs as it does on various hardware. The precise purpose of the information was to release performance data relative to DX9, and for Valve to detail and explain that it spent 5x the development effort on the NV3x path that it spent on the DX9 path in an attempt to get the software to run competitively.

The problems Valve has talked about are the same one's Carmack's talked about relative to Doom3 rendering precision--over and over again--and are the same problems brought to light in 3dMk03, ShaderMark, Tomb-Raider, et al. There's *no difference.*

I really think it's somewhat unfair to impugn Valve's comments as commercially motivated because Valve isn't ATi--Valve doesn't benefit from the sale of ATi hardware directly--Valve wants to sell its software to everybody it can. What they were doing here was a little preemptive damage control prior to shipping--better to get it out in the open now as opposed to dealing with it after the software ships--I agree with that approach 100%.

Basically, Valve is privy to everything you "don't see" in the bars & graphs, and probably what you do see is only a part of the story from Valve's perspective. I'll wager Gabe didn't say everything he could have said, and was trying to frame the issue as conservatively as possible. Valve, on the eve of shipping what is likely to be a blockbuster of a game like we haven't seen for years, is very unlikely to put its credibility on the line by misrepresenting anything. I think if anything they may have understated their case because of not wanting to alienate people with nVidia products, but wanting them to know what's going on at the same time.

Edit: typos
 
martrox said:
yes, Joe, you are absolutely right. The only reason to imply that ATI cheats is to justify nVidia's cheating.

I do want to make one thing clear:

If we ripped ATI's drivers apart, do I think we might find some "questionable" optimizations? Yup. (And I'd guess the same would be true with everyone's drivers.)

Do I think they would be remotely anything like what the public has seen in nvdia drivers? Nope.
 
John Reynolds said:
No, it was supposedly the level of cheating in the drivers once they started working for Nvidia. The quote, which I heard before it was circulating on the 'net, was something like, "You know, we always suspected they'd been cheating but we had no idea it was this bad."

Thanks for the clarification John; I guess I just kinda repressed this info back then (don't ask why!). 8)
 
Pete said:
Just to be clear: pixel shaders (meaning hardware) aren't communal like vertex shaders, right? AFAIK, one pixel shader will work on one pixel to be shaded--you can't use two pixel shaders to halve the time of computation on a single pixel, because of the exclusive nature of the pipelines and this pixel shaders, correct?
Yes, but this doesn't matter. Since both process in parallel, the throughput is doubled. The latency remains the same. The pipeline is optimised such that the latency is largely irrelevant and throughput improvements lead directly to realised better performance.
 
sonix666 said:
Come on, although I am happy with my Radeon 9700, ATi is cheating also in their drivers. A recent article proved that both nVidia and ATi use the same kind of cheats on 3DMark 2001 SE. I really dislike people only looking through their colored glasses and 'forget' this fact and just state that nVidia is the cheater. They both play that cheating game if it fits their needs. Or in other words: YOUR money. Don't be so naive that ATi is the saviour, because you will only look like a fool in the end. ;)

As a vpu benchmark, 2001SE is nowhere near as relevant as 3dMK03. What cheating is being done by your drivers in '03? The thing to do is to firmly pull yourself into the present and stop confusing past events with present relevance.
 
Ha ha ha,

this is really so hilarious. Don't get me wrong, I am very very very happy with my Radeon 9700. But the lame excuses for ATi cheating are really stupid. Be responsible, and just say, yes, ATi has or is also cheating. Don't try to change the subject or try to make one cheat more cheating than the other. Simple fact is, both companies have been found cheating in the current crop of drivers.
 
i don't see anyone making excuses for ati here. however, saying "both companies have been found cheating in the current crop of drivers." makes it sound like a toss up between the two; while nvidia clearly deserves more recognition for their efforts in the cheating department.
 
Back
Top