A note on percentages for you guys here.

Nvidia has heard the same criticism since the V5 days. It's possible they just ignore it and figure that the average consumer isn't making his buying choices based on hard to market IQ differences.

There has never been a time when NVidia had the best FSAA IQ (V5, 8500 > GF2/GF3/GF4), and ATI's been beating them at performance AF since the 8500.

NVidia may be banking on the fact that they can sell the GFFX into the DCC/Workstation market which will like the shaders, and that many people will look at the non-AA benchmarks, or many people will just consider that even though the 9700's 4x FSAA looks much better, the GFFX's 4xS AA looks "good enough". It's hard to tell the difference unless you can switch between them side by side, and how many Best Buys, Circuit Cities, and Frys are going to run demos to show this?

Perhaps ATI needs to develop and push Demo Kiosks into computer retailers like Nintendo/Microsoft/Sony do, and show the GFFX/Radeon9700 side-by-side. IQ is hard to "sell" to non-highend computer users.
 
Its not about bagging on Nvidia. And it should be pretty easy for anyone with an open mind to see that. Its about making people see things for what they are.

It simply gets really old that people continually ignore the details as to what is going on thats all. The Big brush of Nvidias Gold standard drivers for instance. Instead of looking at each driver release, what it offered, how it offered it, issues it had, Complaints made etc.... Like the fact that GF3/4 suffereed for almost a year with unpoptomized Vertex/pixel shaders. Until Nvidia was forced to deal with it. Or their broken S3TC offering thats lasted through every single Geforce card released from the first one. Or the game issues that are reported fixed in driver release notes, and Game patches. Or how about issues where Developers code to Nvidia hardware first, and then test the app on everything else. Of course they reported any bugs they had with the Nvidia hardware along the way, and they got resolved.

I am not saying Nvidia is the *Devil* or anything. Simply that there is a rather large group of people with rosey colored glasses on.
 
Typedef Enum said:
Unless nVidia can completely overhaul their AA logic, I think it's all a moot point. I would take the one with significantly higher IQ _and_ performance over the other guy that could bring high performance without the IQ settings.

I really would like to know what its like to sit in on some meetings within the halls of nVIdia right about now...just to see what they're saying, with respect to all the commentary made about their new flagship product.

One could only hope that they're getting their act together. Perhaps some new project managers need to be appointed? Some layoffs in order? If I was nVidia I certainly wouldn't want something like this happening again. I imagine they're quite fragmented as far as developement teams go as well. I can't imagine how many engineers it took to try and develope 3 completely different chips (nv30 nv31 and nv34), not only that, but they're also dedicating engineers to the nv40, nforce3, pure R&D, various govt. projects, and pda/mobile phone R&D. Maybe they'll do a little more consolidation on the nv50? Whoever decided on .13u and DDRII for the nv30 should get the can in my book, get demoted to the mailroom.

It could be that the company is a mess of fragmentation right now. Noone high up could be very pleased with the performance of their stock options.
 
Hellbinder[CE said:
]Its not about bagging on Nvidia. It simply gets really old that people continually ignore the details as to what is going on thats all... issues where Developers code to Nvidia hardware first, and then test the app on everything else. Of course they reported any bugs they had with the Nvidia hardware along the way, and they got resolved.

Then why don't you say that, instead of name calling them overrated and most pandered to? Why use inflamatory words when you could simply say: NVIDIA has a developer mind share right now and is getting the benefit from this by appearing to have less problems in their drivers because everybody works around them from the start?

And Rev: I chose the complete lack of substance for my response for a reason.
 
Then why don't you say that, instead of name calling them overrated and most pandered to?

Probably because he doesn't think anyone should get so personally offended by such comments that are directed at a graphics card company? Perhpas "overrated" is exactly the best term that he feels describes nVidia's driver team?

Why do you get so offended by someone having the opinion that nVidia's driver team is overrated and pandered to?

And Rev...perhaps you should just avoid Russ from now on. ;)
 
Oh goodness, caught again defending my chosen company. :rolleyes:

Look, I'm not personally offended Joe, its just that we can aspire to more noble things than namecalling, especially since we'd like to have a more employees from all vendors frequent this board and contribute to the discussions.

Do you really have to wonder why we don't have any NVIDIA employees 'out of the closet' and contributing to the discussions? The ones that used to be here have either changed their name or quit contributing.

From now on, lets just assume that we're all in the same room and make our comments accordingly. ATI, Matrox, and NVIDIA, etc. are not some nameless entities--they're made up of people who have feelings also.
 
surely nVidia's mind share with developers was earned? Unless subliminal brainwashing is in any developer toolkit/SDK's nVidia supply? :p

I'm quite certain JC prefers to code with nVidia OGL drivers, because they are maturer and his needs are different to gamers after all.
 
Like the fact that GF3/4 suffereed for almost a year with unpoptomized Vertex/pixel shaders.
Even in their "unoptimized" state they seemed to outperform the competition.
 
Look, I'm not personally offended Joe, its just that we can aspire to more noble things than namecalling...

I fail to see how calling a driver team "overrated" as name-calling. It's an opinion. Did he say they "sucked" or anything like that?

Yes, the people at these companies are people. They have feelings. That doesn't excuse them from being criticized if someone feels its warranted. Not that calling a driver team "overrated" is actually criticizing the DRIVER TEAM, as much as it is a criticism toward those who keep perpetuating the "myth."
 
Well i just had a look at Nvnews and gess what? 50% increase in 4* AF , 7% in 2*AA ans 30% in 4* AF & 2* AA on Quake 3 performance

I don't consider improvements in Quake 3 (that is approaching 4 years old) really a big accomplishment, now MOHAA or RTCW would be different.
I thought the readers would be smart enough to see, but the only REAL large improvements in Detonator drivers has been on popular review benchmarking games and APPZ... they even talked about it the Detonator 40.41 release that 3Dmark got improvements :LOL:

The people on this forum that used the 40.41 drivers claimed Quake 3 washed out same for RTCW and very Dark..

Warcraft 3 they are slower:

benchmark_aa_w3.gif


Quake 3 they are faster (with graphic anomolies):

benchmark_aa_q3.gif


People are still rerpoting flashing textures in 3Dmark, its all reletive..if any IHV optimizes for ONE[ title or a handfull of course there can be improvements...but there is tradeoffs (IQ) and I've never seen a Detonator release give performance across the board, some drivers improve some areas yet some lower performance.
 
Gotta love percentages. they are a great statistical tool. news headline. "studies show that molodinium294 in the ground water increases cancer risk by 50%. " Data says group in moldonium infected area has 90 cases of cancer in population of 10,000. In general population for a similar sized group there are 60. Increase of 30 out of 10,000 = 50% or .3% depending how you look at it.
 
What you really mean Doomtrooper is that you discount games that scale with GPU performance (GPU bound) and only count games that have a heavy CPU component.

I don't know about you, but games like MOHAA, RTCW, and Jedi Knight were CPU bound on my system. And look at Warcraft, the game gets 46fps @ 1024x768 no-AA. Not exactly a brilliant example of a GPU limited game.
 
Who really cares ?? These are modern games..if optimizations are being used make sure they optimize on CURRENT titles people play and buy video card upgrades for...if you want to play Q3 fast you can get a $60 SIS Xabre.

MOHAA is not CPU bound nor is RTCW (both are based off a tweaked and modern Q3 engine)...maybe get yourself a better platform.
 
Doom, I still can't understand why you don't think major driver gains are possible for the GFFX? Apologies if that's not what you're saying, but that's what it sounds like.
 
Nvidia has a good driver team, they will get more peformance, 40-50% is stretching it bigtime IMO on modern games..10% is my guess.
I'm also basing my opinion on actual facts from previous driver comparisons, of course drivers get better (50%) is just too much IMO.

Somebody here named Kristof made a great statement many years back "Nothing is free in 3D"..and I stand by the statement.
 
Doomtrooper said:
Nvidia has a good driver team, they will get more peformance, 40-50% is stretching it bigtime IMO on modern games..10% is my guess.
I'm also basing my opinion on actual facts from previous driver comparisons, of course drivers get better (50%) is just too much IMO.

Somebody here named Kristof made a great statement many years back "Nothing is free in 3D"..and I stand by the statement.

Very well said DT. :)
 
Heathen said:
Doom, I still can't understand why you don't think major driver gains are possible for the GFFX? Apologies if that's not what you're saying, but that's what it sounds like.

just history.
Look at the detonator database driver comparisons.
If you see performance increases that are over 10-15%, let me know!
 
None of the online databases go back far enough to make a valid assumption for a brand new architecture. Ok the the int part may be based on the GF4 core, bu the FP part is brand new. To me it seems like either a) the hardware isn't up to it, b) The hardware is bust or c) The drivers are bust and not just unoptimised.

Look at the 8500, that didn't have smoothvision enabled in the release drivers (iirc), couldn't a similar situation be occuring wrt the FP DX9 component of the GFFX drivers?

PS: Doom man, Ijust love those smilies of yours.
 
Althornin said:
Heathen said:
Doom, I still can't understand why you don't think major driver gains are possible for the GFFX? Apologies if that's not what you're saying, but that's what it sounds like.

just history.
Look at the detonator database driver comparisons.
If you see performance increases that are over 10-15%, let me know!
Look at the graph i posted earkier in this thread ;)
 
Back
Top