State of 3D Editorial

Just wondering - How confident are people that the Quack thing was the only "issue" with ATi back in the day? Driver scrutiny by the enthusiast community is 10x what it was back then, so it stands to reason that some "optimisations" may have slipped under the radar. Hell, it's only reasonably recently that we've been getting detailed IQ comparisons in reviews.

Has anyone investigated it? Can't remember seeing anything anywhere.
 
Well, seeing that NVIDIA really only released the 5x series of drivers, it appears that their driver and software technology at the time was nowhere near mature enough to adequately feed the NV3x processors.

Josh, i don't undrestand that. Did you really miss all the 4x serie of drivers the past 9 months ? That s theses think that are a bit hard to understand in your writings. You post thing like that and it s blatantly false, easy to know if you had a videcard since february, and yet you base your argumentation on thing like that.
 
Bouncing Zabaglione Bros. said:
What you're suggesting is like saying that a DVD with Dolby Digital isn't a "true" DVD if you can play it on your TV with two speakers and get stereo instead of the 5.1 soundtrack.

As always analogys are risky while they may sound smart, they may also be off base.

What I was saying is that if a game is written with DX9 in mind from the begining, and unlike Halo where they threw in some effects then it is a DX9 game. If a movie is filmed with stereo sound and later they convert it to 5.1 (but it is still jsut stereo in a AC3 format) then I would compare that to a few DX9 effect thrown on top.

That too is flawed, but you can see what I mean. A DX9 game is just something that has a significant amount of the engine invested in the idea of DX9 as a minimum even though a work around may be possible. You can see what I mean. I am saying that if you look at anything that is out even 3dmark 03 none of it had mostly DX9 features.
 
JoshMST said:
I am taking the stance on this as I do the human rights violations in China with the treatment of their prisoners. Do I condone the Chinese actions? No. Will they continue to treat their prisoners in this way even though I write an article exposing this? Yes. Will they change just because I want them to? No.

NVIDIA will not change the way they do business even though it is well known that they cheated. Yes, NVIDIA took this a step above what ATI did in the past with its 8500. ATI corrected their "optimizations" as NVIDIA is correcting theirs now. All I am saying is that nobody has their hands clean.
And again you do it. :rolleyes:

"Yes, nVidia took this a step above what ATi did in the past"

No, no nVidia took this to a whole lower level than ATi ever dreamt of! And why do you keep going on about how "nVidia is correcting theirs now"...what signs of this are you talking about? The Det5's provide no magic bullet, there ain't no huge performance gains.

What are you talking about Josh?
 
I meant to say "seeing that NVIDIA only RECENTLY released the 5x series of drivers."

Sorry for the confusion. And yes, the lovely clipping planes, forcing FP16 and partial precision, and other "optimizations" were included in the 4x series of drivers. So my basic point was that until the honest optimizations and compilers were ready, NVIDIA gave us the "questionable" stuff with the 4x drivers.

Additionally: It appears that there are no magic bullet drivers with the 5x, in that they do not take performance to the level where ATI is at. From all appearances though, NVIDIA is keeping the same level of performance as with the cheats, except they look to be rendering everything correctly (and without clip planes, etc.). So basically, with the latest drivers, it appears that NVIDIA is rendering things correctly. I could be wrong, but that is how it appears.

And for the last time, I am not saying cheating is acceptable! I also don't like the way that God has set up the laws of physics so that I can't run around the universe in my spaceship! Until we improve our technology, I am stuck here on Earth. I don't like it, but that is the way it is! I am not God, nor am I the management at NVIDIA, so I can't change the way they act. I can only look at the situation, say it sucks, and then move on.
 
Josh, I have a hard time following your logic. You seem to saying that after ~six months nV caught up to the performance that they promised (implied with cheats) so it's O.K.. If that is the case then ATI should have cheated also. There drivers have improved performance over time. It also opens up another can of worms, what if future drivers without cheats (please stop calling them optimizations) cannot deliver on the promise of performance that the cheating drivers made. Will these cheats (trilinear filtering comes to mind*) be entrenched?

* overriding user selected in game settings
 
Doomtrooper said:
The latest 52.16 drivers are advertised to support FP render targets:

http://download.nvidia.com/Windows/...raphics_drivers_Release_Notes_v0.2.v52.16.pdf

Yet they DO NOT ;)

The 52.16 drivers do not allow trilinear AF..at ALL.

I see nothing changing..At ALL.

Nice exaggeration. We'll ignore the other problems that the old drivers had which aren't in the new ones, instead sticking to 2 examples and using them as a basis to claim that nothing has changed whatsoever.
 
JoshMST said:
NVIDIA will not change the way they do business even though it is well known that they cheated.

Gawd I hope you are wrong. I think / hope they will be very careful if this type of situation comes up again.

JoshMST said:
Additionally: It appears that there are no magic bullet drivers with the 5x, in that they do not take performance to the level where ATI is at. From all appearances though, NVIDIA is keeping the same level of performance as with the cheats, except they look to be rendering everything correctly (and without clip planes, etc.). So basically, with the latest drivers, it appears that NVIDIA is rendering things correctly. I could be wrong, but that is how it appears.

*cough* Psuedo-Trilinear Filtering *cough* ;)

PaulS said:
Nice exaggeration. We'll ignore the other problems that the old drivers had which aren't in the new ones, instead sticking to 2 examples and using them as a basis to claim that nothing has changed whatsoever.

Sure... the nature of the cheating has changed. It's STILL a cheat. If I pay $500 for a fricken card, it better have the FULL features of a $500 card, regardless if the "optimisation in question" is hard to see unless you "zoom in on screenshots", It's not a FAIR benchmark against the competition "doing more work". /shrug


Regards,

Taz
 
Doomtrooper said:
The 52.16 drivers do not allow trilinear AF..at ALL.
If I'm not mistaken (and I very well might be), they do proper trilinear if application preference is selected and the application choses to do it.

But forcing it on in the control panel always ends up in the pseudo-trilinear, doesn't it?

Or am I thinking about ATI...
 
RussSchultz said:
Doomtrooper said:
The 52.16 drivers do not allow trilinear AF..at ALL.
If I'm not mistaken (and I very well might be), they do proper trilinear if application preference is selected and the application choses to do it.

But forcing it on in the control panel always ends up in the pseudo-trilinear, doesn't it?

Or am I thinking about ATI...

You're thinking about ATi, yes.
 
RussSchultz said:
Serves me right for thinking, doesn't it?
Down that path, trouble does lay... ;)


Josh-

I'm still not sure that the jury is in as to whether they've fairly increased their performance with the Det5s or they just learned to cheat differently, it takes time to discover cheats....I find it mildly amusing that you're assuming that nVidia has changed their ways.
 
digitalwanderer said:
I'm still not sure that the jury is in as to whether they've fairly increased their performance with the Det5s or they just learned to cheat differently, it takes time to discover cheats....I find it mildly amusing that you're assuming that nVidia has changed their ways.

Whilst i'm fairly sure their latest speed increases are a result of both compiler optimisations and the usual "cheats", the IQ is clearly superior to what it used to be. Many people have commented that the 52.16 drivers are the best nVidia have done in a while, and that they feel reasonably solid. So whilst they may well still be cheating, at least the IQ isn't suffering to the same extent it once was.
 
PaulS said:
Nice exaggeration. We'll ignore the other problems that the old drivers had which aren't in the new ones, instead sticking to 2 examples and using them as a basis to claim that nothing has changed whatsoever.

Exaggeration about what ?? What has changed ?? Is HDR supported..NO, is FP render targets supported..NO. It is written in black and white on the release notes, but the support is not there.

:LOL:
 
Sxotty said:
That too is flawed, but you can see what I mean. A DX9 game is just something that has a significant amount of the engine invested in the idea of DX9 as a minimum even though a work around may be possible. You can see what I mean. I am saying that if you look at anything that is out even 3dmark 03 none of it had mostly DX9 features.


You didn't answer the question: Do you consider HL2 to be a DX9 game, even though it will run on DX7 & 8 cards?
 
I am not God, nor am I the management at NVIDIA, so I can't change the way they act.

Does that mean if everyone pretends either me or you is god, NVIDIA's management will radically change their ways? Cool! 8) :D

---

Perhaps NVIDIA has been external communication than ATI - but then again, the problem is pretty much everything they say is a lie :LOL:
And when you realize employees including management get a similar treatement... :oops: ( I'm not saying they get always exactly the same BS, but they they a large amount of BS too ).

---

And if you try to turn this thread over ULEditorial even though it's about Josh's editorial, I'll simply reply:
THings have been insane in real life this week; Unfortunately i'm thinking Monday or Tuesday for my next ediit if I can't get it out Saturday.

So no thread hijacking, thank you :) ( and yes, notice there are two types in that part of the message alone and a lot more in the rest. And yes, he's going to edit my editorial - I assume he had to type of it in a hurry though, he generally types an awful lot better than that :p )

---

Regarding the "NV30 is more flexible" stance...
I wouldn't talk of flexibility or programmability or anything here. Instead, I'd simply talk of finesse.

Problem is, there's finesse that works and finesse that makes it worse than brute force. And then there's the from-paper-to-silicon transition which can be either good or bad.
My take on the NV30 is simple: Finesse that makes it worse than brute force coupled with a catastrophic paper-to-silicon transition.

The paper-to-silicon transition part is the worst one IMO, but that's discutable. The reason of this failure seems to have been exagerated optimism over just about everything.

Whether this gives any "experience" advantage is doubtful, considering ATI worked on the R400 which is significantly more flexible than the NV30 - okay, so it was scrapped ( and redesigned into the "new R500" ) - but that doesn't mean they couldn't have learnt a lot from it.


Uttar
 
JoshMST said:
Well, seeing that NVIDIA really only released the 5x series of drivers, it appears that their driver and software technology at the time was nowhere near mature enough to adequately feed the NV3x processors.

Josh, with regards to drivers I think you could also question whether Cg was any kind of good move at all.

When I recently spoke with David Kirk I asked what had taken them so much time to get a compiler optimiser in the drivers and his response was "Who'd have thought we'd need compilers in drivers?". Well, clearly they knew they would need some method of making Shader assembly output something that is at least vaguely optimal for thei hardware, and Cg does that however with MS pushing HLSL did anyone really think that Cg would become the defacto standard?

I think is a valid question to ask if the efforts they put into Cg were not better focused on supporting HLSL fully and creating a driver compiler much earlier than they did.
 
Back
Top