ATI and App Specific Optimisations

Must you turn everything ATI vs Nvidia?

All I want is a checkbox that allows us to disable ATI's optimizations. It's a checkbox, not an outright removal of that feature. Give us that option and stop all the shithauling being made.
 
Smurfie said:
Must you turn everything ATI vs Nvidia?

All I want is a checkbox that allows us to disable ATI's optimizations. It's a checkbox, not an outright removal of that feature. Give us that option and stop all the shithauling being made.

and all i want is a check box to disable nvidia's optimizations . It's a checkbox , not an outright removal of that "feature".

Give us that option. God only knows how long they have been doing it .
 
Ati becoming Nvidia

Many of you doubted this but its more and more coming true monthly.
They can't compete so do the dirty crap Nvidia has done.

They are just doing it in public so they won't get as many nasty emails and people claming that they are cheating.

No, it doesn't make me feel good, no it's not right, but ATI is using the same tactics because they are currently too spread thin like Nvidia and going through the exact mistakes.

It pains me to no end and some of you will say well they have to do this to keep up with Nvidia's own modifications, then what the hell was that interview all about when ATI said Nvidia was doing optmisations for the Nv4x which is bullcrap.

ATI is following Nvidia's same mistakes and this time the shoe is on the other foot. This isn't about trolling or fanboys, this is what it is.
 
ATI and why they are now being run by monkeys.

jvd said:
Smurfie said:
Must you turn everything ATI vs Nvidia?

All I want is a checkbox that allows us to disable ATI's optimizations. It's a checkbox, not an outright removal of that feature. Give us that option and stop all the shithauling being made.

and all i want is a check box to disable nvidia's optimizations . It's a checkbox , not an outright removal of that "feature".

Give us that option. God only knows how long they have been doing it .

How to watch a company that drove themselves into the ground and then take your company and do the same thing. ATI is run by a bunch of moronic monkeys. They watch Nvidia do it in the past with the NV30 so guess what, lets be a stupid as they are and do the same friggin thing.

The idea is to learn and better yourself, not make the same mistakes as your competition. Anyone in Canada need some business sense?
 
Re: ATI and why they are now being run by monkeys.

Proforma said:
jvd said:
Smurfie said:
Must you turn everything ATI vs Nvidia?

All I want is a checkbox that allows us to disable ATI's optimizations. It's a checkbox, not an outright removal of that feature. Give us that option and stop all the shithauling being made.

and all i want is a check box to disable nvidia's optimizations . It's a checkbox , not an outright removal of that "feature".

Give us that option. God only knows how long they have been doing it .
How to watch a company that drove themselves into the ground and then take your company and do the same thing. ATI is run by a bunch of moronic monkeys. They watch Nvidia do it in the past with the NV30 so guess what, lets be a stupid as they are and do the same friggin thing.
"Moronic monkeys"? Maybe you should take a look in the mirror. Tell us, what multibillion dollar company do you run that makes you such an authority?
The idea is to learn and better yourself, not make the same mistakes as your competition. Anyone in Canada need some business sense?
What mistake has ATI made? Aren't they profitable? Aren't they growing revenues every quarter? Application specific optimizations don't have to be cheats.

-FUDie
 
martrox said:
So....in this case, chalk one up for nVidia...... :rolleyes:

Point being, ATI has no choice but to do this in order to compete. It sucks, and is the major reason that the BS that nVidia pulled the last 2 years will have an everlasting effect on the industry. And the real effects aren't even here...yet. As the manufactures start spending time, money and resources to optimise for the really big games, the smaller ones will lose out. There will be less innovation..... the money will be spent on optimisations.

Thank you, nVidia....we all lose!
 
jvd said:
Smurfie said:
Must you turn everything ATI vs Nvidia?

All I want is a checkbox that allows us to disable ATI's optimizations. It's a checkbox, not an outright removal of that feature. Give us that option and stop all the shithauling being made.

and all i want is a check box to disable nvidia's optimizations . It's a checkbox , not an outright removal of that "feature".

Give us that option. God only knows how long they have been doing it .

I repeat, "Must you turn everything ATI vs Nvidia"? Does ATI not do well enough to stand on its own features that you have to resort to putting down competitors?

I don't care what Nvidia does in its drivers anyway, they want to cheat, it's their problem. But I am back to using my 9600 since I RMA'ed my 6800nu due to its in-game freezing problems. Nvidia's drivers has 2 checkboxes that disables trilinear and anisotropic filtering. Whether it really disables them or not, is up to your conspiring minds to believe. But it's there, and ATI staunchly refuses to put in a checkbox to disable its own optimizations. With ATI's refusal, it does nothing more than add fuel to the critics.

Look, ATI has played a lot of filtering tricks in the past. They have been doing it for a long time, with some proof here. This optimization is at the driver, why don't we just put in the option to silence the critics?

If we want ATI to lead, we have to get them to not make mistakes and continue to stay clean and ahead of the pack. Your continual defense of their mistakes does no good at all.[/url]
 
Can someone answer the question why are driver-based optimisations considered to be bad? Forget for a moment basing your argument on which IHV you perceive to be doing them and concentrate instead on explaining what is inherently wrong with them. To make things simple lets confine this to optimisations that have no discernible impact on IQ (by discernible I mean observable to the untrained naked eye).

One of the main arguments so far I have seen against optimisations is that they skew benchmark results. So what?

The trouble is games are not made for the purpose of being tools to benchmark video cards fairly. For a long time now games have been tweaked to follow slightly different rendering paths dependent on the hardware detected. If Valve, for instance, choose to have a lower IQ rendering path for the NV3x to keep performance playable, do we lambast them for screwing up our precious benchmarks? Or do we praise them for attempting to make the game playable for the widest number of customers?

So, if Nvidia or ATI do the same thing via the drivers, is it suddenly wrong, even if they manage it without lowering IQ? If Nvidia do shader replacement in D3 for NV3x cards, is that really so terrible? If you are a NV3x owner playing D3 I don't think you would think so. If you are an ATI are you going to turn round to the people enjoying D3 and say, "Sorry, you must suffer worse performance because it pisses me off that you are getting better FPS than me on my ATI card"? Are you Nvidia owners going to whinge at ATI for coming up with a frankly very-clever way of optimising filtering? Why shouldn't they enjoy this benefit?

The fact is that benchmarks from one single game should never be relied on to be indicative of a piece of hardware. Every game is different and places different demands on hardware. UT2004 has much different demands than Doom 3 - you can't necessarily rely on benchmarks of the former to predict performance on the latter. As shaders become more complex and hardware has more features then inevitably you will see different rendering paths - meaning they are never going to be totally accurate for apples to apples benchmarking, anyway.

Sure this makes life more difficult for people like Dave who have to benchmark cards, but perhaps people care too much about benchmarks anyway. Who really cares if Card X gets 3 FPS more than Card Y in Game Z so long as the game is playable on both cards? Answer: Only people who enjoy enacting vicarious pissing contents.
 
I don't see how this seems so complicated.

Nvidia:
-uses brilinear
-used clipping plane for benchmark to help profitable outcome, obviously clipping planes are not applicable in real-world scenarios (real games, etc)

ATI:
-uses brilinear
-says world isn't ready for SM3 yet
-...

Maybe I'm missing something, but out of the two evils, isn't it obvious which showed the most disturbing behavior?
 
I don't mind application detection or trylinear or all the other tweaks, the isssue is whether they come defaulted to off and you have to do some work to enable them, or they come defaulted to on and you have to do some work to disable them.

You would assume the former is the better to protect people not interested in the ins and outs of graphics cards and just want to play on a new PC for Christmas, however which would these guys really notice more, silky fast frmae rates or IQ optimisations ? So maybe they should be on by default, even for the ignorant ?

Dave Baumann is right though, yet another thing to be aware of when reading ( or writing ) a review to try and see if the playing field was level.

It's becoming a minefield. Good job these web site reviewers are highly paid, or else they would have something to complain about :D
 
Diplo said:
Can someone answer the question why are driver-based optimisations considered to be bad?

I know what you mean. I don't consider driver optimizations bad. When I had my 6800nu, I left all the optimizations on and didn't really mind it.

The problem isn't whether I think it's bad. The problem is there ARE people who think it's bad. And because there are people who think it is bad, there must exist the option to disable them, so that people can make comparisons, without any cause for gripe due to enabled optimizations.

My opinion: driver optimizations are fine.
Fact of life: Some people can't live with a IHV having a faster product and will find any excuse to sling mud at the other. We saw it with the Nv25 vs R200, then with the Nv3x vs R3x0, and now the Nv40 vs the R420. Driver optimizations are merely a source of mud.
 
Smurfie said:
My opinion: driver optimizations are fine.
Fact of life: Some people can't live with a IHV having a faster product and will find any excuse to sling mud at the other. We saw it with the Nv25 vs R200, then with the Nv3x vs R3x0, and now the Nv40 vs the R420. Driver optimizations are merely a source of mud.

Amen. What I find most stupid about all these arguments is that even if Nvidia and ATI provide us with a plethora of options to enable/disable all their cheats/optimizations/shader replacements 100% of people will enable all of those that cause no discernable visual difference when actually playing games.

With regard to shader replacements being an 'unfair' advantage - why should a consumer give a rat's ass if the result is equivalent IQ at higher performance? Personally I welcome all optimizations / cheats / replacements / voodoo magic that allow me to run my games faster as long as IQ is maintained.
 
Can someone answer the question why are driver-based optimisations considered to be bad?

They aren't.

Being honest I don't care how the drivers generate the image I see on screen as long as it's good looking (my definition only matters here for me) at the speed I want them to be displayed at. If ATi or nvidia want to sacrifice goats in order to give me the image I see on screen who am I to care?
 
Diplo said:
Can someone answer the question why are driver-based optimisations considered to be bad?

IMO, the optimizations, if done for specific games, are in fact good for that individual game, but this has the side effect that that game is then no longer as useful as a general performance benchmark, because other games will not have exactly the same optimizations (if any).

The real trouble comes because only those games which are regularly used as benchmarks tend to recieve optimizations! (talking about nvidia only here, obviously)

ERK
 
ERK said:
The real trouble comes because only those games which are regularly used as benchmarks tend to recieve optimizations! (talking about nvidia only here, obviously)

ERK

Good observation. However, benchmark titles are usually very demanding on the system hence would benefit more from such optimizations. And of course it is better to have a few optimized games than none at all.
 
ERK said:
The real trouble comes because only those games which are regularly used as benchmarks tend to recieve optimizations! (talking about nvidia only here, obviously)

ERK

So you don't think Ati targets the top benchmarked titles with their optimizations ?
 
Bjorn said:
ERK said:
The real trouble comes because only those games which are regularly used as benchmarks tend to recieve optimizations! (talking about nvidia only here, obviously)
So you don't think Ati targets the top benchmarked titles with their optimizations ?
Please show us where ATI has optimizations for "top benchmark titles". I mean in today's released driver, not in some future driver.

-FUDie
 
Diplo said:
IMO,whole point of app specific optimizations is win benchmarks no matter how good or bad architecture actually is and in fact mislead buyers.
Another thing is genaral optimization, which can ALL games benefit from, but still there should by way how turn it off.
 
Tweaker said:
Another thing is genaral optimization, which can ALL games benefit from, but still there should by way how turn it off.
Completely absurd. If the optimization is general, adding code to turn it off will probably slow everything down. For example, do users need a way to disable shader compilers/optimizers (I am talking about real optimizers, not shader replacement)? No.

-FUDie
 
Back
Top