ATI and App Specific Optimisations

Listen to what, i bet 95 % of the forum members i just skipping your posts.
Optimizations and cheating were talked to death for months on that forum and with way more thinking than in your posts.
Read the archives you may learn something tweaker proforma ondaedg and so on, noone will lose his time arguing again.
 
PatrickL said:
Listen to what, i bet 95 % of the forum members i just skipping your posts.
Optimizations and cheating were talked to death for months on that forum and with way more thinking than in your posts.
Read the archives you may learn something tweaker proforma ondaedg and so on, noone will lose his time arguing again.
Thanx for your advice but I have read B3D since 2003 or so. Maybe you should read thread before you something post :LOL:
 
Tweaker said:
Btw,is texture compression OK in term of IQ degradation? Just want to know your opinion.Please,no flame ;)

If the app is not asking for texture compresion, using a lossy is cheating. Using some kind of lossless compresion to improve performance is OK.
 
ondaedg said:
Ok, let's sum up:

Nvidia:
1. Nvidia is evil.
2. Nvidia "may" be doing app specific optimizations (how evil!) but we have no proof
3. Nvidia cheated in 3dMark!
4. Nvidia gives the end user the option to use brilinear filtering! (They are evil!)
5. Nvidia may be doing shader replacements that I can't notice but makes the game faster (how evil!) and no, we have no proof of this either..
6. Once again, Nvidia is evil?

ATI:
1. Nvidia is evil!
2. ATI "may be doing app specific optimization (this is good!) but we have no proof
3. Nvidia cheated in 3dMark!
4. ATI uses a brilinear filtering without the option to turn it off unlike Nvidia therefore, Nvidia is evil!
5. ATI may be doing shader replacements that I can't notice but makes the game faster (good work ATI!) and no, we have no proof of this either..
6. ATI's execs are being investigated for insider trading so....Nvidia is an immoral company!
7. ATI is being sued by its stockholders for unethical accounting practices and inaccurate financial reports (As long as it doesn't hurt us PC enthusiasts!)
8. ATI has been reducing filtering quality to enhance performance since the r200 era. (Nvidia is evil!)
9. Therefore, Nvidia is evil?

Where do read that, it is obviously not this thread – maybe you should try actually reading the thread if you already have done that try looking over your own bias that might help you understand what people actually are writing.
 
Re: Ati becoming Nvidia

Proforma said:
Many of you doubted this but its more and more coming true monthly.
They can't compete so do the dirty crap Nvidia has done.

They are just doing it in public so they won't get as many nasty emails and people claming that they are cheating.

No, it doesn't make me feel good, no it's not right, but ATI is using the same tactics because they are currently too spread thin like Nvidia and going through the exact mistakes.

It pains me to no end and some of you will say well they have to do this to keep up with Nvidia's own modifications, then what the hell was that interview all about when ATI said Nvidia was doing optmisations for the Nv4x which is bullcrap.

ATI is following Nvidia's same mistakes and this time the shoe is on the other foot. This isn't about trolling or fanboys, this is what it is.

What the hell are you talking about?? I get the feeling some people skim read over the first post and delve no deeper, just posting their first thought.
 
ondaedg said:
... some peeps will sit in front of a computer for hours a day attacking other people and defending a vendor just because he has adopted some strange fascination with that particular IHV.

:LOL: :LOL: I like your style. Friendly word of warning though. Criticism of ATI's products or any of its supporters on these boards is not taken lightly! :devilish: Sure there is a lot of hypocrisy from both camps but it's most obvious so just take the forum for what it is and try to have some fun ;)
 
So, does anyone actually have an example yet of ATI's application "optimizations," or are we just having simultaneous circle-jerks by each of the fanboy camps?
 
The Baron said:
So, does anyone actually have an example yet of ATI's application "optimizations," or are we just having simultaneous circle-jerks by each of the fanboy camps?

Nope but you would think so many with anti ati angendas would have found something in the last year or so that ati has had it in use .

Funny how we weren't even looking for nvidia brileaner and yet it was very apparent that in a preview of the card beyond3d.com staff members found the image quality loss
 
jvd said:
The Baron said:
So, does anyone actually have an example yet of ATI's application "optimizations," or are we just having simultaneous circle-jerks by each of the fanboy camps?

Nope but you would think so many with anti ati angendas would have found something in the last year or so that ati has had it in use .

If they have done their optimization right, as in no loss of IQ, then why should we notice them ?
 
The Baron said:
So, does anyone actually have an example yet of ATI's application "optimizations," or are we just having simultaneous circle-jerks by each of the fanboy camps?

As far as I know, there are no applications that have been proven to be app specifically optimized by ATI, or NVidia for that matter. That is the reason for my posts in this thread. There is no conclusive proof (yet) that any IHV are doing app specific optimizations.

Quite honestly though, I don't care if they do. As long as the image quality and the gameplay don't suffer, let them have their fun!
 
Tim said:
ondaedg said:
Ok, let's sum up:

Nvidia:
1. Nvidia is evil.
2. Nvidia "may" be doing app specific optimizations (how evil!) but we have no proof
3. Nvidia cheated in 3dMark!
4. Nvidia gives the end user the option to use brilinear filtering! (They are evil!)
5. Nvidia may be doing shader replacements that I can't notice but makes the game faster (how evil!) and no, we have no proof of this either..
6. Once again, Nvidia is evil?

ATI:
1. Nvidia is evil!
2. ATI "may be doing app specific optimization (this is good!) but we have no proof
3. Nvidia cheated in 3dMark!
4. ATI uses a brilinear filtering without the option to turn it off unlike Nvidia therefore, Nvidia is evil!
5. ATI may be doing shader replacements that I can't notice but makes the game faster (good work ATI!) and no, we have no proof of this either..
6. ATI's execs are being investigated for insider trading so....Nvidia is an immoral company!
7. ATI is being sued by its stockholders for unethical accounting practices and inaccurate financial reports (As long as it doesn't hurt us PC enthusiasts!)
8. ATI has been reducing filtering quality to enhance performance since the r200 era. (Nvidia is evil!)
9. Therefore, Nvidia is evil?

Where do read that, it is obviously not this thread – maybe you should try actually reading the thread if you already have done that try looking over your own bias that might help you understand what people actually are writing.

Hi Tim. Perhaps that post was a sumnation of alot of posts. I expected you to see that. :D
 
ondaedg said:
The Baron said:
So, does anyone actually have an example yet of ATI's application "optimizations," or are we just having simultaneous circle-jerks by each of the fanboy camps?

As far as I know, there are no applications that have been proven to be app specifically optimized by ATI, or NVidia for that matter. That is the reason for my posts in this thread. There is no conclusive proof (yet) that any IHV are doing app specific optimizations.

Quite honestly though, I don't care if they do. As long as the image quality and the gameplay don't suffer, let them have their fun!

Really? I seem to remember (probably fuzzy memory BTW) that a russian guy picked apart the Nvidia drivers, even going as far as to un-encrypt them when they tried to cover up their dirty work, who happened to find references to practically every decent game under the sun. I'dl go find a link, but the searh button is too far away. :can'tbebotheredsmilie:
 
ondaedg said:
As far as I know, there are no applications that have been proven to be app specifically optimized by ATI, or NVidia for that matter. That is the reason for my posts in this thread. There is no conclusive proof (yet) that any IHV are doing app specific optimizations.
nV has admitted this to Derek at AT, and I believe it was Unwinder who found tons of references to specific games in nV's drivers. So they are doing it.

Quite honestly though, I don't care if they do. As long as the image quality and the gameplay don't suffer, let them have their fun!
Agreed, but the sticking point is that nV used some blatantly unhelpful-to-the-gamer tweaks to gain standing in benchmarks like 3DM03. Both ATi and nV have admitted to optimizing for 3DM01SE Nature, but I *believe* those optimizations increase speed without noticably impacting IQ in a legitimate way (i.e., no custom clip planes or omitted z clears).

Anyone who tries to downplay or excuse nV's 3DM03 "app-specific optimizations" must not understand what they did and why. It's that simple, IMO. Substituting optimized shaders that speed up gameplay--as opposed to just benchmark runs--without noticably degrading IQ still is OK, IMO. It ain't ideal, but then what is in this crazy world of ours? :p

I'm not sure we can expect mathematically-equivalent optimizations, as games will render differently on different cards simply because of differently-capable hardware (DX7 vs. DX8, FX8 vs FX12, FP16 vs FP24, ATi's 5-bit vs. nV's 8-bit texture filtering, etc., etc.). In general, I'd only consider a mathematically-equivalent code substitution an "optimization." If the drivers can find a shortcut to the right answer (without knowing it before hand [nV + 3DM03]), I'd consider that a "generic optimization." But we're probably not going to see much of either with games, as games still prefer speed over pixel-perfect rendering. IMO, the best we can hope for are optimizations that increase speed without noticably decreasing IQ. Mathematical equivalence doesn't realistic for games that have to run on several generations of hardware from several different manufacturers.
 
Pete said:
Agreed, but the sticking point is that nV used some blatantly unhelpful-to-the-gamer tweaks to gain standing in benchmarks like 3DM03. Both ATi and nV have admitted to optimizing for 3DM01SE Nature, but I *believe* those optimizations increase speed without noticably impacting IQ in a legitimate way (i.e., no custom clip planes or omitted z clears).

Yes! Finally!

IMO the clipping plane hack is the despicable act. Anything else (for both ati and nvidia, shader replacements, insider trading, sm2 vs sm3, etc etc) is trivial compared to a hack that will considerably balance the reviews in favorable direction, while providing absolutelly no benifit to the player. At least shader replacements will improve performance across the line, even if they're mathimatically incorrect or whatever.
 
Pete said:
ondaedg said:
As far as I know, there are no applications that have been proven to be app specifically optimized by ATI, or NVidia for that matter. That is the reason for my posts in this thread. There is no conclusive proof (yet) that any IHV are doing app specific optimizations.
nV has admitted this to Derek at AT, and I believe it was Unwinder who found tons of references to specific games in nV's drivers. So they are doing it.

I also have text from an MSN converstaion with Derek Perez who openly stated that part of their compiler optimisers task is to "replace shaders where they feel they need to" (I wasn't going to get into a discussion with him about what a compiler actually is), and that they would keep replacing shaders in their drivers every time 3DMark issues another patch to defeat their detections. We also have the case where we found in the intial 5900 drivers merely renaming UT would offer full trilinear. ATI have also openly stated to me that they have some detections in R200 drivers - there are thing like vertex buffers that get reassigned in order to better suit how they handle the vertex shaders.

Over the course of many looks into IQ there were also issues with the lighting being off in Halo in certain releases and Lars did an article pointing out the IQ issues in Aquamark - now these may or may not have been as a result of "optimisations" but becuase of what had already been seen, and Unwinders work, mud was sticking. And because of the retraction Futuremark had to make the word "optimistation" was applied and hence that developed a nasty conertation.

Optimistation, per-se, is never a bad thing - ask Intel or AMD, as they have been optimising their CPU's and compilers for years. However "optimisation" has a dark side associated with it and it may be all too tempting to stray off and do things you shouldn't necessarily do in order to improve performance where you think people might not notice - however, where does that line get drawn? Without people looking into these things consumers would just have to have faith that what they are looking at is what the developer intented them to look at and whoe can really say they have that faith from any of them?

As for mathematical equivelent shader optimisations, while I'm fairly ambivelent to their use, I would always point out that these tend to be more fragile - its not generic so a user may find good performance one day, download a patch that may have some inocuous shader change and find their performance sucks; alternatively a shader thats used in an engine may be altered in another use of that engine which means that the IHV will need to optimise it again to get back to a similar performance of its initial implementation, and if the title is not a benchmark then the question is whether or not that would get done. If you think about it though - an optimal shader compiler should be able to compile to, or at least very close to, the best case hand tuned shader anyway and while short term, quick, fixes may be good for replaced shaders, in all it would be better to put the main effort into getting your shader comiler hit as close to that theoretical maximum on a generic basis.

Both ATi and nV have admitted to optimizing for 3DM01SE Nature, but I *believe* those optimizations increase speed without noticably impacting IQ in a legitimate way (i.e., no custom clip planes or omitted z clears).

IIRC both used compressed textures in Nature, which will have some affect on IQ.
 
cloudscapes said:
Pete said:
Agreed, but the sticking point is that nV used some blatantly unhelpful-to-the-gamer tweaks to gain standing in benchmarks like 3DM03. Both ATi and nV have admitted to optimizing for 3DM01SE Nature, but I *believe* those optimizations increase speed without noticably impacting IQ in a legitimate way (i.e., no custom clip planes or omitted z clears).

Yes! Finally!

IMO the clipping plane hack is the despicable act. Anything else (for both ati and nvidia, shader replacements, insider trading, sm2 vs sm3, etc etc) is trivial compared to a hack that will considerably balance the reviews in favorable direction, while providing absolutelly no benifit to the player. At least shader replacements will improve performance across the line, even if they're mathimatically incorrect or whatever.

Personally, that is where we differ. Maybe you would feel different if you were a shareholder.

Don't get me wrong guys, I don't want to see Nvidia and ATI turn good looking games into some alternate world just to speed up their benchmarks. Like I said, I am all for the IHVs making the games faster as long as they maintain the image quality and playability. We should all demand that. Furthermore, I am not saying we should give them free reign to do what they want. This would be suicide as a consumer. We should continuously pick and question everything they do. How else do you keep a company honest?

What I am sick of seeing is the same old bullcrap from the same old people who turn this into their own personal war just to satisfy their screwed up obsession with their favorite IHV. It's the same old guys who jump in these posts not with anything constructive to say but to knock the "enemy" IHV down with their own personal opinion which they state as fact. It is annoying as hell. For example, if I was to start a thread praising Nvidia for making a damn good card this generation, I can pretty much count on several of the same posters to enter the thread with their own personal take on why Nvidia sucks or why ATI's new card is better. My question to these loooosers is who the hell cares? Noone said anything bad about ATI but you can damn well count on them coming in there.

Why do I actually give a shit? Well, I was around back in the day when Tom's Hardware had the most visited forum on the net. Right around when Nvidia came into the picture with the Riva128 and then the TNT, the 3dfx zealots (as they became known as) turned the forums into an all out war prompting Tom to ban many members and go to a more bulletin board style system (which sucked). The good logical discussions became pissing contests. This forum has had a good amount of intelligent posters contributing here. I have been coming here for a while mostly as a lurker. What I have seen over the last year is what I saw happen over at Tom's. And from what I see is mostly ATI fans who have this deep hatred for Nvidia that makes any good discussion about videocards almost impossible. It is your choice if you want to dislike them. Everyone is allowed an opinion. What I am saying is that you don't have to force it onto others. And when they don't agree with you, maybe we can respect their decision and carry on in a rational manner.

I am not a "fan" of Nvidia or ATI. But I do use their products and I find how they design their products intriguing which is why I lurk in these forums. I enjoy reading insightful posts. Not garbage...

But who knows. Maybe Beyond3d is seeing more traffic than it ever has before and all this ATI vs Nvidia stuff is netting Dave more ad revenue than he ever dreamed possible! ATI vs Nvidia last man standing! :devilish:
 
ondaedg said:
cloudscapes said:
Pete said:
Agreed, but the sticking point is that nV used some blatantly unhelpful-to-the-gamer tweaks to gain standing in benchmarks like 3DM03. Both ATi and nV have admitted to optimizing for 3DM01SE Nature, but I *believe* those optimizations increase speed without noticably impacting IQ in a legitimate way (i.e., no custom clip planes or omitted z clears).

Yes! Finally!

IMO the clipping plane hack is <b>the</b> despicable act. Anything else (for both ati and nvidia, shader replacements, insider trading, sm2 vs sm3, etc etc) is trivial compared to a hack that will considerably balance the reviews in favorable direction, <b>while providing absolutelly no benifit to the player</b>. At <i>least</i> shader replacements will improve performance across the line, even if they're mathimatically incorrect or whatever.

Personally, that is where we differ. Maybe you would feel different if you were a shareholder.
So you think it's OK for nV to lie and cheat in order to steal money from gamers to put in your pocket? Because that's all that I can infer from your differing from our opinion that custom clip planes are flat out bogus. Maybe that's why people haven't taken a shine to your arguments in this thread.

Don't get me wrong guys, I don't want to see Nvidia and ATI turn good looking games into some alternate world just to speed up their benchmarks. Like I said, I am all for the IHVs making the games faster as long as they maintain the image quality and playability. We should all demand that. Furthermore, I am not saying we should give them free reign to do what they want. This would be suicide as a consumer. We should continuously pick and question everything they do. How else do you keep a company honest?
I think everyone here wants that.

What I am sick of seeing is the same old bullcrap from the same old people who turn this into their own personal war just to satisfy their screwed up obsession with their favorite IHV. It's the same old guys who jump in these posts not with anything constructive to say but to knock the "enemy" IHV down with their own personal opinion which they state as fact. It is annoying as hell. For example, if I was to start a thread praising Nvidia for making a damn good card this generation, I can pretty much count on several of the same posters to enter the thread with their own personal take on why Nvidia sucks or why ATI's new card is better. My question to these loooosers is who the hell cares? Noone said anything bad about ATI but you can damn well count on them coming in there.
I was one of ATi's and the 9700P's "forum advocates" in the last gen, but I can't think of a bad thing to say about the 6800 series. I honestly can't imagine any ppl here, even the most ardent ATi fans, finding much (if any) fault with the 6800 as it compares to the X800.

But when you try to substitute some false arguments (no examples of nV hand-optimizing shaders) or strawmen (your list) for this discussion, then feign surprise and disgust at how childish we all are, it's a little hard to take you seriously. That shareholder comment really surprised me, given your indignation. If it's OK for nV to cheat others consumers in order for you as a shareholder to profit, then what's the problem with ATi's execs fudging numbers or trading inside for their own profit? They're just doing what they gotta do, right? (And wasn't nV accused of much the same?)

Most over-the-top ATi fans here and elsewhere became that way due to nV's PR and marketing stunts, which were a little heavy-handed even when nV was on top, but became egregious when their hardware was bumped from the top shelf. The 6800 is a different story, and tho ppl may still harbor resentment at nV's self-preservation mode last year, it's hard to argue with good, fast hardware.

Ugh, I really didn't want to get into this again. Look, just tell me you think clip planes are wrong on every level and we can agree to disagree on the technicalities. If you think they're accpetable, then just post a :p or something and we can both save ourselves some typing.
 
ondaedg said:
Personally, that is where we differ. Maybe you would feel different if you were a shareholder.

This is true, I wouldn't of wanted to be an ATI shareholder at the time of that act. They were rightfully pissed. But there are far far more ATI consumers than shareholders. So, the shareholder thing cheated a few people out of huge sums of money, but the clipping plane issue cheated a huge amount of people into buying the (at the time) lesser product. To me, that's the most serious of the two issues, by far. Neither Nvidia nor ATI are close to being saints. But it is possible to choose the lesser evil without being a fanboi. ;)

That said, I totally agree with you on Nvidia and ATI turning what's supposed to be a service to the consumer into a race on who can beat/cheat the benchmark more. It's absolutelly sad that they resort to such immoral tactics in order to get a ten fps gain on the competition. To me getting 65 fps instead of 55 fps in Farcry means nothing. I know a lot of people do mind, but... well.. I don't get them at all. :LOL:
 
cloudscapes said:
This is true, I wouldn't of wanted to be an ATI shareholder at the time of that act. They were rightfully pissed. But there are far far more ATI consumers than shareholders. So, the shareholder thing cheated a few people out of huge sums of money, but the clipping plane issue cheated a huge amount of people into buying the (at the time) lesser product. To me, that's the most serious of the two issues, by far.


:oops: I know you don't honestly think that buying a video card compares to losing money on an investment. Some people need to get their priorities straight :LOL:
 
trinibwoy said:
cloudscapes said:
This is true, I wouldn't of wanted to be an ATI shareholder at the time of that act. They were rightfully pissed. But there are far far more ATI consumers than shareholders. So, the shareholder thing cheated a few people out of huge sums of money, but the clipping plane issue cheated a huge amount of people into buying the (at the time) lesser product. To me, that's the most serious of the two issues, by far.


:oops: I know you don't honestly think that buying a video card compares to losing money on an investment. Some people need to get their priorities straight :LOL:

If it's an equal amount of shareholders vs card consumers then of course not! ;)
What we're dealing with is many many consumers vs few shareholders (relatively speaking). Big difference.
 
Back
Top