App Detection

Geo

Mostly Harmless
Legend
sireric said:
Possibly on a per application basis, different tile sizes would benefit things. On our long list of things to investigate.

What?? Application detection! Heathen! Cheater! Unclean!!

Oh. Wait. That was last year's knee-jerk.

Never mind. :oops:
 
sireric said:
...
Performance issues are still very texture fetch bound (cache efficiency, memory efficiency, filter types) in modern apps, as well as partially ALU/register allocation bound. There's huge performance differences possible depending on how your deal with texturing and texture fetches. Even Shadermark, if I recall correctly, ends up being texture bound in many of its cases, and it's very hard to make any assumptions on ALU performance from it. I know we've spent many a time in our compiler, generating various different forms of a shader, and discovering that ALU & register counts don't matter as much as texture organization. There are no clear generalizable solutions. Work goes on.

This is very close to what I was about to say...;)

geo said:
What?? Application detection! Heathen! Cheater! Unclean!!

Oh. Wait. That was last year's knee-jerk.

I believe that last year's "knee-jerk" was a bit more complex than that...;) Application detection was merely the minor component, as even last year it was well known and vicariously accepted that application detection to address *game bugs* was not only necessary, but desirable. The problem last year had to do with what happened *after* application detection, primarily concerning reductions in normal rendering IQ for the purpose of inflating benchmark scores of all kinds, "real game" game benchmarks (such as the UT2K3 fly-by, for instance) and synthetics (such as 3dMk03.)

The 3dMk03 scandal, in which nVidia cheated by way of only processing the work necessary to properly render 3dMk03 on the static camera path, which became apparent when the camera was moved off the path manually (just as would be the case in any "real" 3d game), is what started the understandable chain reaction last year. nVidia only compounded the problem for themselves by trying to pass off the 3dMK03 cheat as an "optimization" and declaring war on FutureMark and every other game (such as TR:AoD) and benchmark supporting "DX9" functionality that its nV3x chips didn't support at all or else competitively.

Thankfully, we are emerging from the dark time last year (almost exclusively the product of nV's misleading nV3x PR campaigns) and terms such as "application detection" and "optimization" no longer elicit negative, unthinking, knee-jerk reactions from the public, and the truth about the situation is once again becoming visibly apparent. The truth is that there are both beneficial and deceptive optimizations, and beneficial or deceptive application detections.

Still, I think that some damage lingers in that some people unfortunately think that the only beneficial optimization or app detection is a "dead" one...;) To the degree that such misconceptions linger in the public mindset there is yet work to be done.
 
...BOMBASTIC TRAILER MUSIC BEGINS...

"IN A WORLD ON THE BRINK"

"IN THE YEAR 2050"

"ON THE EVE OF A HISTORIC ANNIVERSARY"

"ONE MAN..."

"WILL..."

"REMEMBER AN EVENT..."

"SO EVIL..."

"THAT HISTORY SHOULD NEVER FORGET..."

"THE NVIDIA 3DMARK03 FIASCO"

"WALTC *IS* 'THE VERBOSE 3DMARK03 REMEMBERER'"

Release day: Christmas 04, in all theaters everywhere. "The Verbose 3DMARK03 Rememberer"


:)
 
WaltC said:
geo said:
What?? Application detection! Heathen! Cheater! Unclean!!

Oh. Wait. That was last year's knee-jerk.

I believe that last year's "knee-jerk" was a bit more complex than that...;)

Of course it was, Walt. I was mostly funnin' Tho I think it got to a point last year where more than a few people seemed to be ready to consign application detection to hell in all circumstances. Wasn't ATI forced by the public perception at one point to take some optimizations out of 3dm03 that they felt were perfectly legitimate because of the public outcry? I seem to recall they were.
 
Yes, some "mathematically equivalent" hand-coded shaders, swapped in for some less-efficient "native" code. Obviously the desire was to get ATi and nV to develop drivers that would output more efficient, mathematically-equivalent code on the fly, but I don't know how realistic that is with an 18- or even 36-month life cycle. Heck, how old is each Pentium generation, and how many more engineers do they have plugging away on their compilers?

I guses app-detection and hand-coded shaders are now shortcuts we'll have to live with. It just means reviewers have to study IQ with even keener eyes.
 
Pete said:
Obviously the desire was to get ATi and nV to develop drivers that would output more efficient, mathematically-equivalent code on the fly, but I don't know how realistic that is with an 18- or even 36-month life cycle.
There's also the obstacle of compile times interfering with other game performances. For this reason, I think it would be beneficial to have a GLSL-style programming language for GPUs that allows compilation to object files, and has various optimization options (just like normal compilers). This would allow games to, for instance, only compile shaders when a driver is changed, which would allow IHV's to support much longer compile times, leading to possibly higher performance without hand-tweaking or shader replacement.
 
Pete said:
Yes, some "mathematically equivalent" hand-coded shaders, swapped in for some less-efficient "native" code. Obviously the desire was to get ATi and nV to develop drivers that would output more efficient, mathematically-equivalent code on the fly, but I don't know how realistic that is with an 18- or even 36-month life cycle. Heck, how old is each Pentium generation, and how many more engineers do they have plugging away on their compilers?

I guses app-detection and hand-coded shaders are now shortcuts we'll have to live with. It just means reviewers have to study IQ with even keener eyes.

Personally, I don't see very much difference between a card doing application detection. . .and an application doing card detection. In both cases we've lost the general case applicability that we're striving for, and expended resources that could have gone towards improvements for everyone. One hopes (sometimes in vain) that in the latter case the developer is more IQ conscious. . .but I think we've seen enough cases to know some of them are willing to make tradeoffs to get performance for a class of card that will affect a decent chunk of their target market.

But we're getting OT here.
 
Well, since hardware from different IHV's is inherently going to have different optimizaiton characteristics, there's only so much optimization you can do "for everyone."
 
Chalnoth said:
Well, since hardware from different IHV's is inherently going to have different optimizaiton characteristics, there's only so much optimization you can do "for everyone."

Well, there are two cases here. In the card-detecting-app case, "for everyone" would mean say, all DX9 apps, as opposed to just the high-profile ones, so that a purchaser intending to play a down-market DX9 game, rarely benchmarked by the usual suspects, could still feel reasonably comfortable that performance and reliablility-wise he'll likely see similar relative results as the big titles.

In the app-detecting-card case, I'd say resources are fungible if you get improvements you can rely on over time. This means shorter development times, cheaper games, greater profit margins (which is indirectly good for gamers), better art, add on packs, sequels, or some combination of all. That kind of "for everyone".

And then I woke up. ;)
 
"WALTC *IS* 'THE VERBOSE 3DMARK03 REMEMBERER'"

I've booked my ticket already.

"Somebody" should remember tho.

Before the R300/nV30 I would never have imagined that the events that occured ...... uhm ... would occure. I still can't believe (to some extent) that FX owners are even now posting threads with plaintive cries of "what is wrong with my card".

With what went on I think there is a need for somebody like Waltc to remind us. Lest we forget. :(
 
DemoCoder said:
"THE NVIDIA 3DMARK03 FIASCO"

"WALTC *IS* 'THE VERBOSE 3DMARK03 REMEMBERER'"

Release day: Christmas 04, in all theaters everywhere. "The Verbose 3DMARK03 Rememberer"


:)

Actually, I had this in mind:

"Those who do not remember the mistakes of the past are doomed to repeat them," or something similar. I wish I could take credit for the sentiment, but of course I can't...;)
 
ondaedg said:
Walt, don't you think you are overdramatizing just a bit? Everyone does app-detection. Get over it.
Do NOT let the Dig or any of them other fanboys hear ya talking like that, they'll crucify ya! :oops:

More seriously, WaltC again sums it up well (and strangely enough succintly too!):

WaltC said:
"Those who do not remember the mistakes of the past are doomed to repeat them," or something similar. I wish I could take credit for the sentiment, but of course I can't...;)
I think trying to re-write history and underplay nVidia's dishonesty is just as bad as demonizing it.
 
digitalwanderer said:
WaltC said:
"Those who do not remember the mistakes of the past are doomed to repeat them," or something similar. I wish I could take credit for the sentiment, but of course I can't...;)
I think trying to re-write history and underplay nVidia's dishonesty is just as bad as demonizing it.

Heh. Unfortunately, the industry is more like your typical family than the green eye-shade cold-blooded captains of industry that you'd expect.

This means a lot of histrionics, pouting, over-reaction, re-reaction, accomodation to new facts, etc, on the way to a higher level of undertstanding.

The model does work, but it is an emotional rollercoaster.
 
The thing is... does anyone know if NVIDIA was the first (which seems to be the assumption here) when it comes to "app detection" (whatever and however general that term means)?
 
Back
Top