Could someone link me a list of Nvidia and ATI OEM contracts

The root of "Demirjian" means "head of metal." (I'm not kidding, I promise you :) )
After reading that piece of "journalism" the name does seem very fitting.

EDIT: Here's some other "gems" -

Chuck said:
In the older engines, HL2, Q3, and others, ATI has a slim advantage.
Half Life 2 is running on an "old" engine? :oops: CS:Source was released about five days ago. Ancient.

Chuck also said:
ATI just rebadges the previous generation, and uses its numbering scheme to make it sound like they have something that it doesn't.
The Good Lord Above knows NV would never do a thing like that. No sir.

Chuck said:
ATI was rather smug, and pointed to benchmarks like Quake3. Yay, 450FPS if you spend $400 on ATI, and you only get 425FPS if you spend $400 on NVidia.
No comment needed....
 
digitalwanderer said:
pat777 said:
http://www.theinquirer.net/?article=18029
LOL, that's the funniest article I've read today. :LOL:
ATI has won a ton of OEMs.
The Inq article said:
Why is this so worrisome? Well, for two reasons, licensing and the future. Licensing is the worst one, Doom3 is an absolutely spectacular engine, head and shoulders above anything else out there. It is scalable, beautiful, has a marquee name, and is saddled with amateurish gameplay. Three out of four ain't bad, and luckily for Nvidia, gameplay is not its problem. In fact, only the first three are relevant to video.

So, one marquee game versus another, it would appear to be a tie. The thing is that the Doom3 engine will be licensed by just about everyone under the sun. The HL2 engine may be, or may not be, but people won't be clamouring for a Newell engine as rabidly as a Carmack one. While the proverbial 'people' may be wrong, they are the ones spending the money.
Unfortunately if there IS shader replacement going on in D3 with nVidia's drivers than all the games based on the D3 engine will either need custom shaders or else won't perform as well.

I don't think D3 is a good indicator for how nVidia is going to run D3 based games, if that makes any sense.

The point is not consumers, but OEMs. They are more than smart enough to actually look at the capabilities before they buy the chips by the hundreds of thousands. Worse yet, many of them sell to Joe Six-Pack by means of spec sheets more than performance. Nvidia could hit those checkboxes with the last generation, but ATI still can't. The current generation gets worse, the high end can't offer the features at all, and the mid and low end get worse from there.

As the games start coming out that use PS3.0, and people start looking for those checkboxes, one company will be there. OEMs know this, and buy accordingly. That is where the money is, and it is where ATI isn't and won't be. The last time someone missed like this badly, it took Nvidia about 2 years to recover, and 3DFx never did. I'm not intoning anything here, simply stating it, just watch and see.
Has this guy looked at what cards the OEMs are actually buying?!?! :oops:

SLI alone will be enough to make Nvidia own every benchmark under the sun by simply abusive margins. The early benchmarks indicate that it really is everything Nvidia said it would be. ATI's response? Well, nothing at first, I think it realises that HL2 benchmarks won't cut it this time.

It gets worse though. There has been no response at all since tha SLI announcement, and this is a classic sign of one side getting caught with its pants down.
Sweet jeeze! I guess those dual card rigs from Alienware that were announced before SLI don't count? :rolleyes:

I can't believe that article, I just can't....WHY would someone write such BS?

dude...why does everyone have this idea that nv is doing shader replacement in Doom3???
 
^eMpTy^ said:
dude...why does everyone have this idea that nv is doing shader replacement in Doom3???

Maybe because the lead programmer John Carmack said as much?
 
BRiT said:
^eMpTy^ said:
dude...why does everyone have this idea that nv is doing shader replacement in Doom3???

Maybe because the lead programmer John Carmack said as much?

That quote from carmack? That's what you guys are talking about?

The benchmarking was conducted on-site, and the hardware vendors did not have access to the demo before hand, so we are confident that there is no egregious cheating going on, but it should be noted that some of the ATI cards did show a performance drop when colored mip levels were enabled, implying some fudging of the texture filtering. This has been a chronic issue for years, and almost all vendors have been guilty of it at one time or another. I hate the idea of drivers analyzing texture data and changing parameters, but it doesn't visibly impact the quality of the game unless you know exactly what to look for on a specific texture. On the other hand, the Nvidia drivers have been tuned for Doom's primary light/surface interaction fragment program, and innocuous code changes can "fall off the fast path" and cause significant performance impacts, especially on NV30 class cards.

This quote?

So Carmack goes off on ATi's AF optimizations for almost a paragraph...then makes a comment about how Nvidia drivers are "tuned" for doom's light/surface interaction fragment program...and you come away with Nvidia is doing shader replacement?

I don't see the word "shader" or the word "replacement" anywhere in that quote...I do see the word "hate" though...

And even if this was shader replacement...we've already gone over IQ on nv and ati cards and they have been judged by every source to be equal...if not slightly in favor of nvidia...

So ATi can do AF optimizations but it's ok cuz you can barely see them...but when someone vaguely alludes to something that can possibly be misconstrued as shader replacement even though there is absolutely no known visual impact...that's something to get excited over?

I mean seriously...how can you possibly care what Nvidia's driver is doing if it is providing the same IQ as ATi??? Isn't this the exact same argument applied to ATi's AF optimizations?

Or is there another quote?
 
^eMpTy^ said:
So Carmack goes off on ATi's AF optimizations for almost a paragraph...then makes a comment about how Nvidia drivers are "tuned" for doom's light/surface interaction fragment program...and you come away with Nvidia is doing shader replacement?

I don't see the word "shader" or the word "replacement" anywhere in that quote...I do see the word "hate" though...

You do see the word "fragment program", maybe you should look up the meaning before you make an ass out of yourself.
 
Tim said:
You do see the word "fragment program", maybe you should look up the
meaning before you make an ass out of yourself.

There is a difference between optimizing a specific path/fragment program and shader replacement.
 
Bjorn said:
Tim said:
You do see the word "fragment program", maybe you should look up the
meaning before you make an ass out of yourself.

There is a difference between optimizing a specific path/fragment program and shader replacement.

True optimizations would not falter due to innocuous code changes. Nor would they cause their cards to "fall off the fast path" and cause significant performance impacts.

Read the writing on the wall ... Nvidia is doing shader replacements.
 
Tim said:
^eMpTy^ said:
So Carmack goes off on ATi's AF optimizations for almost a paragraph...then makes a comment about how Nvidia drivers are "tuned" for doom's light/surface interaction fragment program...and you come away with Nvidia is doing shader replacement?

I don't see the word "shader" or the word "replacement" anywhere in that quote...I do see the word "hate" though...

You do see the word "fragment program", maybe you should look up the meaning before you make an ass out of yourself.
rofl.gif
rofl.gif
rofl.gif


I don't know why, but this post absolutely slayed me!!! :LOL:
 
Bjorn said:
Tim said:
You do see the word "fragment program", maybe you should look up the
meaning before you make an ass out of yourself.

There is a difference between optimizing a specific path/fragment program and shader replacement.
That seems a very fine line to me, a very fine ethical line. In terms of end results (speed ups with the caveat that code changes could negate the optimization), though, they would appear to be the same.
 
Tim said:
^eMpTy^ said:
So Carmack goes off on ATi's AF optimizations for almost a paragraph...then makes a comment about how Nvidia drivers are "tuned" for doom's light/surface interaction fragment program...and you come away with Nvidia is doing shader replacement?

I don't see the word "shader" or the word "replacement" anywhere in that quote...I do see the word "hate" though...

You do see the word "fragment program", maybe you should look up the meaning before you make an ass out of yourself.

How does "tuned" equate to shader replacement? I just don't follow that logic.

Second question, how did this concept of shader replacement get such a bad name? Are there any reviews showing degraded image quality due to shader replacement? Has it caused any games to run improperly or crash or anything?
 
Bjorn said:
^eMpTy^ said:
Second question, how did this concept of shader replacement get such a bad name?

NV30 + 3D Mark 2003

Ah, good old build 340 of 3DMark03...so I spent an hour looking around on goolge and all the usual sites for more information on shader replacement...and so far the only people who seem to think it's a bad thing are ATi (big surprise) and 3DMark03...I still haven't found any evidence that it ever degraded image quality...and that's pretty much all I care about...

Anyone have any links to anything showing that shader replacement hurt image quality in any way?
 
Can i ask you where you were the whole last year ?

You dont need to use google, use the search function in theses forums. And please read the thousands threads we had last year before starting a new one :)
 
BRIT said:
True optimizations would not falter due to innocuous code changes.

LMAO, "true" optimization? Seems somewhat silly to say, considering that it is the NV owners who are benefitting from the "fake" optimization, without suffering from reduced image quality in Doom3 ;)
 
Back
Top