Kirk on Cg

Guest

Newcomer
nV News: How do you define "industry standard" and what steps are you taking to make Cg an industry standard?

David: "Industry Standard" means pervasively used, and universally available. NVIDIA's Cg compiler generates program code for both OpenGL and DirectX 8 vertex and pixel shaders. Cg is therefore available on any graphics hardware that correctly implements DirectX 8, or the OpenGL vertex programming extension as approved by the OpenGL ARB. Since Cg is compatible with Microsoft's High Level Shading Language as announced for DirectX 9, this compatibility and universal availability will be maintained through DirectX 9. Everyone can use Cg on any platform that runs either DirectX or OpenGL. Software vendors are also free to implement tools that take advantage of Cg, as well.

HMMMMmmmMMMmmMMMMm....

G~
 
This isn't really pertaining to Cg itself, but I am so bloody sick of this statement from Nvidia employees: "GPUs have been approximately doubling in performance every 6 months. . . ." What's the old adage, if you tell a lie often enough eventually people will begin to believe it. A TnT2 could pull around 35fps in Q3 at 10x7x32, so taking Nvidia's mantra of roughly doubling performance every six months a GF1 should've done 70, a GF2 140, a GF2 Ultra 280, a GF3 560, a GF3 Ti 1120, and a GF4 Ti 2240. Sorry, not even close--and Q3 has definitely scaled with new hardware better than a lot of other games--so could we please stop trying to blatantly BS the gaming and hardware enthusiast markets?

Sorry for the mini-rant, but I'm just really tired of seeing that statement in every other Nvidia interview.
 
We've usually seen raw speed increases like 10-30% between Nvidia's 6 month cycles in most games, this double the speed stuff for itself is utter nonsense indeed.

But then again, the additional speed alone is not the only important part of a new card generation, what about the new features that are added over time? Lets talk about about the "potential" performance? If each generations features would actually be exploited faster in games we might see bigger speed increases than those 10-40% for sure, and sooner. A GF4 can not only do a higher framerate in all games than a TNT2, but it can also do a lot more stuff that would choke a TNT2 and cause it to crawl along. Quake 3 was programmed for TNT2-level hardware and while it does of course scale with newer chips its still a friggin old game and not made to show the strengths of newer hardware.

Lets compare performance of the new UT2003 between a TNT and a GF4 - from what Mark Rein posted about the engine, a GF4 will be a LOT faster than a TNT2 in that game. 7 generations, each doubling performance, would ask for 128x performance. That most likely won't exactly be the case, but it will vertainly be like 20-30x faster, thus show a LOT more scalig between the 7 generations than Q3 would, which at least makes that statement "less false" if there is such a thing, hehe ...

PS: Not trying to claim Nvidia's PR doesn't suck, 'cause it does... ;)
Just that I think things aren't so simple and there's always more than one way to look at such statements.
 
Though I personally like Cg (from a programming perspective), I must admit every time I see a statement from Kirk, the following song lyric comes to mind:

"We come in peace, shoot to kill, shoot to kill, shoot to kill them men!"

(From startrekkin)
 
PSarge said:
"Always going forwards, 'cos we can't find reverse." 8)
My first car was a used Mercury Lynx (Mercury version of the Ford Escort). The transmission was flakey and sometimes it would only go forwards! Even if you put it into reverse, it would still go forwards. Was a real pain if you went shopping and someone parked in front of you. :)
 
John Reynolds said:
This isn't really pertaining to Cg itself, but I am so bloody sick of this statement from Nvidia employees: "GPUs have been approximately doubling in performance every 6 months. . . ." What's the old adage, if you tell a lie often enough eventually people will begin to believe it. A TnT2 could pull around 35fps in Q3 at 10x7x32, so taking Nvidia's mantra of roughly doubling performance every six months a GF1 should've done 70, a GF2 140, a GF2 Ultra 280, a GF3 560, a GF3 Ti 1120, and a GF4 Ti 2240. Sorry, not even close--and Q3 has definitely scaled with new hardware better than a lot of other games--so could we please stop trying to blatantly BS the gaming and hardware enthusiast markets?

Well, your rant would have been alot more sound if you hadn't forgotten that today's cards are running with max Q3 details, anisotropic filtering, and FSAA. Today, a GF4 can deliver over 100fps at 4X FSAA + anisotropic filtering + hires textures.

The TNT2 Ultra was being reviewed with Quake 3 in July 1999 I believe. That leaves roughly 36 months, or 6 6-month periods in which we should have had 2^6 performance increase, or 64x performance. First, the GF4 runs at 4X FSAA, so already we have 4X,(since it is rendering internally at 4X the resolution) that leaves 8x left. 35fps vs about 100-110fps, that's another 3x. We now have 12x the performance, so all we need is another 5x, and I'd claim that the usage of hires S3TC textures and 4-8tap anisotropic filtering gives us that.

If a TNT2 Ultra had to render a 2048x1536 image, with large textures, and 8tap anisotropic filtering (assuming it could), there is no way it would get 35fps.

Moreover, you shouldn't expect that if you disabled all the features on a GF4 (choose bilinear, no-AA, low-res textures) that it would perform 64x a TNT2 with the same settings because the Q3 engine is fundamentally contrained by other bottlenecks. If you replaced OpenGL with a NULL driver that doesn't render at all, but returns instantly for any OpenGL call, you would not get 2240 FPS!


The truth is somewhere in between, and I think you need to rethink your argument a little.
 
eek7.gif
 
OpenGL guy said:
My first car was a used Mercury Lynx (Mercury version of the Ford Escort). The transmission was flakey and sometimes it would only go forwards! Even if you put it into reverse, it would still go forwards. Was a real pain if you went shopping and someone parked in front of you. :)

Wow, I got that same problem with my '91 Nissan Maxima. Most of the time a little jiggle gets it in the right gear. Sometimes it won't even start in park, I have to put it in neutral.
 
hehe... topping up, or changing the transmission fluid would prolly have limited the transmission problems most of the time. :)

Many years ago I worked for a computer suplier, and I delivered large orders of computers in an ancient ford van... the super-econoline 1500 something or other... that had a completely arsed up transmission linkage. Sometimes if you put it in 1st gear, it would NOT come out... same with reverse. The only way to get it out of gear would be to get out of the van, climb underneath and use a broom handle or other long shaft to force the shift linkage to un-bind. hahaha led one day to a backup in a hospital parking lot while I climbed under the van to get it into reverse so I could back up to the loading dock... lol, the things I used to do 10 years ago to be in the computer biz. ;)

hehe, i'm in car-mode this week, hardly any computer stuffs... just car fixin. doing brakes and suspension on my lil bros car. should be done tomorrow if things go well (*crosses fingers*). :)
 
Ichneumon said:
hehe... topping up, or changing the transmission
hehe, i'm in car-mode this week, hardly any computer stuffs... just car fixin. doing brakes and suspension on my lil bros car. should be done tomorrow if things go well (*crosses fingers*). :)

The usual. First it´s a done deal. Then you´ll find a small problem, which will be fixed in 2 days. Then there is a little bit bigger problem, but is fixed in 2 weeks. Then there is a bigger and nastier problem, which takes 2 months to fix and then you realise that you have to by a new car :)
 
Actually looking back at the reviews at andtech we saw the GeforceDDrs only pulling 23 fps in Q3 at 1600x1200 @32 bits with Tri only. Today the GF4 Ti4600 is about 160ish at that same res and setting.

GF2 should have doubled this to 43 but they were stuck at 35 ish until the Det3/4 drivers
GF2 Ultra should have doubled that to 86 which it did not
GF3 should have doubled that to 196 which it did not
GF3 Ti500 should have doubled that to 368 again far from it
and finally the GF4 ti4600 should be pulling 736 FPS :eek: :eek: :eek:

Again this is following his every 6 months with x2 perfromance as he said.



If you want to play word games David said nothing on the conditions so if you assume he was talking about anthing special AA, AF, tri, bi, ect then you are making an A$$ out of you and me (or so thats how the saying goes). If course I was assuming he was talking about FPS perfromance so I am making an A$$ of my self :) Maybe he ment something else? He really did not give us much.

However what ever he ment we certianlly have not seen an x2 increase in FPS numbers every 6 months. And thus JR has a valid point that his statement is not close to being accurate.
 
DemoCoder said:
Well, your rant would have been alot more sound if you hadn't forgotten that today's cards are running with max Q3 details, anisotropic filtering, and FSAA. Today, a GF4 can deliver over 100fps at 4X FSAA + anisotropic filtering + hires textures.

I'm not really interested in arguing over this, but I think you should consider your own logic. By your words, if a GF4 is pulling, say, 120fps in Q3 at a specific resolution and with 4x AA + AF + hires textures, that means the original GF3 was only pulling 30fps with those settings because Nvidia had to have doubled that with the GF3 Tis and then doubled again with the GF4s. The #s don't even begin to add up, so much so that I'm baffled when I see people defend such statements.

But I had hardly forgotten about increased details and enabled features. In fact, Kirk often states doubled performance with new features enabled, which, to my way of thinking, only compounds the lie.
 
Huh, "approximately doubling in performance" is not neccessarily the same as simply 2x the fps in a certain quality setting and res in Q3A, why does everybody suddenly fixate only on the framerate in one old game? Its not as if they had said "we're doubling the framerate in Q3A", performance includes more than just that and you will need newer games to really show the difference.

Of course the PR statement in itself is exaggerated and no matter how you twist it there will rarely be a non-synthetical case where it is true (especially at release of the card), but it does have a relation to reality as long as you don't oversimplify a doubling in performance and insist on interpreting it as "you get 2x the fps every 6 months in this one game at the same old quality settings this older card needs to run". Take a new game, with high polycounts and multiple texture layers, shaders etc., run it with max quality and you will probably see like 3 fps (if it will run at all) on a TNT2 against some 80 fps on a Ti4600 which is a frickin' huge difference...
 
Gollum,

he did not say what he ment by doubling perfromance now did he? Did he say double the triangle throughput? Doubling the filtered pixel rate? Doubling the rate of AA pixel? What is the only universal way to gauge perfromance for a comparison?

Like it or not the only way that is used today to gauge perfromance is looking at FPS scores. I just used Q3 as an example as that has stayed more or less the same over the last 3 years so to provide a base point. Plus we have numbers that go back that far :)


BTW double does mean x2 so what increase two fold from each generate of GPU. Also if we wanted to be picky we can throw out any TNT scores as he said GPU :) :) :) :) :) :) :)
 
John Reynolds said:
I'm not really interested in arguing over this, but I think you should consider your own logic. By your words, if a GF4 is pulling, say, 120fps in Q3 at a specific resolution and with 4x AA + AF + hires textures, that means the original GF3 was only pulling 30fps with those settings because Nvidia had to have doubled that with the GF3 Tis and then doubled again with the GF4s. The #s don't even begin to add up, so much so that I'm baffled when I see people defend such statements.

When trying to see it only mathematically, of course you got a point. You almost sound to me like on of those reviewers who only care about their fps in Q3A benchmarks. Granted, taking the GamePC Parhelia review benchmarks as an example, you can see the GF4 Ti4600 performing approximately 1.5x as fast as the GF3 Ti500 in most tests, its certainly not twice the performance, only halfway. Its even further off the mark if you compare it to an original GF3 (like 1.8x instead of the asked for 4x). If Kirk would have said "we're doubling performance with every generation" it would already be a lot more accurate though, this 6 months claim gets ridiculous because of the thrown-in GF3 Ti refresh, other than that, especially if you look back at one or two years ago. However, there's certainly some truth behind the claim once you figure in games making use of more geometry, advanced shading and other newer features than those games currently out. You'll almost certainly be able to experience like 8x the performance of a GF2 from a GF4 Ti in Doom3 and other upcoming games.

If you want to limit yourself to taking such statements from a company litterally, adding up numbers that don't tell the whole story to check for validity of said statement in order to make a fuzz about it, then that's your problem. If others don't agree try taking a little more open-minded aproach to the situation and never forget its only PR, its meant to exxagerate! I don't hear you complaining that your toothpase isn't providing you with the freshest white smile of the world as it claims to ...
 
jb said:
BTW double does mean x2 so what increase two fold from each generate of GPU. Also if we wanted to be picky we can throw out any TNT scores as he said GPU :) :) :) :) :) :) :)

Hehe, I know double means 2x, but performance is not the same as only frames per second IMHO. ;)

What I am saying is that the real or rather potential performance of each of these parts cannot be measured simply by the fps scores of a technologically outdated game. The new cards will only shine once more complex games show up and then, and only then, this claim will actually become closer to being real. The 6 months figure makes it easy to pick at the comment, but start comparing between generations (12-18 months) and I think we can actually expect the 4x or 8x claims to become true in future titles...

As for the TNT, John picked it for his original argument that's why I used it. In my recent post I used GF2 instead of a TNT in the argument for all us picky guys though - a GF2 is a GPU, no? ;) ;)
 
What he really meant was double the marketing performance!!! :D

With the TNT2, nvidia's marketing team could only produce 5 statistical charts/second, but ever generation that has doubled!!!

wow, that means 5 -> 10 -> 20 -> 40 -> 80 -> 160 statistical charts /second!!! We are REALLY FLYING NOW!

:D :D :D

Nite_Hawk
 
Back
Top