The Nvidia driver chat transcript

pabst

Newcomer
I missed the first question though it appeared to be something along the lines of what does nvidia stand for.

The UT AF issue is at 18:25 (you can guess what they said) and the Ut2K3.exe rename issue is mentioned at 19:26 but not elaborated upon.

http://www.cs.uno.edu/~jpabst


pabst.[/i]
 
This was one of the responses posted in the chat regarding how driver improvements can help across all lines of their products

<nv_nick> (obviously, stuff like shader optimization is an exception, but things like texture placement, etc. help all)

My question is what does "texture placement" mean ?


Here is another question regarding an answer they gave:

<Bantha> When will OpenGL 2.0 support be enabled in NVIDIA drivers? And if so, at the release of OpenGL 2.0, will full support for fragment and vertex shaders extensions be enabled?

&lt;nv_nick> "OpenGL 2.0" is a term that one of the other vendors uses, but right now, the only thing there is is OpenGL 1.4 + a bunch of extensions...

&lt;nv_nick> We will support all of the extensions as soon as we can.

What is meant by Nick's first response? Another vendor uses the term, so they can't?
 
No. There is a certain level of historical emnity between NVIDIA and 3Dlabs, partly because the the role 3DLabs played in the original OpenGL (it spawned from their API) and they weight they hold within the ARB because of it. They view the OpenGL2.0 movement to be a "3DLabs" thing at the moment.

Technically it is correct though, the OpenGL2.0 functionality is a bunch of different extenstions now, however as they get ratified by the ARB they will become the new core and basis for OpenGL2.0.
 
Loved this bit:

<Bantha> Why does nVidia feel they have to spend time optimizing benchmark programs (that do NOT directly

benefit the consumer) instead of spending their time optimizing games?

&lt;nv_nick> hang on, typing...

&lt;NV_Derek> we spend most of our time optimizing for games - but unfortunately our pc manufacturers make a lot

of decisions based on these bmarks.

So all that stuff about not optimising for benchmarks was just what everyone suspected - B.S.?

The majority of the chat just seemed to be a PR stunt to blame their driver 'issues' on Bugs. Pretty worthless.
 
cellarboy said:
Loved this bit:

&lt;Bantha> Why does nVidia feel they have to spend time optimizing benchmark programs (that do NOT directly

benefit the consumer) instead of spending their time optimizing games?

&lt;nv_nick> hang on, typing...

&lt;NV_Derek> we spend most of our time optimizing for games - but unfortunately our pc manufacturers make a lot

of decisions based on these bmarks.

So all that stuff about not optimising for benchmarks was just what everyone suspected - B.S.?

The majority of the chat just seemed to be a PR stunt to blame their driver 'issues' on Bugs. Pretty worthless.
Yup, but they stretched it out for an hour and a half by taking forever between non-commital responses at least... :rolleyes:
 
digitalwanderer said:
cellarboy said:
Loved this bit:

<Bantha> Why does nVidia feel they have to spend time optimizing benchmark programs (that do NOT directly

benefit the consumer) instead of spending their time optimizing games?

&lt;nv_nick> hang on, typing...

&lt;NV_Derek> we spend most of our time optimizing for games - but unfortunately our pc manufacturers make a lot

of decisions based on these bmarks.

So all that stuff about not optimising for benchmarks was just what everyone suspected - B.S.?

The majority of the chat just seemed to be a PR stunt to blame their driver 'issues' on Bugs. Pretty worthless.
Yup, but they stretched it out for an hour and a half by taking forever between non-commital responses at least... :rolleyes:


This chat was supposed to be an opportunity to see if Nvidia were serious about cleaning up their act, being honest, and giving commitments to stop misleading their customers. Unfortunately, it looks like the "same old, same old". PR spin pretending to be technical answers, dodging the difficult questions, and making lots of vague promises about how things would be improved in the future.

Nothing to see here, move along....
 
Brent said:
Here is another question regarding an answer they gave:

<Bantha> When will OpenGL 2.0 support be enabled in NVIDIA drivers? And if so, at the release of OpenGL 2.0, will full support for fragment and vertex shaders extensions be enabled?

&lt;nv_nick> "OpenGL 2.0" is a term that one of the other vendors uses, but right now, the only thing there is is OpenGL 1.4 + a bunch of extensions...

&lt;nv_nick> We will support all of the extensions as soon as we can.

What is meant by Nick's first response? Another vendor uses the term, so they can't?

They're actually right not to use the term 'OpenGL 2.0', but I can't elaborate on why. All will be revealed in the fullness of time.
 
digitalwanderer said:
Yup, but they stretched it out for an hour and a half by taking forever between non-commital responses at least... :rolleyes:

I've often thought Perez has a framed sign hanging in his office which says: "Never answer a question when you don't have to." He's made a career out of talking about shoes when asked to describe a cravat...:)

Heh...I was surprised to see him slip a bit on the benchmark question relative to admitting nVidia ostensibly does what it does in butchering benchmarks to hopefully stimulate OEM sales [no comment on the logic of that proposition]. So who knows, maybe he's trying valiantly to reform? Habits die hard though and I can see how that might be very tough to do.
 
Hanners said:
What is meant by Nick's first response? Another vendor uses the term, so they can't?

They're actually right not to use the term 'OpenGL 2.0', but I can't elaborate on why. All will be revealed in the fullness of time.

Was OGL 1.5 not mentioned in the interview at all then?

MuFu.
 
Now that the cat's been let out of the bag part way, how about letting it run loose completele? What's this about OpenGL 1.5, and how does it differ from oGL 2.0?

I assume it's something less than 2.0, or perhaps 2.0 will grow into something more than originally intended.

I vaguely remember reading that 2.0 might have some features that even the r300 couldn't support in hardware. If true, does this have anything to do with the need for 1.5?

Or perhaps 1.5 is meant for older cards, such as the gf3/4ti and r2x0 series...

I've made enough blind guesses for now, help me out and let the cat out of the bag. Do it for the kittens!



edit: One other question: Why's it a big secret? Who's trying to keep the secret, and who are they trying to keep it from? (OK, so that was three questions...)
 
Contrary to Nvidia's claim, developers WILL have access to low-level hardware features from the assembly level if they so desire. Each hardware vendor will have the choice of supporting their own hardware-specific assembly language or the more common ARB_vertex_program assembly language extension, as they desire. These assembly level interfaces will work seamlessly with the OpenGL 2.0 high level shading language.

http://www.extremetech.com/article2/0,3973,183940,00.asp[/url]
 
From 3dLabs news release ( http://biz.yahoo.com/prnews/030724/sfth042_1.html ), here's a little diddy concerning OpenGL 1.5:

3Dlabs is the first company to ship a preliminary implementation of the OpenGL Shading Language, which was ratified by the OpenGL Architecture Review Board as an official extension to OpenGL 1.5 and is expected to form the foundation of the upcoming OpenGL 2.0 standard.
 
ZoinKs! said:
Now that the cat's been let out of the bag part way, how about letting it run loose completele? What's this about OpenGL 1.5, and how does it differ from oGL 2.0?

OpenGL 1.5 is Ope...*gunshot* *Men in black trenchcoats run off*
 
StealthHawk said:
OpenGL 1.5 is Ope...*gunshot* *Men in black trenchcoats run off*

No, its not. Its the basis for some elements of OpenGL2.0. There is still a lot to be ratified and approved to move stuff from ARB extensions to core OpenGL (and then also move other stuff out of the score).
 
Bah! And here I was waiting till November for the 1.4 red book!

Have they moved anisotropic filtering into the core yet? Can't believe that's not even an ARB extension by now!

Edit: Well well, Amazon.co.uk has renamed its catalogue entry to 1.5... but Amazon.com still says 1.4.
 
Back
Top