NV30 Update

Joe DeFuria said:
How about I ask you this question then. If you were going to start testing performance for a GeForce4 review today, which Detonator driver would you use?

That's a good question, and one I have spoken about on this board previously. (I'll see if I can dig up my post...)

EDIT - Found it: http://www.beyond3d.com/forum/viewt...storder=asc&highlight=leaked&start=49

But in short: in any review, I would use TWO sets of drivers (assuming they are different.)

1) The set that is officially supported by the board manufacturer.
2) The latest beta drivers available to the public either directly from the board manufacturer, or from the chip vendor.

Number 1 is the most important, but unfortunately gets the least attention. Presumably, these drivers should support the complete functionality of the board, including TV-OUT, etc.

It would also be interesting, though not necessary. to see any "leaked" (non publically available) drivers, if there is some claim of major marked performance increase, bug fix, or new feature implementation. However, using "leaked" drivers is not something I would use for overall testing and comparison: just brief and specific tests to validate any claims.

So in the case of these Dets, I would use them in a review (since they are publically available)...along with the official drivers from the board vendor.

You bring up some good points Joe although we do share a difference of opinion. I thought about using the 40.41 Detonator drivers in a Gainward GeForce4 shootout I'm working on, but have decided to use the 30.82 set for the "official" benchmarks and image quality tests.

I certainly will mention the 40.01 drivers in the article, but based on user feedback, I don't feel they are ready for "prime time" usage. Plus I've got more than enough work to do with the article and I don't need any surprises :) I'm very conservative when it comes to beta drivers and normally avoid the rush to be the first person to try them out. Heck, I haven't even installed the 40.41 drivers yet!

I like the idea of providing a few performance results using the 40.41 drivers and supplementing that with screenshots and a brief discussion of the updated control panel applets. I had also planned on installing Gainward's drivers since their control panel suite is pretty impressive.
 
Joe, as a reviewer of any board manufacturer (not reference boards, i.e.), my "priorities" are :

1) Use the drivers that came with the CD
2) If at the time I received the product, the board manufacturers website has newer drivers than the ones on the CD, I'd d/l and use those (and mention such in the review)
3) If the IHV has the very latest drivers (newer than CD or board manuf website), I'd use those, and mention this as well

I remember telling VisionTek about a problem in one of their GF4 products using CD drivers (can't remember which review, here at B3D) which were newer than official NVIDIA ref drivers on NV's website and while they acknowledged the problem (only after testing various driver versions, like I told them to!) and said that the problem is solved in a forthcoming (i.e. not yet available) drivers, I told them I have to report the problem. VT sounded like they didn't want my review "tainted" by this problem I found (it was with 3DMark) , i.e. sounded like they didn't want me to mention the problem, but I told them that I have to (a) use the drivers on the CD ; and (b) I have to report the problem.
 
duncan36 said:
Exactly. As i said, Nvidia has again and again said that the NV30 needs a 0.13 micron process because of it's advanced funtionality and what if it turns out to be a <= 110 millon part ? (since we already have a 0.15 micron 110+ million DX9 part out on the market

ATi has more experience with efficent designs than Nvidia does.



...And NVIDIA has always aimed for bad efficiency in their designs? :LOL: Were is the evidence of this?

My knowledge would indicate otherwise considering NVIDIA chips have always offered more performance clock for clock
 
ir.net/media_files/NSD/NVDA/presentations/gsasia2002.pdf

Mmmmm, so, now the official line appears to be 100+ million transistors, rather than 120. So, it appears the configuration of NV30 hasn't changed, just things removed.
 
The whole point of the Det 40 beta drivers are to support Cg developers and deliver OpenGL1.4 for testing. These drivers were announced a long time ago on the Cg site and we were promised them for enabling Cg NV30 development and testing. They were released to the public instead of "leaked" because Cg developers are not required to be registered NVidia developers.


If you are a registered developer, you know that NVidia follows an open release pattern -- release early, release often, for the devs. Very few of the driver builders ever become WHQL cert'ed. The Det40 beta has major new functionality which devs have been demanding so it wouldn't make sense to release them only to the small cadre of reg'ed developers. That's why they were pushed out for wide beta.
 
Galilee said:
Maybe, but I can't seriously remember NVIDIA having any big problems with their drivers at any time. I've had a GF1-DDR, GF2-PRO, GF3, GF3-Ti200 and now Ti4200.

I'll second that although NVIDIA does have it's share of driver related problems. I play/test quite a few games, and the only one I recall ever having an issue with was the SOF2 multiplayer demo. Flashing textures and my system locked hard after 30 seconds of gameplay. I was literally in shock when that happened :)
 
MikeC,

You bring up some good points Joe although we do share a difference of opinion.

Actually, our difference in opinion is that great. ;)

If you are only going to use a single driver (understandable because time=money), then I would go with the latest officially drivers supported by the board manufacturer. (I assume that's what the 30.82 drivers are? The latest official drivers supported by Gainward?)

Rev,

If by number 3 you mean "chip IHV", then I would disagree with that methodology of choosing them over the board vendor's drivers, if that's the only drivers you'd be using.

I would definitiely like to see them, but only in addition to the board vendor's drivers.

Again, reason being that I think any "official product" review should at least have benchmarks using drivers that are officially supported.

We may just have to agree to disagree on that. ;)
 
DemoCoder said:
The whole point of the Det 40 beta drivers are to support Cg developers and deliver OpenGL1.4 for testing. These drivers were announced a long time ago on the Cg site and we were promised them for enabling Cg NV30 development and testing. They were released to the public instead of "leaked" because Cg developers are not required to be registered NVidia developers.

If you are a registered developer, you know that NVidia follows an open release pattern -- release early, release often, for the devs. Very few of the driver builders ever become WHQL cert'ed. The Det40 beta has major new functionality which devs have been demanding so it wouldn't make sense to release them only to the small cadre of reg'ed developers. That's why they were pushed out for wide beta.
Hmm, I'm not sure if this is the "whole point". With the 9700 available now, perhaps there is another "motivation" for releasing them publicly IMO ;)

NV has always stuck by their "latest non-public drivers for registered developers". They have their CG website. Any programmer who is interested in CG would know of that CG website. NV can release these drivers on the CG website while requiring programmers to register on the CG website (but not necessarily on the NV dev site).
 
IF you consider speed only as the determing factor for superiority then you have a arguement...but looking at history ATI makes more advanced cards...

Radeon 64 Meg Vivo
AIW
Supports all bump mapping modes
DVD features galore

Geforce 2 GTS has only one mode Dot3


Radeon 8500
PS 1.4
Truform (really a building block for dispacement mapping)
Higher Internal Precision
Supports all bump mapping modes again
DVD features galore

Geforce 3 and 4 supports Ps 1.1-1.3 and both bump mapping modes

Radeon 9000

1st value priced Dx 8.1 card

M9000

1st low power Dx 8.1 mobile chip

R300/9700
The most advanced chip/card ever made..

If you look at the trend here ATI has been releasing more advanced hardware than Nvidia, mot to mention the mobile market where low power is the key (i.e M9000).
 
Joe DeFuria said:
MikeC,

You bring up some good points Joe although we do share a difference of opinion.

Actually, our difference in opinion is that great. ;)

If you are only going to use a single driver (understandable because time=money), then I would go with the latest officially drivers supported by the board manufacturer. (I assume that's what the 30.82 drivers are? The latest official drivers supported by Gainward?)

Since I plan to install the Gainward drivers, I might as well include some benchmarks using them. I haven't checked the drivers that are on the CD yet, but as Anthony mentioned, I'll grab the latest ones from their web site.

As for the official benchmarks I mentioned, I'm also going to include performance results from a GeForce2 Ti, GeForce3 Ti 500, and GeForce4 Ti 4600 all of which are reference cards.

BTW, nV News remains a hobby, although some of the advertising revenue is used to pay for server expenses. I've been employed in the IT profession for over 20 years and my wife still tells me that I need to grow up :)
 
Does anyone remember the nforce2 launch days before the Radeon 9700 launch? Seems nvidias taking their time bringing this product to the market. I suspect we will see the same sort of thing with the NV30, nvidia has gotten worse it seems. On that note I wonder how long it will actually be untill you can actually buy an nforce 2..... If it follows the timeline that the first nforce took it won't be avialable untill Christmas. Anyhow the thread starter claims to have sources close enough to nvidia that make the claim that the NV30 in fact has just taped out..... So is the nv30 shipping before Christmas a possibility?
 
Bjorn isn't assuming or inferring anything! You're reading too much into what he's saying.

If anything, he's exactly highlighting how ATI can be doing something that nvidia essentially claimed wasn't really possible. (Though basically, they have already). It's just that if NV30 comes in with LESS transistors than R-300, that's all the more ironic.

Been away for 2 hours and missed all the fun.
But you're right on the spot Joe..

Although i don't know about the "Though basically, they have already" part since that of course depends on how big of a difference there will be between the NV30 and the R300. Either way, it'll be fun to see what happens.
 
On the subject of Not having many complaints about problems with gforce cards in Nvidia related forums.

That may be the case and as others have stated, there is no one central Nvidia forum like rage3d for Ati troubles. But really all you have to do is look in the forums for any particular game and you will see plenty of problems by gforce users. It still kills me when people talk about how bad ATI's drivers are and how perfect Nvidia's drivers are. I play a lot of games and visit the forums for most of the games I play. I can tell you there is definitely no shortage of people with gforces having problems with the games they run. Many, many people are having to play the driver switch game. Use det xxx for this game. When done playing switch to det yyyy for this other game. People really need to take the blinders off.

Typical game specific board:

20 topics from different gforce users complaining of various problems getting the game to run correctly. The replies are usually very helpful. "switch drivers to det xxx" "turn off xxx setting"

2 topics from ati users complaining of various problems getting the game to run correctly. The replies usually look something like this: "get a gforce man. ATi's drivers suck!"

Go look for yourself and you will see I'm not making this crap up. (be sure to remove the fanboy blinders first)
 
jjayb said:
Go look for yourself and you will see I'm not making this crap up. (be sure to remove the <bleep> blinders first)
I'm quite sure most people on this board read other, game-related boards, too. Still they have many different opinions on this. So I don't think you can take what you posted as "proof" for anything.
 
Doomtrooper said:
Radeon 64 Meg Vivo
AIW
Supports all bump mapping modes
DVD features galore

Geforce 2 GTS has only one mode Dot3


Radeon 8500
PS 1.4
Truform (really a building block for dispacement mapping)
Higher Internal Precision
Supports all bump mapping modes again
DVD features galore

Geforce 3 and 4 supports Ps 1.1-1.3 and both bump mapping modes
Though I agree that R8500 is "more advanced" than GF3, I think you're quite selectively picking features that prove your point while leaving out others...
 
OK. I'll bite.

I'm curious is what way is the GeForce3 "technically more advanced" than the 8500? (What selective features would you choose to show the GeForce3 more advanced?)
 
Back
Top