NV30 to break NVIDIAs Unified Driver Architecture?

alexsok

Regular
There have been some speculations on some forums that NV30, since it's based on a completely new architecture, will break the Unified Driver Architecture.

I donno whether that could actually happen, but I wouldn't want to see it happen!

Any suggestions as to the reasons the architecture could be broken? or if it is a must for it to be broken?

Shortly before the R300 announcement, ATI released their CATALYST drivers, which were supposed to become the Unified Driver Architecture of ATI, but it was said by ATI that the R300 drivers are completely new, totaly rewritten and have no connection at all to the CATALYST ones!

Since the support was broken due to the R300 new core, do u think it could also happen cause of that to NVIDIA?

Any suggestions are welcome! :D
 
"Unified driver" just means that you can download one driver package and all products of the company are supported. It does NOT mean that the driver use the same driver path for every product, so ATI and NVIDIA can develop new NV30/R300 drivers from scratch and distribute them in the same package without a problem. Did you realy think a NV5 uses the same codepath than a NV25?
 
Actually, it does mean that every piece of hardware uses the same driver path, for at least some operations.

For example, I have a GeForce4. It should be possible for me to install the original Detonator drivers, and I should still be able to play games in 3D.

The reason this is possible is that every nVidia GPU has a section on its die that is for the unified driver, that apparently acts sort of like an instruction interpreter that allows nVidia to keep its drivers very similar among different video cards (most of the code is shared).

However, there are certainly architecture-specific optimizations, so that each architecture doesn't follow the same exact code path for each instruction, but most of it is shared.

After all, how else could nVidia support every video card from the TNT to the GeForce4 on a single driver that is smaller than ATI's driver for the Radeon 8500?

Regardless, I don't know if they plan to drop the UDA or not. If they do, it wouldn't be that nVidia would stop UDA altogether, but that they decided it was time to break backwards-compatibility and come out with a new UDA for use in future products.

Personally, I doubt that nVidia will change their UDA just yet.
 
Chalnoth said:
For example, I have a GeForce4. It should be possible for me to install the original Detonator drivers, and I should still be able to play games in 3D.

In fact, this does not work.
 
Mephisto said:
"Unified driver" just means that you can download one driver package and all products of the company are supported. It does NOT mean that the driver use the same driver path for every product, so ATI and NVIDIA can develop new NV30/R300 drivers from scratch and distribute them in the same package without a problem. Did you realy think a NV5 uses the same codepath than a NV25?

I think that the reason to change would be the switch to the fully floating point setup of the architecture. Besides, the architecture is completely new!

Such a drastic change hasn't occured ever!
 
alexsok said:
I think that the reason to change would be the switch to the fully floating point setup of the architecture. Besides, the architecture is completely new!

Such a drastic change hasn't occured ever!

There's no reason to deal with floating-point data any differently on the driver level.
 
Personally, I doubt that nVidia will change their UDA just yet.

I would hope that they would.

UDA has pros and cons. It's not all pros. And most of the pros come from economics, not from being a better driver. And one big con is that you have to carry around a lot of baggage when you have to support multiple different architectures with one driver base. Ultimately, that means drivers that tend to mean drivers that are either

1) "jack of all trades" but master of none
2) Heavily tuned toward one architecture...at the EXPENSE of others.

Either way...Not Good.

At some point, you need to make a fresh start so that you can fully optimize for one architecture, without regard to how it impacts others.

If NV30 is such a "revolutionary" break of technology as nVidia is barking, to try and maintain "unified" drivers would be detrimental.

For the sake of delivery, I could forsee nVidia releasing early drivers that are based on previous detonators, and then at some point when the "new" drivers are mature enough unleash those.

After all, how else could nVidia support every video card from the TNT to the GeForce4 on a single driver that is smaller than ATI's driver for the Radeon 8500?

I hope you're not comparing Radeon 8500 dirvers that have mutiple language support, and new control panels to a few binary file updates with some detonator releases... :rolleyes:
 
Doomtrooper said:
Ask Geforce 2 owners what they think of UDA ;)

I never owned a GeForce2, but I had a GeForce DDR up until a few months ago. I loved UDA. Allowed software implementation of certain features (such as vertex shaders) and I got to use nView. I couldn't be certain either would be put to use without UDA.
 
Yeh, I have a GeForce 2, and I don't dare upgrade to the latest drivers.

Everytime I update my drivers, all my older games run slower.

Every driver after ~14.00, makes some older games, like Diablo 2, unplayable.
 
Chalnoth said:
Doomtrooper said:
Ask Geforce 2 owners what they think of UDA ;)

I never owned a GeForce2, but I had a GeForce DDR up until a few months ago. I loved UDA. Allowed software implementation of certain features (such as vertex shaders) and I got to use nView. I couldn't be certain either would be put to use without UDA.

Newer detonator drivers ran slower on my old GTS...always had to use older drivers. I'm talking from personal experience here, and looking at the Nvnews forums you will find almost everyone there not reccomend newer Detonators on a Geforce 2.

UDA has its positives and negetives, this is one of them.
 
Chalnoth said:
Doomtrooper said:
Ask Geforce 2 owners what they think of UDA ;)

I never owned a GeForce2, but I had a GeForce DDR up until a few months ago. I loved UDA. Allowed software implementation of certain features (such as vertex shaders) and I got to use nView. I couldn't be certain either would be put to use without UDA.

Just curious: In what game/application can you actually benefit from software vertex shaders?
I mean any game that requires them today also has support for the old style TnL and benchmarks like 3D Mark 2001 will just do it in software no matter if you have a Voodoo 5 or a GeForce 4 MX.

In any case: I have the same experiences as the most of the people I've talked to: newer detonators have worse performance than the old when using TNT 1 & 2 and GeForce 1 & 2.
It seems like the newer versions are optimized for GF3 and 4. Wich isn't really surprising but a UDA con none the less.

Personally I don't see the point except for the "avarage joe", the kind of guys that probably won't update their drivers anyway. ;)

I'd rather just have a solid driver that's optimized for my specific hardware. Does the driver work for cards other than mine? Who cares?
 
Ante P said:
Just curious: In what game/application can you actually benefit from software vertex shaders?

It's more of a programming curiosity for me. I don't program much, but I like to play around with things like that every once in a while. As a more theoretical argument, software vertex shaders should allow for game programmers to not have to write different codepaths for vertex processing on different hardware.

In any case: I have the same experiences as the most of the people I've talked to: newer detonators have worse performance than the old when using TNT 1 & 2 and GeForce 1 & 2.
It seems like the newer versions are optimized for GF3 and 4. Wich isn't really surprising but a UDA con none the less.

I've never had noticeably-lower performance. Yes, maybe you'll lose a few 3DMark2k1 points, or a few fps in games, but I've never been able to tell the difference. Granted, there are some games that just plain didn't want to work on the newer drivers, but those were few and far between.

Update: Well, I'll have to revise that. There was one case where my GeForce DDR showed noticeably-lower performance, and that was with some rather recent drivers that mangled the texture management, and thus resulted in significant slowdowns in UT.

I'd rather just have a solid driver that's optimized for my specific hardware. Does the driver work for cards other than mine? Who cares?

The problem with that is that nVidia has a limited amount of manpower. Yes, given infinite manpower, a separate driver for each piece of hardware would be optimal. After all, even in a difference between, say, a GeForce3 Ti 200 and a Ti 500 there are probably some optimizations that would work better on one more than the other.

But, a UDA is far easier to develop, and it keeps nVidia from having to worry much about providing legacy support for their older video cards (i.e. if nVidia keeps the UDA up, your old TNT get DX9 drivers "automatically").

Personally, I'm willing to sacrifice small bits of performance for compatibility with future games and other programs.
 
Anything beyond around 6.50 slowed my gf256SDR down on most games. The only increase was the DetXPs enabled point sprites hardware support upping that test in 3dmark. The same drivers killed HL d3d performance from near constant 100fps to 20-30fps (opengl crashed in all drivers).

Vertex shader performance didn't change with the det XPs.
 
McElvis said:
Yeh, I have a GeForce 2, and I don't dare upgrade to the latest drivers.

Everytime I update my drivers, all my older games run slower.

Every driver after ~14.00, makes some older games, like Diablo 2, unplayable.

yep. same here...

for my Prophet DDR+DVI 18.xx is the turning point. after that, things start getting only worse.
 
Back
Top