NV30 to break NVIDIAs Unified Driver Architecture?

Discussion in 'Architecture and Products' started by alexsok, Jul 21, 2002.

  1. alexsok

    Regular

    Joined:
    Jul 12, 2002
    Messages:
    807
    Likes Received:
    2
    Location:
    Toronto, Canada
    There have been some speculations on some forums that NV30, since it's based on a completely new architecture, will break the Unified Driver Architecture.

    I donno whether that could actually happen, but I wouldn't want to see it happen!

    Any suggestions as to the reasons the architecture could be broken? or if it is a must for it to be broken?

    Shortly before the R300 announcement, ATI released their CATALYST drivers, which were supposed to become the Unified Driver Architecture of ATI, but it was said by ATI that the R300 drivers are completely new, totaly rewritten and have no connection at all to the CATALYST ones!

    Since the support was broken due to the R300 new core, do u think it could also happen cause of that to NVIDIA?

    Any suggestions are welcome! :D
     
  2. Mephisto

    Newcomer

    Joined:
    Feb 7, 2002
    Messages:
    200
    Likes Received:
    0
    "Unified driver" just means that you can download one driver package and all products of the company are supported. It does NOT mean that the driver use the same driver path for every product, so ATI and NVIDIA can develop new NV30/R300 drivers from scratch and distribute them in the same package without a problem. Did you realy think a NV5 uses the same codepath than a NV25?
     
  3. KimB

    Legend

    Joined:
    May 28, 2002
    Messages:
    12,928
    Likes Received:
    230
    Location:
    Seattle, WA
    Actually, it does mean that every piece of hardware uses the same driver path, for at least some operations.

    For example, I have a GeForce4. It should be possible for me to install the original Detonator drivers, and I should still be able to play games in 3D.

    The reason this is possible is that every nVidia GPU has a section on its die that is for the unified driver, that apparently acts sort of like an instruction interpreter that allows nVidia to keep its drivers very similar among different video cards (most of the code is shared).

    However, there are certainly architecture-specific optimizations, so that each architecture doesn't follow the same exact code path for each instruction, but most of it is shared.

    After all, how else could nVidia support every video card from the TNT to the GeForce4 on a single driver that is smaller than ATI's driver for the Radeon 8500?

    Regardless, I don't know if they plan to drop the UDA or not. If they do, it wouldn't be that nVidia would stop UDA altogether, but that they decided it was time to break backwards-compatibility and come out with a new UDA for use in future products.

    Personally, I doubt that nVidia will change their UDA just yet.
     
  4. Mephisto

    Newcomer

    Joined:
    Feb 7, 2002
    Messages:
    200
    Likes Received:
    0
    In fact, this does not work.
     
  5. alexsok

    Regular

    Joined:
    Jul 12, 2002
    Messages:
    807
    Likes Received:
    2
    Location:
    Toronto, Canada
    I think that the reason to change would be the switch to the fully floating point setup of the architecture. Besides, the architecture is completely new!

    Such a drastic change hasn't occured ever!
     
  6. KimB

    Legend

    Joined:
    May 28, 2002
    Messages:
    12,928
    Likes Received:
    230
    Location:
    Seattle, WA
    There's no reason to deal with floating-point data any differently on the driver level.
     
  7. Joe DeFuria

    Legend

    Joined:
    Feb 6, 2002
    Messages:
    5,994
    Likes Received:
    71
    I would hope that they would.

    UDA has pros and cons. It's not all pros. And most of the pros come from economics, not from being a better driver. And one big con is that you have to carry around a lot of baggage when you have to support multiple different architectures with one driver base. Ultimately, that means drivers that tend to mean drivers that are either

    1) "jack of all trades" but master of none
    2) Heavily tuned toward one architecture...at the EXPENSE of others.

    Either way...Not Good.

    At some point, you need to make a fresh start so that you can fully optimize for one architecture, without regard to how it impacts others.

    If NV30 is such a "revolutionary" break of technology as nVidia is barking, to try and maintain "unified" drivers would be detrimental.

    For the sake of delivery, I could forsee nVidia releasing early drivers that are based on previous detonators, and then at some point when the "new" drivers are mature enough unleash those.

    I hope you're not comparing Radeon 8500 dirvers that have mutiple language support, and new control panels to a few binary file updates with some detonator releases... :roll:
     
  8. Doomtrooper

    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    3,328
    Likes Received:
    0
    Location:
    Ontario, Canada
    Ask Geforce 2 owners what they think of UDA :wink:
     
  9. KimB

    Legend

    Joined:
    May 28, 2002
    Messages:
    12,928
    Likes Received:
    230
    Location:
    Seattle, WA
    I never owned a GeForce2, but I had a GeForce DDR up until a few months ago. I loved UDA. Allowed software implementation of certain features (such as vertex shaders) and I got to use nView. I couldn't be certain either would be put to use without UDA.
     
  10. RussSchultz

    RussSchultz Professional Malcontent
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    2,855
    Likes Received:
    55
    Location:
    HTTP 404
    Uuuhhh. It works well?

    I"m not sure what you're getting at.
     
  11. McElvis

    Regular

    Joined:
    Apr 15, 2002
    Messages:
    269
    Likes Received:
    3
    Location:
    London
    Yeh, I have a GeForce 2, and I don't dare upgrade to the latest drivers.

    Everytime I update my drivers, all my older games run slower.

    Every driver after ~14.00, makes some older games, like Diablo 2, unplayable.
     
  12. Doomtrooper

    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    3,328
    Likes Received:
    0
    Location:
    Ontario, Canada
    Newer detonator drivers ran slower on my old GTS...always had to use older drivers. I'm talking from personal experience here, and looking at the Nvnews forums you will find almost everyone there not reccomend newer Detonators on a Geforce 2.

    UDA has its positives and negetives, this is one of them.
     
  13. RussSchultz

    RussSchultz Professional Malcontent
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    2,855
    Likes Received:
    55
    Location:
    HTTP 404
    Huh. Guess I never noticed. But then again, the GF2MX I have isn't really used for gaming.
     
  14. Crusher

    Crusher Aptitudinal Constituent
    Regular

    Joined:
    Mar 16, 2002
    Messages:
    869
    Likes Received:
    19
    Im using the latest drivers on my GTS and they seem to be working fine.
     
  15. Ante P

    Veteran

    Joined:
    Mar 24, 2002
    Messages:
    1,448
    Likes Received:
    0
    Just curious: In what game/application can you actually benefit from software vertex shaders?
    I mean any game that requires them today also has support for the old style TnL and benchmarks like 3D Mark 2001 will just do it in software no matter if you have a Voodoo 5 or a GeForce 4 MX.

    In any case: I have the same experiences as the most of the people I've talked to: newer detonators have worse performance than the old when using TNT 1 & 2 and GeForce 1 & 2.
    It seems like the newer versions are optimized for GF3 and 4. Wich isn't really surprising but a UDA con none the less.

    Personally I don't see the point except for the "avarage joe", the kind of guys that probably won't update their drivers anyway. ;)

    I'd rather just have a solid driver that's optimized for my specific hardware. Does the driver work for cards other than mine? Who cares?
     
  16. KimB

    Legend

    Joined:
    May 28, 2002
    Messages:
    12,928
    Likes Received:
    230
    Location:
    Seattle, WA
    It's more of a programming curiosity for me. I don't program much, but I like to play around with things like that every once in a while. As a more theoretical argument, software vertex shaders should allow for game programmers to not have to write different codepaths for vertex processing on different hardware.

    I've never had noticeably-lower performance. Yes, maybe you'll lose a few 3DMark2k1 points, or a few fps in games, but I've never been able to tell the difference. Granted, there are some games that just plain didn't want to work on the newer drivers, but those were few and far between.

    Update: Well, I'll have to revise that. There was one case where my GeForce DDR showed noticeably-lower performance, and that was with some rather recent drivers that mangled the texture management, and thus resulted in significant slowdowns in UT.

    The problem with that is that nVidia has a limited amount of manpower. Yes, given infinite manpower, a separate driver for each piece of hardware would be optimal. After all, even in a difference between, say, a GeForce3 Ti 200 and a Ti 500 there are probably some optimizations that would work better on one more than the other.

    But, a UDA is far easier to develop, and it keeps nVidia from having to worry much about providing legacy support for their older video cards (i.e. if nVidia keeps the UDA up, your old TNT get DX9 drivers "automatically").

    Personally, I'm willing to sacrifice small bits of performance for compatibility with future games and other programs.
     
  17. Xmas

    Xmas Porous
    Veteran Subscriber

    Joined:
    Feb 6, 2002
    Messages:
    3,334
    Likes Received:
    162
    Location:
    On the path to wisdom
    Software VS processing is done by DirectX.
     
  18. Bambers

    Regular

    Joined:
    Mar 25, 2002
    Messages:
    781
    Likes Received:
    0
    Location:
    Bristol, UK
    Anything beyond around 6.50 slowed my gf256SDR down on most games. The only increase was the DetXPs enabled point sprites hardware support upping that test in 3dmark. The same drivers killed HL d3d performance from near constant 100fps to 20-30fps (opengl crashed in all drivers).

    Vertex shader performance didn't change with the det XPs.
     
  19. KimB

    Legend

    Joined:
    May 28, 2002
    Messages:
    12,928
    Likes Received:
    230
    Location:
    Seattle, WA
    But not by OpenGL (At least, not yet).
     
  20. Nappe1

    Nappe1 lp0 On Fire!
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    1,532
    Likes Received:
    11
    Location:
    South east finland
    yep. same here...

    for my Prophet DDR+DVI 18.xx is the turning point. after that, things start getting only worse.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...