Can DX9 cards become run DX10 with a new driver?

Discussion in 'Architecture and Products' started by ^eMpTy^, Jan 14, 2007.

  1. ^eMpTy^

    Newcomer

    Joined:
    Aug 22, 2004
    Messages:
    116
    Likes Received:
    2
    Location:
    My Office
    Can DX9 cards run DX10 with a new driver?

    At first this seemed like a simple question: of course they can't. But then I remembered that the card doesn't actually have to have unified shaders, it just has to have the driver.

    So what is stopping DX9 cards from being able to do DX10 that can't be done on the driver level? I'm sure there are lots of things, but I can't really say with any certainty precisely what those things are.
     
    #1 ^eMpTy^, Jan 14, 2007
    Last edited by a moderator: Jan 14, 2007
  2. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    10,580
    Likes Received:
    622
    Location:
    New York
    Considering hardware unification is not a DX10 requirement I would say that's a moot point. It's all the other API features that prevent DX9 cards from running DX10 code - they just can't.
     
  3. willardjuice

    willardjuice super willyjuice
    Moderator Veteran Alpha

    Joined:
    May 14, 2005
    Messages:
    1,385
    Likes Received:
    299
    Location:
    NY
    Well among the many obvious things, no DX9 card has Geometry Shaders.
     
  4. Davros

    Legend

    Joined:
    Jun 7, 2004
    Messages:
    15,831
    Likes Received:
    3,020
    Maybe what empty is trying to ask is
    "can dx9 cards benefit from the performance improvenments in directX 10"
     
  5. ^eMpTy^

    Newcomer

    Joined:
    Aug 22, 2004
    Messages:
    116
    Likes Received:
    2
    Location:
    My Office
    Well I kinda botched the thread title...but my question was just what stops DX9 cards from doing DX10 stuff...

    I had admittedly forgotten all about geometry shaders.

    The question was prompted by someone asking me the question, and I laughed and started explaining how different DX10 is how it will be impossible for DX9 cards to run DX10 titles...and then I realized I didn't understand why that was true, I was just echoing what I had read elsewhere...

    So geometry shaders is a good one...
     
  6. Anarchist4000

    Veteran Regular

    Joined:
    May 8, 2004
    Messages:
    1,439
    Likes Received:
    359
    GS I'd say would be the big one but then again how many apps in the near future are actually going to take advantage of them? Most games that release will probably have DX9 and DX10 paths so it's doubtful they'd use very many DX10 features that they couldn't otherwise use in DX9. Only recently have any developers started doing anything really meaningful with shaders of any form. For the most part shaders are just another way of doing fixed function to a lot of developers.

    I'd say some of the cards such as a 1900 could probably do a good job emulating DX10 to a degree. They might even be able to pull off some GS style operations.
     
  7. AlexV

    AlexV Heteroscedasticitate
    Moderator Veteran

    Joined:
    Mar 15, 2005
    Messages:
    2,528
    Likes Received:
    107
    SM4.0 has different(higher) requirements than SM3.0, for one. The second thing is, you don`t need to be unified at hardware level, but you should be able to run the same code across Vertex ALUs, Pixel ALUs etc., which no DX9 part can do.

    There should be some improvements for DX9 cards under Vista, due to MS cleaning up some overhead in the driver model and adding improvements in the form of DX9.L(or Ex like DxDiag names it under Vista), but nothing earth-shattering IMO.
     
  8. Demirug

    Veteran

    Joined:
    Dec 8, 2002
    Messages:
    1,326
    Likes Received:
    69
    The only thing that really makes it impossible to write D3D10 drivers for pre G80 chips is the removing of the caps bits. The API itself and the DDI could be implemented up to a specify level from older cards too. But without caps bits there is no way to report this level to the application.
     
  9. TG01

    Newcomer

    Joined:
    Dec 18, 2006
    Messages:
    40
    Likes Received:
    1
    The point of ^eMpTy^ is actually quite good..
    95%+ of computers running vista next year will be using DX9 cards.
    So I'm curious .. does vista support running DX10 and DX9 drivers next to one and another or does DX10 automaticaly support DX9 hardware..
    If so there would be no reason why DX10 can't be made to run under XP other than: We don't want to.!
     
  10. Demirug

    Veteran

    Joined:
    Dec 8, 2002
    Messages:
    1,326
    Likes Received:
    69
    A Vista driver can support multiple driver interfaces at the same time. At least it has to support the D3D9 interface because it is needed for AERO and any D3D runtime up to version 9. The D3D10 runtime will ask the driver for a version 10 driver interface. If the driver doesn’t support it it will not work.
     
  11. Andrew Lauritzen

    Moderator Veteran

    Joined:
    May 21, 2004
    Messages:
    2,526
    Likes Received:
    454
    Location:
    British Columbia, Canada
    The answer is vacuously "yes", since we can theoretically do any bits that the hardware doesn't do in software (ex. DX10 reference rasterizer, or NVEmulate). However for many of these features, the speed would be so unacceptable as to make worthless. I suspect that the cross-API support will be done exclusively at the application level for this reason (although someone might write wrappers to make things easier to port).
     
  12. santyhammer

    Newcomer

    Joined:
    Apr 22, 2006
    Messages:
    85
    Likes Received:
    2
    Location:
    Behind you
    Yesterday a gnome behind my bed told me this:

    "Listen carefully. The GPUs are void really. There is only hot air inside. All is done in the graphics driver, by software, using your 8-core CPU. That's why XXMARK scales so well with the microprocessor. That explains too why I can unlock the GF7600 vs/ps pipelines using this hacking program called XXXTools to convert it to a GF7950GTX."

    So, according to that gnome, YES is possible.

    On the other hand, according Dave, one of the DX dev team programmers in http://letskilldave.com/archive/200...after-me_3A00_-No.-No.-No_2E00_.aspx#comments

    So, according Dave, NOT 3x, is not posible.

    Accoding me... I have not idea but will be fun if some hacker makes it :p

    Hopefully I could use OpenGL extensions to use the DX10 features under WindowsXP ( or to make DX10 parsed by OpenGL 2.1 as somebody I think suggested... Ironic... 2 years ago OGL was going to be emulated by DX10.. and now it appears DX10 is going to be emulated by OGL... fun fun ). Whatever happens I am not going to waste 1 month salary to buy 0.75 gf8800/r600 ... and yes, int(0.75)=0 :D
     
    #12 santyhammer, Jan 15, 2007
    Last edited by a moderator: Jan 15, 2007
  13. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    10,580
    Likes Received:
    622
    Location:
    New York
    Sure it's doable as an academic exercise but what would be the purpose of a D3D10 driver for hardware incapable of most of D3D10's distinguishing features?
     
  14. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    10,580
    Likes Received:
    622
    Location:
    New York
    I don't see why not. Isn't that exactly what Nvidia did for its G80 demos?
     
  15. Demirug

    Veteran

    Joined:
    Dec 8, 2002
    Messages:
    1,326
    Likes Received:
    69
    The same reason why there are D3D9 drivers for older hardware. If we need only one API for multiple generation of cards developers life would be easier.
     
  16. santyhammer

    Newcomer

    Joined:
    Apr 22, 2006
    Messages:
    85
    Likes Received:
    2
    Location:
    Behind you
    Yep, but consider DX10 is much more than SM4.0. It has, for example, the virtualized texture management which I bet is impossible to do in WindowsXP due to the driver model design.
     
  17. JHoxley

    Regular

    Joined:
    Oct 18, 2004
    Messages:
    391
    Likes Received:
    35
    Location:
    South Coast, England
    These threads amuse me :)

    Unlike previous versions of D3D the v10 specification gives the IGV's very little room to manouver - not just on the features that marketting teams love to shout about, but on the really low-level details such as FP precision, accuracy and so on. More specifically, you can't be an "almost D3D10" part - it's one or the other, hardware is or is not capable of running D3D10 applications and there is no middle ground.

    A similar example was some of ATI's later SM2 parts (the X800's for example) that had most of the functionality that Nvidia's SM3 capable parts did. Taken at a high level they were comparable. Problem was, as I remember it, the ATI hardware used 24bit internal precision - which fell foul of Microsoft's specification for SM3 that mandated 32bit precision throughout the pipeline. Consequently it wasn't until the R500 generation that ATI could officially brand their hardware as SM3 capable.

    I'm in the lucky position to have the specifications in front of me, where these things become a lot more obvious. There's only so much that software could change - and if the basic silicon behind earlier hardware doesn't cut it then, well, game over :smile:

    I suppose the key point is that you might be able to hack on the high-level features of D3D10 using D3D9 hardware and Windows XP but it'd only be skin deep. I think it was at GDC '06 where the MS developer days were running the D3D10 refrast on D3D9 hardware with partial hardware acceleration (iirc paths with the GS were using software VS and GS). From what I heard this meant that the SDK samples were interactive/real-time, but whether such an approach scales up I don't know...

    Thing is, without the low-level parts of the hardware spec you'd end up with two similar but slightly different flavours of D3D10 and, as I'm sure any developers here will appreciate, that makes it an absolute nightmare for compatability testing. One of the aims for D3D10's fixed-caps and strict requirements was to make software development simpler - less of a burden on us developers to write a million similar but different rendering paths, reducing the amount of testing etc...etc...

    Anyway, I'll go back to lurking again...

    Cheers,
    Jack
     
    Acert93 likes this.
  18. Demirug

    Veteran

    Joined:
    Dec 8, 2002
    Messages:
    1,326
    Likes Received:
    69
    I agree with this but even with this in mind the hard cut was not necessary. It would be a no brainer to bind everything that defines a D3D10 card today to shader model 4 instead. With this solution it would be possible to have a capsless shader model 4 and support older shader models with D3D10 at the same time.

    Well I still have some hope that Microsoft will release a solution for the two API problem as part of D3DX.
     
  19. Anarchist4000

    Veteran Regular

    Joined:
    May 8, 2004
    Messages:
    1,439
    Likes Received:
    359
    Actually with CTM/Cuda becoming available it might be possible to emulate the basic functionality fairly effectively. Besides there's nothing stopping a DX10 app from creating a DX9 interface and running through some caps.
     
  20. DavidC

    Regular

    Joined:
    Sep 26, 2006
    Messages:
    347
    Likes Received:
    24
    According to that paragraph then, the current GMA X3000 is not DX10 compatible: http://www.intel.com/cd/ids/developer/asmo-na/eng/recent/334680.htm?page=3

    Why?? Cause the FP precision for the Pixel Shader is only 24 bit.
     
    Jawed likes this.
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...