NVIDIA Maxwell Speculation Thread

Discussion in 'Architecture and Products' started by Arun, Feb 9, 2011.

Tags:
  1. Blazkowicz

    Legend Veteran

    Joined:
    Dec 24, 2004
    Messages:
    5,607
    Likes Received:
    256
    VLIW is still common as it's on APUs. Hell, dual core Kaveri is missing. It makes more sense buying a 50 euro APU than a 150 euro one ; A10 6790K is another option.

    So VLIW is year 2014 hardware, even if it may be antiquated.. Radeon 6000 series looks more like Radeon 2900XT than GCN, whereas Fermi looks like Kepler more than G80.

    Fermi was when nvidia got serious with Tesla and stuff. GCN is AMD's similarly modern stuff.
     
  2. UniversalTruth

    Veteran

    Joined:
    Sep 5, 2010
    Messages:
    1,747
    Likes Received:
    22
    That is not full hardware support then.
    If so, what hardware will be fully compliant with DX12, and not putting DX11 hardware working with DX12 runtime version in Windows?
     
  3. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    8,184
    Likes Received:
    1,841
    Location:
    Finland
    Either GCN, GCN 1.1 or something that isn't out yet.
    We don't know if there's going to be Feature Level 12_0, if not, GCN 1.1 at least and possibly all GCN would have support for everything it has.
    If there will be, there's still remote chance of GCN and/or GCN 1.1 supporting it because XB1 has GCN 1.1
     
  4. 3dilettante

    Legend Alpha

    Joined:
    Sep 15, 2003
    Messages:
    8,122
    Likes Received:
    2,873
    Location:
    Well within 3d
    Various sites have relayed that Raja Kadouri announced DX12 would be supported by all GCN hardware at the Microsoft event.
    Given the deep parallels between the two APIs, it seems consistent.
     
  5. DSC

    DSC
    Banned

    Joined:
    Jul 12, 2003
    Messages:
    689
    Likes Received:
    3
    Nvidia hardware supports everything on DX11.1 and DX11.2 included Tiled Resources except for the Direct2D non gaming features.

    You're getting full DX12 support, it's hilarious AMD can't even say at GDC whether HD5000/HD6000 will get DX12 support or not.
     
  6. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    8,184
    Likes Received:
    1,841
    Location:
    Finland
    UAVs in all shader stages to my understanding isn't Direct2D, and is tied to D3D Feature Level 11_1, even though NV hardware is capable of it, they can't use it in DirectX because they're limited to Feature Level 11_0
     
  7. zorg

    Newcomer

    Joined:
    Aug 1, 2012
    Messages:
    32
    Likes Received:
    0
    Location:
    Sweden
    This feature can be supported by NVAPI.

    The bigger problem is the number of UAV in the pipeline. NV only support 8, while AMD GCN and Intel Haswell support 64. Technically AMD GCN can support unlimited number of UAVs.
     
  8. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    8,184
    Likes Received:
    1,841
    Location:
    Finland
    Yes, but then you're not using just DirectX anymore, and I doubt they'd touch the old feature levels in DX12
     
  9. ams

    ams
    Regular

    Joined:
    Jul 14, 2012
    Messages:
    914
    Likes Received:
    0
    According to The Tech Report, AMD GPU's prior to HD 7xxx (is. non-GCN GPU's) will NOT support DX12.
     
  10. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    8,184
    Likes Received:
    1,841
    Location:
    Finland
    There has been some speculation that they meant GCN's support DirectX 12 "as whole", which necessarily doesn't leave VLIW DX11 cards from supporting the API - the bigger question is where do you draw the line today at what's "supporting API" and what's not, NVIDIA claims DX11.2 (and 12) support for 11_0 hardware, don't the VLIW DX11 cards support DX11.2 with 11_0, too?
     
  11. silent_guy

    Veteran Subscriber

    Joined:
    Mar 7, 2006
    Messages:
    3,754
    Likes Received:
    1,379
    From my reading at TechReport, isn't DX12 simply a sideways move that only tackles performance/driver overhead, but nothing else? (Not saying that a bad thing.)

    If so, it's just DX11 with a lighter coat, and it should work for all GPUs that have previously supported DX11, including pre-GCN. IMO AMD simply chose to not support pre-GCN because of resource constraints, not for technical reasons. Which is a fair decision: nobody ever promised that 3 year old GPUs would be upgradable to the next DX level.
     
  12. ams

    ams
    Regular

    Joined:
    Jul 14, 2012
    Messages:
    914
    Likes Received:
    0
    It looks like you are trying to (intentionally?) obfuscate the issue here by talking about DX11 feature level differences which is largely irrelevant when it comes to discussion of DX12 support. DX12 is obviously a more forward-looking API than DX11, even if much existing hardware can take advantage of it. Pre-GCN DX11 GPU's from AMD do NOT support Mantle, so it is not too surprising that pre-GCN DX11 GPU's from AMD will not support DX12 (and if that was not the case, then AMD would have surely clarified by now). Like silent_guy said, this may be due to resource constraints rather than technical constraints on behalf of AMD.
     
    #1432 ams, Mar 20, 2014
    Last edited by a moderator: Mar 20, 2014
  13. silent_guy

    Veteran Subscriber

    Joined:
    Mar 7, 2006
    Messages:
    3,754
    Likes Received:
    1,379
    How did you come to that conclusion? I haven't seen any pointers about new HW functionality. (All base in just one slide unfortunately.) More CPU concurrency, lightweight driver layer etc. It all points to pure software to me.
     
  14. ams

    ams
    Regular

    Joined:
    Jul 14, 2012
    Messages:
    914
    Likes Received:
    0
    More forward-looking in the sense that the lower overhead will make the API more suitable for not just desktop graphics but also for notebook, tablet, and smartphone graphics. But you are right, in technical terms, the hardware requirements may not be any more strict than DX11.

    On a side note, even though NVIDIA's 8xxM mobile GPU's contain a mix of Fermi [820M], Kepler, and Maxwell GPU's, they will all support DX12!
     
  15. 3dilettante

    Legend Alpha

    Joined:
    Sep 15, 2003
    Messages:
    8,122
    Likes Received:
    2,873
    Location:
    Well within 3d
    It's not clear how the VLIW GPUs would be able to practically handle the sort of resource binding and indirection that Mantle or DX12 seem to require, but DX11 does not.
    The lack of a flexible memory architecture pre-GCN may also preclude internal emulation of some things like the promised programmable blending functionality.
    While potentially not ideal, a compute shader and the common read/write path can run through things even if the ROPs can't hack it.
     
  16. silent_guy

    Veteran Subscriber

    Joined:
    Mar 7, 2006
    Messages:
    3,754
    Likes Received:
    1,379
    But was Fermi able to do all that kind of stuff? I believe bindless textures etc. was only first introduced on Kepler?
     
  17. 3dilettante

    Legend Alpha

    Joined:
    Sep 15, 2003
    Messages:
    8,122
    Likes Received:
    2,873
    Location:
    Well within 3d
    I don't think Fermi can directly handle bindless extensions. Its support for more general memory addressing and its memory subsystem might allow some hackish way of following pointers to similarly laid out regions of memory within a compute kernel's execution.

    I suspect having memory traffic, then a little math, then more memory traffic in a VLIW GPU is going to hit clause switch penalties, since the operations are segregated. The general read-only memory pipeline would reduce the chance at emulating things with general compute code.
    AMD might also have given up trying to refactor the VLIW compiler and tool chain, finding the architecture to inflexible to provide support with enough performance to make it worthwhile.
     
  18. swaaye

    swaaye Entirely Suboptimal
    Legend

    Joined:
    Mar 15, 2003
    Messages:
    8,457
    Likes Received:
    580
    Location:
    WI, USA
    5000 series is also past the usual age that ATI switches to legacy drivers. Though really we have no way to know how much development still goes into 5000-6000. They don't say much in their release notes. NV still talks about 400-500 boosts.

    I have noticed a few recent bug fixes in games for my 6950 though. Earlier this year they fixed a repeatable black screen crash with Metro Last Light for example. However, considering the game is not exactly newly released, the lag time on that fix is not a good sign.

    There are still a number of rendering bugs with Crysis 3 on 69xx. I finally beat that in January. Black squares with the fog shadow effect (r_fogshadows must be manually disabled). Extreme model corruption when tessellation is set to very high. I dont know if Crytek is at fault but these things work on other cards.
     
  19. Blazkowicz

    Legend Veteran

    Joined:
    Dec 24, 2004
    Messages:
    5,607
    Likes Received:
    256
    Damn, I find this a bit unnerving.. Do the games work without glitch if you simply set medium or low detail? I made a friend get an AMD A6, I seriously wonder if an Athlon 370K + geforce GT610 would have been safer (yes that CPU is not exactly great but it works, that's already something, ditto GPU) Sometimes you just want stuff to run, with no bugs, no swapping and at last 20 fps rather than 2 fps.

    AMD still has to support these things, at worst some lag on games releases is not terrible as the games are full price and version 1.0.0 anyway. They do sell it afterall.
     
  20. swaaye

    swaaye Entirely Suboptimal
    Legend

    Joined:
    Mar 15, 2003
    Messages:
    8,457
    Likes Received:
    580
    Location:
    WI, USA
    Crysis 3 is the only game I can think of that has always had bugs on my 6950. The tessellation issue isn't very important because Very High object detail is too slow on anything less than a 6950. The black squares thing is frequent but not continuous and it can be manually fixed with r_fogshadows=0. I'm not sure if one of the detail presets disables that.

    My 6950 runs Thief quite well so there's that. AMD seems to be on top of things with them for the most part yet.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...