AMD Mantle API [updating]

Discussion in 'Rendering Technology and APIs' started by MarkoIt, Sep 26, 2013.

  1. mc6809e

    Newcomer

    Joined:
    Jan 24, 2007
    Messages:
    46
    Likes Received:
    5
  2. Gipsel

    Veteran

    Joined:
    Jan 4, 2010
    Messages:
    1,620
    Likes Received:
    264
    Location:
    Hamburg, Germany
    If it is an additional source of fluctuations in the time differences between the snapshots of the game world and the finishing of the rendering/dipslay on the monitor, it likely does.
     
    #1322 Gipsel, Feb 10, 2014
    Last edited by a moderator: Feb 10, 2014
  3. Andrew Lauritzen

    Moderator Veteran

    Joined:
    May 21, 2004
    Messages:
    2,526
    Likes Received:
    454
    Location:
    British Columbia, Canada
    Yeah on AMD the frame consistency is definitely improved with Mantle, but the interesting result from the TR review is that NVIDIA's DX driver is much better than AMDs in this regard (at least in BF4), and competitive with the Mantle result in smoothness.

    Don't want to overstate it though, as the variations are fairly small. It's the spikes that are more concerning, and Mantle doesn't yet seem to do away with those (they may be purely in the game/engine, it's impossible to know from this data).
     
  4. mc6809e

    Newcomer

    Joined:
    Jan 24, 2007
    Messages:
    46
    Likes Received:
    5
    I wonder if maybe it's Windows's scheduler that is the problem.

    Since Mantle tries to use every available core/thread for rendering, any new process that needs CPU is going steal a bit from a thread possibly involved in rendering. With DX, there's usually a spare core or thread available for these interruptions. What happens when Windows has to rejuggle all those live threads? The Mantle threads/GPU seem to pause and do nothing for a moment. Perhaps Windows is taking its time to transfer control back to the rendering threads.

    The spikes see very periodic, as if some background process is waking up every second or so.
     
  5. neliz

    neliz GIGABYTE Man
    Veteran

    Joined:
    Mar 30, 2005
    Messages:
    4,904
    Likes Received:
    23
    Location:
    In the know
    And it doesn't exhibit on every CPU, right?
     
  6. Skinner

    Regular

    Joined:
    Sep 13, 2003
    Messages:
    871
    Likes Received:
    9
    Location:
    Zwijndrecht/Rotterdam, Netherlands and Phobos
    My experience (290CF) with the 14.1 drivers...

    BSOD after BSOD²....BSOD*10²²
    And I'm not even talking about Mantle, but every dx10 and dx11 game.
    Also dx9 disables CF for now.

    It's nice they develop this CPU-friendly route, but it seems it comes at the cost of resources for DX.
    The experience I have with 290CF is mediocore at the very best.
    In november cat. 13.11beta V9.4 starts to introduce a bug in conjunction with v-sync where the perf. plummets and the audio get distorted.
     
  7. Andrew Lauritzen

    Moderator Veteran

    Joined:
    May 21, 2004
    Messages:
    2,526
    Likes Received:
    454
    Location:
    British Columbia, Canada
    Always possible, but you're really just talking about the difference of being truly "CPU bound" on every thread vs. bound on a single thread with idle cores. It's not clear to me that even Mantle gets to the former case on quad-core CPUs, but maybe someone has tested that? In any case it would still be surprising to see spikes of the magnitude shown here just due to some thread getting descheduled briefly, and it wouldn't be a situation that only affects Mantle.

    Hmm sounds like like the driver might have made non-trivial changes to the kernel mode driver (the UMD can't blue-screen the system). Brings us back to the point of wondering what exactly Mantle/new driver is doing to interact with WDDM... my guess is some of the changes would have to be related to the GPU page table stuff in Mantle, but it's hard to know without better information on exactly how Mantle handles stuff like WDDM deciding to move pages/VAs around, etc.
     
  8. Jawed

    Legend

    Joined:
    Oct 2, 2004
    Messages:
    10,873
    Likes Received:
    767
    Location:
    London
    At those particular graphics settings.

    I'm still waiting for evidence across a range of graphics settings.
     
  9. DieH@rd

    Legend Veteran

    Joined:
    Sep 20, 2006
    Messages:
    6,202
    Likes Received:
    2,145
  10. Andrew Lauritzen

    Moderator Veteran

    Joined:
    May 21, 2004
    Messages:
    2,526
    Likes Received:
    454
    Location:
    British Columbia, Canada
    The settings were pretty much the highest IIRC... graphics settings other than shadows/view distance usually don't have a significant effect on CPU time. The more data the better but do you have a strong reason to believe it will vary much?

    The higher the performance gets (say on low settings) the less interesting it is as well. TR's 99% metric is really a pretty good "single number"... variance between 1-2ms is not really an issue for instance but 16-30 is a huge issue (and depending on the frequency should be considered as running at the slower speed, hence the metric).
     
  11. Dave Baumann

    Dave Baumann Gamerscore Wh...
    Moderator Legend

    Joined:
    Jan 29, 2002
    Messages:
    14,081
    Likes Received:
    651
    Location:
    O Canada!
    The Tech Report article used "high" settings rather than "Ultra" as he was trying to induce some extra CPU limitations, however Mantle wasn't scaling as much on the higher CPU's which indicates that while there is more CPU limitation than the highest graphical settings it wasn't completely CPU bound as you see Mantle scale much more in those cases.
     
  12. HMBR

    Regular

    Joined:
    Mar 24, 2009
    Messages:
    417
    Likes Received:
    105
    it's a typical game in terms of CPU usage, BF4 is more unusual, that's why Intel CPUs are so good at gaming

    [​IMG]

    [​IMG]

    but look at the Phenom II X4 and G3220 (2c)

    Nvidia drivers suffer more with 2 cores (or should I say "threads" because the i3 with HT works great), but have good scaling with more cores (threads), while AMD works great with 2 cores but scales less with more cores!?

    the test from the link you posted was made using AMD graphics.
     
  13. eastmen

    Legend Subscriber

    Joined:
    Mar 17, 2008
    Messages:
    10,256
    Likes Received:
    1,711
    anyone else drooling to see a mantle build of civ 5 ? Late in the game my cpu (bulldozer fx 8150 ) is dead trying to keep up with things on larger maps with tons of city states and cities .
     
  14. Andrew Lauritzen

    Moderator Veteran

    Joined:
    May 21, 2004
    Messages:
    2,526
    Likes Received:
    454
    Location:
    British Columbia, Canada
    Absolutely, I'm just not sure why Jawed thinks the variance would be different on ultra. I doubt that introduces much more CPU load, and deferred MSAA tends to introduce a pile more GPU load.
     
  15. Dave Glue

    Regular

    Joined:
    Apr 25, 2002
    Messages:
    634
    Likes Received:
    25
  16. heli

    Newcomer

    Joined:
    Sep 27, 2013
    Messages:
    11
    Likes Received:
    0
    Yep that's the one I was referring to. I agree with you.

    That place seem to have a strange aura around it. It must be, but not limited to feng shui. AMD needs to re-architecture their forum like they did with GCN.

    Great find. AC4 is a NVIDIA sponsored-game (GameWorks title?), maybe it also has something to do with it.
     
  17. Jawed

    Legend

    Joined:
    Oct 2, 2004
    Messages:
    10,873
    Likes Received:
    767
    Location:
    London
    I'm curious to see evidence. Both at higher and lower graphics settings. A single data point is only mildly interesting.

    I expect more advanced graphics settings to increase draw calls per frame. There's too many variables in play to theorise on the net effect on absolute frame rate.

    And I'm curious to see if specific types of graphics settings favour one GPU architecture/driver.

    It's tempting to say that AMD needs Mantle because D3D is such a bad match for its GPU/driver architecture. But I think a bit more evidence is required.
     
  18. mc6809e

    Newcomer

    Joined:
    Jan 24, 2007
    Messages:
    46
    Likes Received:
    5
    Perhaps the best tests should involve complicated scenes with many textures and small objects. Not CPU bound, exactly, but call/second bound.

    The Star Swarm demo seems to be one of the few examples of such. If Civ5 gets mantle, that might also be a good example.
     
  19. Andrew Lauritzen

    Moderator Veteran

    Joined:
    May 21, 2004
    Messages:
    2,526
    Likes Received:
    454
    Location:
    British Columbia, Canada
    I don't think I'd go that far even with "more evidence", but I also don't think it's unreasonable to conclude that NVIDIA's army of software folks earns their pay here.

    And again, obviously none of this is to say that Mantle doesn't reduce overhead, etc. Clearly it does (see Star Swarm, etc), but I still reject the notion that reducing any harmful frame variance in BF4 requires Mantle. I think NVIDIA has a fairly clear counter-example to that, and even a single data point throws that claim into question.
     
  20. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    3,254
    Likes Received:
    3,459
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...