DirectX 12: The future of it within the console gaming space (specifically the XB1)

Discussion in 'Console Technology' started by Shortbread, Mar 7, 2014.

  1. MJP

    MJP
    Regular

    Joined:
    Feb 21, 2007
    Messages:
    566
    Likes Received:
    187
    Location:
    Irvine, CA
    That would be great, but they've already mentioned how they want it to run on existing hardware but also support new hardware features. Personally I'm totally fine with feature levels, but I suppose in practice they have a hard time fitting a wide array of hardware into such coarse buckets of functionality.
     
  2. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    10,245
    Likes Received:
    4,465
    Location:
    Finland
    Not to mention that it would mean endless compromises on what the API will actually feature, IIRC for example DX10, when we got rid of the "cap bits", there were supposed to be more features included than what we finally got, because NV didn't support some planned features on G80. Now with feature levels we can have all those features made available even if one vendor doesn't support some things (like feature levels 11_0 vs 11_1)
     
  3. Miksu

    Regular

    Joined:
    Mar 9, 2003
    Messages:
    997
    Likes Received:
    10
    Location:
    Finland
    Microsoft's Build Conference starts tomorrow and there seems to be one session dedicated for DirectX 12:

    Direct3D 12 API Preview
     
  4. Ethatron

    Regular Subscriber

    Joined:
    Jan 24, 2010
    Messages:
    948
    Likes Received:
    417
    Correct. JPEG has no random access ability in the ISO definition, to get the last pixel you have to decode the whole serial bitstream.
    Albeit, it would be possible to do fixed rate coding on DCT blocks in general, but's not called JPEG, and I my guess it'd really suck quality wise. :)
     
  5. DmitryKo

    Regular

    Joined:
    Feb 26, 2002
    Messages:
    967
    Likes Received:
    1,223
    Location:
    55°38′33″ N, 37°28′37″ E
  6. Starx

    Regular

    Joined:
    Sep 29, 2013
    Messages:
    294
    Likes Received:
    148
    #266 Starx, Apr 4, 2014
    Last edited by a moderator: Apr 4, 2014
  7. Starx

    Regular

    Joined:
    Sep 29, 2013
    Messages:
    294
    Likes Received:
    148
  8. NRP

    NRP
    Veteran

    Joined:
    Aug 26, 2004
    Messages:
    2,712
    Likes Received:
    293
    That article sounds like a bunch of hyperbolic crap. It's highly doubtful the Bone will get a 2X speedup from D3D12, especially if D3D12 is based on the Xbox API. Unless of course, MS really screwed the pooch with the XBone API.
     
  9. pjbliverpool

    pjbliverpool B3D Scallywag
    Legend

    Joined:
    May 8, 2005
    Messages:
    9,237
    Likes Received:
    4,260
    Location:
    Guess...
  10. DmitryKo

    Regular

    Joined:
    Feb 26, 2002
    Messages:
    967
    Likes Received:
    1,223
    Location:
    55°38′33″ N, 37°28′37″ E
    You cannot directly translate CPU ultilization improvements to framerate improvements, that could only happen with a very slow CPU running a very fast GPU.

    In reality the new API and driver model would free some additional CPU time for the developer. Those extra 3-5 ms per frame can be used to perform more AI tasks, to batch more workload to the GPU and better saturate the execution units, or just to get the same 60 fps on your TV (but in a more efficient way :) ) But it cannot magically make your GPU perform twice as fast and exceed its theoretical maximum performance. You will still be limited by the GPU in graphics-heavy workloads.
     
  11. Lalaland

    Regular

    Joined:
    Feb 24, 2013
    Messages:
    864
    Likes Received:
    693
    Yup Mantle has always seemed to me like the ATI answer to the low IPC throughput of AMD cpus to enable Kaveri and their future APU designs punch above their weight. It's a good thing that we're getting standards based solutions to this though rather than a proprietary API (CUDA how are ya).
     
  12. DmitryKo

    Regular

    Joined:
    Feb 26, 2002
    Messages:
    967
    Likes Received:
    1,223
    Location:
    55°38′33″ N, 37°28′37″ E
    Does "conservative rasterisation" really require new hardware? From what I understand, this can be implemented at the driver level in the current architectures, since the GPU is basically a general-purpose many-cores wavefront processor where the driver handles the native code that runs on the wavefronts. You would need to alter the algorithms for rasterisation or setup, but it doesn't require new instruction for operand swizzle, more physical registers or vritual memory descriptor tables, as in other level 11_1 features.
     
    #272 DmitryKo, Apr 5, 2014
    Last edited by a moderator: Apr 5, 2014
  13. zed

    zed
    Legend

    Joined:
    Dec 16, 2005
    Messages:
    6,415
    Likes Received:
    2,139
    Bet me to the punch.
    But if in a year if xbone titles run faster than ps4 I will admit being wrong. Hell even today if he can prove that.
    Until then this guy should be ignored
     
  14. mosen

    Regular

    Joined:
    Mar 30, 2013
    Messages:
    452
    Likes Received:
    152
    Ray-tracing isn't a part of DX12 but DX team is looking at some possibilities/opportunities for supporting non-hardware accelerated ray-tracing. They have some projects which could have benefits from some colleges.

    http://channel9.msdn.com/Events/Build/2014/9-004 - (Start at 18:32)
     
  15. ramr

    Newcomer

    Joined:
    Jan 19, 2013
    Messages:
    169
    Likes Received:
    32
    That is Brad Wardell so you have to assume that he knows exactly what he is saying. Brad is no MS fanboy given how long he avoided working in windows.
     
  16. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    44,106
    Likes Received:
    16,898
    Location:
    Under my bridge
    Again, it doesn't matter who's saying it. There's a couple of fundamental console engineering issues here that make that assertion nigh impossible. 1) For DX12 to provide XB1 with a 2x speed increase, its current API must be horrible inefficient and pretty crippled. We know that some of the aspects of DX12 are already present in XB1, so the likelihood of this is virtually nil. 2) Should XB1 become twice as fast, going by multiplat games the hardware will be equal or exceed PS4 in performance, meaning 12 CUs working faster than 18 CUs by way of an API and some ESRAM. Again, chances are basically nil for that happening.

    I suppose, playing devil's advocate, Sony's API could be as gimped as XB1's and similarly reducing PS4's hardware to half its performance, and then when DX12 flies in to save the day, XB1 will be unlocked and reaching its full potential which PS4 can't hope to achieve. Realistically though, it's bunk. Consoles have thin APIs (maybe not XB1 with it's 3 flavour Windows base??) that let the hardware run full tilt. People shouldn't be looking for massive speed-ups. Optimisations, sure, but anyone claiming 2x the performance is either being taken out of context (talking about one single aspect being twice as fast), or babbling like a lunatic.
     
  17. Jwm

    Jwm
    Veteran

    Joined:
    Feb 27, 2013
    Messages:
    1,037
    Likes Received:
    155
    Location:
    Texas
    Where did all the cool tags go?
     
  18. MJP

    MJP
    Regular

    Joined:
    Feb 21, 2007
    Messages:
    566
    Likes Received:
    187
    Location:
    Irvine, CA
    There are still a handful of fixed-function hardware units that handle clipping, scan conversion, depth testing, and spinning up pixel shaders.
     
  19. Scott_Arm

    Legend

    Joined:
    Jun 16, 2004
    Messages:
    15,134
    Likes Received:
    7,680
    He seems to believe that the gains are found on the CPU side, from better threading and efficiency gains in the D3D API. I don't buy it.
     
  20. Lalaland

    Regular

    Joined:
    Feb 24, 2013
    Messages:
    864
    Likes Received:
    693
    Yup I can't see a 100% performance boost on XB1 from DX12 unless MS gave the XB1 API development to 2 interns with a kegger and said 'have at it, your deadline is tomorrow by 5'.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...