Xbox One (Durango) Technical hardware investigation

Status
Not open for further replies.
GCN may have more features than what DX11/11.1/11.2 requires, but the "+" in "DX11.1+" refers to additional features in XB1 HW/SW compared to standard DX11/11.1/11.2 (according to John Sell at Hot Chips conference) and his recent article:



Also, it may not be related to technical hardware discussions in this thread (since it won't add any technical knowledge to our discussion), but I think Spencer once said that there are some DX12 features in XB1 (he didn't want to talk about them, but somehow confirmed their existence).

(listen from 28:00)

The two bolded sentence fragments practically mean the same thing.

GCN has more features then DX11/11.1/11.2 therefore the + in DX11.1+ are the features that GCN supports that DX11.1 doesn't require.

There are some software DX12 features in the XB1 API, and there has been for a while. This doesn't mean the GPU contains any DX12 hardware.
 
The two bolded sentence fragments practically mean the same thing.

GCN has more features then DX11/11.1/11.2 therefore the + in DX11.1+ are the features that GCN supports that DX11.1 doesn't require.

No. Prior to Bonaire, GCN had no feature level 11_2 capabilities, so the first part of your sentence is incorrect and as I said before "+" means additional/unique features to XB1. The question is how those features can be unique to XB1 while they are already presented in GCN? It seems that you are trying to say that millions of GPUs and PS4 already using XB1 unique features. If you think that they were only talking about SW features at the time it's better to read/listen their presentation again. "DX11.1+ graphics core" means that XB1's GPU has feature level 11_1+ capabilities and if you think it's not what they were trying to say, you can read the quote from the article again:

The GPU contains AMD graphics technology supporting a customized version of Microsoft DirectX graphics features. Hardware and software customizations provide more direct access to hardware resources than standard DirectX. They reduce CPU overhead to manage graphics activity and combined CPU and GPU processing. Kinect makes extensive use of combined CPU-GPU computation.

It's very clear that those features are supported through some software and hardware customizations.

There are some software DX12 features in the XB1 API, and there has been for a while. This doesn't mean the GPU contains any DX12 hardware.

Where did you read that those DX12 features in XB1 are software only? I'm really interested to know your source.
 
No. Prior to Bonaire, GCN had no feature level 11_2 capabilities, so the first part of your sentence is incorrect and as I said before "+" means additional/unique features to XB1. The question is how those features can be unique to XB1 while they are already presented in GCN? It seems that you are trying to say that millions of GPUs and PS4 already using XB1 unique features. If you think that they were only talking about SW features at the time it's better to read/listen their presentation again. "DX11.1+ graphics core" means that XB1's GPU has feature level 11_1+ capabilities and if you think it's not what they were trying to say, you can read the quote from the article again:



It's very clear that those features are supported through some software and hardware customizations.



Where did you read that those DX12 features in XB1 are software only? I'm really interested to know your source.

DX11.1+ doesn't mean features unique to the XB1 where has there been any indication of that?. Also we know that the XB1 is not a DX12 GPU as they have previously mentioned they are the same generation of GPU as Sony, the customisations no doubt centre more around the additional memory pools and things like having two GFX pipelines to draw quick overlays.

I suggest you read this.

https://forum.beyond3d.com/threads/...ectx-11-2-amd-gpu-whats-the-difference.54781/
 
Last edited:
DX11.1+ doesn't mean features unique to the XB1 where has there been any indication of that?.

I said you here. John Sell said that in Hot Chips conference. Those features are unique to XB1.

Also we know that the XB1 is not a DX12 GPU as they have previously mentioned they are the same generation of GPU as Sony

They didn't said XB1 has no DX12 capability, they said XB1 GPU is based on Sea Islands family:

Just like our friends we're based on the Sea Islands family. We've made quite a number of changes in different parts of the areas.

They started their work from AMD DX11 design as the baseline requirement.

the customisations no doubt centre more around the additional memory pools and things like having two GFX pipelines to draw quick overlays.

You mentioned their customisations to support two independent graphics contexts (that helps with system/title rendering, too) and yet ignoring their comments about highly customised command processors in XB1 (the biggest customisation in XB1) :

we took that opportunity with Xbox One and with our customised command processor we've created extensions on top of D3D which fit very nicely into the D3D model and this is something that we'd like to integrate back into mainline 3D on the PC too - this small, very low-level, very efficient object-orientated submission of your draw [and state] commands.


I did and I don't think that it adds anything new to our discussion.
 
Last edited:
I said you here. John Sell said that in Hot Chips conference. Those features are unique to XB1.



They didn't said XB1 has no DX12 capability, they said XB1 GPU is based on Sea Islands family:



They started their work from AMD DX11 design as the baseline requirement.



You mentioned their customisations to support two independent graphics contexts (that helps with system/title rendering, too) and yet ignoring their comments about highly customised command processors in XB1 (the biggest customisation in XB1) :





I did and I don't think that it adds anything new to our discussion.

https://forum.beyond3d.com/posts/1757241/

Look, computing cores are probably almost identical if not identical for both X1 and PS4 and general feature set is definitely a close match in both cases (multitasking, 3D and compute, tesselation and everything one would expect from a modern GPU). Both are 11.2 for sure. Those pluses usually mean some extra stuff that's not exposed through regular 11.2 interfaces. Stuff like mappable SRAM would be that. Some new texture compression format would be in the "+" space. Custom filtering for, say, shadows' PCF would be in the "+" space. It's not PR, it's "these are 11.2 GPUs with some additional/custom stuff we can't talk too much about". Some changes that happened recently were well grounded in math. If you bump up the core speed, you get extra power. The memory controller case is curious but not impossible - driver code* tends to abuse HW design in many cases. You do this whenever you can to gain some advantage. Programmers have done this for ever - take C64 and sprites on the screen border. This has nothing to do with PR but I'm pretty sure that you wouldn't have a clue how HW is being exploited even if you got all the technical details on a silver platter. If it's PR for you - fine. But being ignorant in the "Console Technology" section is something that won't get you far around here.

* consoles have no drivers, they have thin code layer that mostly hides HW issues from the game developers and exposes nice and familiar interfaces to them

Is pretty relevant
 
f9e71982112735de2103fdc56b1adb73.jpg


PS4 has only one GCP.

8498ac03bc0ff426bfafbb0996d269d2.png

Xbone has 2 GCPs.

right?
 
The PS4 has a second graphics front end. It seems like the exact availability of resources (possibly limited compute capability) and the priorities assigned to that front end might differ from how Microsoft has done it, but public details are sparse.
 
The PS4 has a second graphics front end. It seems like the exact availability of resources (possibly limited compute capability) and the priorities assigned to that front end might differ from how Microsoft has done it, but public details are sparse.

Can you show this second front end for the PS4? What is missing from googoo's picture for the PS4?
 
Can you show this second front end for the PS4? What is missing from googoo's picture for the PS4?

http://www.vgleaks.com/orbis-gpu-compute-queues-and-pipelines

The high-priority 3D pipe is the other front end, reserved for VSHELL--so it's a system-reserved front end. The diagrams do not show a direct link to a CS dispatch box, but some of the other text indicates there could be a reserved pipe for system.
The exact nature of the front ends for Durango is not clear, although one of the more conceptually straightforward interpretations is that there is a similar game/non-game split for them. The relative priorities when it comes to allocating work, and the exact breadth of their access to the full resources of the GPU is unclear, so the two designs may have somewhat similar duplication but for different desired ends.
 
GCP's numb = graphics context' num ?

i think u guys may have some wrong info...
or really someone wrote wrong info at that site.
 
With regards to the comment earlier, even Tahiti class hardware has capabilities not exposed through DX... I'm sure the architects could point out capabilities in the VLIW chips that aren't exposed in DX yet. Just because a rev of DX is released on PC does not mean it is absolutely a direct mapping or superset of an individual IHVs capabilities.
 
Sounds like some api improvements in more recent sdks has given devs a little more flexibility in how they use ESRAM. Interesting that there were restrictions on it in the first place.

http://www.polygon.com/2014/12/30/7471133/xbox-one-sdk-update-esram-dying-light-techland

I mean, it's memory. All you can do with it is read and write. What kind of restrictions could have been in place? Specific api calls that only allowed certain types of data/render targets or format restrictions?
 
I mean, it's memory. All you can do with it is read and write. What kind of restrictions could have been in place? Specific api calls that only allowed certain types of data/render targets or format restrictions?

Per the ESRAM thread, there were ways of using the ESRAM that were not being implemented in the early stages of the generation.

https://forum.beyond3d.com/posts/1614724/

Being able to split a buffer so that parts of it could reside outside of the ESRAM was not projected to be adopted until later software waves, and apparently more flexible streaming of data in and out of the ESRAM was going to come into its own later as well.

Part of that may be that the means to set up split targets and to flexibly move targets in and out as needed required validating the ways of defining these more complex behaviors and tracking of hazards. The more flexible allocation could shave off more significant portions of ESRAM consumption that would worsen capacity pressure, and if asynchronous streaming is implemented, it might allow for better utilization of space by reducing the necessary footprint at a given point in the frame, or by more rapidly freeing up space rather than waiting for more coarse synchronization points. This is likely more difficult to get right than the earlier stages that allowed for static allocation of whole resources and noting which ones could be reused.
 
Since october it is possible to use up to 80% of the 7th cpu core. 50% of the core is always available when no voice recognition is needed. The 30% is dynamically used by the os for voice recognition of os commands.

Well this is a small, but nice improvement.
 
Status
Not open for further replies.
Back
Top