Xbox One November SDK Leaked

DieH@rd

Legend
Hacking group H4LT has announced public sharing of the Xbox One's SDK and documentation. They want to spread this SDK to the entire console hacking/homebrew community so that there would not be any bickering and withholding of tools. They have uploaded SDK on sharing sites already
5PnJlMt.jpg


http://www.thetechgame.com/News/sid...kit-leaked-on-twitter-with-documentation.html
 
Last edited:
50-70% of 7th CPU core usable on X1 if title not using Kinect voice recognition according to these docs? http://www.neogaf.com/forum/showpost.php?p=145454380&postcount=10


Edit: Definitely seems legit

If your title does not utilize NUI title speech, we recommend that you enable the 7th processing core in order to gain access to additional processing power and to reduce underlying system overhead from the NUI title speech engine.

Enabling the 7th Processing Core

Access to the 7th core is enabled by setting the XboxSystemResources extension to 'extended' in the application package manifest.

Note that process lifetime management states will continue to operate at current levels. For example, when your application becomes constrained it will be limited to 4 CPU cores. When your application is suspended it will have access to 0 CPU cores. The 7th core is only available when your application is in the full state. As a consequence of disabling NUI title speech, other NUI services such as IR and depth are also disabled. Because of this, titles making use of the 7th core should also set the CoreApplication.DisableKinectGpuReservation property to true in order to make use of the 4.5% NUI GPU reserve.

Windows::ApplicationModel::Core::CoreApplication::DisableKinectGpuReserve = true;
Availability of the 7th Core

Because the 7th core is shared with the underlying system OS, titles will not be able to utilize 100% of the core. Titles are always guaranteed at least 50% of the core and will have at most 80% of the core. The amount of the core that is available will vary based on what is happening in the system at any point in time. For example, when the system must process commands spoken by the user (e.g "Xbox Go To Friends"), it will take up to 50% of the 7th core. After the processing is done, the amount of the core made available to the title will increase again, up to the maximum of 80%."

source: the chm file that was also leaked.

Also a 4.5% GPU reserve number in there.

It seems like a dev would be crazy to use more than 50% of the 7th core given above, but what do I know, maybe they can.

Now I'm wondering what if any bearing this had on that Ubisoft AI presentation that showed CPU numbers that didn't correlate with what we know of X1 vs PS4 (it was more than a 9% advantage for X1). But, the numbers wont match adding a half core either, so maybe it was some oddball combination of things pro and con.

Also

In the default system configuration, titles have access to 6 of the 8 processing cores available on the Xbox One. In order to improve performance a game may choose to forego NUI title speech support in order to gain access to a 7th processing core. The 7th core will be available while the title is in the full state. While using the 7th core, users will still have access to Xbox System voice commands such as "Xbox Snap Achievements" or "Xbox Record That". Access to the 7th core requires the October 2014 XDK or newer.

So October 2014 for a date.
 
Last edited:
50-70% of 7th CPU core usable on X1 if title not using Kinect voice recognition according to these docs? http://www.neogaf.com/forum/showpost.php?p=145454380&postcount=10


Edit: Definitely seems legit



Also a 4.5% GPU reserve number in there.

It seems like a dev would be crazy to use more than 50% of the 7th core given above, but what do I know, maybe they can.

Now I'm wondering what if any bearing this had on that Ubisoft AI presentation that showed CPU numbers that didn't correlate with what we know of X1 vs PS4 (it was more than a 9% advantage for X1). But, the numbers wont match adding a half core either, so maybe it was some oddball combination of things pro and con.

That is almost exactly what I was suspecting. The Ubisoft presentation (the CPU part) could only be explained if XB1 games could use a 7th core compared to only 6 for PS4.

And it's 50 to 80% of the 7th core apparently depending of the conditions, they give vocal command as an example in the doc (Ubisoft most probably didn't use any vocal commands during their aforementioned CPU tests):

For example, when the system must process commands spoken by the user (e.g "Xbox Go To Friends"), it will take up to 50% of the 7th core. After the processing is done, the amount of the core made available to the title will increase again, up to the maximum of 80%.

I also suspect that Sony may want to do similar thing in the future (allocate more CPU resources for the games).
 
That is almost exactly what I was suspecting. The Ubisoft presentation (the CPU part) could only be explained if XB1 games could use a 7th core compared to only 6 for PS4.

The dates dont work (I now realize) though. The Ubi http://twvideo01.ubm-us.net/o1/vault/gdceurope2014/Presentations/828884_Alexis_Vaisse.pdf presentation was in August and this is the October SDK.

The Ubi difference was 15.3% and the X1 clock advantage is 9.4%, so it's almost explained there. In that thread a suggestion was the remainder difference could be down to lower DDR3 latency. So basically I dont know that this extra core is needed to explain it.
 
The dates dont work (I now realize) though. The Ubi http://twvideo01.ubm-us.net/o1/vault/gdceurope2014/Presentations/828884_Alexis_Vaisse.pdf presentation was in August and this is the October SDK.

The Ubi difference was 15.3% and the X1 clock advantage is 9.4%, so it's almost explained there. In that thread a suggestion was the remainder difference could be down to lower DDR3 latency. So basically I dont know that this extra core is needed to explain it.

Yes, you are right. Too bad because 15.3% is roughly 1/6...:yep2:
 
Perhaps it was added before October, or there was a preview that bigwigs at places like the Ubi ass creed studio had access to ...?

Anyway, it's interesting that they seem to describe this as freeing up some BW (edit: presumably main mem?) for the GPU. Perhaps the X1 really is more "balanced" than initial performance seemed to indicate.

Another edit: as bottlenecks shift around many times during each frame, perhaps this cpu headroom has allowed developers more time to be bottleneck on the gpu, and start to push upwards of that 28% flop disadvantage.

Will be good to see what comes of the extra control over esram in the Dec update ...
 
Last edited:
The dates dont work (I now realize) though. The Ubi http://twvideo01.ubm-us.net/o1/vault/gdceurope2014/Presentations/828884_Alexis_Vaisse.pdf presentation was in August and this is the October SDK.

The Ubi difference was 15.3% and the X1 clock advantage is 9.4%, so it's almost explained there. In that thread a suggestion was the remainder difference could be down to lower DDR3 latency. So basically I dont know that this extra core is needed to explain it.
Well MS has hinted that their cores are improved over baseline. The XB1's memory advantage is probably the reason, however. More bandwidth and lower latency for the CPU's memory allocation.
 
True, as well as the lower latency memory access (although due to the higher clocks, in terms of cycles it may not be that much lower) and they have a 30 GB/s bus as opposed to 20 GB/s bus feeding the CPU.

Looking at some more comments about the docs, it now seems that if you ask for that extra half a core, you also then have the option of disable some of the NUI reservation. So a game can now guarantee 6.5 cores and 95% of the GPU.

Next year is looking up for X1 mulitplats.

Does anyone know if one the two unavailable PS4 cores is deactivated for yields, or are they both systems reserved like they are (or were) on Xbox One?
 
Does anyone know if one the two unavailable PS4 cores is deactivated for yields, or are they both systems reserved like they are (or were) on Xbox One?

Same as Xbone, OS uses them. Presentation of KZ:SF around PS4 launch showcased that developers then had access to 6 full CPU cores.

As for yields, PS4 GPU has 20 GPU CUs [same as 7870] but 2 are shut down for yields. GPU is very big, and chance that error will land on GPU is much larger than error hitting CPU.
http://www.extremetech.com/wp-content/uploads/2013/11/ps4-reverse-engineered-apu.jpg
 
Tiled resources was a "preview" feature until October. That's interesting. Directx12 style descriptor tables are also supported since October. The wording of that is interesting because it mentions, being "prepared" for directx12 descriptor tables. That makes it sound like there is still going to be a shift from their dx11.1 based API over to generic dx12.
 
How long does it take a game to go from Gold to market? Just curious when games using the October SDK would have hit the market? A feature like tiled resources being in preview only is interesting, because I'd guess it'll take a while for a game to hit the market with that feature. It seems like something that would have to be planned from the beginning, not added at the end of development.
 
THe x360 had through out the generation a much slimmer OS requirement than the ps3 and now here we are again. I wonder if it will remain that way for this gen also.

A lot of people knock the xbox one having a cpu advantage vs the ps4's gpu advantage but there can be a lot of instances where a faster cpu will help them out.
 
HW accelerated video encode/decode were added as a preview feature for the Monolithic D3D runtime in March 2014? Asynchronous compute added as a preview feature in March 2014? So weird. The console had 2 runtimes, the standard DSD11 runtime and the mono runtime. The mono runtime basically deprecated the "stock" D3D11 runtime in May2014. Really weird. I wonder if most games used stock or mono from release?

April(preview) and May(release) saw "fast semantics", which sounds like the low-level API. April also saw scalar control: bilinear, four tap since, four/six/eight/ten tap Lanczos, six-tap band-limited Lanczos.

The release notes are probably the most interesting part of the leak. I don't know why there are no "What's new" entries from August 2013 to February 2014. They have most months from March 2012 to August 2013.
 
Last edited:
Developers should not change the content update format for an already shipped package, as it will force a full re-download of the entire package.

Ubisoft?

I wonder if most games used stock or mono from release?

I got the impression it was essentially D3D11 used for launch.
 
Last edited:
I got the impression it was essentially D3D11 used for launch.

Yah, looking at the new features for each SDK, the changes from launch are massive. And it's odd to think that with DX12 coming, and rumours of Windows10 on the Xbox One, there will probably be another big change happening. Strange times for Xbox. They basically launched the console when the low-level API was not ready, and the profiling tools were not ready from the look of it either. They've also totally redone the way multiplayer works, if you look at the notes for Multiplayer 2015.
 
Yup, the software side of the Xbox One was definitely very far from ready. As you said, the release notes extremely interesting. There's obviously going to be a big update for DX12 and Win10 this year. Should be really interesting to see how the second big wave of exclusive games will look like compared to the first batch given how dodgy the XDK was at launch (I'm still amazed by how Ryse turned out and Forza 5 considering what we know now...).

BTW I also noted a lot of talk about AA in the XDK especially MSAA...which was BTW put to really great use in Forza Horizon 2.
 
I guess the good with the bad, is that Microsoft is obviously taking the console very seriously, and they're working hard to improve at a rapid pace. The positivity around DX12 maybe has some merit, considering the state of the API on the console as it is now. Looks like they can get things in order, and are working hard at it.

It's kind of crazy to think that at the time Infamous Second Son was released asynchronous compute had just been added to the XDK as a preview feature.
 
Back
Top