Xbox One (Durango) Technical hardware investigation

Status
Not open for further replies.
For the last 4+ yrs MS has been re-writing the .NET stack , making it more modern and using more modern developer patterns.

The same goes with WinRT, it however had the luxury of being born in the 'modern' age and so was designed from ground up with modern developer patterns..

The same I believe is going on with DirectX, it too was probably going thru a 'modern' rewrite based on modern HW and modern developer patterns.. And because so many MS products need DirectX I bet they also influenced a 'Modern' directX.

We had this discussion many times over the last few years at MVP summit and //Build/ with various MS folk.. DirectX is too core and cruicial for MS and it desperately needs some rethinking of some of its libraries, and yes I too agree that it may introduce 1 or 2 new libraries for the modern age :)
 
I do not agree with your statements liquidboy. I would not say that MS has been re-writing the .NET stack. They have been evolving it. That is a substantial difference in meaning.
 
I was always under the assumption you didn't really use DirectX under Xbox 360 or Xbox One. You just used their similar but different low-level libraries specifically designed for the respective hardware. Have you ever seen anything in DirectX(used in Windows) that has specific support for consoles?

Tommy McClain

I've never developed on Xbox hardware, but from what I understand the API's are set up such that they superficially resemble the PC API's on a code level. This lets you code the same way you would on PC (and perhaps even use the same exact code written for PC), and it will compile and run on the Xbox. However on the Xbox they also expose lower-level details and interfaces that aren't available on PC, so that you can access hardware-specific functionality.

Anyway, the point is this: if they intend for DX12 to be the new de-facto graphics API for Windows Desktop and Windows Store development, it might make sense for them to make the API work (in the same superficial manner) in order to allow developers to quickly port games and apps. But it depends of course on what exactly is planned for DX12, and what sort of hardware it will support.
 
I do not agree with your statements liquidboy. I would not say that MS has been re-writing the .NET stack. They have been evolving it. That is a substantial difference in meaning.

not from what I've seen... big chunks of .NET code are now native, full BCL re-writes .. Death of the massive all encompassing mscorlib/system dlls etc. and a move to a more agile nugetable libraries ...

I started seeing this a while back with SL (Silverlight) as it moved from desktop into Windows Phone .. And I've seen it in WinRT (heavily influenced by SL) and .NET ... (recently with with System.Web) ...

Evoloving or full blown re-writes .... I would say the later in a lot of cases!

MS have shown over the last 4+yrs that they are not afraid to break away from the past with libraries that are over a decade old and completely rethink them, even re-write them.

Question for me is are they doing the same with DirectX, I would like to believe they are .... time will tell..
 
Last edited by a moderator:
not from what I've seen... big chunks of .NET code are now native, full BCL re-writes .. Death of the massive all encompassing mscorlib/system dlls etc. and a move to a more agile nugetable libraries ...

That's efficiency improvements and maintenance, delivery, deployment, and packaging sugar. NuGet sure is nice, as is the more periodic release schedule for .NET internals instead of everything being tied to a single once a year enterprise-like release. From a developer point of view, .NET usage remains the same without major rewrites. The core principals remain the same. But I see what you mean now, you were talking about what's behind the curtain.

When people say major rewrite, I think of APIs and how everything plugs together. I was thinking more along the giant cluster-fucking that Java core libraries are with their Java IO, New IO, and New IO 2 packages. I'm sure they're working on a New IO 3 as well. Or the completely broken implementation of Thread stop/start/pause management since Java 1.1 or Java AWT 1.0 to 1.1 or even Java SWING. :devilish:
 
That's efficiency improvements and maintenance, delivery, deployment, and packaging sugar. NuGet sure is nice, as is the more periodic release schedule for .NET internals instead of everything being tied to a single once a year enterprise-like release. From a developer point of view, .NET usage remains the same without major rewrites. The core principals remain the same. But I see what you mean now, you were talking about what's behind the curtain.

When people say major rewrite, I think of APIs and how everything plugs together. I was thinking more along the giant cluster-fucking that Java core libraries are with their Java IO, New IO, and New IO 2 packages. I'm sure they're working on a New IO 3 as well. Or the completely broken implementation of Thread stop/start/pause management since Java 1.1 or Java AWT 1.0 to 1.1 or even Java SWING. :devilish:

I know what you mean about Java ...

thou I really do like what Oracle/Java are doing with HSA (http://regmedia.co.uk/2013/08/23/hsa_roadmap_large.jpg) ...

That really interests me and may make me get back into java, been over 6yrs since I wrote java code :(, but I have been keeping up with its advances and api/libs
 
Last edited by a moderator:
physically if the second esram is to be reserved, it wouldn't be placed so far away from the main eSRAM, heck we shouldn't even identify apart from area counting it if they were to be reserved.
 
Well there's a few interesting things. It seems the Kinect GPU clawback isn't in yet

It's known that Microsoft is attempting to free up precious graphics resources. Last year, the Xbox One architects told us that the GPU "time-slice" - the amount of processing time reserved by the operating system for elements like Kinect - would be made available to game developers. Respawn confirms that this hasn't happened yet.


"They were talking about having it available for launch and I think there were some issues for how it was going to work," Baker tells us. "It's not available for launch but we're definitely going to take advantage of that if they give that as an option. And the plan that is they will make that an option, so when it's visible we'll enable it for our game and we should be able to crank up resolution proportionally."
 
I felt this article was appropriate for this thread, I could be wrong though, maybe needs to be moved.

Talks about how Respawn approached building Titanfall for X1, and some interesting tidbits on ESRAM, Core Usage, the reserve GPU in kinect.

http://www.eurogamer.net/articles/digitalfoundry-2014-titanfall-tech-interview


Seems strange to me that they'd be able to consider an increase to 1080p if they deactivate MSAA, or increase to 900p with FXAA. Surely 2x MSAA isn't equivalent to a 2.2x resolution jump??

Edit: probably closer to 2x resolution, I forgot the game is 792p.
 
Seems strange to me that they'd be able to consider an increase to 1080p if they deactivate MSAA, or increase to 900p with FXAA. Surely 2x MSAA isn't equivalent to a 2.2x resolution jump??

Edit: probably closer to 2x resolution, I forgot the game is 792p.

It should be nowhere near the cost since shader/tex amount is per pixel, not per subsample (that's the point of MSAA vs super sampling). Also, MSAA compression helps a bit with bandwidth consumption.

I do wonder if they're toying with lower resolution alpha effects. A lot of the framerate drops are with the Titan's firing rockets everywhere.

But who knows.
 
Also Respawn kinda damns the ESRAM with faint praise imo

Titanfall's '792p' resolution and frame-rate dips call into question - once again - the power of the Xbox One's hardware architecture. The GPU - based on AMD's Bonaire part - seems to be falling well short of the sort of performance we would expect from the architecture, making us wonder if the custom ESRAM is proving to be more of a problem than it should be.

"We're just trying to put as much trust in it as we can. It's an extra thing you have to manage but if it makes the game run better it's better than not having it, obviously," says Baker. "We've iterated on it - whether the shadow maps should be in there, whether it's better to have more render targets, whether some of the textures we use a lot should be in there. We played with it a lot and it definitely helps performance so if it wasn't there, it would be bad [laughs]."

I dont know, hardly a ringing endorsement. I do think those that make the One architecture sing in later years will definitely be the ones that crack the ESRAM.
 
Status
Not open for further replies.
Back
Top