Xbox One (Durango) Technical hardware investigation

Status
Not open for further replies.
The only semantics are mosys calling their edram implementation 1T SRAM.
No, there was someone else with embedded DRAM calling it eSRAM or ESRAM IIRC. But at the moment I'm happy to consider it standard 6T SRAM, although that's massive for a cache and is going to cost (but probably cheaper than GDDR5).
 
Well, Thuway from Gaf said the current dev kit has a 7970 in it. What's the implication of this? Does dev kit tend to have more powerful gpu than the retail one?
 
Well, Thuway from Gaf said the current dev kit has a 7970 in it. What's the implication of this? Does dev kit tend to have more powerful gpu than the retail one?

I read something similar, but without anything concrete to go on it will have to be treated as more backround noise.
 
Well, Thuway from Gaf said the current dev kit has a 7970 in it. What's the implication of this? Does dev kit tend to have more powerful gpu than the retail one?

He also said beta kits were not really in the hands of 3rd parties. If true what poker game is still going on, and why? Edit: Make that alpha and beta kits

Again if true, a weird game of control the information is being played 2) no console this year or 3) it is all false.
 
He also said beta kits were not really in the hands of 3rd parties. If true what poker game is still going on, and why? Edit: Make that alpha and beta kits

Again if true, a weird game of control the information is being played 2) no console this year or 3) it is all false.

Well we'll see. I will say this, Thuway is MAJOR Sony loyalist over there at NeoGaf. It couldn't have been easy for him to have to post that information. lol.
 
He also said beta kits were not really in the hands of 3rd parties. If true what poker game is still going on, and why? Edit: Make that alpha and beta kits

Again if true, a weird game of control the information is being played 2) no console this year or 3) it is all false.

Probably not a poker game, probably just a reflection of the state of the silicon.
MS has traditionally been rather late providing real devkits, and they may not be doing large runs of early silicon for devkits rather keeping the distribution limited and resolving any bugs before distributing more broadly.
Typically publishers/devs pay for every revision of a devkit separately, so that could also be part of the issue.
If devkits aren't around in quantity in the June/July timeframe I'd worry about a 2013 launch, practically they would have to start QA for disks not long after that.
 
Well we'll see. I will say this, Thuway is MAJOR Sony loyalist over there at NeoGaf. It couldn't have been easy for him to have to post that information. lol.
Has it never occurred to you that he's making it up if he is such a major sony loyalist? I bet nothing would please him more than the cries of downgrade that would ensue.

Still don't know why neogaf is brought up here though read a few threads there and it exactly what you would expect from a gaming forum. Full of fanboys and people messing with them for thier own amusement.
 
Well, Thuway from Gaf said the current dev kit has a 7970 in it. What's the implication of this? Does dev kit tend to have more powerful gpu than the retail one?

Maybe you should post the whole quote, also I remember Proelite saying that the Xbox3 devkit had a 7970 awhile ago.

The current devkits have 7970's inside them. If games were designed using that part, when moving to beta devkits (which btw most third parties don't have yet, and are quite pissed about) they will receive a downgrade of some form.


BTW I am predicating a downgrade based on what developer's have at the moment. Most third parties don't have beta devkits yet and are using uber powerful alpha kits with 7970s. You can make a damn gorgeous looking demo using those kits. The next devkits will break into the stage of optimization which will be a task that will last until the days of release.

Unlike MS, Sony worked in revese. They had hardware that was underpowered and slowly made the kits better as time went on. We are now at a point where the specs are official and any one with half a brain can approximate the results of what you can get.
 
Except, Durango's ESRAM is not intended to be used for the framebuffer like in 360 but as a general purpose scratchpad.

I'm seeing it's 4x 1T ESRAM (which itself is 10-15% bigger than EDRAM on the die)
http://en.wikipedia.org/wiki/1T-SRAM

It might not be intended to be a frame buffer per se. But it seems MS at least at one time may have considered to use it at least a portion of the the gpu cache for tile rendering.

SYSTEM AND METHOD FOR LAYERING USING TILE-BASED RENDERERS
http://appft1.uspto.gov/netacgi/nph...d+layer&RS=((AN/Microsoft+AND+tile)+AND+layer)

Some GPUs support tile-based rendering. Such GPUs may have a fast on-chip memory smaller than the memory used for storing the rendered content (i.e., the image plane), and this on-chip memory may be used to perform certain GPU operations more quickly. Accordingly, in tile-based rendering, content may be rendered in portions, referred to as tiles, such that the GPU may perform operations on each such portion by using the fast memory as part of the rendering process . The content may be rendered one tile at a time, with pixel values being calculated on a per-tile basis. The memory region storing pixel values may be organized as multiple tiles.

Improved content rendering techniques may improve utilization of resources, such as power and memory, in a computing device containing specialized graphics hardware. Techniques include more efficient tile-based rendering of content comprising multiple content layers by optimizing the order in which operations in the rendering process may be performed. Specialized hardware for content rendering, such as a GPU, may be configured to render more than one content layer corresponding to a tile before performing rendering of content corresponding to other tiles. As a result, the number of times pixel values associated with that single tile are brought into memory may be reduced. This may make the overall rendering process more efficient than a conventional approach of rendering the content one content layer at a time, each content layer organized as multiple tiles, which leads to cache thrashing and poor overall performance. A more efficient rendering process may lead to reduced or improved utilization of resources, such as power and memory, which may be beneficial for computing devices (e.g., laptops, mobile phones, devices having a slate form factor, other battery-powered devices, etc.) where such resources are limited.

The inventors have recognized and appreciated that greater utility can be derived from a GPU that supports tile-based rendering if the GPU may be configured to perform, more efficiently, tile-based rendering of content that comprises one or more content layers. In particular, the inventors have recognized that it may be advantageous to render such content one tile at a time, rather than render the content one layer at a time. In the former "layer-then-tile" approach, the content may be rendered one content layer at a time, and each such content layer may be rendered, one tile at a time. On the other hand, in the latter "tile-then-layer" approach, the content may be rendered one tile at a time calculating pixel values associated with each tile may comprise calculating pixel values from multiple content layers.

For example, commands and/or parameters 238 may configure the GPU to perform tile-based rendering using tiles of a particular size. As another example, commands and/or parameters 238 may configure the GPU to perform tile-based rendering of content, which comprises multiple layers, using one of the "layer-before-tile" approach or the "tile-before layer" approach, as previously described above.

It should be recognized that process 300 is illustrative and that many variations of process 300 are possible. For example, in the illustrated embodiment, content to be rendered comprises two content layers. However, this is not a limitation of the present invention as the content may comprise any suitable number of content layers. Accordingly, process 300 may be modified to render any suitable number of content layers (e.g., at least three layers, at least four layers, etc.).

There is nothing to say Durango is slated to make use of this technique but Durango's design at least seems to suitable to employ such a technique. Memory bandwidth is an issue with Durango design and this technique supposely relieves some of the pressure that limited bandwidth can present.

My opinion is that 6T SRAM may be the more optimal choice as fast on chip ram for this type of setup as the on chip may add additional latency to the graphics pipeline that can't be addressed by the gpu.
 
Has it never occurred to you that he's making it up if he is such a major sony loyalist? I bet nothing would please him more than the cries of downgrade that would ensue.

Still don't know why neogaf is brought up here though read a few threads there and it exactly what you would expect from a gaming forum. Full of fanboys and people messing with them for thier own amusement.

I consider that everytime I read something these days, even IN HERE. lol. The idea that someone would troll a particular thread and then sit back and laugh while watching it become an epic monstrosity is not lost on me.

However, I also have to consider that this is not in-keeping with his usual mantra. He wouldn't have bothered posting it if he didn't actually believe it himself. Of course him believing it doesn't make it true either, it's just that he actually has some credibility and he's not given to posting false info.

So as I said before, we'll see.
 
Probably not a poker game, probably just a reflection of the state of the silicon.
MS has traditionally been rather late providing real devkits, and they may not be doing large runs of early silicon for devkits rather keeping the distribution limited and resolving any bugs before distributing more broadly.
Typically publishers/devs pay for every revision of a devkit separately, so that could also be part of the issue.
If devkits aren't around in quantity in the June/July timeframe I'd worry about a 2013 launch, practically they would have to start QA for disks not long after that.

Are timely final dev kits really that relevant for the early unoptimized titles anyway? These consoles are PCs with a custom OS and a few tweaks. Getting early generation games running on a PC dev kit with less performance than the console should give a pretty close simulated environment for early titles.
 
BTW I am predicating a downgrade based on what developer's have at the moment. Most third parties don't have beta devkits yet and are using uber powerful alpha kits with 7970s. You can make a damn gorgeous looking demo using those kits. The next devkits will break into the stage of optimization which will be a task that will last until the days of release.

Unlike MS, Sony worked in revese. They had hardware that was underpowered and slowly made the kits better as time went on. We are now at a point where the specs are official and any one with half a brain can approximate the results of what you can get.

I don't know how alpha->beta->final hardware works, but my logic tell me that final hardware must works, at least, as good as alpha hardware.
 
It might not be intended to be a frame buffer per se. But it seems MS at least at one time may have considered to use it at least a portion of the the gpu cache for tile rendering.

SYSTEM AND METHOD FOR LAYERING USING TILE-BASED RENDERERS
http://appft1.uspto.gov/netacgi/nph...d+layer&RS=((AN/Microsoft+AND+tile)+AND+layer)



There is nothing to say Durango is slated to make use of this technique but Durango's design at least seems to suitable to employ such a technique. Memory bandwidth is an issue with Durango design and this technique supposely relieves some of the pressure that limited bandwidth can present.

My opinion is that 6T SRAM may be the more optimal choice as fast on chip ram for this type of setup as the on chip may add additional latency to the graphics pipeline that can't be addressed by the gpu.
That was a really good read, and it sounds reasonable that Durango might be tile based rendering machine, albeit that line of thought was debunked a month ago or so,well... sort of.

It's a good explanation and it makes sense to me. If the leaked specs were real, it's starting to look like the reason Microsoft went with DDR3 because if they are using TLR to an extent, you wouldn't need as much bandwidth as in the case of a traditional renderer.
 
The specifications of next gen consoles remain unknown, according to Patrick Bach, Battlefield's executive producer. There are a bunch of reasons for this to happen, and my guess is that the real reason is that they don't want to expose the hardware right now, just in case there are changes and they might ditch developers' petitions to send them the final hardware until they are totally sure.

This might also explain why Mark Rein was happy and very surprised with the 8GB GDDR5 announcement for PS4 and was wondering how Microsoft would answer to that.

This is starting to feel pretty awkward if you ask me, an odd beginning for a new generation...

http://www.ubergizmo.com/2013/03/final-ps4-and-next-gen-xbox-hardware-specs-remain-unknown/

I don’t know if anyone has the next-gen hardware to be honest – really. There are versions of it, but does anyone have the final hardware? Do we really know what the final hardware will be? There are specs and alpha hardware, but nobody knows exactly what it will be.”
 
Maybe that can be translated to:

"My bosses are currently ironing out a deal with Microsoft so the safest thing I can do is deny all knowledge"

Everyone knows the final specs of PS4 give or take a few mhz, so I smell BS on his part.
 
The specifications of next gen consoles remain unknown, according to Patrick Bach, Battlefield's executive producer. There are a bunch of reasons for this to happen, and my guess is that the real reason is that they don't want to expose the hardware right now, just in case there are changes and they might ditch developers' petitions to send them the final hardware until they are totally sure.

This might also explain why Mark Rein was happy and very surprised with the 8GB GDDR5 announcement for PS4 and was wondering how Microsoft would answer to that.

This is starting to feel pretty awkward if you ask me, an odd beginning for a new generation...

No kidding, but it'll all be sorted out soon. One way or another.

Anyway, throwing more fuel on the fire. April reveal has been pushed back.

https://twitter.com/thurrott/status/317348357150482432
 
Status
Not open for further replies.
Back
Top