Xbox One (Durango) Technical hardware investigation

Status
Not open for further replies.
Dunno buy them and tell us. They are advertised as the faster speeds and jedec doesnt seem to care them using the ddr 3 name and specifications to sell this ram.

Won't this be more expensive? you need specifically binned / special versions of a specific bit of ram that is not usually produced in large quantities?.
 
From my post earlier in this same thread:



On 28nm 32MB of eSRAM would be 33mm² for 6T, and about 6-10mm² for the 1T variant.

Nowhere near 120mm².

It takes more than the sram cells to make the sram memory useful after all you need to actually read and write the sram. I presented an real-life example based on the Jaguar L2 cache which presents a decent comparison to what 32MB of 6T sram might take in die space if the eSram is in fact 6T-SRAM.
 
It depends on yields like everything else. Ddr3 2133 or was made standard in 2009. Its unknown to me what speeds the ddr3 can run at at this point in time outside of a few sets of ram for sale but it imploes that at least 2800 is avalible today. The fab may ne capable of faster speeds. The final speeds would be up to the fabs ability and ms needs. Thats if the year old specs are correct and its even ddr3 ram that is used.
 
What I find very sad is that for many is it easier to cling to hopes and believe in the existence of a "super secret sauce" for Durango that nobody ever saw, rather then accept that Durango is not a bad machine at all.

Nobody ever saw the hardware accelerators? I think bkilian might disagree with you there, seeing as I believe he was the first to actually mention them. Since then we have seen plenty of leaked info on the DME's, display planes, and eSRAM.



The problem is superdae said kotaku we're going to release the new specs and they mearly confirmed what was already known so I don't know why people still bang on about it. The specs are right, Ms either delay and redesign or stick to their original plan and hope their added goodness such as dvr is enough

SuperDaE apparently was just a hacker and never an actual developer. He had access to info from 2012 but it's not totally clear how up to date his other stuff was afaik. It's one thing to say you have info from Jan 2013, but I dunno if he was able to prove as much when leaking to Kotaku.
 
Nobody ever saw the hardware accelerators? I think bkilian might disagree with you there, seeing as I believe he was the first to actually mention them. Since then we have seen plenty of leaked info on the DME's, display planes, and eSRAM.

SuperDaE apparently was just a hacker and never an actual developer. He had access to info from 2012 but it's not totally clear how up to date his other stuff was afaik. It's one thing to say you have info from Jan 2013, but I dunno if he was able to prove as much when leaking to Kotaku.


I said that nobody saw the "super secret sauce".
We have VGleaks, DF, EDGE, Kotaku as reliable sources and yet still at today some say that Durango has "super secret sauce" in store.
Really the idea that MS can re-design Durango is nothing but the "super secret sauce" turned into a "super secret remedy": hopes created at every turn to make Durnago look less bad in this frankly pathetic pre-release console war.
Also is Durango bad in the first place?
EDGE source (which was right about the Sony doubling RAM) said that said that PS4 is slightly more powerful than Durango.
Nothing to really to worry about for me, but that "slightly less powerful " is now turned into "the end of the world" by MS fanboys.

"Durnago has super secret sauce/remedy" is now better accepted than "Durnago is an excellent machine" which is depressing.

As for the hardware accelerators DF said for Durango: "the GPU itself is supplemented by additional task-specific hardware"
They didn't say "Durango has super secret sauce" .
As for SuperDae and Kotaku he said this.
 
Last edited by a moderator:
This board and neogaf started the "secret sauce" stuff. It was all wishful thinking and handwringing in an attempt to justify MS's choice of a lower CU count for their GPU. There are hardware accelerators in both designs. I happen to think there are more of them in the MS console, some having an impressive effective value to the system. But there's no "secret sauce". There's nothing that will magically make a GPU magically perform 50% more calculations a second. What there are are blocks that can reduce the number of calculations the GPU and CPU need to perform for a specific function. The low-power companion chip on the PS4 is an example of this, as is the realtime video encoder/decoder.
 
I said that nobody saw the "super secret sauce".
We have VGleaks, DF, EDGE, Kotaku as reliable sources and yet still at today some say that Durango has "super secret sauce" in store.
Really the idea that MS can re-design Durango is nothing but the "super secret sauce" turned into a "super secret remedy": hopes created at every turn to make Durnago look less bad in this frankly pathetic pre-release console war.
Also is Durango bad in the first place?
EDGE source (which was right about the Sony doubling RAM) said that said that PS4 is slightly more powerful than Durango.
Nothing to really to worry about for me, but that "slightly less powerful " is now turned into "the end of the world" by MS fanboys.

"Durnago has super secret sauce/remedy" is now better accepted than "Durnago is an excellent machine" which is depressing.


As for the hardware accelerators DF said for Durango: "the GPU itself is supplemented by additional task-specific hardware"
They didn't say "Durango has super secret sauce" .
As for SuperDae and Kotaku he said this.

I'm waiting for someone to say it was actually Sony who sent the FBI to bust him bc he was about to release the new and improved Durango specs and Sony knew it would make them look bad plus they wanted revenge for the Orbis leaks.

Someone should make a thread for all the silly rumors that have been passed around on both sides.
 
It takes more than the sram cells to make the sram memory useful after all you need to actually read and write the sram. I presented an real-life example based on the Jaguar L2 cache which presents a decent comparison to what 32MB of 6T sram might take in die space if the eSram is in fact 6T-SRAM.

The read/write part of the SRAM won't constitute 3 times the die area of the memory cells themselves. That's silly.

See here specifically the table comparison of memory cell sizes that clearly show the cell size WITH OVERHEAD.

For 6T-SRAM on a 45nm process node, the cell size is 0.32 mm2/MBit, its 0.52 mm2/MBit with access overhead. So thats at most a 53-60% increase in mm2 for the read/write gubbinz.

If the cell size for just 32MB of 6T-SRAM on a 28nm process is 33mm2, the actual silicon footprint on the die will be around 53-60mm2 at the most. No where near 120mm2. Your analysis is wrong my friend.
 
I said that nobody saw the "super secret sauce".
We have VGleaks, DF, EDGE, Kotaku as reliable sources and yet still at today some say that Durango has "super secret sauce" in store.

That phrasing (secret sauce, wizard jizz) was invented by ppl with an agenda in order to mock the idea that hardware accelerators existed and were designed to improve how adequately Durango could perform in real world applications. They were clearly wrong, as there really does seem to be extra kit that sounds like it could help Durango punch above its weight. How much so is open for speculation and debate.

It is completely reasonable and fair for onlookers to cry foul when everyone else wants to try drawing comparisons between stuff like flops and bandwidth on an apples to apples basis. It sounds like the Durango setup works to reduce the number of operations you'd need to do in order to get the same end result in the first place and the setup of the DME's and eSRAM sound like they do a lot to mix up the bandwidth comparisons.

Nothing to really to worry about for me, but that "slightly less powerful " is now turned into "the end of the world" by MS fanboys.

I don't see anyone acting like you describe here. It's the opposite in fact. I see tons of ppl all over the internet asserting that the PS4 has a gigantic advantage in terms of power across the board pretty much. These ppl look at GPU specs only, ignore the accelerators in Durango (i.e. dismiss them completely out of hand as 'lol! secret sauce durrrrr'), ignore bottlenecks in the PS4 design (ex: you can't feed 32 ROPS with 176GB/s), ignore dev comments on the subject saying the two are MUCH closer than presumed, and then tell us how Kinect ruined Durango's GPU somehow. Ppl pushing ideological pseudo-dogma and/or memes as arguments while cherry picking information and dismissing reasonable tech when it is easy to mock with a magic phrase like that don't deserve to be taken seriously.

"Durnago has super secret sauce/remedy" is now better accepted than "Durnago is an excellent machine" which is depressing.

I think you may be confusing ppl openly mocking the concept of 'secret sauce' all over the internet with ppl actually expecting something radically new to show up. The eSRAM, DME's, display planes and the possibility of being designed to leverage virtual assets in a fundamental way is the 'secret sauce' in the sense that those items work together to make highly efficient use of the architecture as a whole to help it punch above its weight.

EDGE source (which was right about the Sony doubling RAM) said that said that PS4 is slightly more powerful than Durango.

Yup. The fact ppl are eager to say PS4 is dramatically more powerful based on specs they got from a rumor telling them devs say both are roughly on par should tell you something about how these folks aim to cherry pick the information they allow to influence their worldviews here. Every actual insider and/or dev that I have seen to date commenting with vague hints has suggested the two are about the same.
 
It is completely reasonable and fair for onlookers to cry foul when everyone else wants to try drawing comparisons between stuff like flops and bandwidth on an apples to apples basis. It sounds like the Durango setup works to reduce the number of operations you'd need to do in order to get the same end result in the first place and the setup of the DME's and eSRAM sound like they do a lot to mix up the bandwidth comparisons.

I fully intend to pick up the new xbox on day 1 pretty much and have no intention of ever getting a PS4 so I certainly have no agenda against the xbox but I still disagree with what you're saying here.

I don't believe it's been firmly established at all that the esram has any significant benefit for graphics rendering. The developer comments on here have ranged from "very little if any" to "some but probably not a huge amount" (paraphrasing of course). It sounds like it will be beneficial in some measure for compute workloads though which may also translate to gaming performance in the console world.

I still thing the primary driver behind the use of esram is to allow the console to contain 8GB of main RAM cheaply. And thus the DME's are there to minimize as much as possible the disadvantages of having your graphics memory split into two separate pools as opposed to the more efficient unified pool of the PS4.

And other than that, it looks like the additional helper resources in both consoles are pretty even.

So what you're left with is one console have a a slightly high bandwidth, but likely more efficient/easier to use memory system, 50% more shader/texture resources, 100% more fill rate and fewer computation resources dedicated to system level tasks. That to me is a fairly significant advantage. In the real worl though that probably only translates to somewhere between 25-50% additional performance in the best of cases. That's a sufficiently small advantage for most developers and press to refer to the consoles in general terms as roughly even. Baring in mind the comparison points of last gen consoles and WiiU. Hell I'm sure many would describe them as roughly even with high end PC's too. That doesn't mean exactly on par though. The 25-50% advantage will probably be enough to flame the fan wars though then the digital foundry articles start appearing.
 
do you know "overclock"?

It's a console, not a high end PC graphics card. The whole system has been designed for specific thermal limits. If you "overclock" it then your going to end up making the console considerably less reliable and end up costing yourself billions.

Plus what was the last GPU you saw achieve a 50% overclock on standard cooling anyway?
 
I don't see anyone acting like you describe here. It's the opposite in fact. I see tons of ppl all over the internet asserting that the PS4 has a gigantic advantage in terms of power across the board pretty much.
Well that how the Internet works, look at the early days of this gen.
But I think there is a difference this time around I would say that Sony 'fans 'are happy and are not in 'war" mode.
On the other hand the noise surrounding Durango either the deception at the specs or the other way around minimizing the difference between the system (though based on possibly incomplete, inaccurate information, but what else do we have?) is coming from MSFT "fans".

NB I don't really fan and the background surrounding the word but I fail to find a better word, and I don't state it in a negative manner (as irrational bias, etc.).

Back, without definitive information about how the systems work, how much resources are taken for OS, etc. It is really difficult to assess the difference between both systems. I could be worse than what one could guess from the paper, or better.

I've entered W&S mode. A few more months to go before we know the relevant info and specs are indeed not everything.
 
do you know "overclock"?

I don't think the definition of overclocking applies if the manufacturer sets the clock rate.
Overclocking only happens when someone takes a system and forces it beyond the manufacturer's specifications.

Obviously, if Microsoft is the party that raised the clock, that's the spec.
Now, if the clock is so high that it causes malfunctions or system failure, it's just a bad or faulty product.
 
It's a console, not a high end PC graphics card. The whole system has been designed for specific thermal limits. If you "overclock" it then your going to end up making the console considerably less reliable and end up costing yourself billions.


he says "There's nothing that will magically make a GPU magically perform 50% more calculations a second.."

not console GPU, and I don't know why a console gpu can't have an high clock

Plus what was the last GPU you saw achieve a 50% overclock on standard cooling anyway?

the GPU on SOC in my Galaxy nexus, from 1.2 GHz to 1.9 GHz
+58%

and not only mine and not only GNexus's


overclocking CU units it's easier because you have not to OC all the logic, soc, board, only math units.


or you can obtain the same results overclocking by 30% and using more cache on CU (check) and some low latency memory (check)


3dilettante said:
I don't think the definition of overclocking applies if the manufacturer sets the clock rate.
Overclocking only happens when someone takes a system and forces it beyond the manufacturer's specifications.

of course, but this make clear that the "There's nothing that will magically make a GPU magically perform 50% more calculations a second.." statement is incorrect, there's no magic, but reality
 
Status
Not open for further replies.
Back
Top