They could stick a bit more esram in there.
I'll go for slightly higher clocks all round if anything.
They could... stick... a bit more esram in there?
They could stick a bit more esram in there.
I'll go for slightly higher clocks all round if anything.
Not sure where to post that, I would have gone with the now defunct "predict the next generation etc." thread, but reading article about Haswell and the embedded eDRAM included with the GT3e version of the chip, I can't help but fell like somehow MSFT (but it applies to Sony too) fell on the wrong side of the technology curve.
They're on the "don't have Intel's engineering or process tech" curve, and they weren't going to spend the billions of dollars Intel invests in its 22nm process and fabs.
I think it was doable by involving IBM in the project, it seems that they were willing to lend a contract (on top of Nintendo) in the console realm. I often read that they have extra fab capacity and even without managing to sell a POWER based CPU they may have been OK to make some money out of those fabs and get some extra money (R&D) to participate in the project.What was doable? An IBM 32nm SRAM sized to 128MB, or 32nm eDRAM that is best known for being used for chips that cost as much as a small car?
I do agree up coming memory technologies could be game changers not too mention the possible jump "over" the ~22nm node to finer lithography+finfet.They fell on exactly the only spot they could given the time-frame they're launching in. They picked the absolutely worst time window to launch in (2013). Significantly greater technology is so close on the horizon; I would have preferred a system launch in mid-2015.
So we're back to debating completely unsubstantiated spec-bump hypotheses?
If the release something above what vgleaks suggests is that a spec bump or were they just wrong?
heh, it wasn't long ago most people were attacking the rumored specs. "We know nothing, they're just rumors"
Now they're canonized I guess.
Guess that depends how closely the majority of the system tracks. If say, just the clocks or ram quantity are different, it suggests a bump.
Ok - another one from me :
You remember Master_Jo who was banned in the 6-Months-Delay thread?
He was banned because he added "This is not the full story" to Thuways leak of heating problems with devkits.
He contacted me after his ban on Twitter and elaborated on his statement.
It's basically that he works for a big partner (nothing gaming related) of Microsoft and has heard from MS employees, that there are indeed heating issues but that the reason of this is an upgrade of the specs.
I don't know what to make out of this but I guess a mod could confirm if he is really working for an MS partner by the mail address he used for registration.
Haswell should do quiet well especially taking in account the fact that power consumption was a really strong concerned during the design (25 Watts for the GT3e BGA version?). Imo Intel GPU are under estimated, I read again realworldtech article about both Sandy and Ivy Bridge GPU and I think that they have pretty awesome tech in their hands. I find the way that the GPU can act either as SIMD2x4 or SIMD8 or SIMD16 and deal with different workloads efficiently or the whole memory hierarchy/register files andhow the shader cores communicates with it and/or the fixed function hardware neat.I guess you can afford to be more economical in your approach when there isn't much pressure on the graphics performance of your design.
Truck load of salt aside, looking at timeline and the issue being overheating, I would think of a beefy overclocking instead of more profound rework of the design.Here's another from the "truckload of salt" dept, from ekim at GAF. I'm not sure what ekim does, but I want to say he's vaguely connected to the industry somehow, or he at least knows his tech stuff/does programming. Anyway he's one of the heavyweights in the spec threads.
http://www.neogaf.com/forum/showpost.php?p=56750682&postcount=1906
But yeah, that's about as hearsay as it gets, kind of debatable whether even worth posting. But it could give another possible angle on things.
THe rumours from vgleaks are not canonized at all, but they are far more likely considering the PS4 specs turned out to be very accurate. Those rumours are at least a good starting point. If we're going to play the "what if?" game, then there's a million things we could talk about, mostly without reason.
I'm not going to say they're totally air tight but faking the documentation we got so far would have been one hell of a lot of work. Far beyond what you see with most leaks which are either rumors that got mangled by several layers of poor communication or someone who doesn't know anything looking for attention. If they're fake they were probably deliberately set loose by professionals, maybe MS themselves (although I have no idea why this would happen except maybe to throw Sony off)
I don't think I've ever seen something like that happen before.
Is it possible a SoC inside another SoC? something I have read...
Says who, and even if it were offered, why exactly would they have more to offer?I think it was doable by involving IBM in the project, it seems that they were willing to lend a contract (on top of Nintendo) in the console realm.
Higher R&D for the on-die SRAM pool is a one-time cost, and pretty much everybody knows how to handle SRAM on-die. SRAM can be engineered for high redundancy, so yield impact is not as bad as raw die area would suggest. There's just one chip on the package, and the SRAM will shrink with any node transition.I would not have been free for sure, that is an R&D expense, then you have the cost of the chip by it-self but I don't think that a +/- 300mm^2 chip on TSMC 28nm process, and a 256 bit bus and all the R&D expense associated to the eSRAM+move engine came for free.
This is correct, IBM has a yield tolerance that is buffered by selling its chips for that much money and selling services for more money than they'd get selling my organs on the black market. Microsoft wants an affordable component.I'm not sure it is fair to compare the price IBM sells its CPU (top of the line at what they do, more than often backed with proprietary softwares ) with the price they could have sell a eDRAM chip in a sector they are not competing for making some money out how what seems not that busy foundries (though not unprofitable but extra money does hurt either).
Are you certain they'd want to use IBM's process?Anyway just wondering, the solution may prove less performant, IBM may have been unwilling to let either Sony or MSFT access its 32nm process (without at least securing a POWER based CPU in next gen consoles), etc.
So.... here's a conspiracy theory... (my favorite!)
Maybe AMD had surprise power issues with the final silicon, and MS had to redesign the box with a better heat sink, causing a delay. When Sony announced the PS4, MS people joked something like "they didn't show the box, I wonder why?". If MS had power consumption problems on their SoC, it's pretty sure Sony did too, and they obviously knew that, because the designs are so similar. I.e. both companies suffered a similar delay, but reacted differently to it.
Does that really make sense to anybody for the vgleaks Durango specs? The 1Ghz 7770/7790 TDPs are around 80/85W and 7750 at 55W I would assume Durango to be less than 100W TDP.
there are indeed heating issues but that the reason of this is an upgrade of the specs.