That EDRAM has gotta be pretty decent if its running current multiplats decently.
Not necessarily, it only has to be comparable to the off die bandwidth of 360 ~25GB/s
That EDRAM has gotta be pretty decent if its running current multiplats decently.
It just feels like some of the criticism is kind of overblown. I know Wii U is really underwhelming in terms of processing power but I think if developers had given same effort/time of Wii U port as much as their PS3/360 version, then it should have been at least as good as PS3/360 version if not better.
Isn't matching 8 year old hardware setting the bar pretty low?
I believe the wiiu games will surpass ps360, but it's pretty sad that it's up for debate at this point.
Of course Nintendo didn't max Broadway's frequency... but considering that WiiU is even more pathetic hardware-wise compared to its release date than the Wii ever was, it's obvious that Expresso is even less maxed out than Broadway at 2006.Exophase said:Not necessarily. A shrink can enable improved frequencies by itself, as well as process maturity. Plus you don't know that Wii was maxing out Broadway's frequency headroom.. in fact we pretty much know it wasn't because IBM sold the chip specified up to 1GHz.
But the fact that both sony and microsoft invests billions of dollars to make their chips powerful and scalable while Nintendo spends only thousands of dollars grabbing the cheapest and most inneficient chips available surely has.That makes no sense. A design's age has nothing to do with how well you can shrink it.
Yes, it was something I made up... but because I was being overly optimistic in Nintendo's favour!!!!Nah that's just you making something up. You're claiming less than 40% average density improvement at each full node shrink. That's absolutely meager, especially if you go back that far. You'd be hard pressed to find an example in the industry of such poor scaling.
Because it's not the same to add SMT support over a chip that wasn't designed with that feature in mind, that supporting SMT from the roots of the chip.Obvious how?
You either support SMT or you don't. Obviously support had to be added. That doesn't mean it scales worse. Not just why would it, but how would it? How do you even plausibly justify this claim?
Never underestimate Nintendo when talking about hardware. A less powerful console means less costs on game development, so it wouldn't be strange that they invested in making the console even less powerful than what would be if grabbing generic chips from IGN.On the other hand, AMD designed their GPUs for TSMC so it would be a real design effort to port it elsewhere, so anything else would be a big cost.
We aren't comparing only DDR3 to GDDR3, but Nintendo's cheapest, low-customized DDR3 to higest-quality optimized for insane bandwidth and low latency Microsoft and Sony GDDR3.Your stuff about DDR3 being higher latency than GDDR3 is also totally fabricated.
Ok, that has a lot more sense to me. But taking in consideration that Xbox 360 was a highly advanced design optimized to the limit, this 150ns claim is for sure much less than what Nintendo offers on his new machine. Of course, since the CPU is much slower, in cycles we would be moving on similar figures (lets say 750 to 1000 cycles), but that would be much more time in ns.Exophase said:At 3.2GHz 500 cycles is over 150ns. This isn't the latency of the DRAM in isolation but getting the read back to the CPU over the memory controller.
Well, I rely on the rumours pointing at a 3-core overclocked broadway as the WiiU CPU. If that same person nailed cores and frequency much before they were leaked or confirmed, and considering that die sizes also clearly point to that direction, I think we can speculate considering this as facts.Shifty Geezer said:The CPU is not a 1997 CPU. It may be very close in design, or may be fairly different, but it's not 1997 tech (unlike Wii which was a higher clocked 1997 CPU).
Of course, Wii uses an older, much slower tech. So if those were the numbers for PS3 and 360, I can't even imagine how big they would be (we are speaking of latencies here) on the original Wii.Based on what? XB360/PS3 being more powerful ergo the numbers must be smaller? Wii uses a different RAM tech (1T-SRAM).
So maximum bandwith available for games is only 6.4GB/s then. Thanks, it's even worse than I expected.We have the clocks (DDR3 1600), bus width (64 bit), and bandwidth (12.8 GB/s). And only half is available for games.
You appear to believe older = slower. SRAM is far older than DDR5, so I guess it's far slower...Of course, Wii uses an older, much slower tech.
This perfectly highlights how your biased thinking is affecting your understanding. Read what I wrote in context as a response to a specific question:So maximum bandwith available for games is only 6.4GB/s then. Thanks, it's even worse than I expected.
Honestly, you are completely out of touch with technological discussion, like theizzzeee who is just as clueless in the other direction imagining, Wii U could have a DX 11 APU and a CPU based on Power7/8 technology. Your rational is just noise and you've a lot of learning to do.We have the clocks (DDR3 1600), bus width (64 bit), and bandwidth (12.8 GB/s). And only half is available for games.How much info is out there about 2GB of main memory on Wii U other than it is DDR3 based?
Of course Nintendo didn't max Broadway's frequency... but considering that WiiU is even more pathetic hardware-wise compared to its release date than the Wii ever was, it's obvious that Expresso is even less maxed out than Broadway at 2006.
But the fact that both sony and microsoft invests billions of dollars to make their chips powerful and scalable while Nintendo spends only thousands of dollars grabbing the cheapest and most inneficient chips available surely has.
Yes, it was something I made up... but because I was being overly optimistic in Nintendo's favour!!!!
I mean, I compared the shrink to the one the PS3 had, and I didn't even factor the fact that Sony invested a lot to achieve those results while Nintendo only invested on that granny-gimmik that is it's controller, and not on the chips.
If we try to be realistic, we could assume a shrink from 19 to 16 mm^2 and we would still be really generous in that regard.
Because it's not the same to add SMT support over a chip that wasn't designed with that feature in mind, that supporting SMT from the roots of the chip.
I mean, what will be better:
A recently cooked meal served at the optimal temperature or a meal that has been put on the freezer during years and then warmed up in the microwave?
Never underestimate Nintendo when talking about hardware. A less powerful console means less costs on game development, so it wouldn't be strange that they invested in making the console even less powerful than what would be if grabbing generic chips from IGN.
Nintendo can also customize its chips, and if less power means less costs when developing, those modifications sure are done and not in small quantities I presume.
We aren't comparing only DDR3 to GDDR3, but Nintendo's cheapest, low-customized DDR3 to higest-quality optimized for insane bandwidth and low latency Microsoft and Sony GDDR3.
This is what we have, and I'm really sorry because I consider myself a great Nintendo fan (although not a fanboy, and this is why I have to defend the truth above any company or personal tastes).
Ok, that has a lot more sense to me. But taking in consideration that Xbox 360 was a highly advanced design optimized to the limit, this 150ns claim is for sure much less than what Nintendo offers on his new machine. Of course, since the CPU is much slower, in cycles we would be moving on similar figures (lets say 750 to 1000 cycles), but that would be much more time in ns.
Thanks for your informations, that has helped us a lot on determining how inefficient the WiiU is compared to both PS3 and X360!
Well, I rely on the rumours pointing at a 3-core overclocked broadway as the WiiU CPU. If that same person nailed cores and frequency much before they were leaked or confirmed, and considering that die sizes also clearly point to that direction, I think we can speculate considering this as facts.
It's like when the Wii was confirmed to be a 50% higher clocked Gekko, it wasn't official, but that's what we had.
Of course, Wii uses an older, much slower tech. So if those were the numbers for PS3 and 360, I can't even imagine how big they would be (we are speaking of latencies here) on the original Wii.
Of course, on WiiU they would be a bit better, but still far worse than on PS360.
So maximum bandwith available for games is only 6.4GB/s then. Thanks, it's even worse than I expected.
WiiU having the exact same memory bandwidth than the original Xbox.