It is. You need more than just bit cells to build an DRAM array. Someone quoted the numbers for TSMC (with a comparably sized cell). You can obviously easily (more than) double the needed space.1 square mm is 2 mbyte in this case.
16 square mm is 32 mega byte?
in that case 64-128 MByte should be the number with 130 + die size
But somehow I feel that quite unrealistic.
It is. You need more than just bit cells to build an DRAM array. Someone quoted the numbers for TSMC (with a comparably sized cell). You can obviously easily (more than) double the needed space.
Argh, my mistake. Personally I see little benefit in 24bit floats given that comparable consoles work just fine at 16bit. 12bit floats or fixed-point values would be interesting but I'm not sure developers would want to deal with those (but lower-precision arithmetics would fit well with the low-power concept as min10float in DX11.1 should prove).I said that already and asked for other ideas.
Maybe the memory is fast compared to the CPU.
“The performance problem of hardware nowadays is not clock speed but ram latency. Fortunately Nintendo took great efforts to ensure developers can really work around that typical bottleneck on Wii U,” he said.
“They put a lot of thought on how CPU, GPU, caches and memory controllers work together to amplify your code speed...
The developer said bottleneck apply to any hardware but Nintendo’s decisions as regards cache layout, ram latency and ram size prove an effective solution.
The issue with memory bandwidth isn't just the CPU, in fact it's primarily the GPU.
Even with EDRAM the textures have to come from main memory, most of the recent performance captures I've seen for high end GPU's they are limited by texture memory bandwidth, not ALU's, and they have > 10x the bandwidth that the WiiU has.
The EDRAM alleviates the frame buffer bandwidth, and I'll be nice and assume it can be used for intermediate buffers, but it's still going to be an issue.
If the CPU is indeed the 3 enhanced Broadway's @1.6GHz or so, it's going to be an issue for some games regardless of the memory subsystem.
Maybe the memory is fast compared to the CPU.
Or if you listen to the resident devs here, they are notorious for over designing their memory solution.Almost every dev from Namco to Koei to Ninja Theory is pointing the finger of blame squarely at the CPU. I think it's because the low bandwidth memory pool is not an issue. I can't tell you why, but if it was then developers would have said something by now. History Nintendo are known for engineering good efficient memory solutions into their consoles.
I don't think anyone ever considered turning Broadway from in-order to out-of-order execution, it has to be something else
They are not exactely out of order, there is a catch to that, I don't remember exactely, may be something about only the first stages of the pipeline being able to be fed out of order / the tech guys here are going to make you a proper answer, sorry for the fail attempt though those CPU are not out of order as say the celeron in the first xbox.Gekko and Broadway were always out of order.
And the person who started the Broadway rumour has been pretty reliable so far. He's said that's what it's described as in the dev documents.
I wasn't aware of this. Do you have links to old posts? Because everything I've read says the chips are out of order, even PPC 750 docs on IBM's site.They are not exactely out of order, there is a catch to that, I don't remember exactely, may be something about only the first stages of the pipeline being able to be fed out of order / the tech guys here are going to make you a proper answer, sorry for the fail attempt though those CPU are not out of order as say the celeron in the first xbox.
I wasn't aware of this. Do you have links to old posts? Because everything I've read says the chips are out of order, even PPC 750 docs on IBM's site.
“What surprises me with Wii U is that we don’t have many technical problems. It’s really running very well, in fact. We’re not obliged to constantly optimize things. Even on the PS3 and Xbox 360 versions [of Origins], we had some fill-rate issues and things like that. So it’s partly us – we improved the engine – but I think the console is quite powerful. Surprisingly powerful. And there’ a lot of memory. You can really have huge textures, and it’s crazy because sometimes the graphic artist – we built our textures in very high-dentition. They could be used in a movie. Then we compress them, but sometimes they forget to do the compression and it still works! [Laughs] So yeah, it’s quite powerful. It’s hard sometimes when you’re one of the first developers because it’s up to you to come up with solutions to certain problems. But the core elements of the console are surprisingly powerful.
“And because we’re developing for Wii U, we don’t have to worry about cross-platform optimization.
“We can push what the console can do; push it to its limits. And of course, we have a new lighting engine. In fact, the game engine for Origins was mostly just classic sprites in HD, but now we can light them and add shadows and all these things. So there is some technical innovation with the engine itself. “
The issue with memory bandwidth isn't just the CPU, in fact it's primarily the GPU.
Even with EDRAM the textures have to come from main memory, most of the recent performance captures I've seen for high end GPU's they are limited by texture memory bandwidth, not ALU's, and they have > 10x the bandwidth that the WiiU has.
The EDRAM alleviates the frame buffer bandwidth, and I'll be nice and assume it can be used for intermediate buffers, but it's still going to be an issue.
They are not exactely out of order, there is a catch to that, I don't remember exactely, may be something about only the first stages of the pipeline being able to be fed out of order / the tech guys here are going to make you a proper answer, sorry for the fail attempt though those CPU are not out of order as say the celeron in the first xbox.