Wii U hardware discussion and investigation *rename

Status
Not open for further replies.
I think all the Gb of memory for the OS means is that Nintendo think the none gaming functions are important.
Say they have a single memory channel (64bit) that's not that much bandwidth to feed the CPU and the textures to the GPU.
If they were to use those 2 GB of memory the effective bandwidth would be "halved". Not in real term but if you have twice the data to move with the same bandwidth, the result is the same.
May be Nintendo decided that 1GB was the sweet spot. Could it be that if they were to say the devs you can access 2GB (or close) of memory but through a straw (64bit bus) it could have ended counter productive?

On top of it nintendo is not MS. Ms does great thing with really few memory in the 360, the 3DS does lesser things with 128MB. MSFT have a hell of an advantage on the matter on both Sony and Nintendo. It's not what bothers me with the system.

If I were speculating on N's decision making process, I'd suggest that their entire strategy is based on being on par with 360/PS3, but offering unique elements to differentiate their product.
Most likely yes.
But there are choices I don't understand, namely why so much EDRAM? I mean 32MB is enough for 1080p +x2 AA and 720p =x4 AA. It seems the gpu doesn't have what it takes to render high profile games at 1080p. On top of it I don't believe that the GPU has the same type of access to the edram the xenos has mostly because it would show already (AA would be free).

So to me it's overkill. half of that or a bit more would have been enough. It would have allow Nintendo to fit 1080p render target for their non demanding games, and it would have gave some room for the existing engines to fit their G-buffer (@720p) and may be some other render target(s).

That's quiet some silicon, 12-16MB of EDRAM, especially when you invest only 12mm^2 in your cpu cores.

Thing is I believe that cutting the amount of edram might have allowed Nintendo to actually out do (even slightly) what the ps360 do within a lesser silicon budget than MS vahalla (especially once you take in account the smart edram) and within a significantly lower power envelope. I could have been quiet an achievement.

The most bothering part in the design for me is the number of cores, those are pretty weak. While lowering the amount of edram they may have pack a couple more cores and I guess it would not have hurt.

I have less concerns with the GPU has even a Caicos class of GPU running at high speed would do RSX/Xenos job provided with enough bandwidth. Still I wonder why Nintendo vouch for R700 hardware instead of the Architecture in Cayman and Trinity. It's simply a bit better for mostly the same (silicon) price.

It makes me wonder about what somebody here said, did Nintendo went with that much edram for the sake of retro compatibility? (they could emulate all the memory pool of the Wii within that space)? If yes it's really a bad sign. They waste silicon for the sake of not developing/buying a proper emulators. There are multiple emulators out there for the GC/Wii, it doesn't require crazy hardware to run far from that).

I hope they don't go away with that crap. They managed thanks to their fans to go away with the 3ds and the mind blowing lack of a second analog stick (not even speaking about sucky proprietary tech...).
They need a hard landing to come back to their sense.

I suspect they want the tablet to behave like a tablet when you're not playing a game, or perhaps even when you are, that means it has to run "apps" and be competitive with real tablets doing so.

Having said that I won't be buying one.
I don't think so but may be they want room, they are not MSFT, memory chips are cheap.
The issue with that is that in game they don't have much CPu cycles to trhough at running back ground tasks :( (yes I go here again trading some edram for more CPU cores would have balanced the system better).
 
Last edited by a moderator:
Yes! But I don’t see the WiiU as a SoC. I see more signs that it’s a SiP.
Possibly even a 2.5 or 3D IC.
Recent statement by Robert Patti from Tezzaron:
But consider this: “Nintendo’s going to build their next-generation box,” said Patti. “They get their graphics processor from TSMC and their game processor from IBM. They are going to stack them together.


What does Tezzaron specialize in?
We specialize in 3D wafer stacking and TSV processes, cutting edge memory products, and wide-ranging collaborations. In 2004 Tezzaron demonstrated the world's first successful wafer-stacked 3D-ICs with TSV,

Now, why would IBM be bragging about their new WiiU cpu if the CPU is not better than
360’s Xenon? Broadway x 3?

Here is what IBM said:
The all-new, Power-based microprocessor will pack some of IBM's most advanced technology into an energy-saving silicon package

Wiki:
TSVs are a high performance technique used to create 3D packages and 3D integrated circuits, compared to alternatives such as package-on-package, because the density of the vias is substantially higher, and because the length of the connections is shorter.

A 3D package (System in Package, Chip Stack MCM, etc.) contains two or more chips (integrated circuits) stacked vertically so that they occupy less space and/or have greater connectivity. An alternate type of 3D package can be found in IBM's Silicon Carrier Packaging Technology, where ICs are not stacked but a carrier substrate containing TSVs is used to connect multiple ICs together in a package.


Patti:
SOCs put a huge burden on processing. If you need flash, DRAM, and a processor on the same chip, you have a fab nightmare." The cost burden is huge, he pointed out

Another advantage is density: Four layers of 45nm circuitry take about the same space as one 22nm device. "By virtually any measure, development cost, fab facility cost, even piece part, 3D wins." Patti said.

A third advantage is power. "If we assume high-k gates, and lower transistor leakage, most of the power is left in the charging and discharging of the wire," Patti said. "Make shorter wires [and] you get lower power. In our memories we make the wires half as long and we get a 40% reduction in per bit power."



If that’s the case, if Nintendo has been putting a lot of development into making a cool running console, how can the TDP be an accurate determinant of the performance power of the console? That’s like saying the original Xbox360 is more powerful than the Xbox360 slim because it uses more watts, isnt it?

xbox-360-power-brick-comparison.jpg


We know that’s not the case,as a matter of fact, the slim can out perform the original.

As a matter of fact, according to http://www.electroiq.com the Vita can be seen as a SiP:
By combining the processor with the different memories in the same package in the Vita, Sony and Toshiba have produced one of the few true system-in-package (SiP) parts that we have seen. And I would call it 3D, even though industry convention is now restricting that term to TSV-based parts – so it’s not 3D, in our current argot.
http://www.electroiq.com/blogs/chip...vita-uses-chip-on-chip-sip-3d-but-not-3d.html

For now that kind of interconnect offers really low bandwidth, and parts have to be really low power too. In the mobile world 12.8 GB/s is a lot of bandwidth.

So if you were to be right the thing would suck even more than most people think it already does. It would be in my opinion terrible.
 
Last edited by a moderator:
Say they have a single memory channel (64bit) that's not that much bandwidth to feed the CPU and the textures to the GPU.
If they were to use those 2 GB of memory the effective bandwidth would be "halved". Not in real term but if you have twice the data to move with the same bandwidth, the result is the same.
That's the sort thinking behind my notion that Nintendo were designing for a 1GB machine, and then maybe found 2GBs was no more expensive so decide to go that route without affecting their game design. Of course, RAM doesn't have to be for immediate assets. They could have more variety storing more assets than the current screen needs. But one thing I've noticed in Nintendo's talk about their software is they like limited expense. I wouldn't be surprised if they chose 1GB based on cost to produce a typical game with that target for quality and variety assets. If they had more RAM, that'd mean more expense. They could throw in more RAM open to devs but cap their content to target only 1GB, but then their flagship first-party titles would look inferior to their rivals. So the only way to maintain parity at a given cost is to lock everyone into the same position. Perhaps that's a conspiracy-theory, but it registers as plausible from what I've seen of Nintendo over the past few years and their references to cost of developing games.

So the 1GB split could be as much just to preserve the game console design they wanted. I'll find it hard to believe otherwise until I have it shown me how the OS is needing 1 GB. My smartphone's fully functional with 196 MBs, and as mentioned in the XB3 rumours discussions, actually consuming lots of RAM for non-gaming functions is very hard outside of productivity software or lots of complex webpages.
 
I think all the Gb of memory for the OS means is that Nintendo think the none gaming functions are important.

If I were speculating on N's decision making process, I'd suggest that their entire strategy is based on being on par with 360/PS3, but offering unique elements to differentiate their product.
I suspect they want the tablet to behave like a tablet when you're not playing a game, or perhaps even when you are, that means it has to run "apps" and be competitive with real tablets doing so.

Having said that I won't be buying one.
That is what I have be thinking from the start. I think they design the system to run ps360 games while running the extra screen. At this point they dont need the extra power. It clear it wont be able to keep up wit the PS4/720 so why even go much over ps360. Since third party game will be design based on those specs of the lead platform ps360.

the problem with the tablet it has to be near the base station. Really make it almost useless to be any kind of tablet replacement.
 
The denial surrounding this console/GPU is unbelievable. Just defies logic.. only matched by speculation of the last Wii. Mainly driven by GAF where stuff gets spinned and copy pasted everywhere to look really great for Wii U comparing some wishful gflop numbers because AMD/Nintendo will never publish them to anyone. Then completely ignoring every other thing that affects GPU performance. There are no specific numbers for a reason: it in no way benefits Nintendo

It was hilarious when Epic made its statement that there is no Wii U support in UE4 SDK and it got spinned as a really good news for Wii U because some dev can port it if they want. Yeah thats really good news right.. like Epic could forbid a studio to do something like that when they have a licensed it. When Epic dosent think its wise to put engine on some console its because it isnt

You can only do so much with 45W average TDP of 75W PSU announced by Iwata and non-binned cheaply made parts. Epic already made a statement: Wii U will not be supported in the UE4 SDK. But of course this also got spinned to look really good thing for Wii U

The main benefit is 1GB ram albeit on similar speeds as current-gen memory. If you just bolted PS4/720 memory bandwidth to Wii U or PS3/360 you would likely get big difference in performance already.
 
Well whatever it ends up being, at least it surely isn't going to be Gamecube 3.0 HD. I hope not anyway. ;)
 
The denial surrounding this console/GPU is unbelievable. Just defies logic.. only matched by speculation of the last Wii. Mainly driven by GAF where stuff gets spinned and copy pasted everywhere to look really great for Wii U comparing some wishful gflop numbers because AMD/Nintendo will never publish them to anyone. Then completely ignoring every other thing that affects GPU performance. There are no specific numbers for a reason: it in no way benefits Nintendo

It was hilarious when Epic made its statement that there is no Wii U support in UE4 SDK and it got spinned as a really good news for Wii U because some dev can port it if they want. Yeah thats really good news right.. like Epic could forbid a studio to do something like that when they have a licensed it. When Epic dosent think its wise to put engine on some console its because it isnt

You can only do so much with 45W average TDP of 75W PSU announced by Iwata and non-binned cheaply made parts. Epic already made a statement: Wii U will not be supported in the UE4 SDK. But of course this also got spinned to look really good thing for Wii U

The main benefit is 1GB ram albeit on similar speeds as current-gen memory. If you just bolted PS4/720 memory bandwidth to Wii U or PS3/360 you would likely get big difference in performance already.

Yep. Even I am a bit surprised at the strength of the Wii U fanboy current. It has exploded just in the last few days with all this info.

The thing is I'm not surprised in one sense. I've always know Wii U will have it's short 4-5 months in the sun. That short time between where it's coming out, or it it's on shelves, but we dont know anything about PS4/720 yet.

But I dont know if it hasn't been a little stronger than I expected already. Gamers are an incredibly fickle, immature bunch (and they will all turn on the Wii U just as quickly when the shiny PS4 comes into view, sadly for it). Maybe helps that a lot of them are 14, lol.
 
That's hardly surprising. Many current gen games are less than 720p, 1080 is more than double that and then there's the tablet.
 
That's hardly surprising. Many current gen games are less than 720p, 1080 is more than double that and then there's the tablet.


The tablet if its only showing 2d images like maps, inventory ect. really doesn't take much GPU power, I think a guy did the calculations and found that it would take 12MFLOPS of GPU power.
 
Good luck BG, Think you going to need it. Maybe time to switch to a new handle if things dont play out well when the wii-u is out and the true specs come out....

One thing i learn if you get some inside info or just info that not out there. It best to keep it to your self. When you go around with third or second hand info it can lead to bad things. Keep ur head up!

Like I said in the prediction thread. I couldn't care less about being right or wrong. And second if it's something that I can share and feel it's worth sharing then I will for the sake of discussion. Like right now I'm trying to learn more about Xbox 3. And if I get something worth sharing I will.

Or they may be keeping a healthy margin for future additions to their services, including what they may not be able to fully foresee. While keeping 1GB reserved for OS services seems extravagant, we need to remember that the reason why you reserve memory space at all is to ensure dependably responsive and glitch free operation. It's a worthy goal. And it still leaves three times more memory for games than the PS360 do.

Nintendo is bad about overcompensating. I see them reducing the amount allocated before utilizing more than whatever they are right now.

The denial surrounding this console/GPU is unbelievable. Just defies logic..

Or... wait for it... some people are so biased (see your sig) that they see things that don't even exist.

Epic already made a statement: Wii U will not be supported in the UE4 SDK. But of course this also got spinned to look really good thing for Wii U

:LOL:

I'd tell you why I'm laughing, but it's a big secret. :yes:
 
Status
Not open for further replies.
Back
Top