Wii U hardware discussion and investigation *rename

Status
Not open for further replies.
Wii U memory can't be GDDR5 because that would mean Nintendo put 32 MB of eDRAM in there just for sake of it and wasted good amount of money and transistor budget for something they don't need.

GDDR5=high GPU bandwidth, no need for eDRAM for frame buffer.
DDR5 = low GPU bandwidth, need eDRAM for frame buffer to workaround low bandwidth of main memory.

Wii U doesn't have GDDR5, and if it had, than that would definitely bring TDP and costs up, and it looks very low as it is now.

I would say that if Nintendo took existing R7xx design and worked on that, that would be R730. Its 128 bit card with DDR3, 480GFLOPs so that would be the ballpark they would go for. Highest TDP is 60W and is manufactured on 55nm process. Shrinking and customizing would bring that power draw to ~30-40W and thats max for Wii U.


All of this sounds very reasonable to me.
 
Wii U memory can't be GDDR5 because that would mean Nintendo put 32 MB of eDRAM in there just for sake of it and wasted good amount of money and transistor budget for something they don't need.

GDDR5=high GPU bandwidth, no need for eDRAM for frame buffer.
DDR3 = low GPU bandwidth, need eDRAM for frame buffer to workaround low bandwidth of main memory.

Wii U doesn't have GDDR5, and if it had, than that would definitely bring TDP and costs up, and it looks very low as it is now.

I would say that if Nintendo took existing R7xx design and worked on that, that would be R730. Its 128 bit card with DDR3, 480GFLOPs so that would be the ballpark they would go for. Highest TDP is 60W and is manufactured on 55nm process. Shrinking and customizing would bring that power draw to ~30-40W and thats max for Wii U.

1. The main function of the eDRAM could be to ease the porting of game from the Wii and Xbox360 to the Wii U. I do agree however that Nintendo has likely gone with DDR3.

2. An embedded RV730 shrunk to 40nm would consume less than 30-40W. The RV730 based E4690 already consumes only 25W at 600Mhz using GDDR3.
Swapping the 512MB of GDDR3 for 1GB of lower voltage DDR3 will probably come out about even power-wise, and shrinking it to 40nm will reduce power consumption, perhaps down to around 18W.

At the lower end of the performance predictions, an APU containing three Broadway-like cores and a RV730-like GPU would likely use less than 25W.
 
Last edited by a moderator:
I have also thought about the possibility that the OS RAM is not just a sectioned off part of the main memory and is a separate, and likely slower, pool.
That'd be a completely unnecessary extra cost. RAM is cheap. It will cost more to add two types of RAM on two buses that add 2GBs of chips. Unless there's some very expensive RAM in there like GDDR5, but with the eDRAM there's no real point to that in this box. I expect 2GBs DDR3 unified.
 
What usage can have the edram (on gpu) other than framebuffer?
Everything. Assuming it's a fully featured eDRAM solution and not limited like Xenos's implementation, it's effectively 32 MBs of RAM with massive amounts of bandwidth, completely flexible for any use.
 
1. The main function of the eDRAM could be to ease the porting of game from the Wii and Xbox360 to the Wii U. I do agree however that Nintendo has likely gone with DDR3.

2. An embedded RV730 shrunk to 40nm would consume less than 30-40W. The RV730 based E4690 already consumes only 25W at 600Mhz using GDDR3.
Swapping the 512MB of GDDR3 for 1GB of lower voltage DDR3 will probably come out about even power-wise, and shrinking it to 40nm will reduce power consumption, perhaps down to around 18W.

At the lower end of the performance predictions, an APU containing three Broadway-like cores and a RV730-like GPU would likely use less than 25W.
1. No, main function of eDRAM is to serve as frame buffer if the memory you are using is low bandwidth one. Thats what DDR3 is and thats what Nintendo is using. No sane company would waste transistors (money) on something they don't need just to make things for developers "wee bit" easier.

Nintendo with more GPU grunt and twice available memory in consoles doesn't have to worry about fast ports from old 360, they need to worry about their development tools.

2. You are not shrinking much from 55nm to 40nm so I wouldn't expect big gains. HD4670 was "nice" card with 8 ROPs, 320 stream processors and max wattage of 60W. You just have to look at Wii U console size to see its perfect match. With console PSU that is rated 75W you are looking at this card. And we already know Nintendo uses Rv7xx derivative card in console so either it is this, or something akin to HD4850. But if they used HD4850 they would have to cut ROPs, use different memory and lower the clocks, but than they would get exact same GPU that Rv730 is.

Nintendo would get twice the performance in comparison with 360, card would use little power and problem of "tiny box" would be irrelevant. So, since I'm betting man, I'll bet on Rv730.
 
Last edited by a moderator:
....Unless there's some very expensive RAM in there like GDDR5, but with the eDRAM there's no real point to that in this box. I expect 2GBs DDR3 unified.


So what you are saying is, because we know there is eDRAM being used,
Nintendo wouldn't use GDDR5? Does it matter how much eDRAM they are using though?
The only company that said they were using is eDRAM is IBM to feed the CPU.

Everyone keeps assuming there is eDRAM for the GPU, but that has not been confirmed.

So the question is, if the GPU is also using eDRAM is it sharing from the same pool as the CPU? Or is this split? Would Nintendo design the console where the CPU has a small amount, and the GPU has a small amount?

If not, why not have the CPU be fed eDRAM and the GPU primarily use 1 GB of GDDR5?


that'd be a completely unnecessary extra cost. RAM is cheap. It will cost more to add two types of RAM on two buses that add 2GBs of chips. Unless there's some very expensive RAM in there like GDDR5, but with the eDRAM there's no real point to that in this box. I expect 2GBs DDR3 unified.

So you speculate 2 gigs of DDR3, plus eDRAM for the CPU and GPU.
That ends up being more than two gigs. Though, IWATA could have been rounding down.


Another question, why would Nintendo need a DSP if its using a GPGPU? I recall in one of the posted videos where the AMD rep says that the embedded GPU can replace the DSP.
 
Last edited by a moderator:
1. No, main function of eDRAM is to serve as frame buffer if the memory you are using is low bandwidth one. Thats what DDR3 is and thats what Nintendo is using. No sane company would waste transistors (money) on something they don't need just to make things for developers "wee bit" easier.

I didnt read any confirmation about DDR3 being used for games, but assuming thats true, Nintendo is a company of developers. I would think, and its been stated by them, they design their console around the needs of their developers.

Isn't Nintendo know for using expensive and exotic memory where they can?
Even the Wii uses the same memory types as the 360.
 
I didnt read any confirmation about DDR3 being used for games, but assuming thats true, Nintendo is a company of developers. I would think, and its been stated by them, they design their console around the needs of their developers.

Isn't Nintendo know for using expensive and exotic memory where they can?
Even the Wii uses the same memory types as the 360.
Look, you either go with DDR3 and eDRAM to workaround low bandwidth of DDR3, or you go with GDDR5 but in that case there is no need for expensive eDRAM when you already have huge bandwidth of GDDR5.

Its 2+2 situation. Wii U has 2GB of RAM, 1GB reserved for games that can be expanded. If they put 1GB GDDR5 in there than they couldn't simply expand on another 1 GB of DDR3 if they cut OS memory needs. Since they can expand and since there is eDRAM in console its safe to say that a chance for GDDR5 to be in is lower than zero.
 
A single GDDR3 chip per console wouldn't have broken the bank at the end of 2006, neither was it the highest speed that was available by then.

Yes, but they also used eDRAM (like the 360) and 1TSRAM.

Both the PS3 and 360 only used 2 types of RAM AFAIR.
but a SD console like the Wii used three types.

Wouldnt the GDDR3 and the eDRAM be enough for the Wii if it was good enough
for the 360? Or, the GDDR3 + 1tsram? Why go for the extra expense?
 
Having 64bit gddr5 would not mean great bandwith to the Wii U, it could lose to the xbox 360.

Another question, why would Nintendo need a DSP if its using a GPGPU? I recall in one of the posted videos where the AMD rep says that the embedded GPU can replace the DSP.

a GPGPU uses more power, is more inefficient and takes away from gaming, if Nintendo used GPGPU for encoding and streaming to the tablet it would be a small disaster and the quality would be lower.
 
Look, you either go with DDR3 and eDRAM to workaround low bandwidth of DDR3, or you go with GDDR5 but in that case there is no need for expensive eDRAM when you already have huge bandwidth of GDDR5.

Its 2+2 situation. Wii U has 2GB of RAM, 1GB reserved for games that can be expanded. If they put 1GB GDDR5 in there than they couldn't simply expand on another 1 GB of DDR3 if they cut OS memory needs. Since they can expand and since there is eDRAM in console its safe to say that a chance for GDDR5 to be in is lower than zero.



Well that brings me to this question.

the OS ram, whatever it is, which CHIP would primarily use it?
The CPU or the GPU? Or does Nintendo put in an additional chip to make use of that?
Or maybe this is handled by an unspoken fourth core?

What if Nintendo wants developers to primarily use the GPGPU to make games with with its own pool of ram. To keep the CPU as free as possible to handle OS programs running in the background with its own pool of ram?

So, something like
GPGPU with one GB of GDDR5 dedicated for games
CPU handling GB of DDR3 for the OS.
Both the CPU and the GPGPU connected by a pool of maybe 32MB of eDRAM?
 
You're missing the whole point of eDRAM, the "e" is abbreviated from embedded (eDRAM = embedded DRAM), it is literally on the chip which is utilizing it, it is not, for any practical purposes, available to other components, it's not a pool that is generally available.

The whole point of embedding it is to leverage very high bandwidth & low latency by having it on the chip itself rather than accessible via an external bus.
 
http://www.cinemablend.com/games/Wi...ble-DirectX-11-Equivalent-Graphics-47126.html
Recently, Unity Technology announced that the Unity Engine will be supported by Nintendo and the Wii U across the globe, opening up development via the indie-friendly engine for developers both big and small.

In a pre-briefing interview with Helgason before the press announcement went live, Gaming Blend had the opportunity to ask a few questions about the jump from mobile, PC and current-gen consoles to the first next-gen console, and whether developers would be able to make use of all of Unity's latest high-end technology on Nintendo's newest console, including the ability to make use of Unity 4's DirectX 11 equivalent features and shaders. Helgason replied with the following...
Yeah. We'll do a -- we'll make it potentially possible to do.

What's interesting is that our philosophy is always this: We have a match work flow and I'm sure we can make a decent game and prototype, and they're fun. And then we have a shared system that basically allows you to access the full capabilities of the hardware you run. That's going to be good whether you're running [software] on an iPhone, the Wii U, a gaming PC or whatever.
 
(eDRAM = embedded DRAM), it is literally on the chip which is utilizing it, it is not, for any practical purposes, available to other components, it's not a pool that is generally available.
That's true... if all the components are on separate chips. If you have a SOC (all components on the same chip), the eDRAM can be made available for all the SOC components (if it brings some benefits to the system).
 
That's true... if all the components are on separate chips. If you have a SOC (all components on the same chip), the eDRAM can be made available for all the SOC components (if it brings some benefits to the system).

Yeah, I didn't want to throw too much into the mix, better for understanding to come in small bites :p

Besides, if I'd gone whole hog on the SoC explantion next you know 'other' forums would be posting that Wii U now uses a Power7-based SoC with onboard HD9000 GPU with 32MB eDRAM onboard. ;)
 
So what you are saying is, because we know there is eDRAM being used,
Nintendo wouldn't use GDDR5? Does it matter how much eDRAM they are using though?
The only company that said they were using is eDRAM is IBM to feed the CPU.

Everyone keeps assuming there is eDRAM for the GPU, but that has not been confirmed.
Nothing had been confirmed up to this point other than 2 GBs RAM. ;) The eDRAM amount is a leak, but eDRAM for the CPU makes zero sense. That's for caching large datasets for unpredictable access patterns that can't be steamed/cached effectively. That's virtually no use for a console CPU. Whereas eDRAM for the GPU makes a lot of sense. Hence the assumptions, as the alternative is poor engineering. And as such, GDDR5 is an unnecessary cost. If the rumours are wrong are the eDRAM is a small amount for CPU cache (making it's advertising as a feature completely pointless), the GDDR5 for VRAM makes sense, at which point maybe a separate system RAM pool makes sense, as you say.

So you speculate 2 gigs of DDR3, plus eDRAM for the CPU and GPU.
That ends up being more than two gigs. Though, IWATA could have been rounding down.
When specifying system RAM amounts, you typically only mention the major RAM pools and not the various caches and buffers. It would be very unconventional to say, "2.01 GBs" or "2.032" GBs, not least because we don't have a representation for mixing base-two 'gigas' and 'megas'. I suppose they could say 2080 MBs of RAM (2048 MBs RAM + 32 MBs eDRAM) or whatever.

Another question, why would Nintendo need a DSP if its using a GPGPU? I recall in one of the posted videos where the AMD rep says that the embedded GPU can replace the DSP.
Efficiency. Every moment you have the GPU working on something not graphics, you eat into graphics time. As a DSP can do audio work very efficiently, you get more bang-per-buck mixing processor types.

Yes, but they also used eDRAM (like the 360) and 1TSRAM.

Both the PS3 and 360 only used 2 types of RAM AFAIR.
but a SD console like the Wii used three types.
Wii only really uses eDRAM because it's a GC. That is to say, an alternative Wii based on something like an ATi 9800 may have featured just a unified system RAM and no eDRAM, or split pools like PS3 and PC.

Wouldnt the GDDR3 and the eDRAM be enough for the Wii if it was good enough
for the 360? Or, the GDDR3 + 1tsram? Why go for the extra expense?
BC. And it's not like Wii was expensive hardware. ;)
 
Status
Not open for further replies.
Back
Top