Wii U hardware discussion and investigation *rename

Status
Not open for further replies.
You can find a 2GB DDR3 1333MHz stick for $10 in retail, so I doubt the price at millions of units would be more than that. A safest bet would be in the $6, but $10 is the absolute maximum, unless they're going with much faster DDR3.

Ok, I was just basing it of the chart. I also expect them to atleast use faster DDR3, if they go this route.
 
I think it would be fun to see what people's lowball expectations for the WiiU are. At the very outside I think it might be possible for Nintendo to get away with a SoC with something like 160 shaders, 12 TMUs and 8 ROPS - at something like 700 mHz that would fit the rumour of "fewer shaders" while still giving you about the same minimum performance. I'm hoping for something better though.

12 TMUs would make more sense with 240 shaders, like, half a "turks" GPU, I think their "Caicos" GPU have 160 shaders but only 8 TMUs (that's with 67mm2 die size), well, at least on the PCs this tiny GPU seem to perform better than the 2005 GPUs on newer games,
 
12 TMUs would make more sense with 240 shaders, like, half a "turks" GPU, I think their "Caicos" GPU have 160 shaders but only 8 TMUs (that's with 67mm2 die size), well, at least on the PCs this tiny GPU seem to perform better than the 2005 GPUs on newer games,

You guys need to wake up..your being tricked!! a console can't be made to be less powerfull than 360 in 2012..its impossible.:p

Seriously..12 TMU's??....RSX had what? 24??...Xenos 16?? seriously?
160-240 shaders??..no chance, how would that system compare to an S4 Pro? Adreno 320...4x Kraits @ 2.5ghz??...don't get me wrong..i don't think that would beat the snapdragon..but not by a million miles it wouldn't...

If Ninty was seriously going to be this stingy..they could have got Nvidia to make them a custom Tegra 4 layout or something...Even that S4 Pro could be overclocked....i don't believe the rumours..its nonsense for 2012.
 
Perhaps I'm looking at the 4cm fan and 45nm things a bit too hard, but given just how low I think Nintendo need the BoM to be and how limited the cooling will be the be I'm just not expecting that much. I think it can get into the same ballpark as the 360 while drawing substantially less power because they should be able to save loads of power on the CPU by clocking lower.
-------------
BTW I'm expecting lower peak flops from the WiiU CPU, but that it's more resistant to code that makes the 360 bum out (OoOE, bigger L2, possibly higher L2 bandwidth, lower latency memory access etc). Super optimised 360 stuff might pose a problem though I guess.
-------------
At the time I thought the tech demos looked nice but couldn't see what was "next level" about them. Screen grabs showed that Link in the Zelda demo was actually really simple and low detail - way, way below a Mass Effect character, for instance. It's also possible that early dev kits may have been more powerful in some ways than the final system. As with the Xbox 360, perhaps final clocks won't quite live up to early expectations.
-------------
Maybe it won't be as easy to just throw a PS360 game at the WiiU and have it gobble it up as people were expecting, but with sufficient work the ports will probably be fine. I guess that could give put off some publishers that aren't expecting great market penetration with the hardcore.
-------------
I think it would be fun to see what people's lowball expectations for the WiiU are. At the very outside I think it might be possible for Nintendo to get away with a SoC with something like 160 shaders, 12 TMUs and 8 ROPS - at something like 700 mHz that would fit the rumour of "fewer shaders" while still giving you about the same minimum performance. I'm hoping for something better though.
I think that you are right (after reading the rumors about Nintendo being out for costs reduction).

For the GPU I pretty much agreed with Rangers that a down clocked RV730 would do the trick.
Actually more half a RV740 than a RV730 as the latter is a bit "strange" (8 40 wide SIMD vs the usual 80, with a lot of texturing power vs latter architecture).

I would discard GDDR5 as it is 3 times more expansive than DDR3, at least in large quantity. So edram may have a role to play in rendering (more on that latter).

Overall I would agree with you 2 SIMD may be enough still if it's a SOC and assuming low power CPU cores (4 power a2 are ~9Watts and CPU issued from IBM embedded parts should be in the same ball park or lower) I would think that 700MHz is too high. Low power llano SKU (using better process) are clocked in the 450MHz range.

But overall along with the fact that specs seems to be a moving target I'm close to dismissing a SoC all together. SoC are complex design there is not playing with the specs that much.

Accordingly to AMD own numbers the HD6450 has BOM of 38$ (I believe that the version with 512 MB of DDR3). It's a tiny chip 67 sq. mm

I can't see Nintendo going with a chip (for me 45nm one / speaking of the CPU) big enough so it could embark enough edram for make it relevant for rendering. Not to mention to complication of having the the GPU to access this edram.

As I see it the CPU chip could cost as much if not more than the GPU (I know that AMD price is for the whole GPU including RAM). At this point I would assume that the cpu will be less than 100 sq. mm too. I would (again) discard edram as being relevant to rendering.

So putting altogether, I could definitely see Nintendo shipping something like this:
CPU:
Tri cores from the embedded line.
OoO, no SMT
2MB of edram (the L2 cache basically)
CPU speed could be set anywhere between say 1.5 and 2Ghz
64bit bus to either 512MB of ram.
RAM would be DDR3, clock speed anywhere between 533 and 800 MHz
Bandwidth to the main ram would be anywhere between 8.5GB/s and 12.8 GB/s
between 10 and 15 watts

(I'm using the info from this page because I believe that if that kind of budget ram is relevant to AMD low end gpu it is to Nintendo too).


GPU:

Caicos/ HD 6450 / HD 6400M
160 Stream Processing Units
8 Texture Units
16 Z/Stencil ROP Units
4 Color ROP Units.
64bit bus to 256MB of VRAM
RAM would be GDDR5, clock speed 800Mhz.
Bandwidth to the VRAM would be 25.6GB/s
I'm using AMD own data (the same as above) and I chose the slowest option for the gddr5 (for price and mem controller power consumption).
Then there is the clock speed. Anand gives the hd6450 TDP @750Mhz with 900MHz GDDR5 at 27Watts. It's too high. So I already cut the ram speed I would also cut the chip clock speed.
I may put the clock around 500Mhz (it could end lower).
I would not be surpised if the power budget for the GPU is in between 15 and 20 Watts. At 650 MHz the hd5450 is almost 20Watts and that's with DDR3. So taking in account the power cost of GDDR5, sadly ~500Mhz sounds right. That's 160 MFLOPS

Both chips could be connected by a pci express x8 link.
------------------------

That's pretty much it, Edram is too complicated for rendering especially as the wiiumote ups the requirement for the framebuffer (two of them screen and WiiU).

I know fans will try to kill me but I believe that it makes a lot of sense, 256MB may make up for the higher cost of gddr5 vs ddr3 and Nintendo needs that bandwidth.
512MB freed from the FB requirement should be an improvement over the ps3.
The system would consist of two tiny and cool chips, max tdp would be 35Watts. Passive cooling and a unique fan in the box for the power supply, the chips, etc. should do the trick.
The cost using numbers given by AMD and extrapolating for the CPU north of 80$ including the memory chips.

Now I'm ready to face the fans hatred :)

EDIT
For the sake of beyond3 article I may push the number of SIMD to three put it would be "much ado about nothing" to go custom vs off the selves part for so low gains.
 
Last edited by a moderator:
Target specs.
Sorry but which target specs?
Because wrt the upcoming systems the only one thing we know for sure is that WiiU CPU might be a tri-core that it is made by IBM and includes edram.
The rest (for Nintendo, SOny, Msft) is plain speculation and rumors for now.
 
Oh yes from which reliable source? the same(s) that touted that the GPU would spurt 800 stream processors?

If this refers to me, that was me basing that on what the dev kit started out with. The 1-1.5GB is from Nintendo's actual target specs.

Also it will have 3MB of L2 cache split asymmetrically among the cores and 32MB of eDRAM capable of 720p w/ 4x MSAA or 1080 in a single pass.

But yeah what you're saying spec-wise is very off.
 
If this refers to me, that was me basing that on what the dev kit started out with. The 1-1.5GB is from Nintendo's actual target specs.

Also it will have 3MB of L2 cache split asymmetrically among the cores and 32MB of eDRAM capable of 720p w/ 4x MSAA or 1080 in a single pass.

But yeah what you're saying spec-wise is very off.
Really? Seeing the trend is generally moving towards deferred rendering that won't be enough for 4xMSAA.

Where did you even get those specs? From stuff showed at E3 nothing had any AA and all of them were 720p native, not 1080p.
 
If this refers to me, that was me basing that on what the dev kit started out with. The 1-1.5GB is from Nintendo's actual target specs.

Also it will have 3MB of L2 cache split asymmetrically among the cores and 32MB of eDRAM capable of 720p w/ 4x MSAA or 1080 in a single pass.

But yeah what you're saying spec-wise is very off.
I was not particulary refering to you hence the (s) at the end of same ;)

I would be surprised if that turns out true.

3MB of cache is not an issue with edram. But I don't get the split asymmetrically thing. If there is more than 3 cores it could make sense but still be weird.
IBM stated that there is edram ion the WiiU CPU that's an official statement.

32MB of edram I don't byte in it either. 1) It would be costly 2) it's a flat out bad decision vs more raw power (especially with their own late comments on the matter) 3) 32MB is perfect for x4 AA @720p too bad there is also the WiiU screen. All this sounds like made out numbers that rings well to me at least. The 360 came with 10MB that was 'odd' vs either 720 and 1080 rendering requirement (for forward rendering). Like Pettrucci song "mind carrying the solo" I think "production cost carrying the design" not nice figures.

Another point is with the kind of power the WiiU may end with whether it's just below, on par or just above the ps360 I'm not sure that 1 or 1.5 GB would make a difference just cost more.
A 256MB increase in main RAM vs the PS3 may do the trick as far as convenience is concerned for the devs.

I don't byte in it, sorry. We should know soon. I would happily be proved wrong as my wife kind of green light the buy of anything Nintendo but I don't I'm picky ;) ( and I don't care for their exclusives).


Overall I see no reason with that kind of specs and "target specs" for a real Nintendo spoke person, not a rumors or leak, to come out of the wood to state what he stated.
 
Last edited by a moderator:
I was not particulary refering to you hence the (s) at the end of same ;)

I would be surprised if that turns out true.

3MB of cache is not an issue with edram. But I don't get the split asymmetrically thing. If there is more than 3 cores it could make sense but still be weird.
IBM stated that there is edram ion the WiiU CPU that's an official statement.

32MB of edram I don't byte in it either. 1) It would be costly 2) it's a flat out bad decision vs more raw power (especially with their own late comments on the matter) 3) 32MB is perfect for x4 AA @720p too bad there is also the WiiU screen. All this sounds like made out numbers that rings well to me at least. The 360 came with 10MB that was 'odd' vs either 720 and 1080 rendering requirement (for forward rendering). Like Pettrucci song "mind carrying the solo" I think "production cost carrying the design" not nice figures.

Another point is with the kind of power the WiiU may end with whether it's just below, on par or just above the ps360 I'm not sure that 1 or 1.5 GB would make a difference just cost more.
A 256MB increase in main RAM vs the PS3 may do the trick as far as convenience is concerned for the devs.

I don't byte in it, sorry. We should no soon. I would happily be proved wrong as my wife kind of green light the buy of anything Nintendo but I don't I'm picky ;) ( and I don't care for their exclusives).


Overall I see no reason with that kind of specs and "target specs" for a real Nintendo spoke person, not a rumors or leak, would have come out of the wood to state what he stated.

Haha. Ok then as I was saying 640-800 ALUs.

That said, if you don't want to believe actual target specs, I don't know what to tell you. But a good portion of that came from lherre whom we know has a dev kit.

He told us the cache was split asymmetrically and that one core essentially works as a "master core" and has more cache than the other two. This means that core has at least 1.5MB.

He said it also had a memory range and based on what he said indicated they were going with at least 1GB, but were going with the max. I got the actual numbers (1-1.5GB) from someone else.

The eDRAM amount came from multiple places and all said the same thing.

If you notice Nintendo seems to be going for 3x the cache, eDRAM, and system memory of the 360.

The problem is you're saying it's unbelievable based on your assumption of the power. And saying they won't do that because you don't believe it will have the power to justify having that when that came from Nintendo themselves. I can only say that I guess it will have the power to justify having those things based on what you're saying.

The thing is Nintendo didn't give GPU specs early on and still don't seem like they've totally given them, if at all yet. They only gave a codename for the GPU. That I also confirmed from multiple places. But when looking at the dev kit, you're proposing something weaker than the underclocked GPU in the early kit. At least Rangers was more in line with that.
 
I was like Rangers expecting something that kind of match a RV730 or more half a RV740, that's a year ago.
I find it tough to believe, it doesn't add at all with the noise around including the one made by Nintendo. That's all for me on the topic, W&S.

EDIT
May be if you come with a nice photoshop I'll change my views (cf the rumors of day) :LOL:
 
Last edited by a moderator:
Really? Seeing the trend is generally moving towards deferred rendering that won't be enough for 4xMSAA.

Where did you even get those specs? From stuff showed at E3 nothing had any AA and all of them were 720p native, not 1080p.

Oh I definitely agree those demos didn't have AA. As for the specs, I wasn't the first to post them in this thread, but I have been able to confirm them.

http://forum.beyond3d.com/showthread.php?p=1572779#post1572779

I was like Rangers expecting something that kind of match a RV730 or more half a RV740, that's a year ago.
I find it tough to believe, it doesn't add at all with the noise around including the one made by Nintendo. That's all for me on the topic, W&S.

EDIT
May be if you come with a nice photoshop I'll my views (cf the rumors of day) :LOL:

Haha. I'll get right on the 'shop. But it seems that with the GPU, the key is figuring out what type of Nintendo tweaks are we looking at. It seems like we won't be able to gauge the GPU's power in a traditional sense.
 
Sorry but which target specs?
Because wrt the upcoming systems the only one thing we know for sure is that WiiU CPU might be a tri-core that it is made by IBM and includes edram.
The rest (for Nintendo, SOny, Msft) is plain speculation and rumors for now.

Actually, it's also known for a fact that WiiU GPU is AMD Radeon HD custom design ;)
 
For what it's worth even a hd5450 with 256MB of vram would qualify as a custom part :LOL:

That's for the "bad spirit", still we haven't heard any nose about that part being taped out, at the same time I can understand the usual pc rumors site having no interest on the matter.
 
Last edited by a moderator:
For what it's worth even a hd5450 with 256MB of vram would qualify as a custom part :LOL:
Yes, but some AMD spokesperson said in an interview with Golem (large German tech site) at E3 2011 that the GPU was not based on any existing Radeon chip.
 
Status
Not open for further replies.
Back
Top