Wii U hardware discussion and investigation *rename

Status
Not open for further replies.
No idea why they'd reserve 1GB of actual RAM; I'm wondering if that 2nd GB isn't some kind of flash store or somesuch as it makes no sense to reserve 50% of the DRAM.
 
Come on man. You got to stop mentioning me with stuff I haven't said.

http://www.neogaf.com/forum/showpost.php?p=39504036&postcount=6121

http://www.neogaf.com/forum/showpost.php?p=39311994&postcount=4576

http://www.neogaf.com/forum/showpost.php?p=41938848&postcount=6761

http://www.neogaf.com/forum/showpost.php?p=37795868&postcount=7756

You may not think it's an exact e6760 (then again everything you think is apparently some big secret, I cant read minds), but whatever. You are the genesis of where I have heard "e6760" and "Wii U" together repeatedly.
 
160-240 shaders at 500~700MHz would most likely be unable to produce even XB360 details while rendering the 480p screen too, it's higher than that for sure, especially when taking into account the reports of several multiplatform titles being actually 1080p + 480p on Wii U

We've heard repeatedly that newer architectures are much more efficient than Xenos and that fewer shaders can yield much better results. The figure of twice as efficient has even been thrown around for the latest ATI architectures. Even lowballing 1.5 X the efficiency, 240 shaders @ 600 mhz (for example) would easily swallow up a 360 port and rendering a completely different image on the Wumoat, or could get you towards double the resolution.

RSX creaks along happily at Xbox 360 levels despite having zero fillrate and triangle setup of 4 tris a calendar month too, so Wii should be able to push 360 graphics at higher resolutions and/or to the Wublet if it has the shader power.

The requirements of the second screen depend entirely on what you're doing on it, too.
 
Last edited by a moderator:
No idea why they'd reserve 1GB of actual RAM; I'm wondering if that 2nd GB isn't some kind of flash store or somesuch as it makes no sense to reserve 50% of the DRAM.

If Nintendo want to allow off-telly play on the tablet while still allowing the full range of WiiU on-telly dashboard-esque services (movie streaming, web browser, the facebook etc), or if they wanted the tablet to be able to access all that stuff while playing a game on the telly, then they'd need a a large chunk of constantly reserved memory.

I can see the nextbox reserving a GB or more memory too.
 
The requirements of the second screen depend entirely on what you're doing on it, too.
Important plot point. It'll be rare to render a AAA game port with a full 3D render on the second screen, so the impact shouldn't be too great on most titles. There's certainly no particular reason to shift resources away from the main screen and to the Wuublet when a simpler 2D representation can do the same job.
 
I thought the same thing at first but perhaps Nintendo has bigger long term plans for the second screen. Currently we all are probably imagining scenarios where it's one person using the system with one tablet if Nintendo sees this as the central entertainment hub in the home for,the entire family they could try and promote multiple family members using the system at the same time. For example one person using tablet for browsing the web while another using a pro controller to play a full game on the tv.
 
If Nintendo want to allow off-telly play on the tablet while still allowing the full range of WiiU on-telly dashboard-esque services (movie streaming, web browser, the facebook etc), or if they wanted the tablet to be able to access all that stuff while playing a game on the telly, then they'd need a a large chunk of constantly reserved memory.
You don't need a GIGABYTE of memory to stream some effing video or show effing facebook. That's ridiculous.

Anyway, why should Nintendo reserve memory for these fringe activities on a GAMES device when these activities severely harm the performance of games on said GAMES device? I'm wholly 100% against that.

They didn't even allow DVD video playback on the Wii (nor on the Wuu I believe) because they said the primary purpose of the Wii was games, and people have DVD players already, well hells bells; people have ipads and PCs and stuff for facebook, and DVRs and PS360s and Apple TVs and such for video streaming.

If a gig of actual RAM is reserved in Wuu then it's near 100% wasted.
 
I think the success of netflix on wii may have changed their mind. They seemed to put a lot of time into showing of TVii in their presentation.
 
Currently we all are probably imagining scenarios where it's one person using the system with one tablet if Nintendo sees this as the central entertainment hub in the home for,the entire family they could try and promote multiple family members using the system at the same time. For example one person using tablet for browsing the web while another using a pro controller to play a full game on the tv.

I think the success of netflix on wii may have changed their mind. They seemed to put a lot of time into showing of TVii in their presentation.
So Wii U is being sold a £200 console and tablet replacement, instead of people buying a PS360 and a tablet? I don't think that's a good strategy as it effectively renders the Wuublet controller irrelevant for games. That is, you'll have to buy a Pro controller to be able to play on the console without any Wuublet support while someone else is browsing the web on that capacitive screen. For any existing PS360 gamer, buying a £200 Nexus 7 or £130/160 Kindle Fire (HD) or even a generic £70 tablet off eBay is going to be a far better option. I find it hard to believe that the Wuublet is intended to be used extensively for non-gaming. TBH I don't think Nintendo quite know how to use it. On the one hand it's there for interactivity with games, but on the other they also envisage playing the console game on the handheld, and then the controller being used by someone else and you need a conventional controller to play a conventional game on a TV. It's very inconsistent and vague. I wonder were developers are on this, and how much effort they'll put into support Wii U's USP when their games also have to work as ordinary games on a standard controller? Sinking a lot of effort into a feature that's not used because the controller is being used to watch Netflix would be a frustrating waste of effort.
 
I don't know Shifty I'm as confused by the 1GB for OS as everyone but its there so I'm just brainstorming and throwing out possible ideas, not defending Nintendo or taking a position.
 
http://www.neogaf.com/forum/showpost.php?p=39504036&postcount=6121

http://www.neogaf.com/forum/showpost.php?p=39311994&postcount=4576

http://www.neogaf.com/forum/showpost.php?p=41938848&postcount=6761

http://www.neogaf.com/forum/showpost.php?p=37795868&postcount=7756

You may not think it's an exact e6760 (then again everything you think is apparently some big secret, I cant read minds), but whatever. You are the genesis of where I have heard "e6760" and "Wii U" together repeatedly.

Then say it started with me, even though someone else found it and told me about. Don't sit there and say I said it would be an E6760. I'm glad you found those posts so you can see for yourself how you twist my posts (which you still conveniently left out the posts where I tell people who say it's based on the E6760 that it's not based on the E6760). And now you're accusing me of things again that I don't know where you get it from. What are all of these things I'm acting as being some kind of big secret? The things I get from people that I don't want to get in trouble? If you're going to keep doing this, you need to stop mentioning me in your posts. Being accused of things I don't do is probably my biggest pet peeve. For someone who a long time ago talked about how posting here is not the same as fanboy posting on GAF, you haven't done a good job living up to your own words.

Anyway, on topic I made a post on GAF about how I think Wii U's clocks look like with a small addition to the memory

Well to help you see where I'm coming from I'm expecting the GPU to have 640 ALUs. I also expect Nintendo to continue to use clock multiples. Since we know the DSP (at least originally) is 120Mhz, then I expect the GPU to be 480Mhz which would put it at 614.4GFLOPs. So using that if I applied that to all the components I could see it like this.

DSP - 120Mhz
GPU - 480Mhz
DDR3 - 240Mhz (or 960Mhz depending on how you prefer to look at it; DDR3-1920)
CPU - 1440Mhz, 1920Mhz, or 2400Mhz
eDRAM - ? (couldn't find clock speed info)
As for the memory split, considering how the target specs originally said 1 or 1.5GB for games I'd assume that Nintendo plans to cut the OS allotment in half down the road.
 
We've heard repeatedly that newer architectures are much more efficient than Xenos and that fewer shaders can yield much better results. The figure of twice as efficient has even been thrown around for the latest ATI architectures. Even lowballing 1.5 X the efficiency, 240 shaders @ 600 mhz (for example) would easily swallow up a 360 port and rendering a completely different image on the Wumoat, or could get you towards double the resolution.

I think you're right. I think the WiiU's GPU performance comes from improved architectures and not raw power.

Consider this: WiiU will draw around ~50W from the wall (based on the PSU numbers). Where as the PS360 are closer to 90-100W. Both (assuming things didn't change for the WiiU) are built on the 40/45nm node currently. If theoretical performance between the three platforms was 1:1, the WiiU would already roughly have double the performance/watt. If it's got 4x the performance/watt of the previous consoles on the same processing node, that's a huge achievement in itself. That's possible, but beyond that I'm not sure is possible.
 
Then say it started with me, even though someone else found it and told me about. Don't sit there and say I said it would be an E6760. I'm glad you found those posts so you can see for yourself how you twist my posts (which you still conveniently left out the posts where I tell people who say it's based on the E6760 that it's not based on the E6760). And now you're accusing me of things again that I don't know where you get it from. What are all of these things I'm acting as being some kind of big secret? The things I get from people that I don't want to get in trouble? If you're going to keep doing this, you need to stop mentioning me in your posts. Being accused of things I don't do is probably my biggest pet peeve. For someone who a long time ago talked about how posting here is not the same as fanboy posting on GAF, you haven't done a good job living up to your own words.

Anyway, on topic I made a post on GAF about how I think Wii U's clocks look like with a small addition to the memory
Good luck BG, Think you going to need it. Maybe time to switch to a new handle if things dont play out well when the wii-u is out and the true specs come out....

One thing i learn if you get some inside info or just info that not out there. It best to keep it to your self. When you go around with third or second hand info it can lead to bad things. Keep ur head up!
 
Last edited by a moderator:
As for the memory split, considering how the target specs originally said 1 or 1.5GB for games I'd assume that Nintendo plans to cut the OS allotment in half down the road.

Or they may be keeping a healthy margin for future additions to their services, including what they may not be able to fully foresee. While keeping 1GB reserved for OS services seems extravagant, we need to remember that the reason why you reserve memory space at all is to ensure dependably responsive and glitch free operation. It's a worthy goal. And it still leaves three times more memory for games than the PS360 do.
 
I think all the Gb of memory for the OS means is that Nintendo think the none gaming functions are important.

If I were speculating on N's decision making process, I'd suggest that their entire strategy is based on being on par with 360/PS3, but offering unique elements to differentiate their product.
I suspect they want the tablet to behave like a tablet when you're not playing a game, or perhaps even when you are, that means it has to run "apps" and be competitive with real tablets doing so.

Having said that I won't be buying one.
 
Well, even if it is a "full tablet", I still think a gigabyte is a lot. Most (non very high end) phones have less. Even the very high end ones usually don't have that, because they use some of that RAM exclusively for the GPU (there aren't a lot of phones with more than a gigabyte of RAM). And those run a FULL OS underneath. Not just an app starter frontend plus apps. Well... all in all, it does run a full OS, too, but very likely very limited compared to Android.

As I've said before, I recently bought a Raspberry Pi and I have OpenElec running on it (HTPC OS with Xbmc frontend). And that means just 128MB of RAM for the OS (HD video decoding takes away 128MB of the RAM for the GPU). Sure, it's not as flexible (you can install plugins to view webpages etc, but it's very limited), but it's also a full OS.

And it's not like Nintendo needs the system to have the ability to open 8 tabs in the browser, which all have a flash player open etc...
 
I've been banging this drum since the IBM press release last E3, shouting into the wind about IBM's wording, the history of the players involved, the lack of GPU fab being touted while IBM were chest beating, the performance of the platform, the power envelope, the ... 4cm fan. Etc, etc.

160 ~240 shaders at 500 ~ 700 mHz on a SoC. It could still happen. It. Could. Still. It's the dream. Embrace everything that you are, Nintendo.

And a handful of us merrie men called it on the power consumption as soon as we saw the fan. Never deny the truth presented by the gloriously small Nintendo case fan.

Yes! But I don’t see the WiiU as a SoC. I see more signs that it’s a SiP.
Possibly even a 2.5 or 3D IC.


Recent statement by Robert Patti from Tezzaron:
But consider this: “Nintendo’s going to build their next-generation box,” said Patti. “They get their graphics processor from TSMC and their game processor from IBM. They are going to stack them together.


What does Tezzaron specialize in?
We specialize in 3D wafer stacking and TSV processes, cutting edge memory products, and wide-ranging collaborations. In 2004 Tezzaron demonstrated the world's first successful wafer-stacked 3D-ICs with TSV,

Now, why would IBM be bragging about their new WiiU cpu if the CPU is not better than
360’s Xenon? Broadway x 3?

Here is what IBM said:
The all-new, Power-based microprocessor will pack some of IBM's most advanced technology into an energy-saving silicon package

Wiki:
TSVs are a high performance technique used to create 3D packages and 3D integrated circuits, compared to alternatives such as package-on-package, because the density of the vias is substantially higher, and because the length of the connections is shorter.

A 3D package (System in Package, Chip Stack MCM, etc.) contains two or more chips (integrated circuits) stacked vertically so that they occupy less space and/or have greater connectivity. An alternate type of 3D package can be found in IBM's Silicon Carrier Packaging Technology, where ICs are not stacked but a carrier substrate containing TSVs is used to connect multiple ICs together in a package.


Patti:
SOCs put a huge burden on processing. If you need flash, DRAM, and a processor on the same chip, you have a fab nightmare." The cost burden is huge, he pointed out

Another advantage is density: Four layers of 45nm circuitry take about the same space as one 22nm device. "By virtually any measure, development cost, fab facility cost, even piece part, 3D wins." Patti said.

A third advantage is power. "If we assume high-k gates, and lower transistor leakage, most of the power is left in the charging and discharging of the wire," Patti said. "Make shorter wires [and] you get lower power. In our memories we make the wires half as long and we get a 40% reduction in per bit power."



If that’s the case, if Nintendo has been putting a lot of development into making a cool running console, how can the TDP be an accurate determinant of the performance power of the console? That’s like saying the original Xbox360 is more powerful than the Xbox360 slim because it uses more watts, isnt it?

xbox-360-power-brick-comparison.jpg


We know that’s not the case,as a matter of fact, the slim can out perform the original.

As a matter of fact, according to http://www.electroiq.com the Vita can be seen as a SiP:
By combining the processor with the different memories in the same package in the Vita, Sony and Toshiba have produced one of the few true system-in-package (SiP) parts that we have seen. And I would call it 3D, even though industry convention is now restricting that term to TSV-based parts – so it’s not 3D, in our current argot.
http://www.electroiq.com/blogs/chip...vita-uses-chip-on-chip-sip-3d-but-not-3d.html
 
Status
Not open for further replies.
Back
Top