Wii U hardware discussion and investigation *rename

Status
Not open for further replies.
As far as eDRAM goes I think Nintendo like using it because it helps provide predictable performance, same reason they tend to like low latency memory and big caches.

simply, you have to host the tablet framebuffer on it. that eats bandwith too.
bandwith is also too damn important. some have said Wuu has a 96bit bus but this isn't very clear.

if you only have ddr3 you don't match the old consoles, xbox has edram and ps3 has a pool of fast CPU memory. the newer GPU's efficiency redeems you so in the end yes you can do a ddr3 and nothing else console. as you say you would have to pay attention to it.

why Wii U has such a big edram I'd also say, because they can. transistors are cheaper than power or external bandwith and the tech isn't new, IBM was already using it.
 
if you only have ddr3 you don't match the old consoles, xbox has edram and ps3 has a pool of fast CPU memory. the newer GPU's efficiency redeems you so in the end yes you can do a ddr3 and nothing else console. as you say you would have to pay attention to it.

If we do assume a 96-bit bus paired with 1866MHz DDR3, that will give ~ the same theoretical max as 360. Texture cache in recent Radeons is already considerably larger than the paltry 32kB in Xenos, so that should help alleviate some of the external bandwidth consumption (filtering, higher res tex, tex ops etc).
 
I don't think it's in Nintendo's interest to stick a cut-down PC into their consoles; they're pretty paranoid about piracy, they'd be plenty pissed over there in Kyoto if soon after launch you could emulate the Wuu on a desktop computer and play your Marios and Zeldas without paying Nintendo a penny. Regular PC CPUs don't support hardware encryption to the level Cell and Xenon does, and which Wuu's CPU undoubtedly will also, for starters.
Well they might be paranoid but their late hardware was the easiest to hack.
Their hardware is also pretty standard (say vs a PS2 or ps3) and there are plenty of emulators for the GC and the Wii, I saw vid of a wii emulator running on something as weak as an AMD E-450.

I was not saying that they should have used an existing AMD APU but used such a piece of hardware (be it some trinity or llano as in my laptop) as a proof that to achieve what the ps360 do DDR3 is all that you need, modern ROPs efficiency is impressive. The main reason for me to pass on AMD is that Jaguar was not ready for launch.
 
Last edited by a moderator:
You have no solid info on WiiU performance. You're putting a system together and making big assumptions at times based on little to no fact (like chip size/expense ect) and looking at RAM prices on new egg. Now most of that's fine from a purely speculative standpoint (all be it flawed in certain areas, like you're talking about the TDP for mobile parts which are usually binned) . But certainly not when its used to come to the conclusion that there's definitely something wrong with Nintendo's design decisions. I think the very least you should do is wait until you have all the facts before coming to such a certain conclusion.
I've no idea? I don't agree to that I've the same info as every body: I read the comment of quiets some devs in the MSM, I saw quiet some vids, etc. So far I see nothing that let me think that there will a significant perceived difference between the WiiU and the PS360. So my point is especially as Nintendo is not bent on communicating on its hardware prowess is that they try to reinvent the wheel while dismissing proven solutions .
They though that IBM classic CPU (ppc 47x) were not good enough, they "need" something custom
They though AMD GPU was not good enough they "need" something better
DDR3 is not good enough either no matter the performance of AMD APU.
At some point it's almost kind of disrespectful of the great companies you deal with and the best efforts they put in their products. It's their job you come and say I know better. They don't care as they take N money anyway but still.
Then you can look at Nintendo previous achievement like the 3ds and sorry no matter it sells the hardware sucks. The fact that they were loosing money @150$ is not a good sign.

For the TDP, well both Piledriver and Star core are pretty high power CPU. The ppc 47x @1.6GHz burns ~1Watts. I don't believe that TDP would have been an issue especially if they had made a great integration job and include advanced power management features.

As far as eDRAM goes I think Nintendo like using it because it helps provide predictable performance, same reason they tend to like low latency memory and big caches.
Well like actual devs stated here with the gamecube, Nintendo is overly focused on that.
Ultimately they have trade EDRAM for only 3 pretty slow CPU cores, then there is the amount of RAM, possible lack of some Mass storage (even a few GB for caching would have helped) we are going to see how powerful the GPU is soon enough.
 
so, nintendo is disrepectful of IBM because it asks for something custom and embedded, which is exactly what those IBM cores are about?

what do we know about the console's CPU? I thought we knew it's three IBM low power cores, and nothing else. could be a stock 470, could be a variant with a wider SIMD (bobcat is SIMD capable, that doesn't mean you want it in a console)

wikipedia says 1.6 watt for a 476FP core. that doesn't mean a six core CPU eats 10 watts, because you're lacking all the glue and L2 controller, cache, interfaces.
 
so, nintendo is disrepectful of IBM because it asks for something custom and embedded, which is exactly what those IBM cores are about?
Not exactly, it's more disregarding the other companies are coming with. You know a bit like the people for which it's never good enough in a restaurant.
what do we know about the console's CPU? I thought we knew it's three IBM low power cores, and nothing else. could be a stock 470, could be a variant with a wider SIMD (bobcat is SIMD capable, that doesn't mean you want it in a console)
well we don't know for sure but the word on the street (vs PR) seems to hint at either overclocked broadway or modified ppc 47x.
Including wider simd ain't trivial, you have rework lot of data path.
wikipedia says 1.6 watt for a 476FP core. that doesn't mean a six core CPU eats 10 watts, because you're lacking all the glue and L2 controller, cache, interfaces.
And it's a far strech from the CPU cores included in AMD APU (star or piledrivers), ain't it?

EDIT

I will add a bit more with regard to the first point. I think that Nintendo is not a hardware company, never been on the contrary of say Sony. They have their teams but they are not working on real silicon that often, they worked on the GC the wii was a mere evolution of the GC. They worked on the DS and then we had even lighter evolution till the 3ds. I honestly think that it can't fly.
On the other hand you have plenty of companies (TI, Qualcom, AMD, Intel, Nvidia, Armlogic, etc. ) that are working on multiple products on a yearly basis. Some designs stuff from scratch somes are only integrating different IP on a chip, some do both.
Then you have companies that don't really touch the silicon per self but are putting plenty of products together on a yearly basis too, here the list is long with all the vendors of desktop to laptop, sucky tablet to phone, etc.
I honestly question the ability of NIntendo to maintain a high enough level of competences witihin its team in that circumstances vs the competition in both those field.
I'm close to believe that they would be better off pretty much completely out sourcing the whole process a bit like Google did with their Nexus line of products.
Pretty much going from pretty specific requests with regard to the hardware to that's pretty much the R&D budget and the BOM we want to reach. We want a pretty cool system that fit in a tiny form factors. We have pretty strict guidance for the exterior aspect of the device.
So you end up having IBM and AMD teams working on their own on what they think is best, with a company like Asus taking charge of putting the inner together for example.

You may doubt my design choices and might be right ;) (even though I don't think that what I describe is a crazy proposal), though I'm confident that what AMD and IBM would have come with would have beaten handily what they build under what is likely a pretty strict guidance from Nintendo or my cheap attempt at doing it. The reason is simple they do that for a living of a daily basis, they have a knowledge Nintendo can't touch. The same applies for putting the system together a company like Asus (or others it's just an example) does this for a living on many different type of devices, it's their business they do that on daily basis.
I do get that it's tough for Nintendo to give up control but it's not like could not hire a few experts (hardware but also high profile game developers) to audit the project.
 
Last edited by a moderator:
liolio

Before I reply I'd be interested to know on what you're basing all of your angst. You must have some pretty definite info yes? I mean you must know WiiU's specs?
 
Last edited by a moderator:
I'll be pleased with Wii U's visuals provided it can produce visuals on par with the E3 2011 demos--the "Nature/Bird" demo and the Zelda demo. What I am getting at is, I just hope Wii U's graphical capabilities have not been downgraded from what we saw then. There's no reason to believe they have, correct?
 
I'll be pleased with Wii U's visuals provided it can produce visuals on par with the E3 2011 demos--the "Nature/Bird" demo and the Zelda demo. What I am getting at is, I just hope Wii U's graphical capabilities have not been downgraded from what we saw then. There's no reason to believe they have, correct?

Downgraded? it cannot happen! (I guess)
 
liolio

Before I reply I'd be interested to know on what you're basing all of your angst. You must have some pretty definite info yes? I mean you must know WiiU's specs?
I know the same as you. It's no Angst by the way.
Waht I said in the last part of my post, I think it applies to different extends (not in the same amount for both) to Sony and MSFT. Things are getting crazy complex there are teams that do that for a living. I don't know if the noise about Oban/Durango is true or false either but it tilts in me with what I was saying. Putting something together custom that beats what people in the business do for a living (so to the best of their capacities as it's dead/life matter for their company) become close to impossible. Last gen with so lot of money invested in a dead end (the broadband engine) and an industrial disaster (the RroD) big enough to kill quiet some companies.

By the way if you read what I posted as angst it's may be because you have high expectation for the product or you have a light bias in favor of Nintendo. Neither of those things are important, or wrong or right by the way. I basically said that even if the "real WiiU" is in some regards more powerful than the SOC I describe and even as an significant edge over the ps360 it won't change perceptions much. The product is highly unlikely to be in between this gen and the up coming one.
Core gamers are highly unlikely to jump on the system, the lack of CoD and BF3 will further secure that point. Traditional Nintendo audience will no matter what. And it's a bet if the audience they won to the Wii is gonna make the jump.

They could have gone with something more straight forward and most likely cheaper. That's all.

So to your point here the kind of politically correct remark the system gets:
"This is the biggest title we've worked on in a long time. There's a significant technical challenge bringing the game from its original format to the Wii U. It's a new, different and sophisticated piece of hardware, so there's a good deal of technical energy that's been expended making that happen."
I think porting to a simple SoC as I described should have been straight forward. Either way the system is overly complicated for what it is supposed to achieve or they face some bottlenecks. None of the option sounds good imo when integrating existing part from AMD and IBM IP catalog on a SoC linked to DDR3 would have done the job.
 
Last edited by a moderator:
I'll be pleased with Wii U's visuals provided it can produce visuals on par with the E3 2011 demos--the "Nature/Bird" demo and the Zelda demo. What I am getting at is, I just hope Wii U's graphical capabilities have not been downgraded from what we saw then. There's no reason to believe they have, correct?
Reason to believe, hum that's not a nice way to put it.
There have been plenty of noises surrounding the project, Nintendo after the luck warm reception of the announcement went ahead to reassure that the investors that the product would be cheap.
Problem about how the wiiumote connects to the console (solved).
We heard noise about change(s) in the devs kits, from some sources here or in the Msm.
Some leaks in the msm stated that some devs had to scaled down what they were planning to do for example.
BgAssassin told us that the last devs kit should be more powerful than the previous ones, etc.

No matter it's been upgraded, downgraded or it didn't move much, the only conclusion I can rationally make is that the development of that product seems a bit chaotic and hasted.

EDIT
To put this in a more general form that doesn't focus on Nintendo only, my feeling about what consoles manufactures have been doing since last gen when every single actors knew it could not go further without real experts (that was IBM, ATI and Nvidia) (and even Sony put down the gun).
More or less (it could even truer this gen) what consoles manufacturers do is to come at expert people/companies and ask them to put together a product that is better and cheaper than what they think is the best parts they can come with and what they think it's safe to produce (be it yield or design like homogenous vs heterogeneous).
There is definitely something in that approach that is really risky and not that rational.
Where we are now in the technology advancement makes that approach even more risky.

Anyway that's just me I think that crazy technological bets and super AAA games are just calling for their respective black swan events. What is unsustainable will fall.
Especially for a company like Sony (so in financial troubles) I hope they do reasonable choices (sadly not the one I bet on as I don't expect them to be that reasonable that time again).
 
Last edited by a moderator:
I'm sorry liolio, sometimes I can't understand exactly what you are trying to say. Must be my english :(

There is definitely something in that approach that is really risky and not that rational.
Where we are now in the technology advancement makes that approach even more risky.

You mean it's risky and not rational for ms/sony/nintendo do outsource their designs and hope it turns out OK or the other way around, them designing their own chips hoping it turns out better/cheaper than what amd/intel/nvidia/imb can give them? I suppose you mean the latter. There isn't really any point in designing your own chips anymore I think. Releasing new hardware only once every ~6 years or so means you have to put in way more effort and money to come up with something yourself than when you just go shopping around. Besides, if you design something really different from the current chips devs won't like it because it's so different (new way or working, more effort in porting etc), and if you design something that is similair to what amd/intel/nvidia/ibm can make, than why not go there in the first place? Will save you a lot of time and money.
 
No offense to him but LMB has been "outed" a little bit on GAF. Someone found his GAF posts from 2005 basically going through the same song and dance denial that the Wii/Revolution was going to be underpowered. VERY reminiscent of his posts regarding Wii U today...

Alright, let me clarify a few things. 7 years ago I was still at the undergraduate level at my university. Those posts that "outed" me, where did I claim insider knowledge of the Wii's HD capabilities? I was extrapolating from Nintendo's R&D budget expenditures, comments made by Iwata, Miyamto, ATi, hardware rumors, etc. I even stated I would wait for the "official specifications sheet." I did not believe that Nintendo would abandon technical parity altogether, especially after that wonderful piece of kit known as the Gamecube. I also knew they were partnering with MoSys again, & had initially inquired about 1-TSRAM-Q. It simply did not add up. As far as my skepticism regarding Perrin Kaplan, I'm certain that you believe everything that Reginald Fils-Aime says as well. How a 7 year old post debating the Wii's as yet undocumented specs proves that I'm a fraud today is ludicrous.


Where am I saying that the Wii U will be a behemoth hardware wise? I've consistenty said more capable than the "current generation" hardware but most assuredly behind Orbis & Durango by several magnitudes. This is fact, not song & dance. The Wii U however is still a very capable machine, I was simply attempting to help illustrate why. When I stated that UE3 was up & running optimally on Wii U, I was questioned here prior to Batman:AE (Armored Edition) being demoed iirc. Have I always posted in the manner that others would clearly infer that I had access to developer contacts or industry information? Of course not, my post history clearly reflects this. Whatever I can access & share now, I try and do so. I've made no outlandish claims, I have no agenda, other than to try & definitively as possible nail down the GPU & system capability. I'm far, far, from anything resembling an insider.

The concept and design team at Monolith Soft are so talented, the small tidbit of information I received I simply had to share. Takahashi has even hinted more than once at a Xenoblade sequel, though I suspect he was merely being coy with the media. I stand by my statements, I'm not attempting to mislead anyone. Let the software confirm or refute my legitimacy.
 
Last edited by a moderator:
you dont have to take offense. i dont have a problem with someone being "optimistic" on nintendo hardware. it's just best to have all info out there so people might know where others are coming from and make their own judgements.
 
Are you sure magnitudes is really the word you wanted to use there?

No, that is indeed too strong of a word. Especally since the gpu is of modern arcitecture, design, & capability. I was just making certain that people had no illusions that parity with the next generation was not what I'm attempting to show. No problem Rangers, I'm not offended.
 
Last edited by a moderator:
Every generation that has an order of magnitude improvement in power gets a notable improvement over last-gen graphics, even if it's just an increase in resolution and texture detail.

That used to be the rule but I don't think it's true any more. There were still huge strides being made in technology up until the 2005-2007 time frame that have since largely disappeared. I don't think we'll see such improvements for the same reason that we haven't seen them on the PC. Put simply, even if the PS4 ends up comparatively more powerful than the Wii U then there is still no guarantee that its games will look significantly better. It may even boil down to one running at 720p and the other at 1080p with otherwise (relatively) equal content.

Just contrast a 5 year old PC game today with a 5 year old PC game in 2007. Age of Mythology, Morrowind, and BF1942 all looked ancient in 2007 while World in Conflict and Crysis have hardly aged a bit. Or even just compare Far Cry to Crysis with a scant 3 years between them.
 
That used to be the rule but I don't think it's true any more. There were still huge strides being made in technology up until the 2005-2007 time frame that have since largely disappeared. I don't think we'll see such improvements for the same reason that we haven't seen them on the PC.
I tend to agree, largely because assets are already so high-fidelity. But there still seems to be a lot of room to grow yet in open-world games. And the fact is that as machines become more and more powerful, economic limitations, not hardware limitations, are going to be the chief limitation on asset fidelity. That's already true to some extent this gen. There are lots of games that don't even come close to using all the available graphical power of the machines they're on because the game didn't have a $20m+ budget. You didn't see that so much last gen.
 
Status
Not open for further replies.
Back
Top