Wii U hardware discussion and investigation *rename

Status
Not open for further replies.
What about the possibility that devs have been making games without final silicon ?
Well there has been some comment from developers, more harsh than sweet, they would have slipped some words about it.
Either way Nintendo has some kind of weird nda that allows only negative comment about their system.
It would not be a hell of statement to say for a dev, well the cpu is weaker, the gpu a bit better and AA should be pretty free for example.
 
Nothing had been confirmed up to this point other than 2 GBs RAM. ;) The eDRAM amount is a leak, but eDRAM for the CPU makes zero sense. That's for caching large datasets for unpredictable access patterns that can't be steamed/cached effectively. That's virtually no use for a console CPU.


You must be familiar with this statement:


Yes, Power-based Watson and the Wii U both use IBM’s small, stable and power-efficient on-chip embedded DRAM (eDRAM) on SOI, which is capable of feeding multi-core processors large chunks of data.



That's eDRAM for the wiiu CPU. How do reckon that?
 
In the end, it doesnheadst't look like Nintendo cares about other two competitors when it comes to next gen 3rd party games, they are out of competition already. They want year ahead on their own, they want 3rd party PS360 games and they want to "hook" people up with new controller. Once Durango and PS4 arrives, ports will be impossible. There will be baseline, and that will be those two, anything considerably below that will simply be left behind.

I think you are not considering the business approaches publishers have planned for next gen: As many ports as possible.

You are also not considering the impact having a year headstart will offer the WiiU.
It might be the baseline for next gen where only a handful developers would risk budgets on pushing consoles.
 
Once Durango and PS4 arrives, ports will be impossible.
This isn't true. Never underestimate the ability of developers to port things. After all, Call of Duty got ported to the Wii. The kind of software developers will make for the the Wii U depends largely on what kinds of games they want to make, and what kinds of games they believe will sell.

I think the potential here is that if, say AC3 and COD:BlOps 2 do well (> 2m units for each game), it establishes that the Wii U is a viable platform for that kind of game. As much as devs seem to resent being compelled to make games for weak hardware, if the system has some big third-party successes in the first year, profit-conscious publishers will make the ports happen one way or another. (I don't believe this will happen, since I expect the Wii to tank in 3DS-like fashion, but that is another story.)

This didn't happen on the Wii. Part of it was that everyone had gone all-in on HD games and expected the Wii to tank, so when it blew the doors out sales-wise, hardly anyone was in a place to capitalize on it. Launching a year after the 360 also meant that by the time it had proved to be a sales success, most studios had completely ended last-gen development. Hence, the relative successes of Resident Evil 4, Red Steel and Call of Duty 3 simply weren't enough to propel 3rd party development forward, and third parties' predictions became self-fulfilling prophecies.
 
The thing you want with porting is for it to be close enough that the primary team does it alongside the other versions.
Otherwise it's just getting farmed out to a port house, and may or may not ship with the primary SKU's.
My guess is that they'll do WiiU versions alongside 360/PS3 but probably not alongside 720/PS4.
If it has a decent market share it'll still get the ports.
 
There was a existing cpu which would allowed that, the power a2. A xenon improved in every way.
As for the balance between cpu and gpu in the next genration sytem let wait for the definitive specs. For example if Sony uses fiur streamroller cores it's not exactely a light investment on cpu either in area or power.
Either way there is no need to rationalize in the matter for eon, Nintendo went for a pretty sucky cpu set up.

Well I would still expect significant downgrades. And then there is the willingness of the editors to port. How much std.console gamers the wiiu will get by providing lesser graphic than what is available in the pc realm since the evergreen era? For all we know it will take quiet some sales.and to the proper audience to convince the one as Dice to port their engine to the system.

Well the lack of AA is a really strong hint to that, either way it would deliver on the 360 premises free AA.

I believe the latest rumor is that the CPU cores for the PS4/Durango will be based on Jaguars. Those processors may have similar strengths and weakness to the CPU in Wii U, but I agree with you in that we should wait for more solid info about those systems.

Well, 360 is already GPU centric and as some of the guys here said, alot of games are also CPU bound. Durango and PS4 will definitely have improved CPUs (PS4 probably less since Cell is weird beast) and Wii U will be left with performance lacking on CPU side.

Next generation consoles will be baseline, those baselines will be pushed to their limits like this generation and Wii U will be left behind even if it wasn't that much weaker than it actually is. If you build game for 6-7GB (Durango is rumored 8GB) than you are going to have hard time downscaling that to 1GB. Such a hard time that I assume developers won't even bother. Not to talk about serious implications that CPU is even a bit behind this generation, and GPU being no where close to what they are packing.

Durango alpha kits (if those are legit) packs 6870-6950 GPU. Considering the time those where put and when consoles will actually be out, difference between them will be huge. Thats why I doubt Wii U will get ports.

Due to the flexibility of "next-gen" engines and the increasing cost, I think there will be a bit more down-ports than you are implying. Also consider that the PS4 is rumored to have 4GB RAM at most, and that is not including the RAM reserved for the system.
 
I've wondered this before so while we are on the subject of ports, can anyone see ms or Sony actually downgrading their specs slightly as result of wiiu specs? So the conversation might be.." Do we really need to be this powerful when one of our competitors is already weaker, could we not save a bit of money and still be more powerful "
I think it's a fair question. I mean how much of a power advantage do you really need especially now that console success is based on the whole package like online and other services.
It only took ms having a very small advantage this gen over ps3 to get the best 3rd party versions of games.
What would be the point of spending the money to be let's say 5x as powerful when 2 x as powerful would probably suffice.
 
Sony and Microsoft sit together on a tea party....
that will be even awesome if both of them make console that play each other game.

the differentiator will be on the 1st and 2nd party games that locked to Sony /0Microsoft and the additional services

the nightmare of porting game will vanish for these two console and the dev will have more resource for posting game to Wii u.

hmm the publisher will be happy too? they don't need release different product for one game. the disc playable on Sony and ms
 
I was just thinking, and I'm sure others have said and thought about this before, the GPU may not even resemble an e6760 or RV730 by the time AMD had finished making the custom silicon Nintendo requested.

Hmm...
 
That wouldn't be awsome at all. Competition is what improves products, competition is what lowers prices for consumers.

Sony and MS designing one hardware spec means they will want to spend as little as possible and charge as much as possible. There is no incentive to offer customers the best deal they can if there is no competition.
 
To me it's likely/obvious now that the EDRAM (if it's here) is not tightly linked to the ROPs which negate a lot of its advantages for rendering.

What are you smoking? If the edram is a scratch pad then I'll take that over the 360's edram any day of the week. At least that way it has more advantages in terms of rendering than the 360's approach. It will allow devs to use the memory in ways they want, for the different types of buffers they want to have super high bandwidth for. Not just the frame buffer would be of benefit.


Megadrive, if the machine is a SoC and the edram is hooked to the GPU, and it's a scratch pad then think of that as a good thing.
 
I was just thinking, and I'm sure others have said and thought about this before, the GPU may not even resemble an e6760 or RV730 by the time AMD had finished making the custom silicon Nintendo requested.

Hmm...

It may not. And someone here or somewhere else has said Nintendo has AMD add some "effects" to be done in hardware or something to that effect. Forget who it was, but it was in the last 6 months. But in looking at the system's overall design it may have been Nintendo wanting these things done in hardware in order to save power and be as efficient as possible. And that's considering if those rumors are even true.

But in terms of what has been customized from the initial stock GPU they started with will be interesting to find out. I'm thinking this thing could be a SoC, given the power budget. Three Power based cores and its 3 MB cache, the GPU, and perhaps the 32 MB of edram is shared between both. I'm probably wrong. I can't wait for the teardowns when it finally launches.
 
I believe the latest rumor is that the CPU cores for the PS4/Durango will be based on Jaguars. Those processors may have similar strengths and weakness to the CPU in Wii U, but I agree with you in that we should wait for more solid info about those systems.
Well for now I don't put much wait into those rumors. I tend to retain the general idea a quad core for Sony, possibly up to 8 cores for MSFT.
For MSFT 8 cores prevent pretty much the use of high power CPU core be it piledriver of streamrollers. I could indeed see MSFT using Jaguar cores. For Sony it's more opened as both Jaguar or piledriver/streamrollers should be doable (it's up to Sony to decide the power budget of the system, we can only do guesses they either go above or below what most would vouch as reasonable).
I would not say that Jaguar cores and PPC 470x would have the same strength. I could see performance per cycle being in the same ballpark if you ignore the SIMD.
Jaguar SIMD have twice the throughput and they support integers. Then there is the cache architecture the L2 is shared in the jaguar and should prove more flexible.
The safe bet for Jaguar is imo MSFT and they would more than x2 the number of cores.
Then if we compare to quad core streamrollers which is still a possibility in the Sony camp, well if AMD on the improvement they promised it should be significantly faster per cycle.

Ultimately we don't really know what those enhanced broadway are but returns from developers are pretty bad.

Due to the flexibility of "next-gen" engines and the increasing cost, I think there will be a bit more down-ports than you are implying. Also consider that the PS4 is rumored to have 4GB RAM at most, and that is not including the RAM reserved for the system.
Actually the rumors have 2 GB or RAM for Sony and Sony trying to make it 4. It could be cost related that would hint at the GDDR5 which is expansive. May be Sony hope for 4gb memory chips but I have read nothing on the matter. So it's not completely impossible for Sony to be stuck with 2GB if we go by the rumors. I would not expect them to reserve 1 GB of ram.
When it comes to port I guess the basic line is Low end pc gaming and this gen of consoles. As ERP put it it most likely better than me.

This choice of CPU is still to me the most disheartening part of the system, it could have allowed some kind of games that don't map successfully to a pad to make into the console realm (think RTS) and some this genre can be quiet CPU intensive. I can live with downgrade in graphic but once you touch the core gameplay it's more bothering.
 
Last edited by a moderator:
Thinking more and more Wii U is an SoC so maybe Nintendo looked at the work done on Vejle (X360's SoC) and liked it very much. Vejle from what I've read had to be throttled to prevent higher performance to maintain compatibility with older X360s.

The more I think about it the more it makes sense to me. Of course I could be totally wrong.
 
Yeah, I didn't want to throw too much into the mix, better for understanding to come in small bites :p

Its not about throwing too many things into the mix, its about seeing where the facts lead without relying on the rumors.

For example:
We know the WiiU is using 2 GB of memory.
Iwata made a special distinction that 1 GB was for games, the other for the system.
Why? Was he implying that the memory was physically separated?

The only confirmation for eDRAM is for the CPU.
In the past Nintendo used eDRAM for the GPU.
So, is there eDRAM for the GPU as well, or
has Nintendo come up with another solution?

If so, then the next question is, is eDRAM shared?
If its shared, then as pointed out, we are dealing with some type a SoC or SiP
If that the case, where does that lead us to?

If not, would it even be necessary to have eDRAM for the GPU if the 1GB of ram is GDDR5 or some other type of fast memory? But would that conflict with the eDRAM for the CPU?

Alot of people have their minds made up how the system is put together due to all the rumors for the last couple of years. But I think they are selling Nintendo short for not thinking outside the box, so to speak. Nintendo could have designed, with its customized chips, a (new) unique architecture for the console.



Thinking more and more Wii U is an SoC so maybe Nintendo looked at the work done on Vejle (X360's SoC) and liked it very much. Vejle from what I've read had to be throttled to prevent higher performance to maintain compatibility with older X360s.

The more I think about it the more it makes sense to me. Of course I could be totally wrong.

But you could be totally right.
This might have made the rounds here:


WIKI:
The Advanced Microcontroller Bus Architecture (AMBA) is used as the on-chip bus in system-on-a-chip (SoC) designs... AMBA was introduced by ARM Ltd in 1996. The first AMBA buses were Advanced System Bus (ASB) and Advanced Peripheral Bus (APB). In its 2nd version, AMBA 2, ARM added AMBA High-performance Bus (AHB) that is a single clock-edge protocol.
http://en.wikipedia.org/wiki/Advanced_Microcontroller_Bus_Architecture

Now, there is a LINKDIN profile of Pavan Kumar Madasu
http://se.linkedin.com/in/pavankumarmadasu

who worked on the Wii U GPU for aprox two years.
June 2009 – April 2011 (1 year 11 months)


ASIC design engineer
Advanced Micro Devices

Worked on the verification for some of the blocks in the Nintendo Wii-U Game Console. Did block level and system level verification for GPU blocks UVD(Universal Video Decoder) which can decode the H264, MPEG2 and VC1 streams and supports the decoding of encrypted streams


But also:
Worked on the verification of the AHB Subsystem blocks,AHMN bridge to connect the AHB bus to the Memory interface. AIMN another AHB-AHB bridge to connect SATA to the AHB IO bus.

Is this evidence of the GPU being part of a SoC or SiP?


IBM has of course
CoreConnect is a microprocessor bus-architecture from IBM for system-on-a-chip (SoC) designs. It was designed to ease the integration and reuse of processor, system, and peripheral cores within standard and custom SoC designs. As a standard SoC design point, it serves as the foundation of IBM or non-IBM devices...

CoreConnect has bridging capabilities to the competing AMBA bus architecture, allowing reuse of existing SoC-components.

The CoreConnect is an integral part of IBM's Power Architecture
 
You must be familiar with this statement:


Yes, Power-based Watson and the Wii U both use IBM’s small, stable and power-efficient on-chip embedded DRAM (eDRAM) on SOI, which is capable of feeding multi-core processors large chunks of data.



That's eDRAM for the wiiu CPU. How do reckon that?
Tweets also said it was POWER7 in Wii U. That particular example doesn't say Wii U's CPU is using eDRAM either. Pr one-liners aren't a sensible reference point (any more than emailing AMD's customer support. Valid console info only comes from official technical releases or leaks!). eg. Perhaps IBM was asked to design the eDRAM in an SOC chip that'll be used for the GPU. A techno-illiterate then forwards to the public that the Wii U uses eDRAM, and from their crib sheet on "Benefits of POWER7" they read off that that's "good for large chunks of data." That's far more plausible than large amounts of eDRAM for the CPU which is of little use in a console, and a bandwidth starved GPU.

It's not 100% certain that it's eDRAM for the GPU, but it is extremely likely to the point nothing short of a Nintendo or IBM press releases spelling out the internal system architecture will present a good case to the contrary. With all the info we've had, it has been the most sensible explanation that has been true.
 
Its not about throwing too many things into the mix, its about seeing where the facts lead without relying on the rumors.
A lot of the rumours are coming from good sources. A lot fo the 'facts' are coming from vague PR remarks.


For example:
We know the WiiU is using 2 GB of memory.
Iwata made a special distinction that 1 GB was for games, the other for the system.
Why? Was he implying that the memory was physically separated?
No. We have reserved RAM in every other console and it's not in a separate pool, so why think Wii U changes that?

The only confirmation for eDRAM is for the CPU.
In the past Nintendo used eDRAM for the GPU.
So, is there eDRAM for the GPU as well, or
has Nintendo come up with another solution?
They haven't said eDRAM on the CPU. They've said eDRAM in Wii U. Plus the sources for Wii U's CPU config are all over the place, so why trust one statement and ignore others instead of ignoring all of them as unreliable?

Alot of people have their minds made up how the system is put together due to all the rumors for the last couple of years. But I think they are selling Nintendo short for not thinking outside the box, so to speak.
I'm not sure that's true.
The possibility of a SOC hasn't been disregarded. What are not being entertained are architectures that don't make a lot of sense, such as eDRAM for the CPU and the GPU running off an expensive RAM pool, and a second cheap pool being used for the OS. I mean, where does game code fit into that? OS operations have to use the CPU and GPU, so all memory must be addressable, meaning a unified RAM pool like XB360. Where you want to throttle the GPU with CPU access to the same RAM, because the GPU hasn't got it's own working space? Give the GPU a load of eDRAM and the CPU will have room to work too.
 
I'm not sure that's true. The possibility of a SOC hasn't been disregarded. What are not being entertained are architectures that don't make a lot of sense, such as eDRAM for the CPU and the GPU running off an expensive RAM pool, and a second cheap pool being used for the OS. I mean, where does game code fit into that? OS operations have to use the CPU and GPU, so all memory must be addressable, meaning a unified RAM pool like XB360. Where you want to throttle the GPU with CPU access to the same RAM, because the GPU hasn't got it's own working space? Give the GPU a load of eDRAM and the CPU will have room to work too.
There are some Linkedin profiles by AMD and IBM engineers hinting at a SoC approach. And then we had that weird statement by the CTO of Tezzaron Semiconductors back in July that made it sound like the Wii U would be using a stacked chip with CPU and GPU bolted on top of each other. Assuming that's the case (and I'm doubtful as I believe 3D-ICs are still pretty damn rare - but Tezzaron should know if they're actually involved, and the guy sure made it sound like they were), I would think the eDRAM is part of the CPU die. Dunno, maybe it could serve as a fast, low latency buffer between CPU and GPU to make GPGPU stuff more efficient? Either way, I don't think the eDRAM is L3 cache or an embedded framebuffer.
 
Status
Not open for further replies.
Back
Top