Wii U hardware discussion and investigation *rename

Status
Not open for further replies.
MDX I was just answering you about that post.

Now I've to say that I can not longer even tell the point you are trying to make.
May be you should go lighter on the quote and make the point you are trying to defend clearer.
Honestly I might not be alone I guess, it's getting difficult to follow.
 
I have a question about the e6760 if the WiiU GPU is a custom version of it. Is it possible that the reason they(Nintendo or devs) didn't list the WiiU GPU as DirectX11 level is because of Microsoft? Is it a business reason or technical reason that they can't say it has DirectX11 features? The CEO of the Unity Engine company did say that the WiiU GPU is certainly capable of DirectX11 "equivalent" features and shaders. Not really sure what he means by equivalent. Also why even bother saying that the GPU everyone is speculating is DX10.1 capable when it doesn't even use the API.
 
Last edited by a moderator:
I have a question about the e6760 if the WiiU GPU is a custom version of it. Is it possible that the reason they(Nintendo or devs) didn't list the WiiU GPU as DirectX11 level is because of Microsoft? Is it a business reason or technical reason that they can't say it has DirectX11 features? The CEO of the Unity Engine company did say that the WiiU GPU is certainly capable of DirectX11 "equivalent" features and shaders. Not really sure what he means by equivalent. Also why even bother saying that the GPU everyone is speculating is DX10.1 capable when it doesn't even use the API.

From Crytek (on Crysis 3 PS360):

"We want to make sure as much as is humanly possible can translate from a DX11 variant into a DX9 variant, that will work almost as good on an Xbox console to whatever extent we can, because we don't want the experience to be different between the platforms,"

"It is very, very difficult, but it is possible. It just requires a lot of effort. Some of the stuff these guys are making work on consoles now is absolutely amazing. It's render features that shouldn't theoretically work on consoles, but they've managed to construct code that can emulate a similar thing from a… hack and slash sounds wrong, but they don't have the same streamlined pipeline you would have with a DX11 structure, but they can get to a similar result just by experimenting and using tips and tricks."
 
If Nintendo asked IBM to make the eDRAM accessible, it wouldn't have beeen for the CPU,
as thats what IBM has been using their eDRAM for, it would have been for the GPU.

We are assuming the GPU has access to eDRAM based on the design of the current consoles. Which is a logical. But, lets not fly against the face of facts that the WiiU CPU, like the POWER7, has "alot" of eDRAM.

My only question is, is it shared with the GPU, or is it seperate. And if its shared,
what's stopping Nintendo from having IBM & AMD design their chips to have access to 32MB, 256MB, or even one GB of eDRAM?

Not IBM, they claim they can offer more than 128MB of eDRAM. AMD has for, 2011- 2012, the goal of having their CPU and GPU to share fully coherent memory.

Here is what AMD said about GPGPUs not to long ago:



Wouldn't this be a problem for IWATA and team Nintendo who has declared that WiiU is using a GPGPU? Can you imagine the fail for going through the trouble of customizing a chip to be used as GPGPU and not having developers make use of it. Even now we are hearing developors complain about weak CPU, or unknown architecture. Nintendo would have had to be prepare for this, luckily they are working with AMD:


http://www.anandtech.com/show/5847/...geneous-and-gpu-compute-with-amds-manju-hegde



The logical conclusion, like you said, is that the edram is for the GPU or at least partly for the GPU. It makes little sense that the edram would be a cache solely for the CPU, and if that is indeed the case then Nintendo engineers should be fired immediately for being retarded. And when I say Nintendo engineers I mean Nintendo engineers alone considering they have the final say in the hardware design. IBM and AMD are perfectly capable of joining the edram to the GPU so that is the assumption I am making.

Now as for what is stopping Nintendo form putting in 64, 128, 256, or a gig of edram is also an obvious answer. Cost.
 
We don't know the application of the eDRAM. There are three options:

  1. It's for the CPU
  2. It's for the GPU
  3. It's for both
1. This makes very little sense, and as the only remarks supporting the idea of the eDRAM being for the CPU seem to come from a PR article that's clearly lacking in-depth understanding of games, there's not much reason to ask for this in a CPU.



2. There's no mention of the GPU in the PR so far, suggesting the eDRAM isn't for the GPU. But then I'm not placing a lot of faith in these PR comments. ;) Would IBM want to post 'we are incorporating AMD's graphics hardware into the chip'? I don't honestly know.


3. Same sort of issue as 2. Where does the GPU fit in?

So we are left with the conundrum of which vital piece of information is being left out. Is the GPU being included on die but not being talked about? Or is the eDRAM for the CPU and the GPU's on a separate chip? The arguments in contention are PR vs. technical knowhow. I'm siding with technical knowhow on this one. That, or the eDRAM in question is actually just a couple megs of cheap L2 cache and Wii U hasn't even got eDRAM for its GPU, given that the 32 MB number has never been substantiated and appears to just be a self-reinforcing rumour. But the EG article, which is based on devs giving info rather than PR copy-writers, says there's eDRAM on the GPU, so I have to believe that at this point as it's the architecture that makes sense in every way.

4. There's separate eDRAM pools for CPU and GPU
 
e re

ou But the EG article, which is based on devs giving info rather than PR copy-writers, says there's eDRAM on the GPU, so I have to believe that at this point as it's the architecture that makes sense in every way.

What article? Can you quote what they said?
 
Now as for what is stopping Nintendo form putting in 64, 128, 256, or a gig of edram is also an obvious answer. Cost.

Okay, assuming then Nintendo had a choice of having GDDR5 or eDRAM.
Do you happen to know the margin of difference, maybe a ratio, between the two in terms of costs? Including design changes that would have to be made to accomodate either or.
 
Very very very limited OOO capability. The FPU has delayed execution which can cover the latency of a D$ hit, but nothing else.

If the PPC 750 is considered OOO the so is the ARM Cortex A8 (and it isn't).
I'd agree that the PPC 750 is a pretty weak out-of-order design, but it's certainly more OOO than the Cortex A8. All five of the 750's execution units are independent, each with one or two "reservation station" slots attached, not just the FPU. Besides, why would they bother with register renaming and a six-entry reorder buffer if it were an in-order design? ;)

On your greater point though, I agree that there is no reason to use such a simple architecture in a next-gen console.
 
What article? Can you quote what they said?
http://www.eurogamer.net/articles/2012-08-30-how-powerful-is-the-wii-u-really

article said:
Now, though, as we near launch, final kits are in the wild. And, crucially, developers Eurogamer spoke to as part of a wide-ranging investigation into the innards of the Wii U now have final specifications. Here's what they told us.

The CPU: The Wii U's IBM-made CPU is made up of three Power PC cores. We've been unable to ascertain the clock speed of the CPU (more on this later), but we know out of order execution is supported.
RAM in the final retail unit: 1GB of RAM is available to games.
GPU: The Wii U's graphics processing unit is a custom AMD 7 series GPU. Clock speed and pipelines were not disclosed, but we do know it supports DirectX 10 and shader 4 type features. We also know that eDRAM is embedded in the GPU custom chip in a similar way to the Wii.
 
There are some Linkedin profiles by AMD and IBM engineers hinting at a SoC approach. And then we had that weird statement by the CTO of Tezzaron Semiconductors back in July that made it sound like the Wii U would be using a stacked chip with CPU and GPU bolted on top of each other. Assuming that's the case (and I'm doubtful as I believe 3D-ICs are still pretty damn rare - but Tezzaron should know if they're actually involved, and the guy sure made it sound like they were), I would think the eDRAM is part of the CPU die. Dunno, maybe it could serve as a fast, low latency buffer between CPU and GPU to make GPGPU stuff more efficient? Either way, I don't think the eDRAM is L3 cache or an embedded framebuffer.

You were referring to this post you made on Gaf wsippel:

wsippel said:
statement by Bob Patti, CTO of Tezzaron Semiconductors, a few months ago:

“Nintendo’s going to build their next-generation box,” said Patti. “They get their graphics processor from TSMC and their game processor from IBM. They are going to stack them together. Do you think IBM’s going to be real excited about sending their process manifest—their backup and materials—to TSMC? Or maybe TSMC will send it IBM. Neither of those is ever going to happen. Historically they do share this with OSATs, or at least some material information. And they’ve shared it typically with us because I’m only a back-end fab. I can keep a secret. I’m not going to tell their competition. There’s at least a level of comfort in dealing with a third party that isn’t a competitor.”

http://semimd.com/blog/2012/07/31/th...e-of-the-osat/

He did sound fairly convincing, so I looked into Tezzaron & 3D-ICS, & found one particular design compelling
the Wafer Stacked SOC http://www.tezzaron.com/technology/FaStack.htm

"FaStack devices have many advantages over their single-layer counterparts. They are much more dense and their short vertical interconnects allow them to operate at higher speeds with a lower power budget. In addition, FaStack allows disparate elements to be processed on separate wafers for simpler production and greater optimization.

Unlike the separate chips in a "System-in-Package" (SiP) component, FaStack layers are fully integrated into a single IC by a dense system of through-silicon interconnects. FaStack devices match the tight integration of SoC devices while out-doing SiPs for high speed, low power budget, and tiny footprint."


Quite interesting considering the small form factor & power output of the Wii U. One more interesting fact. Mosis, who is reponsible for the fabrication for IBM, lists 45nm as the beginning range.

"The IBM fabrication processes available through MOSIS range from 45 nanometer to 0.25 µm in CMOS, and from 0.13 µm to 0.50 µm in SiGe BiCMOS."

http://www.mosis.com/products/fab-processes

Not saying Nintendo has gone this route, although this would seem to fit in with their design philosophy. I am however sure that it's undoubtedly an SOC design. Also the edram is for the GPU, I do not understand the confusion.
 
Last edited by a moderator:
I wonder, given that Nintendo made a deal with Unity and Epic for licensing their engines, do their internal teams want to use those too? I mean, this will be huge change for Nintendo going to "HD era" of development. Assets, textures, voice acting etc. everything will sky rocket from what they have done before and licensing solid 3rd party middleware could be good thing for them.
 
I wonder, given that Nintendo made a deal with Unity and Epic for licensing their engines, do their internal teams want to use those too? I mean, this will be huge change for Nintendo going to "HD era" of development. Assets, textures, voice acting etc. everything will sky rocket from what they have done before and licensing solid 3rd party middleware could be good thing for them.


There's a rumor that Retro Studios is working on game engines. The rumor then goes into near-crazy territory, saying that Epic was so impressed that they were convinced to bring UE4 to WiiU.
 
There's a rumor that Retro Studios is working on game engines. The rumor then goes into near-crazy territory, saying that Epic was so impressed that they were convinced to bring UE4 to WiiU.

It wouldn't be Retro, it would be dollar signs that would convince Epic to bring UE4 to lesser platforms than the ones currently publicly targetted (i.e. just PC).
 
How exactly does one cool a 3D IC?

I suppose you just build something that doesn't heat much.
In this context "tiny" and "low power" may be for something under 1 watt, e.g. a powerful microcontroller for industrial or field use.
Wii U is only low power in comparison to gaming PCs and the two HD consoles.
 
There's a rumor that Retro Studios is working on game engines. The rumor then goes into near-crazy territory, saying that Epic was so impressed that they were convinced to bring UE4 to WiiU.

Epic "impressed" by another engine and then "convinced" to bring UE4 to Wii U? It looks weird, Epic know what kind of hardware UE4 need, they know how to make engines, they don't need Retro "show them" anything :smile:
 
I wonder, given that Nintendo made a deal with Unity and Epic for licensing their engines, do their internal teams want to use those too? I mean, this will be huge change for Nintendo going to "HD era" of development. Assets, textures, voice acting etc. everything will sky rocket from what they have done before and licensing solid 3rd party middleware could be good thing for them.

Indeed they are: http://www.eurogamer.net/articles/20...-house-and-out

Composing a Nintendo middleware SDK thread, should be up soon.

XpiderMX said:
Epic "impressed" by another engine and then "convinced" to bring UE4 to Wii U? It looks weird, Epic know what kind of hardware UE4 need, they know how to make engines, they don't need Retro "show them" anything

Such is Epic's hubris, Rein believes his engine reigns supreme. steviep is correct, moneybags are their only motivation. Besides Retro would not be showing their proprietary engine to Epic anyway. Retro "is" Nintendo.
 
Last edited by a moderator:
How exactly does one cool a 3D IC?


Vertically?

Seriously though, the heating solutions for these things are quite extravagant from what I've seen. Who knows what they'll come up with for this sort of application though. If it happens, it'll be the first such application of 3D IC in the consumer electronics space, right?

Indeed they are: http://www.eurogamer.net/articles/20...-house-and-out

Composing a Nintendo middleware SDK thread, should be up soon.



Such is Epic's hubris, Rein believes his engine reigns supreme. steviep is correct, moneybags are their only motivation. Besides Retro would not be showing their proprietary engine to Epic anyway. Retro "is" Nintendo.


Althought the rumour is quite clearly guff; to be fair I think its implying that Retro got UE4 up and running so impressively, it made Epic reconsider putting WiiU on their "officially supported" list.

Like I said, guff. And as we know, it matters not whether its "officially supported", developers are free to get the engine running on whatever they want to try once they're licensed to use it.
 
Though being officially supported probably comes with the advantages of 1. knowing the engine will run as opposed to maybe having to change whatever to get it working and 2. support from epic. Sure they will probably offer support on not supported platforms as well but I assume they are charging a nice extra for that.
 
Status
Not open for further replies.
Back
Top