Wii U hardware discussion and investigation *rename

Status
Not open for further replies.
Wiiu is a soc :

A foundry might be willing to give a Tezzaron the information. But consider this: “Nintendo’s going to build their next-generation box,” said Patti. “They get their graphics processor from TSMC and their game processor from IBM. They are going to stack them together. Do you think IBM’s going to be real excited about sending their process manifest—their backup and materials—to TSMC? Or maybe TSMC will send it IBM. Neither of those is ever going to happen. Historically they do share this with OSATs, or at least some material information. And they’ve shared it typically with us because I’m only a back-end fab. I can keep a secret. I’m not going to tell their competition. There’s at least a level of comfort in dealing with a third party that isn’t a competitor.

http://semimd.com/blog/2012/07/31/the-changing-role-of-the-osat/
 
That used to be the rule but I don't think it's true any more. There were still huge strides being made in technology up until the 2005-2007 time frame that have since largely disappeared. I don't think we'll see such improvements for the same reason that we haven't seen them on the PC. Put simply, even if the PS4 ends up comparatively more powerful than the Wii U then there is still no guarantee that its games will look significantly better. It may even boil down to one running at 720p and the other at 1080p with otherwise (relatively) equal content.

Just contrast a 5 year old PC game today with a 5 year old PC game in 2007. Age of Mythology, Morrowind, and BF1942 all looked ancient in 2007 while World in Conflict and Crysis have hardly aged a bit. Or even just compare Far Cry to Crysis with a scant 3 years between them.
Crysis 1 looks great, but judging by actual developers words in the forum and the evolution of Crytek's engine, the technology used in Crysis has quite a few limitations compared to today's standards.

I have Crysis 1 running beautifully on consoles, and Crysis 2 on both the console and the PC, and I think there are games that have surpassed Crysis 1 graphics, imho, like Red Dead Redemption, not to mention the new technologies used in engines like Frostbite 2.

It would be nice to see a non completely obsolete engine like the one used in Crysis 1 put into use with some modern games, but it is certainly obsolete in some ways. The game, just like Doom 3, for instance, has aged well, but there are new, more efficient technologies that have been developed since.

Finally, I don't expect Wii U to be a stripped down console, nor another step backwards in scope and hopefully it won't become obsolete for certain games in a couple of years.
 
Crysis 1 looks great, but judging by actual developers words in the forum and the evolution of Crytek's engine, the technology used in Crysis has quite a few limitations compared to today's standards.

I have Crysis 1 running beautifully on consoles, and Crysis 2 on both the console and the PC, and I think there are games that have surpassed Crysis 1 graphics, imho, like Red Dead Redemption, not to mention the new technologies used in engines like Frostbite 2.

It would be nice to see a non completely obsolete engine like the one used in Crysis 1 put into use with some modern games, but it is certainly obsolete in some ways. The game, just like Doom 3, for instance, has aged well, but there are new, more efficient technologies that have been developed since.

Finally, I don't expect Wii U to be a stripped down console, nor another step backwards in scope and hopefully it won't become obsolete for certain games in a couple of years.

you should play Crysis 1 on the PC... DX10 Very High and high res... in my opinion it's still really impressive and better than most games, and I actually prefer it as a game over Crysis 2 easily, and it looks more impressive than many games released in 2012...
 
That used to be the rule but I don't think it's true any more. There were still huge strides being made in technology up until the 2005-2007 time frame that have since largely disappeared. I don't think we'll see such improvements for the same reason that we haven't seen them on the PC. Put simply, even if the PS4 ends up comparatively more powerful than the Wii U then there is still no guarantee that its games will look significantly better. It may even boil down to one running at 720p and the other at 1080p with otherwise (relatively) equal content.

Just contrast a 5 year old PC game today with a 5 year old PC game in 2007. Age of Mythology, Morrowind, and BF1942 all looked ancient in 2007 while World in Conflict and Crysis have hardly aged a bit. Or even just compare Far Cry to Crysis with a scant 3 years between them.

You just explained the reason why there will be "huge" jump. You saw that in 2005-2007 because of new consoles that brought new baseline specs for engines

The ports to PC now from 360 are similar to a PS2 GTA to PC in 2004.

Now there are no PC-exclusive devs like in the late PS2 and early 360 days. The PC is utilized for better IQ and thats it. They are forced to a 360 spec in one way or another for financial reasons. Crytek was the last one..
 
You just explained the reason why there will be "huge" jump. You saw that in 2005-2007 because of new consoles that brought new baseline specs for engines

The ports to PC now from 360 are similar to a PS2 GTA to PC in 2004.

Now there are no PC-exclusive devs like in the late PS2 and early 360 days. The PC is utilized for better IQ and thats it. They are forced to a 360 spec in one way or another for financial reasons. Crytek was the last one..

I actually made a conscious effort to use PC exclusives in those examples and not console ports. The point was that these games weren't held back by console constraints.
 
That used to be the rule but I don't think it's true any more. There were still huge strides being made in technology up until the 2005-2007 time frame that have since largely disappeared. I don't think we'll see such improvements for the same reason that we haven't seen them on the PC. Put simply, even if the PS4 ends up comparatively more powerful than the Wii U then there is still no guarantee that its games will look significantly better. It may even boil down to one running at 720p and the other at 1080p with otherwise (relatively) equal content.

Just contrast a 5 year old PC game today with a 5 year old PC game in 2007. Age of Mythology, Morrowind, and BF1942 all looked ancient in 2007 while World in Conflict and Crysis have hardly aged a bit. Or even just compare Far Cry to Crysis with a scant 3 years between them.

So you should not mind then if Wii U is basically just a X360. Which it seems to be.
 
Rumoured final Wii U specs:

CPU: “Espresso” CPU on the Wii U has three enhanced Broadway cores

GPU: “GPU7” AMD Radeon™-based High Definition GPU. Unique API = GX2, which supports Shader Model 4.0 (DirectX 10.1 and OpenGL 3.3 equivalent functionality)

Memory: Mem1 = 32MB Mem2 = 1GB (that applications can use)

Storage: Internal 8 GB with support for SD Cards (SD Cards up to 2GB/ SDHC Cards up to 32GB) and External USB Connected Hard Drives

Networking: 802.11 b/g/n Wifi

Video Output: Supports 1080p, 1080i, 720p, 480p and 480i

Video Cables Supported:

Compatible cables include HDMI, Wii D-Terminal, Wii Component Video, Wii RGB, Wii S-Video Stereo AV and Wii AV.

USB: Four USB 2.0 Ports
http://www.vgleaks.com/world-exclusive-wii-u-final-specs/

Is this really the same amount of information that Nintendo would give to developers? It would be difficult to develop for the machine with this little information. I guess this pretty much confirms that the GPU is an R700 derivative though.
 
If the CPU really has 3 enhanced Broadway cores, wouldn't that mean “Espresso” is still based on PowerPC G3 (or rather, 750CXe) like Gekko?

QNnF1.jpg


hmmm...

btw, that pic is of Gekko from 1999. Just thought I'd throw it in there because i'm a little frustrated.



Anyway, "GPU7" sounds more interesting.
 
Last edited by a moderator:
Enhanced Broadway cores could be PPC cores, and PPC cores could be POWER7 cores, to people who don't understand the difference. So there's no knowing what's going on. But if Wii U's CPU really is 3x Gamecube's last millenia technology, Nintendo will have set a new benchmark for cheapskate, underpowered hardware!
 
Im still having an issue with the multipliers some people are using.
The Gamecubes DSP was 81hz. The Wii was increased to about 122mhz.

Why are people assuming that the DSP for the WiiU will stay at around 120?
Why wouldnt Nintendo use a DPS at 300 or 500mhz?

From the Gamecube to the Wii the formula has been
1-2-3-4
1xDSP=DSP, 2xDPS=GPU, 3xGPU=CPU, 4xDSP=Main Memory
Why would they change this formula for the WiiU?

A DSP at 300mhz would give us
GPU=600, CPU=1800, Mem=1200

A DSP at 400mhz would give us
GPU=800, CPU=2400, Mem=1600

A DSP at 500mhz would give us
GPU=1000, CPU=3000, Mem=2000


Im betting Nintendo went with numbers similar to the middle option.
 
Enhanced Broadway cores could be PPC cores, and PPC cores could be POWER7 cores, to people who don't understand the difference. So there's no knowing what's going on. But if Wii U's CPU really is 3x Gamecube's last millenia technology, Nintendo will have set a new benchmark for cheapskate, underpowered hardware!

The Wii documentation also said "Enhanced CPU".
 
Rumoured final Wii U specs:

CPU: “Espresso” CPU on the Wii U has three enhanced Broadway cores.

So... the Wii's CPU was called Broadway.
Allegedly based on the PowerPC 750C

The WiiU CPU is rumored to be called Expresso.
Allegedly based on the Power7.
POWER7 systems includes "Express®" models (710, 720, 730, 740 and 750)

Is there a connection?
 
Would be a relief though I put the odds close to zero, this line of power7 is still pretty much plain and high power power7.

Even at low clock speed to have power7 core within an acceptable power budget you would have to use binned part of the highest quality / super expansive.
 
Also interesting how single forum post is enough for confirmation

Confirmation is it's the same info from other sources EG, the euro gamer article.

Especially as we near launch the trustworthy leaks have converged. There's no real doubt.

Hard to believe Wii U-heads would still be fighting this data.

Anyways it still doesn't give us the coveted GPU functional units/clocks data or really same for CPU leaving us to argue over the term enhanced Broadway.
 
Status
Not open for further replies.
Back
Top