Wii U hardware discussion and investigation *rename

Status
Not open for further replies.
A post I just read on NeoGAF:

Also people need to stop worrying about power consumption and how it relates to the Wii U's technical power. The Wii U by all accounts has dedicated DSP, I/O controller, and i believe even an ARM CPU for some O/S functions. Not to mention IBM's modern Power based CPUs are incredibly efficient.

The reason consoles like the Xbox 360 and PS3 have been such high draw is not just because they had very high end parts for their time, but also because they went down a path where they had single components like the CPU doing an awful lot of functions. The Xbox 360's CPU does I/O, sound, general processing, FPU tasks, SMID tasks etc. The Xbox 360's CPU is no where near as efficient as a dedicated DSP at doing sound, or as good as a dedicated I/O controller at handling I/O tasks. Tthe Xbox 360 performs many tasks that a processor of significnatly less wattage and clock speed could do, and do better. Nintendo have simply identified which tasks could be done by a seperate dedicated chip better, and put it in their console.

With the Wii U using lots of dedicated chips for specific tasks, that boosts efficiency and reduces power consumption. Sure the Wii U won't be a power house, but it does seem like Nintendo have invested significant engineering into ensuring the Wii U's hardware is as power efficient as possible, and that consumers are getting maximum performance per wattage. If BG's suggestion of the Wii U's GPU being around 6570 performance /600GLFOPs are true, the Wii U Would actually quite a engineering achievement. Not because it's a power house of a console, but because it delivers brlliant performance per watt.

edit: I'm still expecting ~300 gflops on the low-end and ~500 gflops on the high-end.
 
Just to clarify, that 75% is the OUTPUT to the Wii U, not what it's drawing from the wall.


IdeaMan on NeoGaf has recently said that his sources, who are developing an exclusive IP and a multi platform one, have seen close to a 2x framerate increase over a few dev kit revisions. On the slower kit they were still getting playable framerates (25-30 by his words) and were using the GamePad in intricate (Again his words) ways.

So I have a question. If people are thinking 4-500GFLOPS for Wii U, then would that translate to a 60FPS version of a 30FPS current HD title. Is it as simple as that?

It's not that simple, but as far as the GPU is concerned I think even a ~350 gigaflop GPU in the WiiU with a modern architecture should be a lot more powerful then what's in the consoles now.

Now this is my opinion, nothing more, but if I had to guess, I would guess that the WiiU as a whole can take a current gen game that runs at 720p/60fps and run it at 1080p/60fps. I don't think it's powerful enough to take a 720p/30fps game to 1080p/60fps.
 
Just to clarify, that 75% is the OUTPUT to the Wii U, not what it's drawing from the wall.

PSU Max output 75W which means for a short amount of time Wii U could actually draw very near of that number.

However consumer electronics PSU delivers best performance and lifetime at 50-65% load which matches the 40-45W that Iwata says Wii U actually draws normally.

McHuj said:
Now this is my opinion, nothing more, but if I had to guess, I would guess that the WiiU as a whole can take a current gen game that runs at 720p/60fps and run it at 1080p/60fps. I don't think it's powerful enough to take a 720p/30fps game to 1080p/60fps.

We will see at launch. If there is not a single multiplatform retail game doing this it just dosent seem likely to be any kind of norm. The 480p tablet is not a insignificant burden even if its doing nothing special..
 
There are other possible bottlenecks like raw fillrate (pixel/texel/zixel) or CPU-tasks.

I've been mulling this over in my head for awhile now.

IIRC Xenos and RSX both have the same raw fillrate: 4 Gpixels/sec (8 rops * 500 Mhz). If Wii U's GPU also has 8 rops it would obviously need to be clocked significantly higher to have a major fillrate advantage over the current HD consoles.

I'm guessing Wii U GPU doesn't have a major advantage in fillrate, thus we are seeing games running in 720p. I know it's not that simple. I think the texel rate should be a lot better. We'll see.
 
I've been mulling this over in my head for awhile now.

IIRC Xenos and RSX both have the same raw fillrate: 4 Gpixels/sec (8 rops * 500 Mhz). If Wii U's GPU also has 8 rops it would obviously need to be clocked significantly higher to have a major fillrate advantage over the current HD consoles.

I'm guessing Wii U GPU doesn't have a major advantage in fillrate, thus we are seeing games running in 720p. I know it's not that simple. I think the texel rate should be a lot better. We'll see.

Does anyone have a link so I can get my head around fillrate vs texelrates in order to understand performance bottlenecks? Thanks.
 
http://www.eurogamer.net/articles/2012-09-21-a-chat-about-the-power-of-the-wii-u-with-the-developer-of-a-wii-u-launch-title

Warriors Orochi 3 Hyper is on display at the Tokyo Game Show. Eurogamer reports that it features a slower frame rate and fewer on-screen enemies compared to other Dynasty Warriors games.

This can be attributed to the Wii U’s CPU. Dynasty Warriors producer Akihiro Suzuki says that its power “is a little bit less” compared to the technology in the PlayStation 3 and Xbox 360, and “the performance tends to be affected because of the CPU.”

One of the weaknesses of the Wii U compared to PS3 and Xbox 360 is the CPU power is a little bit less. So for games in the Warriors series, including Dynasty Warriors and Warriors Orochi, when you have a lot of enemies coming at you at once, the performance tends to be affected because of the CPU. Dealing with that is a challenge.”

On the other hand, Hyper features overall improved visuals compared to what is offered on other consoles. That’s because of the Wii U’s high level of RAM.

“Developing on new hardware in itself was a challenge, and also making that launch date was a challenge. But from a visual standpoint, based on the performance of the Wii U, we knew the game had the capability of having much better graphics than games on PS3 and Xbox 360. Make no mistake, from a visual standpoint, it is able to produce better graphics. So our challenge was to make a higher quality graphics. We were able to meet that.”

Suzuki promises that the development team will be touching things up before launch. The game is scheduled to arrive alongside Wii U in November, so the staff only has a couple of months left to improve performance.

“While the visuals are great, as is being able to improve them, we had to deal with the lower CPU power and how we can get around that issue. Actually, we’re still working on that. If you see the demo on the show floor and you try it, you’ll probably feel it’s not up to the PS3 level. But we’re working on it!”

Again, all of this comes back tot he Wii U CPU. It’s new, and the team is inexperienced with this particular piece of hardware, “so there are still a lot of things we don’t know yet to bring out the most of the processing power.”

“For the PS3 it has multiple CPUs and an SPU, so you can calculate the various motions of the characters on the CPU so overall it runs smoothly. The Xbox 360 CPUs are formulated so they can spread out the processing power so things run efficiently. With the Wii U being new hardware, we’re still getting used to developing for it, so there are still a lot of things we don’t know yet to bring out the most of the processing power. There’s a lot that still needs to be explored in that area.”

Final nail in the coffin.

I'm starting to believe that anonymous ubisoft employee from awhile back who said that the CPU was on par(a little bit weaker), but that the GPU was 1.5x.
 
Apparently, a spaniard guy has received an email from AMD and they have confirmed that the WiiU GPU is based on the discrete GPU model AMD E6760. Oh yeah, it ain't no fairy tale...

9qwmiq.jpg


"The Wii U utilizes an AMD E6760 GPU, which is a specially-designed, embedded GPU inside the Wii U specifically. This is based around the 6xxx series of GPUs, but has obviously been modified for the Wii U and its specific needs and configuration."

A picture of the chip:

amd-e6760.jpg


And these are the technical especifications:

345lcmt.jpg


The GPU runs at 600 MHz and the TPD is 25W. Honestly, I think it is an improvement over the PS3 and Xbox 360 GPUs. Also someone published this video a few days ago, which confirms this information..


Kudos to the guy who unveiled this, it seems authentic to me.
 
Runs contrary to information that we've gotten today that the Wii U's GPU is less than 600GFLOPS(from StevieP of all people).

Also how exactly is the e6760 even based off of the RV7XX?
 
I went to the Trilateral Commission's website and dropped an e-mail, they told me the backstory about Kennedy's assassination and thanked me for my interest.
 
Likely fake as said above.

StevieP is one of the more optimistic people regarding Wii U hardware, when he says that the Wii U GPU is less than 600GFLOPS then you really know it is.
 
The GPU runs at 600 MHz and the TPD is 25W. Honestly, I think it is an improvement over the PS3 and Xbox 360 GPUs. Also someone published this video a few days ago, which confirms this information..

Yeah, we've discussed that chip for months, you video confirms nothing except that it exists which we already knew lol. Also the TDP is 35 watts not 25.

It also contradicts most sources, even the optimistic bg/StevieP ones state it's some sort of R700 based chip. And most rumors say it's mainly DX10 as well.

Also, even if it was 6760, that wouldn't make it a beast. All they have to do to neuter the 6760 is cut the clocks to 300mhz, 400mhz, or whatever.
 
It could be the right chip if say for instance it ran at 400mhz. I think it is quite plausible if you want to try to merge the rumours of 400Gflops and the rumours of using the above E6760.
 
Status
Not open for further replies.
Back
Top