Wii U hardware discussion and investigation *rename

Status
Not open for further replies.
It's trivial to have the games made after the policy change run faster and only them. (and only if they request it)
What we can certainly agree on is that there's no indication at all that such a scheme is planned.
 
The idea is not so outlandish when we have the precedent of Sony precisely doing the same thing, though tripling the clock is completely bunk (that 2GHz ARM too, no question)

Sony didn't do the same thing. They limited clock speed on PSP to improve battery life. Then they gradually increased what a game would ask for once there was enough of an install base that people wouldn't start making battery life generalizations based on some killer apps.

The power consumption on Wii U barely matters to anyone. Nintendo has no real incentive to artificially limit it.
 
If that's the case, then why hasn't Sony, Microsoft, or Nintendo ever utilized that over the past decade on their consoles? Surely they would have released yearly models with updates for new and improved performance.
 
Is the PLL(s) in wuu even programmable? That has never been the case in previous consoles they've released, to upclock them you had to solder in a new oscillator....
 
I actually know this.

Espresso, like other members of the 750 series (probably gx/fx judging by the cache size and new 'unknown' registers identified by fo) has an incredibly short pipeline. Something like 4 stages.

Even if suddenly bumping clock speeds by this magnitude wasnt insane, you dont just clock an architecture with 4 pipeline stages to over 3Ghz. No matter how much it shrunk.

I was excited to see this thread update, but dissapointed when I saw why.

I guess ill have to see what Nintendo does with expresso before well have anything interesting to talk about. If they can somehow cobble together a jerry rigged 'dmm' solution on freaking broadway theyll be able to do some neat stuff with espresso.
 
This is the silliest rumor I have heard for a long time. Anyone with any technical background can see that all the technical details are just wrong...

And there's zero business sense to do something like this: "Let's sabotage our own launch sales and reputation by making our 3x faster console look like it's only equal to competition" :)

I remember that there were similar rumors with the original Wii. It was supposed to have some extra hidden chips that would make it faster than x360/PS3 :)

Good laughs though :)
 
Sony didn't do the same thing. They limited clock speed on PSP to improve battery life.
So FOR A REASON then. What's Nintendo's reason? To keep wattage down to 40 watts at launch, but increase it to 60/80 watts after 6 months which otherwise would have been objectionably high but is now acceptable?

There is only three ways this can pan out:

1) TRUE - Nintendo deliberately gimped their console to lose early interest
2) TRUE - Nintendo's engineers are incompetent and didn't realise their hardware could perform much better than expected
3) FALSE

Option 3 is the only one that's positive for Nintendo.
 
Somone raised an interesting point on another forum that Nintendo's eventual goal will be to shrink down the size of the components enough that the whole circuitry can fit inside the tablet, so that it becomes truly portable.
 
Somone raised an interesting point on another forum that Nintendo's eventual goal will be to shrink down the size of the components enough that the whole circuitry can fit inside the tablet, so that it becomes truly portable.

Someone raised that interesting point in this forum over half a year ago.
 
Somone raised an interesting point on another forum that Nintendo's eventual goal will be to shrink down the size of the components enough that the whole circuitry can fit inside the tablet, so that it becomes truly portable.

And some Mod is responding on this forum saying how that has nothing to do with their current non portable offering. That would be more of a 3DS followup. It will be at least 3 years out given Nintendos current pace. Also, there's an existing thread talking about all of that here.
 
Is the PLL(s) in wuu even programmable? That has never been the case in previous consoles they've released, to upclock them you had to solder in a new oscillator....

The PLLs in anything recent are programmable. Wii and WiiU both have programmable PLLs. They are just integrated into the SoC. This is true for Nintendo and other systems, everyone does it. If you have the register spec and ability to access those registers you can set the clocks to just about anything.

All of those PLLs on the SoC are sourced from the same reference clock, so you can also change the frequency by changing the oscillator on the PCB. Then you're keeping the same multiplier relative to the reference frequency because you haven't changed the register settings. Eg, changing from a 27MHz reference to 30MHz changes the processor bus frequency from 243 (9*27) to 270 (9*30) on Wii. Software still thinks it is running at 243 because it expects the reference clock to be 27 MHz. You can do the same thing with software by changing the registers to generate a 10x clock rather than a 9x clock, but now it knows it is running at 270.

It doesn't have to be whole numbers either, though many PLLs will produce a better clock when using a whole multiple of the reference clock.

Changing the reference clock rather than the register settings can also cause problems. That same reference is used for the PLL that generates the display output. When it needs to generate a specific pixel clock to output something like 720p, the registers are set based on a 27MHz reference. If you move far enough away from 27 without changing the PLL programming software, you'll eventually get far enough out of spec that the TV won't be able to make sense of your no-longer-720p signal.
 
Overclocking rumours are funny :) I'm not sure why anyone would beleive them let alone repeat them.

The concept isn't out fo the realms of possibility, however. (Didn't they do somthing similar with the 3DS? I guess they could have underclocked it due to o/s inefficiency etc and were struggling to keep the heat down). Its just the extent to which the rumour went which made it laughable.

Had they said the clock speed had been increased by like 2-5% I might have beleived it. They went for broke though with an almost 200% increase and as such this rumour shouldn't have ever been reported by anybody.
 
They didn't change any clockspeeds in 3DS, they just released some of the previously reserved CPU time on one core.
 
They didn't change any clockspeeds in 3DS, they just released some of the previously reserved CPU time on one core.

Well they put it to damn good use in luigis mansion. Damn DAMN impressive work they got out of that arm11, and its incredibly picky vfp's. Props to next level games.
 
They didn't change any clockspeeds in 3DS, they just released some of the previously reserved CPU time on one core.

Like I said, somthing similar ;) (as in changing what is available to developers through a firmware update). But yeah its totally different to upping the clock speed for sure.
 
Last edited by a moderator:
To get back on topic. I have been looking more at the gpu in the wiiu. Based on comparable cards it seems fourth storm finding might be the correct makeup of 160:8:8.

Power Consumption was talk about months ago and under 15 seem to be the range needed for the gpu.
HD 5550 320 shader card @ 550 MHz is 33 Watts 40 nm card
HD 6450 160 Shader card @ 625 Mhz is 13 watts 40 nm card

http://www.techpowerup.com/reviews/Sapphire/HD_6450_Passive/25.html
http://www.techpowerup.com/reviews/HIS/Radeon_HD_5550/27.html

Performance- based on what we seem in games it looks to have a little extra performance compare to ps360.

Radeon HD 6450 40nm 160:8:4 293
Nvidia 7900 gt. 238

http://www.videocardbenchmark.net/gpu.php?gpu=GeForce+7900+GT/GTO&id=1253 http://www.videocardbenchmark.net/gpu.php?gpu=Radeon+HD+6450&id=267

function i known brought up this gpu benchmarks before.

the only thing that doesnt fit with 160 ALU is the size of the ALU blocks themselves. What if they spread the ALU out to reduce the heat?
 
lol. Maybe if you had said 1.4GHz and 600MHz...

Even then it would remain closer to Vita than any console lol


(pedantic reply incoming)

Maybe....but if so then only just. Vita CPU is, what, 1Ghz max and GPU ~400mhz??

PS4 is likely 1.8Ghz CPU and ~800mhz GPU (if sources are to be believed)

So if WiiU had received a bump to 1.4Ghz/600Mhz it would sit perfectly in the middle.

But it hasn't. That rumour was drivel.

But my point is, comparing clock speeds against handhelds isn't really "lol" worthy going into the next generation. I know you were just making a joke but I was bored and feeling picky. I'm sure someone equally as pedantic will swoop in and point out that I'm wrong now :)
 
Last edited by a moderator:
(pedantic reply incoming)

Maybe....but if so then only just. Vita CPU is, what, 1Ghz max and GPU ~400mhz??

PS4 is likely 1.8Ghz CPU and ~800mhz GPU (if sources are to be believed)

So if WiiU had received a bump to 1.4Ghz/600Mhz it would sit perfectly in the middle.

But it hasn't. That rumour was drivel.

But my point is, comparing clock speeds against handhelds isn't really "lol" worthy going into the next generation. I know you were just making a joke but I was bored and feeling picky. I'm sure someone equally as pedantic will swoop in and point out that I'm wrong now :)

I'm talking about performance, not clockspeed. The damn thing can't run half the 2013 PS3/360 games. GTA5, SR4, Metro: Last Light, Crysis 3, the list goes on and on and on. The OUYA 2 and NVidia Shield 2 will probably crap on it next year.
 
Status
Not open for further replies.
Back
Top