Nintendo's next gen strategy for home & mobile

Did IBM ever do any research on getting PowerPC on mobile devices during the past 10 years?
AFAIK, their last CPU to go on a laptop was the G4 for the powerbooks..
How hard would it be for Nintendo to co-develop a PowerPC for a handheld?

Now they're claiming there will be more than 2 form factors.. at least a 3DS-sized handheld, a tablet and a home console.
I couldn't understand if they're going to target the same performance characteristics for all three.
I think IBM still develop low power POWER PC cores though not at the same pace as what is going on in the ARM realm.

I hope that when they speak of something that sort grow on the WiiU it is just programmable shaders. Imo sticking to IBM+AMD is a bad idea as they acknowledged that developing and supporting 2 different architectures is no longer pertinent from their pov.

AMD is an option is they can get ARM cores and GCN (and up coming architectures) to "speak" together well. They should go for vendors that provides both the CPU and GPU, there are not many:
Nvidia, AMD, Qualcomm. Then there are Chinese HIV but they are really low end I'm not sure they are interested in the kind of volume Nintendo represent. Not sure even mediatek would be interested, anyway looking at political and historical tensions between Japan and either China or Korea it is unlikely to happen.

Now Nintendo designed the 3DS hardware mostly by self, I think that they could do their own SoC (and do it right if they make it easy for them-self), I would favor ARM IP, ARM is really focus on providing "easy" to implement IP, with a strong focus on time to market, their CPU and GPU are designed to work together, etc.
 
Last edited by a moderator:
Or what about just using ARM for their consoles? Even a 1.6+GHz quad core Cortex-A9 would have given the Wii U's CPU a run for its money. Cortex-A15s even more so. While it may not be particularly cutting edge in 2016, something like a quad or maybe hex core 2.5GHz Cortex-A57 would be a big step up from where they're at CPU-wise, and they could use the same core count with a lower clocked A53 setup on handhelds. Then license some kind of GPU that scales well from handheld to console. Not sure what would be feasible here (if AMD could accommodate it or not), but of course nVidia would be a strong contender, since they're licensing now.

I thought I read somewhere that core-for-core the 750 would still have an edge at similar clock speeds. I can't find it at the moment though. I found a SIMD test posted by a user on neogaf showing the Wii's CPU beating out a higher-clocked A8 with neon extensions. The Wii U CPU apparently has a different cache architecture and more cache plus a higher clock speed. RAD Games got Bink 2 (optimized for SIMD) running 1080p video on Wii U and on ARM I only see 720p but that doesn't but this is just going off of their development notes.

I dunno though because I remember Marcan saying the Ouya's CPU is better than the Wii U's. I do agree that they should ditch PPC in future platforms even if they have to let go of BC. It's performance expensive, but they could always work on hacking an emulator together. I know many people don't care for the 360's BC but I've always been impressed that they pulled it off (even though it only plays like half the games and many have issues).

The Wii U's CPU is sort of one-of-a-kind as far at 750s go. Multicore, over a gigahertz, the cache, etc. That kind of optimization was probably expensive. They may have intended to use the same architecture in future platforms, although going against the grain in that regard (relative to the industry) is probably more trouble than it's worth. I would like to see them do x86 on consoles and ARM on handhelds with AMD GPUs in a SOC with same OS on both. I don't know if that would allow them to release the same software titles on both the way they sound like they want to though.
 
I thought I read somewhere that core-for-core the 750 would still have an edge at similar clock speeds. I can't find it at the moment though. I found a SIMD test posted by a user on neogaf showing the Wii's CPU beating out a higher-clocked A8 with neon extensions. The Wii U CPU apparently has a different cache architecture and more cache plus a higher clock speed. RAD Games got Bink 2 (optimized for SIMD) running 1080p video on Wii U and on ARM I only see 720p but that doesn't but this is just going off of their development notes.

I dunno though because I remember Marcan saying the Ouya's CPU is better than the Wii U's. I do agree that they should ditch PPC in future platforms even if they have to let go of BC. It's performance expensive, but they could always work on hacking an emulator together. I know many people don't care for the 360's BC but I've always been impressed that they pulled it off (even though it only plays like half the games and many have issues).

The Wii U's CPU is sort of one-of-a-kind as far at 750s go. Multicore, over a gigahertz, the cache, etc. That kind of optimization was probably expensive. They may have intended to use the same architecture in future platforms, although going against the grain in that regard (relative to the industry) is probably more trouble than it's worth. I would like to see them do x86 on consoles and ARM on handhelds with AMD GPUs in a SOC with same OS on both. I don't know if that would allow them to release the same software titles on both the way they sound like they want to though.

That was blu (darkblu around these parts) who posted those test results. They were quite surprising. From the looks of it, a higher clocked Espresso with more cores should perform well enough to hang with Jaguars in use by Sony/MS, even if they leave SIMD the way it is now. As you stated, they poured alot of R&D into making that 750 multicore.

Also, they have (and are currently) optimizing their emulators to run on the PPC cores. I imagine that this will be something they want to keep working long term. They've got Espresso emulating ARM9 now. They could probably get 3DS emulation to work the same way with their next hardware. Crazy crazy crazy.
 
That was blu (darkblu around these parts) who posted those test results. They were quite surprising. From the looks of it, a higher clocked Espresso with more cores should perform well enough to hang with Jaguars in use by Sony/MS, even if they leave SIMD the way it is now. As you stated, they poured alot of R&D into making that 750 multicore.

darkblu's tests were pretty limited, just small matrix multiplication.. while they're representative of something it hardly gives a broad picture of performance. And he didn't compare with Jaguar but with Bobcat, that has half the execution width. IIRC the assembly code was hand rolled for PPC because the compiler wasn't too good with it, while C for the others. He checked the NEON code generated by the compiler to make sure it wasn't getting anything terribly wrong but I'm sure hand code could have done better, NEON scheduling is extremely fiddly on Cortex-A8/A9.

All of these other platforms still benefit from integer SIMD which can be a big deal for some stuff.

I think you guys are overestimating how much development effort went into making Wii U multicore, I don't think they got away from the coherency protocol that Broadway already implemented, and they're using separate caches.

Also, they have (and are currently) optimizing their emulators to run on the PPC cores. I imagine that this will be something they want to keep working long term. They've got Espresso emulating ARM9 now. They could probably get 3DS emulation to work the same way with their next hardware. Crazy crazy crazy.

Releasing some games on a virtual console like platform is not the same as offering general backwards compatibility. Of course I could have told you they'd be able to emulate DS to some extent on Wii U, although I didn't think they would.
 
I was just thinking it would be costly compared to using something more off-the-shelf. It looks like a one-off product created and customized specifically for Nintendo. IIRC, it's the highest clocked 750 and the only one that's multicore (could be wrong though). The last time IBM did that with a 750, they released the broadway as a general purpose use CPU (750CL).

I wonder what Iwata was referring to here (from the earlier posted quote):

However, I think that we no longer need this kind of effort under the current circumstances. In this perspective, while we are only going to be able to start this with the next system, it will become important for us to accurately take advantage of what we have done with the Wii U architecture. It of course does not mean that we are going to use exactly the same architecture as Wii U, but we are going to create a system that can absorb the Wii U architecture adequately. When this happens, home consoles and handheld devices will no longer be completely different, and they will become like brothers in a family of systems.

Could be a translation issue too. What could "absorb the Wii U architecture adequately" mean? And why would that bridge handheld and home consoles? The context is about an iOS-like shared architecture among different devices. But this comment about the Wii U here is kind of odd in this context. Unless it's referring to BC, but then he also says it doesn't mean they'll use the exact same architecture as Wii U. Such a comment doesn't mean they won't rule out using it or some aspect of it either. Or I could be reading too much into it and he's just referring to accounts (NNIDs).
 
I was just thinking it would be costly compared to using something more off-the-shelf. It looks like a one-off product created and customized specifically for Nintendo. IIRC, it's the highest clocked 750 and the only one that's multicore (could be wrong though). The last time IBM did that with a 750, they released the broadway as a general purpose use CPU (750CL).

I wouldn't really read anything into the clock speed. 750CL could officially clock all the way up to 1GHz. That was way back on 90nm. Wii U's CPU is presumably at 45nm, which could have easily enabled the ~25% clock increase all by itself, without redesigning the uarch at all.

Whether or not we see a commercialized version of Wii U's CPU too remains to be seen. It's hard to really imagine the broader market appeal.
 
Could be a translation issue too. What could "absorb the Wii U architecture adequately" mean?

I think there's no interpretation other than backwards compatibility.
Emulating a 1.2GHz PPC with another low-power ARM or x86 should be a difficult task, so I think this means one of three options:

- Next Nintendo consoles use behemoth CPUs so powerful that they can emulate PPC (unlikely)
- Next Nintendo consoles use PowerPC cores as main CPU
- Next Nintendo home console has a 3-core Espresso for diverse tasks such as I/O and background tasks (stuff that won't be needed in handheld consoles) and also for emulation in Wii U mode -> the same way a PS2 uses the PS1's MIPS CPU. This way they can use ARM, x86 or even MIPS cores at will.
 
- Next Nintendo consoles use behemoth CPUs so powerful that they can emulate PPC (unlikely)
Iirc Arwin's wife owns one of those nice Windows 8 convertible that are powered by Kabini CPU.
It would be interesting to see how such a set-up runs GC/Wii emulator, I've no idea.
Now the Wiiu cores are twice as fast as the wii one and there are 3 of them, still it would be interesting to have a ref.
I searched youtube and could not find a single vid of a a4-4500, on top of it it would be interesting to see the cpu load under emulation.
 
I think this means one of three options:
Option three sounds the most likely, considering how teensy-tiny espresso is even at 45nm. At a future-current semiconductor process when successor to wuu will be manufactured it would literally almost disappear. Then again, nintendo is nintendo, and they could just as well decide on a whim to go with ppc as main CPU again merely to be contrary, just as the rest of the console business rides the x86 train to success into a wholly different direction. *shrug*

Crazy thought:

'Absorbing wuu' into new console means stationary console will include the portable console when purchased, for use as second screen/wuublet.
 
FWIW, I think their next system should just be a tablet, one that comes with a tiny adapter that allows you to "AirPlay" to your TV if you want. But otherwise it is fully self contained and portable. If they don't want to compete on hardware then they should just stop competing on hardware and put all the software in one place. Give up on exotic multi-screen layouts. Make a high quality ~7 inch tablet with a nice screen, good controls and sell an official cover to keep it safe in a bag. Wait a couple years so the hardware is at least an upgrade over the WiiU and stuff it with RAM and expandable storage. Use cartridges if they need to keep retail happy, but otherwise make all games available as a download. Include virtual console compatibility with everything up to GameCube and make all that shit cross-buy and attached to a user account. Don't sell it for more than $250. Make sure Netflix, Amazon Instant and Hulu+ apps are there. Forget about backwards compatibility with Wii/WiiU. Do a couple high profile ports to the tablet if needed. Call it a day.
 
darkblu's tests were pretty limited, just small matrix multiplication.. while they're representative of something it hardly gives a broad picture of performance. And he didn't compare with Jaguar but with Bobcat, that has half the execution width. IIRC the assembly code was hand rolled for PPC because the compiler wasn't too good with it, while C for the others. He checked the NEON code generated by the compiler to make sure it wasn't getting anything terribly wrong but I'm sure hand code could have done better, NEON scheduling is extremely fiddly on Cortex-A8/A9.

All of these other platforms still benefit from integer SIMD which can be a big deal for some stuff.

I think you guys are overestimating how much development effort went into making Wii U multicore, I don't think they got away from the coherency protocol that Broadway already implemented, and they're using separate caches.



Releasing some games on a virtual console like platform is not the same as offering general backwards compatibility. Of course I could have told you they'd be able to emulate DS to some extent on Wii U, although I didn't think they would.

Does this make the Espresso less flexible for developers? The only developer comment I can remember seeing was from Slightly Mad with their multi threaded shadows. I havent heard many comments about multi threading with the Wii U.
 
I think there's no interpretation other than backwards compatibility.

I disagree. Nintendo likes to use vague language that sometimes doesn't mean exactly what it sounds like. Or just make things up entirely - remember DS being a "third pillar"?

Absorbing the architecture could simply mean that they intend for the successor to use the same tablet-based controller.

Does this make the Espresso less flexible for developers? The only developer comment I can remember seeing was from Slightly Mad with their multi threaded shadows. I havent heard many comments about multi threading with the Wii U.

Having different cache sizes is certainly less flexible. Having separate caches does tend to be more restrictive, and does tend to mean that you lose some higher speed communication between threads. Part of the efficiency depends on if the cores can communicate directly somehow, or if it just relies on the coherency protocol to invalidate cachelines in other cores, pushing communication back out to main RAM.
 
Last edited by a moderator:
I disagree. Nintendo likes to use vague language that sometimes doesn't mean exactly what it sounds like. Or just make things up entirely - remember DS being a "third pillar"?

IIRC, the DS was supposed to be a third pillar and it wasn't originally meant to replace and discontinue the GameBoy line. But then the DS' astounding success made Nintendo focus on the new idea instead of launching a new Gameboy (for example, Gameboy 3D coming out in ~2009?).
And as far as I recall, when the DS was announced, investors went batshit crazy because Nintendo didn't use the GameBoy name. This could be because Nintendo wasn't really sure about the DS' market reception, and they didn't want to screw-up the GameBoy name with an insecure idea.
Just like the VR-Boy wasn't named Famicom VR or Gameboy VR.



Absorbing the architecture could simply mean that they intend for the successor to use the same tablet-based controller.
I dont think so, because in that paragraph Iwata is talking about hardware architecture, since he specifically mentions CPU and GPU development:

Satoru Iwata said:
For example, currently it requires a huge amount of effort to port Wii software to Nintendo 3DS because not only their resolutions but also the methods of software development are entirely different. The same thing happens when we try to port Nintendo 3DS software to Wii U. If the transition of software from platform to platform can be made simpler, this will help solve the problem of game shortages in the launch periods of new platforms. Also, as technological advances took place at such a dramatic rate, and we were forced to choose the best technologies for video games under cost restrictions, each time we developed a new platform, we always ended up developing a system that was completely different from its predecessor. The only exception was when we went from Nintendo GameCube to Wii. Though the controller changed completely, the actual computer and graphics chips were developed very smoothly as they were very similar to those of Nintendo GameCube, but all the other systems required ground-up effort. However, I think that we no longer need this kind of effort under the current circumstances. In this perspective, while we are only going to be able to start this with the next system, it will become important for us to accurately take advantage of what we have done with the Wii U architecture. It of course does not mean that we are going to use exactly the same architecture as Wii U, but we are going to create a system that can absorb the Wii U architecture adequately. When this happens, home consoles and handheld devices will no longer be completely different, and they will become like brothers in a family of systems.

It really doesn't sound like he's talking about using tablets as controllers.
 
If Nintendo chooses to go with IBM again for the console CPU, one could only hope they'd select at least a PowerPC 74xx (introduced with the Power Macintosh G4 in 1999) as the starting point for the design.

If the console CPU had say, eight cores @ any reasonable clockspeed I would imagine it would easily be able to thrash the 8 core Jaguar CPUs in PS4 and X1.

As for graphics, if Nintendo decides not to stick with AMD, I sure would like the handheld to have PowerVR Series 6 GPUs (say MP4 and MP16, respectively) especially given PowerVR's scalibility in general. PowerVR would be a wise choice if the console uses say, a 128-bit bus. Wii and Wii U both used a 64-bit bus if I'm not mistaken.

Aside from hardware, Nintendo's next generation platforms would be a perfect time to resurrect the Famicom / NES and GameBoy names. It's time Nintendo broke with DS and Wii branding.
 
Last edited by a moderator:
IIRC, the DS was supposed to be a third pillar and it wasn't originally meant to replace and discontinue the GameBoy line. But then the DS' astounding success made Nintendo focus on the new idea instead of launching a new Gameboy (for example, Gameboy 3D coming out in ~2009?).
And as far as I recall, when the DS was announced, investors went batshit crazy because Nintendo didn't use the GameBoy name. This could be because Nintendo wasn't really sure about the DS' market reception, and they didn't want to screw-up the GameBoy name with an insecure idea.
Just like the VR-Boy wasn't named Famicom VR or Gameboy VR.

You recall exactly the same thing I did, which is merely what Nintendo claimed at the time. There's zero evidence that Nintendo had any honest intention of developing some other handheld concurrently, and all things considered, I'd be surprised if they actually were. The whole thing was most likely just done to keep investors from panicking (oddly you say they panicked because it wasn't called Gameboy..)

I dont think so, because in that paragraph Iwata is talking about hardware architecture, since he specifically mentions CPU and GPU development:

It really doesn't sound like he's talking about using tablets as controllers.

Okay, but here's the thing. You're taking this to mean that Nintendo is going to use another PowerPC-based CPU in their next console and handheld. But in this regard Wii U already kept using a PowerPC-based CPU, in fact, one that barely changed vs the last gen one. Yet here we have Nintendo saying it was a totally new design from the ground-up that caused them a lot of migration problems. Does it not stand to reason that they're talking about something more than ISA here? What is it that you think really made migrating from Wii to Wii U difficult? It probably had something to do with not being able to effectively migrate the same SDKs and having to deal with three cores and a more modern unified shader-based GPU with eDRAM. Maybe if their next console still uses three cores and a modern unified shader-based GPU it won't really matter if they change the uarchs.

But maybe you can also see what I mean when I say Nintendo is being deliberately vague and full of double speak and why you shouldn't take any very specific meaning from this.
 
If Nintendo chooses to go with IBM again for the console CPU, one could only hope they'd select at least a PowerPC 74xx (introduced with the Power Macintosh G4 in 1999) as the starting point for the design.

Basically, the only good reason to keep using PPC is direct backwards compatibility. Once they move from paired singles to AltiVec they lose that.

If the console CPU had say, eight cores @ any reasonable clockspeed I would imagine it would easily be able to thrash the 8 core Jaguar CPUs in PS4 and X1.

I guess some people are expecting Nintendo to do a complete 180 on their power consumption stance - so tiny and quiet that mom won't mind having it run in the living room (not that mom ever minded before, I'm not sure what they're on about here..). I really don't think anyone's been making with PowerPC designs that are similar to something like Silvermont, Cortex-A15, or even Jaguar in terms of peak perf in general purpose code vs perf/W.

As for graphics, if Nintendo decides not to stick with AMD, I sure would like the handheld to have PowerVR Series 6 GPUs (say MP4 and MP16, respectively) especially given PowerVR's scalibility in general. PowerVR would be a wise choice if the console uses say, a 128-bit bus. Wii and Wii U both used a 64-bit bus if I'm not mistaken.

I don't think Rogue scales like you think it does, I don't think you can really do something resembling an MP16 or maybe even MP4. Why not be stuck with AMD? I guess it depends on how much they're willing to license newer tech for some arbitrary SoC that may not include x86 cores.

Aside from hardware, Nintendo's next generation platforms would be a perfect time to resurrect the Famicom / NES and GameBoy names. It's time Nintendo broke with DS and Wii branding.

I think it'd be a big mistake to go back to using different names in different markets. And I really don't see something with "NES" going over well, it'll just make it sound archaic. Not sure with Gameboy, but 3DS is doing fine, I don't think they really need to change the image here a lot..
 
As crazy as those "Fusion" specs seemed at the time, now that we know this is the direction Nintendo is going in the future, they did have a the Wii U CPU listed as an additional CPU in those specs, so backwards compatibility will likely be handled by keeping the tiny little CPU in the loop. So Nintendo's options for the main CPU would be wide open. Not saying that we should believe those rumored specs, but because of the recent announcements, its not 100% out of the question. Maintaining 100% backwards compatibility would make transitioning from Wii U earlier than expected a more seamless process. Who knows, but it would be funny if in a couple years Nintendo reveals new hardware with those rumored specs.
 
As crazy as those "Fusion" specs seemed at the time, now that we know this is the direction Nintendo is going in the future, they did have a the Wii U CPU listed as an additional CPU in those specs, so backwards compatibility will likely be handled by keeping the tiny little CPU in the loop. So Nintendo's options for the main CPU would be wide open. Not saying that we should believe those rumored specs, but because of the recent announcements, its not 100% out of the question. Maintaining 100% backwards compatibility would make transitioning from Wii U earlier than expected a more seamless process. Who knows, but it would be funny if in a couple years Nintendo reveals new hardware with those rumored specs.

No. Just no. Nothing Nintendo said gave any kind of credence to those BS specs. They're garbage for a whole bunch of reasons.

Backwards compatibility has nothing to do with what Nintendo said in the first place. Nintendo already has that with Wii U. Claiming that they'll attain backwards compatibility again by including the CPU separately is not exactly an inspired idea. But I think if they have any sense they'll do a real SoC and that may be challenging to do while keeping any kind of PPC cores, because IBM doesn't have experience porting them to other processes and GPU/SoC makers don't have experience with IBM's process.
 
The "fusion" specs have got to be trolling. "AMD" Adreno? 8-core Power 8? Can you imagine how expensive that would be? The die size would be enormous and power consumption would be through the roof. The shared architeture thing was already announced at the previous investor meeting and fusion was taken from an old URL they registered.

Nintendo should have considered changing to a newer PPC architecture with the Wii U. At least then, assuming it was clocked high enough, they could try emulating paired singles for Wii BC. There probably isn't a PPC CPU fast enough to do the same for Espresso. If PPC appears in their next console, I would bet on it being another 750/Gekko/Broadway/Espresso derived design mostly for BC as ToTTenTranz mentioned above like the PS2 having PS1 CPU for I/O. possibly replacing the ARM CPU if that doesn't break BC...

Nobody would like this, but I can also see them using another evolved 750 design again. They've been using this architecture for 14 years, probably came in handy when they did the WWHD port in six months. As Fourth Storm mentioned, their emulators are optimized for PPC. They've promised N64, GCN, GBA, and now DS VC support so I've got to believe it's being worked on (and hopefully not on a per-game basis as the Wii likely did). I also think, even with the low shader count, the idea was to utilize the GPU for compute (personally I'm pessimistic about GPGPU outside of folding proteins and mining bitcoins but I hope I'm wrong).
 
The whole thing was most likely just done to keep investors from panicking (oddly you say they panicked because it wasn't called Gameboy..)

Of course. Gameboy was a powerful brand that everyone recognized in 2004. Yet Nintendo comes in and announces the DS.
Imagine apple announcing in their next keynote that they will discontinue the iphone brand and replace it with "Apple PS". Do you think their investors would like that idea?



Okay, but here's the thing. You're taking this to mean that Nintendo is going to use another PowerPC-based CPU in their next console and handheld.
No. From "absorbing the Wii U architecture" in a CPU/GPU context, I take it to mean there will be backwards compatibility with the Wii U.



But in this regard Wii U already kept using a PowerPC-based CPU, in fact, one that barely changed vs the last gen one. Yet here we have Nintendo saying it was a totally new design from the ground-up that caused them a lot of migration problems. Does it not stand to reason that they're talking about something more than ISA here? What is it that you think really made migrating from Wii to Wii U difficult?

No one said the Wii to Wii U migration was difficult.
If you read Iwata's statement, you'll see that he mentions troubles with Wii -> 3DS ports and 3DS -> Wii U ports. PowerPC -> ARM and ARM -> PowerPC.
He's talking exclusively about handheld <-> home-console ports.


But maybe you can also see what I mean when I say Nintendo is being deliberately vague and full of double speak and why you shouldn't take any very specific meaning from this.

It doesn't seem vague to me. Iwata said:

1 - Future handhelds and home consoles will start using the same CPU architecture and GPUs that use at least compliant shaders between them.

2 - Next home console will be backwards compatible with the Wii U.

Either they follow this or not, I can't promise. Neither can they, since a lot could happen between now and fiscal year 2016.
 
Last edited by a moderator:
Back
Top