Could next gen consoles focus mainly on CPU?

Not without more substantial cooling and power delivery. Realistically the 3x PPC750 scheme was dead on arrival because of it's limited SIMD capability and only ~1.2 GHz clock speed. Nintendo should've just figured out a way to integrate the Broadway core either into the main system hub (which had pretty much everything aside from the Espresso CPU), or onto the CPU die, but still have more modern CPU cores to run actual Wii U games. There was a bewildering number of PPC cores out there to make use of.
Oh no doubt, i've thought they should have incorporated broadway as a secondary chip since the day Wii U launched. I will say what they managed to get out of such an old design is really impressive though.
 
Are you referring to my post about the Wii U? I love how IBM said the Wii U had Watson/Power7 technology, which really was just their breakthrough in eDRAM density at the time.

With the XB1X tossing out the ESRAM, and Sony never using it in the first place this gen, specialized large cache systems are probably going to go the way of the dodo. Developers like the ease of big bandwidth + big memory, though I could see AMD gravitating towards larger L3 in their PC APUs to deal with bandwidth woes if it takes so long to bring 3D stacked memory.

I would describe ps4's bandwidth as adequate, not big ha. It'd be better if it weren't shared with the cpu. Esram was a cost cut, edram is better.

Can you imagine what kind of particles and shadows we could‘ve had if ps4 had edram as well? And also good AF :p seeing nier's texture filtering on a base ps4 is just gross.

OR, 512 bit buses would be great. A man can dream
 
Impressive yet damning. I got a Wii U late last year, and so far the only pastgen multiplatform I've played on it was ACIV Black Flag. Impressive that it runs, but the performance is well........poop compared to the 360 and PS3 that often run 50% faster in some scenes (unlocked framerate!). And there is plenty of multiplat comparisons on Youtube.

The PPC750 situation I'm sure extended beyond backwards compatibility, because we're talking IBM here. All the other PPC cores I was referring to are Freescale, not IBM. But a die shrunk PowerPC 970 would've been appropriate assuming thermals and power usage would scale down with process. If an off die or on-GPU PPC750 could've been retained, a dual or better yet a quad Freescale e600 would've been a much better choice than 3x PPC750s for a modern game console.

I think Ubisoft's support of the Wii U was a PR move more than anything. An unwillingness to play ball by other devs kept the Wii U from being a better seller than it could be, but alot of that was Nintendo's fault for building such a limited system.
 
Last edited:
Impressive yet damning. I got a Wii U late last year, and so far the only pastgen multiplatform I've played on it was ACIV Black Flag. Impressive that it runs, but the performance is well........poop compared to the 360 and PS3 that often run 50% faster in some scenes (unlocked framerate!). And there is plenty of multiplat comparisons on Youtube.

The PPC750 situation I'm sure extended beyond backwards compatibility, because we're talking IBM here. All the other PPC cores I was referring to are Freescale, not IBM. But a die shrunk PowerPC 970 would've been appropriate assuming thermals and power usage would scale down with process. If an off die or on-GPU PPC750 could've been retained, a dual or better yet a quad Freescale e600 would've been a much better choice than 3x PPC750s for a modern game console.

I think Ubisoft's support of the Wii U was a PR move more than anything. An unwillingness to play ball by other devs kept the Wii U from being a better seller than it could be, but alot of that was Nintendo's fault for building such a limited system.

Not sure if they even needed to stick with Power PC. After all, ps2 was RISC and that worked as a side chip in the ps3.

Just about anything of the 2010+ era would've been better than the cpu wii u got. It did get a nice gpu for its power envelope though and plenty of ram so that was good. All in all for a 35 watt system on 45/40nm it was nice so long as a game wasn't cpu heavy.
 
I think Ubisoft's support of the Wii U was a PR move more than anything. An unwillingness to play ball by other devs kept the Wii U from being a better seller than it could be, but alot of that was Nintendo's fault for building such a limited system.
Zombi U was the best Wii U title for me, and among the best of the 360/PS3/Wii U (I'd class them all as the same gen based on hardware capabilities). It basically developed to the strengths of the hardware, as well as having the only decent use of the gamepad in a third party game or any game on that system. Can't fault Ubisoft for its Wii U support tbh with such a low userbase.
 
yes Liolio... Only I dont see room for Zen... Too big CPU and power get wasted into the unified RAM environment. The best we can get IMHO are improved Jaguars... If we get 16@3 ghz would you be unhappy ? Me no....
It is not too big. How big is the CPU complex in the XB1X or PS4 Pro? I would say not significantly bigger (in the same ball park than a CCX which is only 45mm2. I've no time to do some estimated on Paint or something like that but Zen cluster are not big, and per GHz and Watts I'm confident a 4C/8T CCX beat the 8 Jaguar cores in nowadays consoles, Wrt to size I suspect it is much much closer than you think.
Zen cores can be clocked higher without reaching out of control TDP (35-45 Watts you have perfs that KILL the CPU in the PSXB, that is for a 4C/8T).

EDIT
Counting transistors is not a proper metric, density of the various part of chip varies significantly (lots of trannies in the XB1 on chip SRAM are pretty well packed).
 
Last edited:
will see ;) ... MS on his 7 billions OneX APU choosed to spend transistors budget on BUS & GPU... Results are fine. Even the decision not to adopt the Rapid Math was the best... (rapid math as we discover cost silicon with questionable benefits).... Compare Scorpio APU to the Ryzen 2400g that is 5 bilions... Does anybody has seen Ryzen running with GDDR5 into an unified memory system with GPU ? The closer is this 2400g.... With DDR4
 
There was a bewildering number of PPC cores out there to make use of.
Not for embedded low-power-ish, though. The Power7 cores were designed for very high power consumption and very high clocks. Each Power7 quad-core module (the smallest ones available) was a 567mm^2 monster with 1.2B transistors, 4 cores with 4 threads and 4 FP64 floating point units each, designed to run at ~4GHz for a total throughput of 400 GFLOPs.

I'm not saying it wouldn't be awesome to see one of these in a console, but Nintendo wasn't going to make consoles with a $1000 BoM.

They'd be much better off keeping the damned PowerPC ~750MHz 750 for direct Wii compatibility, use it in Wii U games for handling the O.S. and then let the developers use a quad-core Cortex A9 1.8-2.0GHz. 2GB GDDR5 128bit, plus the single channel DDR3 for low-priority RAM they had been implementing since the Wii and call it a day.
That and a Juniper or even a RV740 GPU would put it close enough to the 2013 consoles to get multiplatform titles.

But noooo....

Funniest thing is this could probably all fit inside a single ~180mm^2 chip made at IBM, since most of Latte's die area was for the 32MB eDRAM. They didn't even have to go with three ridiculously sized different chips in a substrate. The only additional cost here would be the 4* GDDR5 chips for 128bit width.
 
Not sure but I think they needed the 32mb eDRAM to sort of simulate wii's 24megs of 1t-sram.

Hollywood and the 3mb's of edram was cleverly combined with the latte chip though. Thing is the gpu they chose was very good for a 20 watt chip, it's DX10 and at least better than Xenos. 40nm was the best available for a mass produced box in 2012. They didn't want to make a hot running, power sapping box like ps3 and 360 were. Remember consoles didn't use to be these hulking pc's in a smaller box.

I would've liked to see more main memory bandwidth though ; a quad channel ddr3 config like xbox one had would be nice. Since so many of ninty's 60fps games had no AF. Hell even Zelda doesn't and that's 30fps.
 
New Not sure but I think they needed the 32mb eDRAM to sort of simulate wii's 24megs of 1t-sram.

Naaaah. The 1T-SRAM had 2.6GB/s bandwidth in the gamecube. Even if they did a 50% upclock for the Wii like they did for everything else, we're looking at 4 GB/s tops.
Then even the 3MB EDRAM only had a theoretical maximum bandwidth of 20GB/s .

A single 128bit GDDR5 bus using just 4 x32 memory chips would have given them enough bandwidth to cover it all.

One could say latency would be a big problem, but Dolphin runs pretty much all Gamecube and Wii games on low-end PCs quite easily (especially now with the multicore patch).


The gamecube had very clever hardware for its time: up-to-date-ish OoO CPU, 1T-SRAM at internal 384/512 buses for higher bandwidth and external 64bit for quantity, ArtX GPU with TEVs that apparently could do almost the same thing as the 1st-gen programmable pixel shaders in the Xbox's NV2A, etc.
The Wii did not have clever hardware and was just a very low-effort to sell a console out of existing hardware (dejá vu?).
The Wii U seems to have been specially designed to have pretty terrible performance-per-mm^2 through the various chips on that weird MCM.

I would've liked to see more main memory bandwidth though ; a quad channel ddr3 config like xbox one had would be nice.
DDR3 is 16bit maximum. To get the same 256bit width the console would always need 16 chips.
You're using the term "quad-channel" because standard PC motherboards use 64bit modules, but each module usually carries a bunch of chips.
 
Yes latency would be the issue not bandwidth. Nitendo wanted faithful native hardware bc so that kind of emulation is off the table. It's like how sony doesn't enable boost mode by default on ps4 pro just in case issues would arise ; a cautious approach to bc.

Ahh now i see why wii u's ddr3 speed was limited. Then yes I agree with your gddr5 + ddr3 thoughts.
 
Not for embedded low-power-ish, though. The Power7 cores were designed for very high power consumption and very high clocks. Each Power7 quad-core module (the smallest ones available) was a 567mm^2 monster with 1.2B transistors, 4 cores with 4 threads and 4 FP64 floating point units each, designed to run at ~4GHz for a total throughput of 400 GFLOPs.

Oh God, I already knew any relevant server POWER core was out of the question. Those are server chips! A quad variant of the e600 from Freescale would've been great, basically 4x PPC 7448s. Hell even just two of those cores in the 2.0 GHz region would've been a huge improvement over the 3x PPC750 since the e600 like PPC7448, has true 4-wide Altivec. A quad would've easily been competitive with Xenon in real workloads because of the short pipeline. The PowerPC 970 with a 45nm die shrink I bet could've been pulled off too. Sticking to PPC would've leveraged the PPC work known by devs on all the systems at that point (Wii, 360, PS3).

They'd be much better off keeping the damned PowerPC ~750MHz 750 for direct Wii compatibility, use it in Wii U games for handling the O.S. and then let the developers use a quad-core Cortex A9 1.8-2.0GHz. 2GB GDDR5 128bit, plus the single channel DDR3 for low-priority RAM they had been implementing since the Wii and call it a day.

A retained PPC750 could've been kept around as a secondary security processor for Wii U titles, while performing BC for Wii and Gamecube games, however that would still leave the Wii/GC GPU to worry about retaining. But better yet, Nintendo could've had some level of intelligence and leveraged the work of the Dolphin emulator team to build a comprehensive software emulator to run previous system games. They should've offered to bring them in-house, pay them well, and in turn get some good boy points from the community instead of the usual demonization schemes they tend to enact. There would've been no need to retain any old hardware, and older games could be ran at higher resolutions, AA, and AF to look better than before. Hell it makes a case to have used an AMD Llano APU instead of sticking with PPC. Nintendo would've been ahead of the curve as Sony and MS transitioned to x86.

That and a Juniper or even a RV740 GPU would put it close enough to the 2013 consoles to get multiplatform titles.

RV740 was exactly what we were all expecting when it was rumored the Wii would have an R700 based GPU in 2012 of all years. I guess the feature set relative to the transistor count and GFLOPS in R700 made sense to Nintendo as DirectX capability was irrelevant but only 352 GFLOPS of GPU compute was pathetic. You're right that more capability would've kept the system relevant, if not somewhat competitive graphically, but would've also forced Sony and MS to produce more powerful systems too. I've tried to run some newer games on a Radeon 4670 (same basic config for Wii U GPU) for poops 'n giggles, but by 2012, game and driver support had moved beyond the R700s. Let's say AC4 Black Flag barely ran, even on lowest settings.


Funniest thing is this could probably all fit inside a single ~180mm^2 chip made at IBM, since most of Latte's die area was for the 32MB eDRAM. They didn't even have to go with three ridiculously sized different chips in a substrate. The only additional cost here would be the 4* GDDR5 chips for 128bit width.

I blame Nintendo's unwillingness to break hardware backwards compatibility, and perhaps vendors played a role in whether they wanted to combine their IP on one die. AFAIK, Microsoft owns the chip designs to the 360, hence why they could create an "APU" for the later model 360s. It might have not been the same for the Wii U.

Idealized Wii U Specs:

Freescale e600 Quad-core @ 2.0 GHz
AMD RV740 GPU, ported to 28nm, 640:32:16 @ 750+ MHz (960+ GFLOPS)
4 GB GDDR5 on 128 bit bus

No retained old hardware :cool:, and at least for a year, Nintendo would've had the best versions of multiplatform games in 1080p as compared to 720p on the 360 and PS3.
 
Last edited:
RV740 was exactly what we were all expecting when it was rumored the Wii would have an R700 based GPU in 2012 of all years. I guess the feature set relative to the transistor count and GFLOPS in R700 made sense to Nintendo as DirectX capability was irrelevant but only 352 GFLOPS of GPU compute was pathetic. You're right that more capability would've kept the system relevant, if not somewhat competitive graphically, but would've also forced Sony and MS to produce more powerful systems too. I've tried to run some newer games on a Radeon 4670 (same basic config for Wii U GPU) for poops 'n giggles, but by 2012, game and driver support had moved beyond the R700s. Let's say AC4 Black Flag barely ran, even on lowest settings.
Try 176 GFlops. Yes, Nintendo lowballed even the lowest expectations, but I think there may have been a reason for that, and that was probably where they saw handheld technology being in the next 4-5 years, and hence easy ports from Wii U -> Switch. It turns out the Switch superseded that but roughly within the ballpark.
 
Last edited:
You guys act like dolphin is some magic bullet. Dolphin 5.0 with multicore and everything wasn't a thing in 2012, let alone in wii u's r&d stage. Games were riddled with issues and even my old phenom 2 x4 wasn't enough to properly emulate which far outclasses anything Nitendo would've used. No matter who was working on the emulation it would leave a lot to be desired given the time period and cost limits.
 
Try 176 GFlops. Yes, Nintendo lowballed even the lowest expectations, but I think there may have been a reason for that, and that was probably where they saw handheld technology being in the next 4-5 years, and hence easy ports from Wii U -> Switch. It turns out the Switch superseded that but roughly within the ballpark.

The general consensus on the Wii U GPU is a 320:16:8 config @ 550 MHz = 352 GFLOPS. Testing in graphics bound scenarios in many games has shown it has a bit of a leg up on the 360.

https://www.eurogamer.net/articles/df-hardware-wii-u-graphics-power-finally-revealed
 
The general consensus on the Wii U GPU is a 320:16:8 config @ 550 MHz = 352 GFLOPS. Testing in graphics bound scenarios in many games has shown it has a bit of a leg up on the 360.

https://www.eurogamer.net/articles/df-hardware-wii-u-graphics-power-finally-revealed

Nope. That's wrong. Completely.

General consensus on B3D is 160:16:8 @550MHz.

Yep. Correct.

Based on?

Count the registers on the chip. Read the completely and utterly unambiguous statements in the developer docs.

Pisses me off that Wikipedia allows such shit to stain their pages.
 
Back
Top