Nintendo Switch Tech Speculation discussion

Status
Not open for further replies.
Playing devil's advocate a bit, but I don't agree with you that downclocks would erode the point of using the TX2, although this is completely sidestepping the question of whether or not the extra price would (pretty much unknown to us) Switch is very heavily power limited in handheld mode (and decently in docked mode). Just a straight decrease in power consumption in handheld mode would have been a pretty big deal, a ~3 hour battery life is pretty bad. On the flip side, the 128-bit memory interface (downclocked by about half in handheld mode to try to come closer to the 64-bit power consumption) would have helped keep performance from getting totally hamstrung in docked mode.

I meant erode rather than eliminate, as the lower you have to clock to fit the power envelope then the closer to TX1 performance - compared to the 7.5W TX2 performance touted earlier - you get and the more you're paying per unit of performance (however you measure that). After all, lower clocks save power but don't particularly affect the cost of the chip (unless you're after the higher clocking bins or have very tight voltage tolerances).

As I understand it, there's also a point at the very power focused end where going wider and slower becomes counter productive, although I have no idea if Switch would have been there with TX2. I just think it's easy to look at the TX2 7.5W figures and say "that should have been in Switch" when it may not have helped mobile configuration much, and would have raised BOM significantly.

A 16nm TX1 otoh would have reduce power, increased frequency and given Nintendo more flexibility. Though the cost of custom 16nm chips can easily be into the hundreds of millions of dollars, from what I've read. That's quite a gamble.

TX1 would have benefited from more BW in docked mode of course, but rather than a double with bus that might impact mobile mode power draw perhaps using faster memory that could go above 3200 when docked would have been a good idea. There are already upper/mid phones using above 3200, and Samsung have supposedly had 4266 chips since 2015 ....

The interesting question is exactly what Nintendo would have done with the two Denver cores, if developers would find such a wildly heterogeneous setup desirable. I guess it really depends on their efficiency. If the two Denvers can tend to supply better performance than three A57s at less power, which is at least within the realm of possibility, then they'd probably be a win. Then the OS functions could stay on the A57 cluster and use as little clock speed and as few cores powered as it can get away with.

In the embedded development kit the Denvers are disabled at 7.5W, in preference for the A57s. I am presuming (though don't know) that perf/Watt is better for the A57s at the low end, but that absolute performance of the Denvers is better when less power constrained.

Given that Nintendo have the same CPU configuration for both mobile and docked, would it even have been possible for Nintendo have used the Denver cores? If so, why have Nvidia disabled them in the 7.5W configuration?

I really would like to see where TX2 ends up. It looks like the perfect chip for a 20W console. But ... that not currently a market niche.
 
How do you figure this? What are you comparing with?

It's hard to really estimate because we don't have GPU-only power consumption numbers for any console that I'm aware of, or any methodology that tries to determine this based on varying workloads. But in terms of overall system perf/W Wii U is surely behind both the original XB1 and PS4, and way behind XB1S and PS4Pro. Or maybe you're defining efficiency as simply the least power consumption regardless of performance?

Sorry I didn't mean power consumption vs. the 8th gen I meant it was the best in capabilities of the 7th gen consoles. It was a DX10 chip vs. the older Xenos gpu. Wii U had a halfway decent gpu it was just letdown by its cpu and bandwidth to a lesser extent (though the 360 and ps3 were bandwidth limited as well.)
 
Sorry I didn't mean power consumption vs. the 8th gen I meant it was the best in capabilities of the 7th gen consoles. It was a DX10 chip vs. the older Xenos gpu. Wii U had a halfway decent gpu it was just letdown by its cpu and bandwidth to a lesser extent (though the 360 and ps3 were bandwidth limited as well.)

It was 176 gflops gpu, i wouldn't call it a descent halfway GPU, it was about the same class as 360, and came out 7 years later. I don't how people still say this. wiiu got plenty of ports, and hardly none showed any had advantage in resolution, in fact you have more ports that run lower resolution then 360. Wiiu bandwidth was pretty good for it's capabilities with 38mb edram, it wasn't a problem.
 
TX1 would have benefited from more BW in docked mode of course, but rather than a double with bus that might impact mobile mode power draw perhaps using faster memory that could go above 3200 when docked would have been a good idea. There are already upper/mid phones using above 3200, and Samsung have supposedly had 4266 chips since 2015 ....
TX1 memory controller might not be capable of supporting the higher frequency, I imagine?
 
It was 176 gflops gpu, i wouldn't call it a descent halfway GPU, it was about the same class as 360, and came out 7 years later. I don't how people still say this. wiiu got plenty of ports, and hardly none showed any had advantage in resolution, in fact you have more ports that run lower resolution then 360. Wiiu bandwidth was pretty good for it's capabilities with 38mb edram, it wasn't a problem.
It's a more modern part than 360's xenos.

Problem with eDram is it takes more effort to use than a unified pool and is easy to underutulize.

As for the lower frame rates I'd blame the decade old CPU. It's clearly a bottleneck when you look at ubisofts open world games on the platform, 4a games called the CPU horrible :p Not that Nintendo needed a beefy CPU for their games except zelda and xenoblade.

Ps3 and 360's gpus both had half the bus width of their PC counterparts and Wii u is comparable in bandwidth.

GC and Wii for example had enough bandwidth, and ps2 had an abundance.

Most of the major ports on Wii u were rushed, criterion ported need for speed and got the PC textures working on it with vsync at all times, better night lighting and runs a bit smoother.

When I say halfway decent I mean for 2012. Remember Sony and Ms weren't ready for 28nm until the following year.

*Posted from phone
 
Last edited:
It's a more modern part than 360's xenos.

Problem with eDram is it takes more effort to use than a unified pool and is easy to underutulize.

As for the lower frame rates I'd blame the decade old CPU. It's clearly a bottleneck when you look at ubisofts open world games on the platform, 4a games called the CPU horrible :p Not that Nintendo needed a beefy CPU for their games except zelda and xenoblade.

Ps3 and 360's gpus both had half the bus width of their PC counterparts and Wii u is comparable in bandwidth.

GC and Wii for example had enough bandwidth, and ps2 had an abundance.

Most of the major ports on Wii u were rushed, criterion ported need for speed and got the PC textures working on it with vsync at all times, better night lighting and runs a bit smoother.

When I say halfway decent I mean for 2012. Remember Sony and Ms weren't ready for 28nm until the following year.

*Posted from phone

Really people still use the rushed port excuse. were talking about a whole generation here, wiiu had tons of ports, and most were inferior, and the ones that were better, didn't really give me a impression of gpu was being clearly more powerful, for example the criterion port, very slight advantage in frame rate advantage in crashes, runs same sub hd resolution just like 360/ps3, and had a revamped lighting at night, which was a design choice, they made everything darker. i'm sorry but even in 2012 not being to be clearly superior then consoles released in 2005 is not halfway descent. not even one game on wiiu had better AA, and higher resolution the whole generation compared to 360. the difference when wiiu ports were better, is much smaller then many 360 vs ps3 ports.
 
Last edited:
It's worth remembering that ports is a dubious method for comparing performance across architectures generally. It just happens to be the only way accessible to non-developers.
But any performance critical code obviously needs to take the underlying architecture into account. Both the xb360 and particularly the PS3 CPUs were SIMD monsters if you could fit the code to them. Such code couldn't have been much fun getting to run on the WiiU. On the other hand the WiiU had a more flexible GPU and a lovely big pool of EDRAM, accessible to CPU and GPU.
A game built from the ground up targeting the PS3 would be quite different from a game targeting the WiiU. In that respect, the current crop are much more similar, but there are still differences that are significant. And WiiU to Switch is actually a bigger step architecturally than say PS4 to Switch. Titles like ARMS or the new Xenoblade will probably show more of the Switches ability than the WiiU ports. Even a game like Fast RMX that digital foundry reviewed and were very impressed by, is still anchored in code targeting the WiiU.
 
It's worth remembering that ports is a dubious method for comparing performance across architectures generally. It just happens to be the only way accessible to non-developers.
But any performance critical code obviously needs to take the underlying architecture into account. Both the xb360 and particularly the PS3 CPUs were SIMD monsters if you could fit the code to them. Such code couldn't have been much fun getting to run on the WiiU. On the other hand the WiiU had a more flexible GPU and a lovely big pool of EDRAM, accessible to CPU and GPU.
A game built from the ground up targeting the PS3 would be quite different from a game targeting the WiiU. In that respect, the current crop are much more similar, but there are still differences that are significant. And WiiU to Switch is actually a bigger step architecturally than say PS4 to Switch. Titles like ARMS or the new Xenoblade will probably show more of the Switches ability than the WiiU ports. Even a game like Fast RMX that digital foundry reviewed and were very impressed by, is still anchored in code targeting the WiiU.

it's not prefect, but it does give a idea of how much more powerful the hardware is, especially when were talking about a whole generation, even though we see many ports where the cpu shows it's advantage against wiiu , we never really see a tangible advantage in wiiu gpu in ports, no develoepr has added more AA or higher resolution provided for sub hd games. in fact we have a few games on wiiu where it targets lower resolution. even when games were lead on ps3, 360 ports were sometimes better or on par.
 
it's not prefect, but it does give a idea of how much more powerful the hardware is, especially when were talking about a whole generation, even though we see many ports where the cpu shows it's advantage against wiiu , we never really see a tangible advantage in wiiu gpu in ports, no develoepr has added more AA or higher resolution provided for sub hd games. in fact we have a few games on wiiu where it targets lower resolution. even when games were lead on ps3, 360 ports were sometimes better or on par.
The WiiU was happy if it got ports at all. Ports that performed above the targets set for the lead platform was probably not really in the cards. Hard to see the publishers footing the bill for that, unless it was low hanging fruit.
Need for Speed, Bayonetta and so on did well. I don't really see how that affects the Switch either way. It has an architecture and memory space that should enable cut down ports to be made, if the projected sales volume is there to motivate it. Some games like FIFA seems like a good fit for the portable demographic. The major publishers don't seem to be on board the train from the start though, which I think is a mistake. Maybe they think that iOS and Android can provide them with mobile platforms for their IP. But if they aren't supporting the Switch from the get go, then they will find a less receptive audience when their products reach the platform later and thus less desireable.

Judging by kids and youth around me, TV as a concept is being quickly depreciated. My kids absolutely will not be tied to a broadcasting schedule, and even prefers watching films on tablets or phones somewhere cozy or in bed over sitting in front of a TV screen. The same trend is clear among adults as well. That's not to say that the TV attached gaming console will go away tomorrow, but I really doubt that it's a growth market long term.
 
Judging by kids and youth around me, TV as a concept is being quickly depreciated. My kids absolutely will not be tied to a broadcasting schedule, and even prefers watching films on tablets or phones somewhere cozy or in bed over sitting in front of a TV screen. The same trend is clear among adults as well. That's not to say that the TV attached gaming console will go away tomorrow, but I really doubt that it's a growth market long term.
If that were true, sales of TVs should be dropping. I think what you're seeing is more an interest in privacy and reduction of shared viewing. Offer those same kids the chance to have a 50" TV in their bedroom and see if they'd rather watch on their iThing.
 
The WiiU was happy if it got ports at all. Ports that performed above the targets set for the lead platform was probably not really in the cards. Hard to see the publishers footing the bill for that, unless it was low hanging fruit.
Need for Speed, Bayonetta and so on did well.

I don't think it's about footing the bill, it's just the gpu wasn't capable enough, and like i said before even the good ports showed minuscule advantage in gpu power. then you have games like tekken tag, sonic racing, sky landers, and disney infinity, which run lowest resolution or have there graphics downgraded to run on wiiu. wiiu gpu being better then 360 is not a fact, and i wish people would stop saying that like it's a fact, when there is no real proof to back it.
 
TX1 memory controller might not be capable of supporting the higher frequency, I imagine?

Could be I suppose.

They could always hire a 1337 overclocker to overvolt the mem controller and pump ln2 round from the dock. Problem solved!
 
If that were true, sales of TVs should be dropping. I think what you're seeing is more an interest in privacy and reduction of shared viewing. Offer those same kids the chance to have a 50" TV in their bedroom and see if they'd rather watch on their iThing.
Quite possible, in general, although they prefer mobile screens even for shared viewing. (!) Anecdotal evidence isn't worth much under any circumstances, but on the other hand it would be strange if ones personal experiences didn't shape beliefs in any way. And my personal experience of people around me is that they simply place a large value on deciding for themselves how, when and where they consume entertainment. Even severely adult people being 80+.

That said, my perception is also that there is a generational component. If you have grown up around smartphones, you basically take some things for granted. Mobile devices don't offer mobility, stationary devices remove it.
Can't see this trend reversing going forward - what on earth would drive such a change in direction?
 
I know the battery life has been stated to be quite low but having only updated the firmware and played 45 minutes of Zelda the battery is at 60%. A little unfortunate as I doubt I'll ever use it docked. Judging by the much longer battery life playing indie games that's being reported there is probably wide variation in the X1's power draw.

I haven't heard the fan yet and the air coming from the vent is quite lukewarm, so the cooling seems to be well and truly under control undocked.
 
I know the battery life has been stated to be quite low but having only updated the firmware and played 45 minutes of Zelda the battery is at 60%. A little unfortunate as I doubt I'll ever use it docked. Judging by the much longer battery life playing indie games that's being reported there is probably wide variation in the X1's power draw.

I haven't heard the fan yet and the air coming from the vent is quite lukewarm, so the cooling seems to be well and truly under control undocked.

I think battery life will probably get a little better for games actually made for the system. With Breath of the Wild, it's basically a port with a LOT of stuff going on. I'm curious as to what Xenoblade 2 or ARMS's battery life will be like. Then again, Nintendo could still be using old engines for those games too.
 
I don't think it's about footing the bill, it's just the gpu wasn't capable enough, and like i said before even the good ports showed minuscule advantage in gpu power. then you have games like tekken tag, sonic racing, sky landers, and disney infinity, which run lowest resolution or have there graphics downgraded to run on wiiu. wiiu gpu being better then 360 is not a fact, and i wish people would stop saying that like it's a fact, when there is no real proof to back it.

That was my point: "better" at what exactly? Unless the WiiU was a superset in all respects including all aspects of performance, there is no way to guarantee that ported code would run faster on the newer system. And that obviously wasn't the case. CPU and main memory bandwidth alone present hurdles necessitating recoding for ports before we even start considering the GPU at all.

Again, this is unrelated to the Switch, apart from the caution against using ports for performance comparisons. Using Geekbench at the time to compare the PS3 CPU vs. contemporaries for instance, you could see even normal subtests (not encryption or such) vary more than a factor of five in relative performance. Performance comparisons between architectures is simply difficult. If all we have available for doing them is a really crappy tool on top of the inherent problems, then maybe we just shouldn't do it.
 
it's not prefect, but it does give a idea of how much more powerful the hardware is, especially when were talking about a whole generation, even though we see many ports where the cpu shows it's advantage against wiiu , we never really see a tangible advantage in wiiu gpu in ports, no develoepr has added more AA or higher resolution provided for sub hd games. in fact we have a few games on wiiu where it targets lower resolution. even when games were lead on ps3, 360 ports were sometimes better or on par.

True, but the fact remains that Wii U was never the target platform for these multi plat titles, and developers spent years tailoring their game engines to take advantage of the 360 and PS3. I remember in the Secret Developer article at Eurogamer, the developer made mention of how its common knowledge that the Cell processor was used for graphics processing on the PS3, but this was also done with the 360 CPU as well. The SIMD capabilities of these consoles was maximized, and that's something that was never going to port to the paired singles CPU the Wii U rocked. From what I can gather, whatever modest advantages the Wii U GPU had was mitigated by the CPU performance. More or less, Nintendo crafted hardware that was on par with the 360 and PS3, but had some fundamental design characteristics that had advantages and disadvantages with the other consoles.

Some links:
Switch vs WiiU: http://www.eurogamer.net/articles/digitalfoundry-2017-fast-rmx-showcases-switches-power-over-wii-u
Switch vs PS4 vs Vita: http://www.eurogamer.net/articles/digitalfoundry-2017-dragon-quest-heroes-2-switch-vs-ps4-comparison
Switch power consumption: http://www.anandtech.com/show/11181/a-look-at-nintendo-switch-power-consumption
Xbox One S power consumption:

Switch (GPU performance) seems to be slightly ahead of last gen consoles (Xbox 360, PS3, WiiU) when handheld. When docked it is roughly 2x last gen. Last gen games were mostly 720p, meaning that the handheld image quality of Switch (720p screen) should be also slightly ahead. Docked IQ is the same, but rendered at 900p or 1080p. GPU performance doubles, but memory bandwidth only gets a minor boost when docked. This is a bit similar to PS4 -> PS4 Pro.

We need more cross platform games to draw final conclusion about Switch vs Xbox One. 1/3 seems to be a pretty good estimate for docked mode. Switch has 4 GB of memory. This is much more than Xbox 360 or PS3 (both had 512 MB). WiiU had 2 GB, but OS used half of it (1 GB usable). Switch Nvidia Maxwell GPU also all the same features as current gen consoles and DX12.1 PCs. Switch has a modern OoO CPU with cache prefetchers. Down-porting games to Switch should be significantly easier than current gen -> last gen.

Switch power consumption when docked (11W) is roughly 5x less than Xbox One S (58W). Both chips are 20nm. Handheld power efficiency is harder to compare against home consoles, because handheld power consumption includes the screen. Handheld battery life beats tablets in gaming. Battery life in Modern Combat 5 or Asphalt 8 is only 1.5h - 2h on high end tablets. Rendering resolution however differs, meaning that direct efficiency comparison can't be made.

Doesn't Xbox One S use 16nm Finfet? I believe it is, and that is quite a bit more power efficient than 20nm. So even with the more efficient node, the Xbox One S pulls nearly 6x the amount of juice the Switch is pulling, at least when not charging the battery.

One third the power of the Xbox One seems reasonable, and half as powerful seems to be possible depending on just how useful the FP16 capabilities turn out to be. Docked offers 786Gflop half precision, although the real word performance may find the memory bandwidth ceiling prior to reaching maximum shader performance. Regardless, this is pretty darn good for a product in this form factor pulling 1/6th the power. Im sure a lot of people are going to see this as a monumental chasm, but Im not so sure. Look how much parity there are with multi platform games on the Xbox One and PS4, and the PS4 has a 600Gflop advantage over the Xbox One, similar to the 800Gflop advantage the Xbox One has over the Switch docked.

Both Steep and Skyrim come out in the fall, and I wouldn't be shocked to see Call of Duty to make an appearance. The chasm in performance certainly will make porting to Switch less than straight forward, but the business proposition is more likely to determine what support looks like. Even if Switch sells very well, if Activision releases COD on Switch and doesn't sell a million copies, they will likely decline to continue with such efforts on the platform. Same with Steep from Ubisoft, while I don't see the million sold milestone being the benchmark for this particular title, but if it struggles to sell even a few hundred thousand units, Ubisoft will likely streamline their offering down to titles like Just Dance.
 
Status
Not open for further replies.
Back
Top