Nintendo Switch Tech Speculation discussion

Status
Not open for further replies.
microsoft and google also has been chasing this concept for ages and none of them have succeeded. Microsoft do seems to have an edge on this tho (with their W10continuum and x86onARM thingy).

although in this case, Nintendo have distinct advantage: focused as video game device
Microsoft and Google have their set of issues and it resolves around software and input methods. Nintendo is a better spot though we have yet to see what will happen when a game requires touch inputs and is played on TV (I mean it is awesome how Nintendo makes its own life so complicated...).

What I'm alluding too is TDP only laptop have a the TDP headroom to offer decent home console experience. Laptop are bigger, heavier (than handhelds) often with tricky cooling solution, etc.
 
The Tegra's are just too power hungry at reference clocks for mobile, but they make for great PR at Nvidia's events. I wonder if going the Vita route with reference ARM and PowerVR would have got them more perf/W.

Vita successor vs. Switch would have been something to see...if only Vita wasn't a market failure.

Tegra uses reference ARM the same as Vita did. Maybe you mean a better core like A72 or A73. But I remind you that the Vita route resulted in Cortex-A9 cores that were clocked at a scant 444MHz, when contemporary phones used Cortex-A9s clocked at > 1GHz.

I haven't yet seen evidence that PowerVR from comparable generation has significantly better power efficiency than Maxwell in the X1 implementation. People focus heavily on power consumption at max clocks but that doesn't tell the whole story, you really have to look at the entire perf/W vs perf curve and rarely does anyone do this.
 
I think they got a deal from Nvidia + special talk from Nvidia awesome sale persons.
I think that the Tegra X1 falls a little short as home console and is over specced for a handheld BUT it can be used for both, comes with great tool an software, Nvidia reputation, etc.

On the other hand it is free money to Nvidia the SOCs is available, they just have to place orders...no work serious work (sale persons aside) involved.



No it comes from a poor concept, no hardware the size of a handheld can double as a mobile device and stationary device. Laptops can, laptops are bigger then there is the matter of costs.

Nvidias public reputation and their b2b reputation are totally disparate. Remember that their dealings with Sony and Microsoft.
 
GCN2 geometry throughput is pretty low compared to Nvidia Maxwell/Pascal. Clocking it low shouldn't be a problem in this sense. Also Xbox One has only 16 ROPs and Maxwell tiled rasterizer is pretty efficient.

The perf problems are most likely seen in pixel and especially in compute shaders. GCN is pretty good in compute and async compute is widely used in console games to bring GCN occupancy higher than PC.

http://www.realworldtech.com/tile-based-rasterization-nvidia-gpus/

This is a pretty good article/video on the subject.
 
Tegra uses reference ARM the same as Vita did. Maybe you mean a better core like A72 or A73. But I remind you that the Vita route resulted in Cortex-A9 cores that were clocked at a scant 444MHz, when contemporary phones used Cortex-A9s clocked at > 1GHz.
The Vita uses 4x 444MHz Cortex A9, whereas contemporary flagship phones from late 2011 / early 2012 used 2x ~1GHz Cortex A9. So from a phone-to-console comparison, in the CPU side the Vita was well ahead of what the Switch seems to be if it's 4x A57 @ 1GHz. Vita had a higher core count, the Switch as a lower one.

If we go by GPU comparison, the Vita had a PowerVR 543MP4, practically the same as the top-end ipad 2 of the time with a A5X and estimated 32 GFLOPs. Current top-end 9.7" ipad pro AFAIK does close to 500 GFLOPs FP32 and 1 TFLOPs FP16.
And then the Vita had the exquisite 128MB dedicated VRAM with a probably large bandwidth using TSV.
Screen resolution and quality was also above most phones at the time, offering OLED RGB 960*540 while android flagships had 800*480 screens.
And the console was sold at $250/250€ without 3G, AFAIR at a profit.


Just to say the Vita compared much more favorably to 2011 top-end phones/tablets than the Switch compares to 2016 top-end phones/tablets (and the console isn't even releasing in 2016). And Sony didn't try to sell the Vita as a hybrid console.

Oh, and the Vita didn't burn little kids' hands. Neither does it need air vents and fans. Food for thought.





All this - again - assuming the Switch has 2 SM clocked at 300-760MHz, which is pure speculation at the moment.



I haven't yet seen evidence that PowerVR from comparable generation has significantly better power efficiency than Maxwell in the X1 implementation. People focus heavily on power consumption at max clocks but that doesn't tell the whole story, you really have to look at the entire perf/W vs perf curve and rarely does anyone do this.

I believe Maxwell Tegra would probably be competitive with PowerVR 7XT, if there were SoCs with 7XT built on 20nm. There aren't any, though. So the best you can do is compare to 6XT, but no 6XT SoC is trying to reach TX1's absolute performance in 3D.
 
@ToTTenTranz
I 100% agree that I was surprised by the clock speeds, but we have seen this show before with Nintendo. What we have also seen is we tend look at spec at a very superficial level. Remember when the bandwidth to the main memory on the Wii U was figured out? People immediately assumed it must be starved for bandwidth. 12.8GB/s just cant be enough.....Developers report that there was no such issue with the memory, both Shin'en and Criterion reported that memory performance was solid. If we are to rewind just a couple weeks, we have FromSoftware talking about having Dark Souls 3 running with good performance on Switch. Bethesda talking about the Switch being the best demo they have ever seen, and confirmed that Skyrim will be on Switch. A week ago these developer comments painted performance on Switch in a favorable light, we get some clock speeds and all of a sudden Dark Souls 3 doesn't run on Switch? Someone should inform FromSoftware that Dark Souls doesn't run on Switch, despite what they see in their office. LOL Seriously though, either this custom Tegra chip gets far better performance than one would expect from the numbers or these modern engines may scale down a lot easier than people would expect. Right now we basically have people assuming that the specs mean no western third party support. What happens if on January 13 during the reveal Call of Duty and Assassins Creed (insert any western third party game really) are both demoed running just fine on the Switch? People are saying it wont happen, but what if it does happen. Could be interesting.
 
Last edited:
Oh, and the Vita didn't burn little kids' hands. Neither does it need air vents and fans. Food for thought.
4 hours gaming from 8Wh battery. 2 watts total.
Screen is around 1 watt (5 inch oled from 2011 tech).
So there's no way the main SoC and memory consumes much more than 1 watt while gaming.

It's almost as if Sony didn't try to cheat the laws of physics :confused:
 
The Vita uses 4x 444MHz Cortex A9, whereas contemporary flagship phones from late 2011 / early 2012 used 2x ~1GHz Cortex A9. So from a phone-to-console comparison, in the CPU side the Vita was well ahead of what the Switch seems to be if it's 4x A57 @ 1GHz. Vita had a higher core count, the Switch as a lower one.

Galaxy S3 was released May 2012 with quad core 1.4GHz Cortex-A9. Galaxy S2 was May 2011, 2x1.2GHz Cortex-A9. The latter is probably the closer comparison to Vita since AFAIK it was made on the same process generation.

Power consumption scales far more than super-linearly vs clock speed (with some boundaries). Under comparable designs a 444MHz Cortex-A9 will consume far less than 37% the 1.2GHz part. Probably closer to 20% or lower. This would mean that the dual core 1.2GHz Cortex-A9s should be budgeted to consume well over twice the power that the quad core 444MHz Cortex-A9s consume.

There's no question in my mind that Vita was designed with less power budgeted for the CPU cores than contemporary phones, so that more would be afforded for the GPU.

I would also consider the 4x1GHz Cortex-A57s in K1 to consume maybe half of what other mobile devices with A57 on similar processes would. The extra cores in mobile devices are almost entirely only there to increase the low end of the perf/W dynamic range, they're not as useful for a dedicated gaming device.

4 hours gaming from 8Wh battery. 2 watts total.
Screen is around 1 watt (5 inch oled from 2011 tech).
So there's no way the main SoC and memory consumes much more than 1 watt while gaming.

It's almost as if Sony didn't try to cheat the laws of physics :confused:

When you get down to it, power efficiency (either through process or uarch) has not improved nearly as much as peak performance has. Mobile device peak power consumption has gone way up, especially for the GPU part, and there's a lot of throttling. Everyone is chasing benchmark scores and burst performance.

This makes it a lot harder today to make a reasonable gaming handheld that compares well with other mobile platforms in max performance than it was in 2011.
 
Last edited:
4 hours gaming from 8Wh battery. 2 watts total.
Screen is around 1 watt (5 inch oled from 2011 tech).
So there's no way the main SoC and memory consumes much more than 1 watt while gaming.

It's almost as if Sony didn't try to cheat the laws of physics :confused:

My point was the Vita got downclocked CPU but a GPU performance equivalent to top-end tablets at the time of release. The Switch, if it has only 2 SM at 300MHz, will be far away from an ipad pro's A9X in 3D performance.
Please explain how that constitutes "trying to cheat the laws of physics".


Regardless and since you went that road, I don't know if you own a Vita but:
1 - You definitely don't get 4 hours from demanding games like Killzone Mercenary. I remember the console would get little more than 2 hours even with minimum brightness - and yes, it would get a bit hot.
2 - Screen consumption depends on brightness. My bet would be 1.5W is for maximum brightness and minimum may be 500mW, but we'd need sources.


As for the rest, I'm not questioning the clocks claimed by Eurogamer's latest article. I actually said many times I thought the ideal CPU for the Switch would be something like 8 Cortex A53 at 1.5GHz, and a quad Cortex A57 1GHz may not be that far away in both performance and power consumption.
I do question the specs stated by the twitter leak, which sound like a load of BS to me for many reasons.
 
@ToTTenTranz
I 100% agree that I was surprised by the clock speeds, but we have seen this show before with Nintendo. What we have also seen is we tend look at spec at a very superficial level. Remember when the bandwidth to the main memory on the Wii U was figured out? People immediately assumed it must be starved for bandwidth. 12.8GB/s just cant be enough.....Developers report that there was no such issue with the memory, both Shin'en and Criterion reported that memory performance was solid. If we are to rewind just a couple weeks, we have FromSoftware talking about having Dark Souls 3 running with good performance on Switch. Bethesda talking about the Switch being the best demo they have ever seen, and confirmed that Skyrim will be on Switch. A week ago these developer comments painted performance on Switch in a favorable light, we get some clock speeds and all of a sudden Dark Souls 3 doesn't run on Switch? Someone should inform FromSoftware that Dark Souls doesn't run on Switch, despite what they see in their office. LOL Seriously though, either this custom Tegra chip gets far better performance than one would expect from the numbers or these modern engines may scale down a lot easier than people would expect. Right now we basically have people assuming that the specs mean no western third party support. What happens if on January 13 during the reveal Call of Duty and Assassins Creed (insert any western third party game really) are both demoed running just fine on the Switch? People are saying it wont happen, but what if it does happen. Could be interesting.

people thought the wiiu was memory starved cause they thought it had 350-500 gflops, in reality it was 176 which was a good fit.

My point was the Vita got downclocked CPU but a GPU performance equivalent to top-end tablets at the time of release. The Switch, if it has only 2 SM at 300MHz, will be far away from an ipad pro's A9X in 3D performance.
Please explain how that constitutes "trying to cheat the laws of physics".


Regardless and since you went that road, I don't know if you own a Vita but:
1 - You definitely don't get 4 hours from demanding games like Killzone Mercenary. I remember the console would get little more than 2 hours even with minimum brightness - and yes, it would get a bit hot.
2 - Screen consumption depends on brightness. My bet would be 1.5W is for maximum brightness and minimum may be 500mW, but we'd need sources.


As for the rest, I'm not questioning the clocks claimed by Eurogamer's latest article. I actually said many times I thought the ideal CPU for the Switch would be something like 8 Cortex A53 at 1.5GHz, and a quad Cortex A57 1GHz may not be that far away in both performance and power consumption.
I do question the specs stated by the twitter leak, which sound like a load of BS to me for many reasons.

when was the last time leaked specs for a nintendo console made sense? when it comes to power or saving pennies, nintendo would rather save been pennies, this proven by the wii and wiiu. they don't care about power period.
 
My point was the Vita got downclocked CPU but a GPU performance equivalent to top-end tablets at the time of release. The Switch, if it has only 2 SM at 300MHz, will be far away from an ipad pro's A9X in 3D performance.
Please explain how that constitutes "trying to cheat the laws of physics".


Regardless and since you went that road, I don't know if you own a Vita but:
1 - You definitely don't get 4 hours from demanding games like Killzone Mercenary. I remember the console would get little more than 2 hours even with minimum brightness - and yes, it would get a bit hot.
2 - Screen consumption depends on brightness. My bet would be 1.5W is for maximum brightness and minimum may be 500mW, but we'd need sources.


As for the rest, I'm not questioning the clocks claimed by Eurogamer's latest article. I actually said many times I thought the ideal CPU for the Switch would be something like 8 Cortex A53 at 1.5GHz, and a quad Cortex A57 1GHz may not be that far away in both performance and power consumption.
I do question the specs stated by the twitter leak, which sound like a load of BS to me for many reasons.
Okay, I agree we don't have a lot of true confirmation except the clocks and the stock X1 being probably used as an early dev kit.

But there isn't a lot of possibilities to connect the dots. In the past, main SoC power consumption wasn't the limiting factor it is today. So in your imaginary specs (what are they? are you saying it would be 4 SM? With 128bits lpddr4?), how many watts would the switch consume when docked, and how many watts in portable mode, considering the only difference is 40% gpu clock and the rest of the chip left at the same clock?
 
Nvidias public reputation and their b2b reputation are totally disparate. Remember that their dealings with Sony and Microsoft.
I see where you are going and I agree. I don't think that Nvidia are schemers, bad partners etc. They actually hold their side of the deal and have reputation to defend and improve on (the CEO is not a guy to look back for too long).
What I'm saying is that they are also sharp negotiators. If they can sell something they have that will make them money and prevent them to dedicate man power on something they will fight the long fight to get there. It seems to me that for the Switch they manage to sell Nintendo the Tegra X1 (I may be wrong) and that is a great sale for them: no R&D they will provide support but most of the investments were already done. It is easy from there order more wafer and get money.

It would be the same if (just an example) AMD managed to sell MSFT or Sony a Beema/Mullins SOC + an existing discrete GPU(say Cap verde or Bonaire) for a good amount of money (versus developing something custom, locking human ressources etc.).
 
Last edited:
allegedly leaked NVIDIA in discussion with Nintendo for NX
p14-masami-alcohol-a-20140831.jpg
 
Okay, I agree we don't have a lot of true confirmation except the clocks and the stock X1 being probably used as an early dev kit.

But there isn't a lot of possibilities to connect the dots. In the past, main SoC power consumption wasn't the limiting factor it is today. So in your imaginary specs (what are they? are you saying it would be 4 SM? With 128bits lpddr4?), how many watts would the switch consume when docked, and how many watts in portable mode, considering the only difference is 40% gpu clock and the rest of the chip left at the same clock?

I don't know what he has in mind but for the majority of his points I agree in his post you quoted. Here's the Pixel C https://gfxbench.com/device.jsp?benchmark=gfx40&os=Android&api=gl&cpu-arch=ARM&hwtype=GPU&hwname=NVIDIA(R) Tegra(R) X1&did=28156617&D=Google Pixel C
....the GPU runs at a peak of 850MHz if memory serves well. In the Manhattan 3.1 long term performance test it seems to lose up to 50% of it's original performance, which would mean a peak frequency for the GPU of 425MHz instead. Ironically when someone mentions in other cases how solution A or B throttles it usually gets market as insignifant in the grander scheme of things, however when faced with a different reality where it actually comes to realistic gaming conditions as within a gaming console, it's obviously a completely different chapter than to run any device through a synthetic benchmark for a couple of seconds.

On a speculative basis IMHO it means that if the Switch SoC is manufactured on 20SoC as Erista, the leaked frequencies should be quite close to final specifications if they aren't final already. If in a less likely case it should be on any of the recent FinFET process variants then things could be quite a bit better.

For the record the specifications aren't really disappointing for me, at least not under the usual NINTENDO perspective. They might not want to market it as a pure handheld but compared to the 3DS the Switch is leaps and heads ahead starting in mobile mode. Other than that for the usual "could have" or "might have beens" for alternative SoC manufacturers I challenge anyone here to name ONE of the usual suspects that could/would design such an ULP SoC that could support the Switch with a low level API amongst others as good as NVIDIA can.
 
The Eurogamer article stated GPU clocks of 307.2/768MHz (undocked/docked) and a CPU clock of 1020MHz in either state.
It bothers me a bit that when I look at documentation regarding the Tegra X1 performance stages, link, none of these numbers appear in the documentation. Indeed, I can't find two graphics states that are a factor of 2.5 apart, so adjusting a reference clock somewhere won't produce these rumoured clocks.
 
The Eurogamer article stated GPU clocks of 307.2/768MHz (undocked/docked) and a CPU clock of 1020MHz in either state.
It bothers me a bit that when I look at documentation regarding the Tegra X1 performance stages, link, none of these numbers appear in the documentation. Indeed, I can't find two graphics states that are a factor of 2.5 apart, so adjusting a reference clock somewhere won't produce these rumoured clocks.
That page appears to be for the Tegra K1 (A15).
 
Status
Not open for further replies.
Back
Top