Nintendo Switch Technical discussion [SOC = Tegra X1]

Nintendo's been around the block. Nvidia isn't able to convince them they don't need a wider bus, eDRAM, more cores or whatever, for a well rounded powerful gaming device. Seems to me Nintendo just wanted to maximize profit and do as little extra work as possible. Nvidia fulfilled that need. I'm disappointed as well in the switch's hardware but it wasn't Nvidia's doing.

I'd argue there was some deceit from Nvidia with regards to the ps3's gpu (though it's also true the ps3 needed to be delayed and unified vs. fixed shaders *may* not have seemed so clear cut in early 2005), and they're scum for sending the Xbox to an early grave (how cool would a reliable slim Xbox be?) but whatever the switch is, it's all on Nintendo.

I have been saying it's gonna be a regular TX1 since eurogamer posted the rumor 6 months ago. look at the the wii, and wiiu, nintendo doesn't believe that good specs and third-party support will sell there consoles, they believe more in gimmicks. nintendo could easily make a ps4 level powered console and sell it at 250-300$ for profit, they just don't believe it will work.
 
Uhm. I wanted to see the 16nm Parker chip in the Switch as much as anyone, but lets not go completely overboard with disappointment here. It would have cost significantly more, it would have made retail availability a question mark, and it wouldn't have meant all that much in terms of performance.

TSMC 16nmFF provide roughly a 20% increase in clocks or a 40% reduction of power over 20SoC. Example source Since the Switch is limited by power draw, you could have gotten 1/0.6=1.667 or roughly 65% higher performance out of the Parker GPU at the same power draw, in spite of the doubled GPU and bus widths. Realistically some of that headroom (all?) might have been spent improving battery life instead of performance, or being spent in part to increase CPU clocks.

This also implies that we may well not see a 16nmFF update to the Switch, the benefit over the current SOC is rather modest. A performance improved rehash may not make sense until 7nm, and then only if sales volumes are good due to design costs. This depends a lot on nVidias plans of course, since I can't imagine that Nintendo will comission fully custom chips on those nodes, if they didn't want to do it on 16nm.
 
So they are basically doubling down on the Wii U, that's what this is. Or maybe a morph between that and the 3ds.
and it wouldn't have meant all that much in terms of performance.

Isn't mem b/w a huge limitation on the Switch and one of the selling points of Parker though? It offers more than double the b/w the TX1 does.
 
Uhm. I wanted to see the 16nm Parker chip in the Switch as much as anyone, but lets not go completely overboard with disappointment here. It would have cost significantly more, it would have made retail availability a question mark, and it wouldn't have meant all that much in terms of performance.

TSMC 16nmFF provide roughly a 20% increase in clocks or a 40% reduction of power over 20SoC. Example source Since the Switch is limited by power draw, you could have gotten 1/0.6=1.667 or roughly 65% higher performance out of the Parker GPU at the same power draw, in spite of the doubled GPU and bus widths. Realistically some of that headroom (all?) might have been spent improving battery life instead of performance, or being spent in part to increase CPU clocks.

This also implies that we may well not see a 16nmFF update to the Switch, the benefit over the current SOC is rather modest. A performance improved rehash may not make sense until 7nm, and then only if sales volumes are good due to design costs. This depends a lot on nVidias plans of course, since I can't imagine that Nintendo will comission fully custom chips on those nodes, if they didn't want to do it on 16nm.

Nintendo are in a tough spot with the decline in traditional mobile and them being squeezed out of the traditional console market, so they needed to do so something new. But Switch is still a risk, no matter how much Nintendo need it.

If TX1 helped them plan, develop the system they need and crucially to manage the risks (and costs) they face in trying out the Switch then that's a legitimate plus point for its use.

And the results it's delivering in handheld mode are pretty great.
 
Isn't mem b/w a huge limitation on the Switch and one of the selling points of Parker though? It offers more than double the b/w the TX1 does.

TX2 is double the BW at same memory clocks. Switch BW at mobile clocks doesn't seem so bad, even taking into account that it's shared with the CPU. It's docked mode where things will probably get pretty tight, though working in a tile friendly way should help significantly.

The trouble with a double width bus is that it'll need powering in mobile mode, where there would be much less benefit from it. Mobile phone chipsets don't seem to like going over 64-bit at any rate.
 
TX2 is double the BW at same memory clocks. Switch BW at mobile clocks doesn't seem so bad, even taking into account that it's shared with the CPU. It's docked mode where things will probably get pretty tight, though working in a tile friendly way should help significantly.

The trouble with a double width bus is that it'll need powering in mobile mode, where there would be much less benefit from it. Mobile phone chipsets don't seem to like going over 64-bit at any rate.

So it would only really benefit docked mode? That's not a big deal in favor of TX2 if true, at least for the Switch.
 
So it would only really benefit docked mode? That's not a big deal in favor of TX2 if true, at least for the Switch.

If you were to improve performance by 65% with a possible move to TX2, as suggested by Entropy, then it'd probably be different.

TX2 doubles both bus width and the number of CUs, and so maintains a similar performance / BW ratio to TX1.
 
Going off the numbers provided by AnandTech regarding the TX2 it could easily have been in the Switch. They could clock it way down for maximum efficiency and still provide better performance.

I believe the TX1 is purely a decision based on cost and nVIDIA cutting them a sweet deal, for a chip that really didn't go in anything else, besides cars.

The only thing custom about the TX1 used in the Nintendo Switch is clock speed.

Would be fun to see someone buying the Jetson TX1 and TX2 and run some numbers for us.
 
I have been saying it's gonna be a regular TX1 since eurogamer posted the rumor 6 months ago. look at the the wii, and wiiu, nintendo doesn't believe that good specs and third-party support will sell there consoles, they believe more in gimmicks. nintendo could easily make a ps4 level powered console and sell it at 250-300$ for profit, they just don't believe it will work.
Gimmick is a nice buzzword against Nintendo but it literally means anything designed to garner interest. More powerful hardware is a gimmick.

The Wii was really an excellent console, and the wii mote gave us many great games that wouldn't have been possible with traditional controls. Wii U is the one that didn't have its head on straight, it's not good in all honesty. And I hate that the Wii mote is seemingly lost to history now, no I don't want those little gyro sticks I want my IR pointer :/

----

Anyways, my beef with the Switch hardware (power wise anyway) isn't even its power target, it's that it's not balanced ; Nintendo didn't maximize its potential. Wii at least was a very balanced machine, and impressive considering it pulled less than 20 watts from the wall ; a tenth of its competition at the time. Switch is bandwidth starved in docked mode and doesn't have all 4 cores available to games. They just didn't put in the effort.
TX2 is double the BW at same memory clocks. Switch BW at mobile clocks doesn't seem so bad, even taking into account that it's shared with the CPU. It's docked mode where things will probably get pretty tight, though working in a tile friendly way should help significantly.

The trouble with a double width bus is that it'll need powering in mobile mode, where there would be much less benefit from it. Mobile phone chipsets don't seem to like going over 64-bit at any rate.

Thing is I have no interest in the switch as a handheld, it'll stay docked. If they wanted to they could have worked something out.
 
If the whole system is just TX1, no changes, then in theory Switch OS could be ported onto other TX device. And if so, onto a TX2 to see what happens. I hope some hackers give that a go!
 
The Wii was really an excellent console...
The problem with Wii is that it could have been so much better with decent hardware; what was good about it wasn't dependent on being outdated tech and better tech wouldn't have cost lots. Switch could be better but at what cost? Some think very little.
 
The problem with Wii is that it could have been so much better with decent hardware; what was good about it wasn't dependent on being outdated tech and better tech wouldn't have cost lots. Switch could be better but at what cost? Some think very little.
I love the gc BC though. And I love how reliable it is ; partially due to its low low power consumption. Lack of HD is my only gripe, the (good) games look great otherwise. But i've mitigated it with my choice of TV. With Wii I can at least understand any choices they made.

If they clocked broadway at 1GHZ, added a modern gpu around 3x the gc and just threw in flipper for legacy support that probably would've been the best. Then add further reliability through die shrinks. But that's the most i'd want them to do, I wouldn't want it to have been a 200 watt monster destined to die an early death. Which is what you're probably alluding to, so they could have platform parity.

In fact, simply because the Wii was much weaker than the other consoles, it became the target platform for a number of exclusive games.
 
The problem with Wii is that it could have been so much better with decent hardware;

Same thing for Switch imo. Nintendo was so close in delivering -somewhat- current generation performance on a portable device. If it was possible to port Xbox 1 games to the Switch without too many compromises (other than resolution) that'd be something noteworthy in this industry, and it's something that people always talk about.
 
Same thing for Switch imo. Nintendo was so close in delivering -somewhat- current generation performance on a portable device. If it was possible to port Xbox 1 games to the Switch without too many compromises (other than resolution) that'd be something noteworthy in this industry, and it's something that people always talk about.
This is hypothetical, and my counter argument all along has been, "Xbox One S draws 50 Watts; How do you get that power down to something in a cost-effective handheld?" The arguments for using TX2 assume it's cost effective, but none of us know that. IF Nintendo could have used TX2 and double the RAM BW at negligable extra cost, they should have. The question then is why didn't they, whether it really was a viable option or did Nintendo cheap out or what? What I would say is nVidia did not pull the wool over Nintendo's eyes and trick them into a turkey. The choice is Nintendo's. Worst nVidia could have done is offer TX2 at a greatly inflated price.
 
Same thing for Switch imo. Nintendo was so close in delivering -somewhat- current generation performance on a portable device. If it was possible to port Xbox 1 games to the Switch without too many compromises (other than resolution) that'd be something noteworthy in this industry, and it's something that people always talk about.

With what SoC ? X1 is already the most powerfull 3d mobile SoC for a product designed in 2015.
 
This is hypothetical, and my counter argument all along has been, "Xbox One S draws 50 Watts; How do you get that power down to something in a cost-effective handheld?" The arguments for using TX2 assume it's cost effective, but none of us know that. IF Nintendo could have used TX2 and double the RAM BW at negligable extra cost, they should have. The question then is why didn't they, whether it really was a viable option or did Nintendo cheap out or what? What I would say is nVidia did not pull the wool over Nintendo's eyes and trick them into a turkey. The choice is Nintendo's. Worst nVidia could have done is offer TX2 at a greatly inflated price.

Agreed, we don't really know the specifics other than Nintendo chose the TX1 over TX2.
With what SoC ? X1 is already the most powerfull 3d mobile SoC for a product designed in 2015.

Release the Switch later in 2017 and use the TX2, but like I said we don't really know the specifics other than that the TX2 is more powerful perf/watt than the TX1.
 
Which is what you're probably alluding to...
No. They could have remained small and quiet, just using a decent contemporary tech such as Radeon R300 series. DX9, SM2, decent antialiasing even at SD res.
I agree that would have been a good option. And now that you mention it 480p plus some AA and good AF would've been smarter than wasting it all on 720p. Wii's issue isn't really 480p it's that there's 0 AA...
 
Going off the numbers provided by AnandTech regarding the TX2 it could easily have been in the Switch. They could clock it way down for maximum efficiency and still provide better performance.

The numbers Anand got for Switch un-docked were taken from a wall power supply, and much higher than they would be if they were measuring draw from the battery while unplugged.

TX2 in its 7.5W configuration is way above TX1 in switch, even with the Denver cores completely unused.

The more significantly you have to downclock, the lower the performance per mm^2 and per $$ the chip represents.

Not to mention the TX2 embedded dev boards only just launched, nearly a year after basically final Switches were in Nintendo's hands.
 
Back
Top