Switch 2 Speculation

The marketing implies a footprint less than the Tegra X2, I think, while being pin compatible with the Jetson Nano. My impression was a new SKU on 7nm.

so..... ¯\_(ツ)_/¯

Or I’m crazy :V

The Xavier NX module is smaller than the Tegra X2 module and is very compact, but the chip itself is the same Xavier 12nm chip with portions disabled and running at lower clock speeds.
 
What can we realistically expect from a Switch Pro/2 in 2020/21?

As the Xavier NX is way too big but no other chip is available that could make a big enough leap over the TX1, I can only see a custom chip on 7nm EUV with a 8-core CPU cluster (either 8 Cortex-A77 or 4 A77 / 4 A55 configuration), 512 Ampere CUs and 8GB GDDR4x (or GDDR5) on a 128bit interface. This should be doable under 150 mm² and would enable at least 2,5-3x the performance of the original Switch.
 
What can we realistically expect from a Switch Pro/2 in 2020/21?

As the Xavier NX is way too big but no other chip is available that could make a big enough leap over the TX1, I can only see a custom chip on 7nm EUV with a 8-core CPU cluster (either 8 Cortex-A77 or 4 A77 / 4 A55 configuration), 512 Ampere CUs and 8GB GDDR4x (or GDDR5) on a 128bit interface. This should be doable under 150 mm² and would enable at least 2,5-3x the performance of the original Switch.

I don't think there is a chance it would be Ampere. Nvidia would charge too much for it.
 
I don't think there is a chance it would be Ampere. Nvidia would charge too much for it.

What do you mean with that? It doesn't really depend on which generation the GPU comes from, but transistor count resp. die size. It could also use shrinked Volta GPU of course, and for what its worse, I don't really care. I just used Ampere, because it already will be on 7nm and will have some architectural improvements, so it makes a lot of sense from my point of view.
 
Last edited:
I don't think there is a chance it would be Ampere. Nvidia would charge too much for it.
Well, if Nvidia charge so much Nintendo won’t buy, they get no revenue.

At this point in time, with Nvidia out of mobile, and Switch a proven concept, it would make a lot of sense for Nintendo to move on to one of the mobile SoC suppliers.
 
What do you mean with that? It doesn't really depend on which generation the GPU comes from, but transistor count resp. die size. It could also use shrinked Volta GPU of course, and for what its worse, I don't really care. I just used Ampere, because it already will be on 7nm and will have some architectural improvements, so it makes a lot of sense from my point of view.

*facepalm*
Yes, it does depend on generation, Nvidia doesn't charge only the production cost but the 3 or 4 years of R&D it took to develop.
 
Well, if Nvidia charge so much Nintendo won’t buy, they get no revenue.

At this point in time, with Nvidia out of mobile, and Switch a proven concept, it would make a lot of sense for Nintendo to move on to one of the mobile SoC suppliers.

Isn't it obvious that their mobile efforts are now for automotive and AI, a much more lucrative market than mobile gaming? They went with the Switch because they had an over 1 year old chip to sell, not a new one. It was old tech probably sold for cheap.
 
*facepalm*
Yes, it does depend on generation, Nvidia doesn't charge only the production cost but the 3 or 4 years of R&D it took to develop.

They have a multi-year agreement with Nintendo and are paying for R&D for the custom chip anyway, regardless of which generation it is based on I suppose.
 
They have a multi-year agreement with Nintendo and are paying for R&D for the custom chip anyway, regardless of which generation it is based on I suppose.

Have you seen the agreement? You must have to be so sure of that. The Switch does NOT have a custom chip at all, it is a standard off the shelf Tegra X1. What makes you believe Nvidia is going to develop a custom SoC just for Nintendo? They never did for anyone else, they develop their stuff and sell it. Even the PS3 RSX was a glorified G70 part.
 
Just throwing a current gen snapdragon in there would be a huge upgrade in performance.

The severely downclocked Tegra X1 was crappy even for the time the switch was launched.

I don't see any reason for Nintendo to stick with Nvidia. But then again, picking the Tegra in the first place was a dumb move imo.
 
Just throwing a current gen snapdragon in there would be a huge upgrade in performance.

The severely downclocked Tegra X1 was crappy even for the time the switch was launched.

I don't see any reason for Nintendo to stick with Nvidia. But then again, picking the Tegra in the first place was a dumb move imo.
Going nvidia again will ensure compatibility I would wager.
 
Going nvidia again will ensure compatibility I would wager.
Do they care about that tho? They switched it up last gen and they were successful without any BC. If they want BC, they would have to not only stick with Nvidia but probably pay for something custom, not like any modern Nvidia SoC will be directly compatible with the X1.
 
The Tegra X1 was a fantastic choice for the Switch. It was one of the best, if not the best, in graphics performance during the development of the Nintendo Switch. For a product launching at $299, the Tegra X1 was hard to beat even in March 2017 when the Switch launched. Not to mention the software and development tools that Nvidia has brought to the table resulted in a Nintendo console that was very easy to develop for. The partnership with Nvidia was exactly what Nintendo needed, and Nvidia needed a partner who could make all that R&D money spent on the X1 worth while. With over 50 million Switch units sold, they found that partner in Nintendo.

I believe the partnership will continue. Nvidia has continued to develop its GPU cores with power efficiency in mind. Pascal cores are more efficient than Maxwell, and the Volta cores are more efficient than the Pascal cores. So pairing as many Volta cores as possible with a cluster of ARM CPU cores doesn't seem like it would need a huge budget. The A72 and A73 cores both support the same instruction set as the A57 cores, so there shouldn't be a compatibility issues. Unlike the Tegra X1, this will take some R&D money from Nintendo, but when your looking at a product that can sell upwards of a 100 million units, its cost will be minimal when distributed across all the units sold. The partnership has gone well, why throw that away?
 
Going nvidia again will ensure compatibility I would wager.
Not necessarily. If devs are hitting the hardware without a high level abstraction, a change in architecture, even from one iteration of an uarch to another, can break code. There may not be a better nVidia part that's fully BC with TX1 and Nintendo may need a clean break. However, I think of all the companies, they are the only ones who can do this successfully. Their fanbase are loyal to the IPs that other platforms can't offer, so they'll buy new libraries or rebuy the favourite games. As per the old "importance of BC" discussion, it's nice, but not essential if the new platform is bringing you new content that means you don't care to play the old content.
 
Not necessarily. If devs are hitting the hardware without a high level abstraction, a change in architecture, even from one iteration of an uarch to another, can break code.

Doesnt the new 16nm FinFet TX1 already use different language than the standard X1? Hackers arent able to hack the newer X1's because of that I believe. Or would this be totally different?

As for how important backwards compatibility is, I agree that its nice but not a requirement. The PS4 isnt backwards compatible with the PS3, and that didn't slow it down any. Having a clean break is probably preferred by a lot of third party developers. The early days of the Switch eshop was very lucrative for a lot of Indies. Exposure was much better. If Switch 2 were to be a direct continuation, visibility of software will never be that great.
 
They never did for anyone else, they develop their stuff and sell it. Even the PS3 RSX was a glorified G70 part.
Doesn't the RSX include northbridge functionality?
Regardless, at that time the nForce efforts were still ongoing.

The Tegra X1 was a fantastic choice for the Switch. It was one of the best, if not the best, in graphics performance during the development of the Nintendo Switch.
Of course it wasn't. Do we really need to come back to this every other month?
The Snapdragon 820 that was ready in 2015 would have run circles around the Tegra X1 with a sub-500MHz GPU. During 2015 even Samsung's Exynos 8890 would have been available for sampling to partners, and it too would have provided much better performance at the same power consumption. Either company could have been able to develop a radio-less and ISP-less solution with CPUs optimized for lower frequencies for Nintendo had they paid enough money for it, or either company would have gladly sold Nintendo their high-volume SoCs.

Do you really need to make these broad generalizations about Tegra X1 being OMG the best possible SoC for 2017 handheld release when it's very obviously and factually false? During all the ~8 years they were active in the mobile SoC market, nVidia never hit any power/performance jackpot. Their inability to compete is the reason they left that market.

What nvidia did provide to Nintendo (and the others could not) was a vertical stack for software development and optimization, an already existing development team that was specializing in porting PC games to ARM SoCs and a close presence next to 3rd party developers.
I.e. what nvidia could provide in the overall package was money savings in software development / optimization efforts plus a sizeable discount on an objecticely failed existing SoC.
nVidia did not a better hardware solution than others could, for a handheld console. The benchmarks exist, the power consumption comparisons exist and everything was made by multiple outlets and experts. Let's stop pretending they don't.
 
nVidia's strength has always been software rather than hardware.
I remember all those broken features in nVidia hardware that made my life horrible when I was making AAA games...

As for Switch 2, I have no idea what Nintendo would do, the company is so atypical. The path of least resistance would be to continue with nVidia, but that might not be the best path for Nintendo.
(There's better power/performance hardware with ray tracing from a formerly british company... ;p)
 
nVidia's strength has always been software rather than hardware.
I remember all those broken features in nVidia hardware that made my life horrible when I was making AAA games...

As for Switch 2, I have no idea what Nintendo would do, the company is so atypical. The path of least resistance would be to continue with nVidia, but that might not be the best path for Nintendo.
(There's better power/performance hardware with ray tracing from a formerly british company... ;p)

nvidia's weakness is also software not hardware, e.g. VRS will only come to RTX despite GTX turing should also able to do it because its only have disabled tensor cores.

or VRS uses tensor core?
 
Back
Top