Nintendo Switch Technical discussion [SOC = Tegra X1]

Well the hardware capability of running the same shaders is there, but if the resolution drops so hard that what you see in outdoor scenes (post FXAA) are blobs of color as if it was an impressionist painting, you're not really looking at the same game anymore:

zlrU3TV.png

They should have leaned into the low detail problem and given the Switch port a cell shaded look. Would have made it distinct from other versions, while making the lack of detail an asset instead of a problem.
 
That's why I think Nintendo will be willing to invest money for a customized chip with the Switch successor. ....

Are you implying still with nVidia ?

Because, I guess Switch is good for nVidia (they unloaded old underclocked X1 SoC...), but It could maybe be different if they have to spend to money into R&D, or sold a more recent SoC. Plus, even if we don't know all the details, I'm sure they're allocating some of their staff to support the Switch tools, devs, etc... If it was the first step for nVidia to be back in the console game, yeah I'm sure they'll be willing to continue, but if it was a "one shot" kind of thing, I wonder...
 
Are you implying still with nVidia ?

Because, I guess Switch is good for nVidia (they unloaded old underclocked X1 SoC...), but It could maybe be different if they have to spend to money into R&D, or sold a more recent SoC. Plus, even if we don't know all the details, I'm sure they're allocating some of their staff to support the Switch tools, devs, etc... If it was the first step for nVidia to be back in the console game, yeah I'm sure they'll be willing to continue, but if it was a "one shot" kind of thing, I wonder...
The PR out of the companies made it sound as if it was a long term partnership. Jen-hsun talked about a 20 year commitment. Then again, he says a lot of things.
As Tottentranz pointed out, one of the beautiful aspects of Nintendos choice of technology is that they could conceivably change suppliers of the system SoC to for instance Qualcomm, or even HiSilicon or Samsung since they don’t compete in the same space. Developers could use the same game engines and APIs. If the relationship with Nvidia goes sour, they are far from locked in.
And they can piggyback on all the advances driven by mobile platforms.
From a technological point of view I would be vastly more confident in Qualcomm being able to deliver an excellent 5nm gaming SoC, than Nvidia.
 
Last edited:
The APIs are done by nVidia on the Switch, no ?

If they don't care about BC, sure they can change SoC /tech, but one of the selling point of the X1 was the nVidia tools If i remember some articles correctly. I wonder if other SoC suppliers can have a lot of man power doing supports/devs like nVidia can.


OT : I wish Nintendo would make 2 separate console again. One portable, and one for home, more powerfull, but still compatible. If you buy a game for one of them, It would run everywhere. It would allow them to keep a low consuming SoC for mobile, but go higher like X2, Xavier or whatever nVidia can do...
 
Last edited:
Going with Nvidia made a lot of sense at the time. WiiU was flopping horribly, insecurity from that and about the viability of the Switch concept in the face of mobile gaming made going with Nvidia who could supply the whole package (because they had done most of the work already) a safe move. (Nvidia almost completely failed to sell it to anyone though, so they even resorted to making their own devices that went nowhere on the market.)
Minimum risk.
Now however the situation is different.
The Switch concept is more than validated and looks set to sell in the ballpark of 20 million units/year depending on prices, hit software, public fickleness et cetera. It means that Nintendo can walk up any supplier and put ten digits worth of $ on the table when shopping around for the next gen solution, and the SoC supplier in turn can make deals for wafer starts in the six digit range. This gives Nintendo options.

Nvidia hasn't had any mobile presence for years, they repurposed the Tegra line to some kind of automotive chips, the latest they announced they would design was in march 2018 (Orin) which was all about automotive needs, but it hasn't been heard of since.
Apart from the Switch, Nvidia is dead in mobile space. Utterly.
So while it may make sense for Nintendo to turn to Nvidia for a mid life kicker for the Switch, letting them be the supplier for their next generation device is a much more questionable proposition. Right at this moment, Qualcomm has a number of 5nm products in design. Nvidia? Anything? Why would a company that has no presence whatsoever in mobile space, nor prospects of it in the future be at the forefront of design and development work for mobile solutions? If I were Nintendo I would bet on Vulkan and middleware and engines built on top of that, and go with one of the mobile suppliers. That Nvidia would maintain R&D efforts in mobile space that is long term competitive with the companies that yearly sell hundreds of millions of chips into that market isn't be a bet I would be willing to make.
We'll see. At least Nintendo can negotiate from a position of some strength.
 
They should have leaned into the low detail problem and given the Switch port a cell shaded look. Would have made it distinct from other versions, while making the lack of detail an asset instead of a problem.

I isn't there a pc mod that do that? I wonder if it can be ported to hacked switch.

The performance boost would be curious
 
Nintendo has a history of not changing suppliers unless there are issues. SGI developed the graphics part of N64, and much of the team that did it left to form ArtX, who developed Gamecube's Flipper. ArtX got bought by ATi, and flipper got repurposed into Hollywood with Wii, along with the PowerPC CPU that powered Gamecube. WiiU was again PowerPC but this time with AMD graphics, who had purchased ATi. It wasn't until WiiU's failure that Nintendo changed suppliers and started from scratch. I would expect Nintendo to stick with nVidia because Switch is successful, and nVidia already has faster SOCs available. They already did when Switch was released. So there's an upgrade path for them using existing, probably cheaper, technology. Nintendo likes that.
 
I think that interpretation is a bit contrived. Another way to tell the story...

Nintendo went with SGI for the N64 and, because they felt that worked well for them, with the ArtX team. For Wii, because it was a gamble, they just repurposed the GC design, so didn't really 'pick a team'. For Wii U, there weren't a great many options for a GPU vendor. AMD probably offered the only cost-effective solution.

Look at their handhelds for a more complete picture. DS didn't involved any outside GPU solution. 3DS went with DMP.

Now we have Switch, using nVidia, fitting in with both Nitnendo's console and handheld lines, and being another GPU vendor. Effectively, there's no pattern. Nintendo have sourced a solution that provides whatever power and cost target they are after along with whatever compatibility they want to offer. If they want NSW compatibility, they will go nVidia. If they don't, they'll probably source a more cost effective solution for their next devices assuming nVidia can't provide one. Likewise if they change format and go glasses-only VR or something.

So, their next device, a NSW+, will likely be Tegra X2 as a compatible upgrade. Beyond that, could be anything from anyone.
 
Just how outlandish would it be for Nintendo to just change to an AMD SoC for the Switch 2?

A 5nm Switch 2 with an 8 core Zen 3, a 20CU Navi, and single 12GB stack of HBM3 would probably consume very little power when clocked low enough (~2GHz and ~700MHz respectively) for its portable mode. It could run lower fidelity versions of PS5/XBoxNext games with little faffing for devs, and would have a clear upgrade path to future nodes and architectures.

It seems like such an easy win, but how feasible would BC with the Switch be?
 
Just how outlandish would it be for Nintendo to just change to an AMD SoC for the Switch 2?

A 5nm Switch 2 with an 8 core Zen 3, a 20CU Navi, and single 12GB stack of HBM3 would probably consume very little power when clocked low enough (~2GHz and ~700MHz respectively) for its portable mode. It could run lower fidelity versions of PS5/XBoxNext games with little faffing for devs, and would have a clear upgrade path to future nodes and architectures.

It seems like such an easy win, but how feasible would BC with the Switch be?

If its powerful enough, probably an emulator would be feasible. Without needing any hardware crutches a-la ps3 x ps2

Currently there's YUZY that emulates switch but it's still in very early developments
 
One should immediately doubt the assumptions of the power consumption when suggesting a handheld can comfortably run ports of current-gen home consoles consuming >100 watts. If what you say is viable, that suggests that every generation, a portable could have been made by shrinking the console hardware one node and clocking it low, which has never been even remotely possible. Even three node reductions isn't enough. A launch PS3 consumed ~180 W playing games, while a PS3 slim from 2013 draws ~70.
 
Just how outlandish would it be for Nintendo to just change to an AMD SoC for the Switch 2?

A 5nm Switch 2 with an 8 core Zen 3, a 20CU Navi, and single 12GB stack of HBM3 would probably consume very little power when clocked low enough (~2GHz and ~700MHz respectively) for its portable mode. It could run lower fidelity versions of PS5/XBoxNext games with little faffing for devs, and would have a clear upgrade path to future nodes and architectures.

It seems like such an easy win, but how feasible would BC with the Switch be?


The latest agreement between AMD and Samsung to share the RDNA IP for mobile GPUs seems to dictate they can't compete on the same markets.
That means Samsung probably can't make SoCs above 10W, and AMD can't make SoCs below Raven Ridge's current 15W minimum.

So AMD can do APUs for large tablet form. Anything below that and it's Samsung territory.
 
They should have leaned into the low detail problem and given the Switch port a cell shaded look. Would have made it distinct from other versions, while making the lack of detail an asset instead of a problem.
The Legend of Witcher 3: Breath of the Wild Hunt?
 
If its powerful enough, probably an emulator would be feasible. Without needing any hardware crutches a-la ps3 x ps2

Currently there's YUZY that emulates switch but it's still in very early developments

What kind of hardware requirements does that emulator have?

Good to know it exists. Even if only as a proof of concept.

Would emulation require an XBoxOne style of BC, or would there be no legal hurdles for Nintendo to emulate an Nvidia SoC with different hardware? I.E. Are Microsoft's legal dealings entirely to do with the terms of the game initially being tied to a single platform, or is the alteration /emulation of code a factor too?

One should immediately doubt the assumptions of the power consumption when suggesting a handheld can comfortably run ports of current-gen home consoles consuming >100 watts. If what you say is viable, that suggests that every generation, a portable could have been made by shrinking the console hardware one node and clocking it low, which has never been even remotely possible. Even three node reductions isn't enough. A launch PS3 consumed ~180 W playing games, while a PS3 slim from 2013 draws ~70.

You're quite right, but the power consumption seen by some 14nm Raven Ridge APU's is quite promising. There are 15W APU's with higher clockspeeds than I suggested, albeit with something like 8 or 10 Vega CU's. So I think it's within the realm of possibility that a 5nm AMD Switch 2, when portable, could surpass the launch PS4 in terms of clockspeeds, raw TF, and bandwidth.

If so, we'd potentially be looking at a 2TF portable VS 10TF home consoles, with near identical architectures. I think that's a small enough difference to make ports viable.

Some games, maybe plenty, still won't be able to have enough stripped from them in order to make up the gulf between a 2TF system and its 10TF counterparts. But lowering the resolution from reconstructed 4K to reconstructed 1080p, dropping the draw distance, texture resolution etc would probably be enough for a good deal of games.

Maybe we'd all get lucky and 60FPS would become the new standard for most PS5 and XBoxNext games, so that halving the framerate would make it that much easier to port to the Switch 2 :D

The latest agreement between AMD and Samsung to share the RDNA IP for mobile GPUs seems to dictate they can't compete on the same markets.
That means Samsung probably can't make SoCs above 10W, and AMD can't make SoCs below Raven Ridge's current 15W minimum.

So AMD can do APUs for large tablet form. Anything below that and it's Samsung territory.

15W Switch 2 here we come!
 
They should have leaned into the low detail problem and given the Switch port a cell shaded look. Would have made it distinct from other versions, while making the lack of detail an asset instead of a problem.

The sure fire way to make a switch port would have been to turn the witcher into a spin-off cel shaded anime top down turn based RPG in the mushroom kingdom. But it seems CDPR loves their game more than they love money. In short, they dumb.
 
I am inclined to believe the Nvidia relationship will continue with the release of the Switch's successor. The Nvidia support provided for development tools has been significant in just how improved the development environment has been with Switch compared to previous Nintendo platforms. Even if Nvidia no longer has mobile chips as part of its business model, providing chips for a platform that can sell upwards of 100 million units makes it more than worth while. Whats to stop Nvidia from combining as many Volta cores as possible alongside some modern ARM cores that pulls around 6 watts portable and 10-15 docked? The CPU cores are already designed for mobile, and their Volta cores are more power efficient than the Maxwell/Pascal cores. Switch also sits in a little bit of a different space. Pulling 6 watts would probably be considered way to high for a phone processor, but for a system that does have active cooling, not so much.
 
TBH, in the short term, they'll probably just pick the cheapest option. They can do something a little more fancy for the major successor along with more time for QA/compatibility testing.
 
Back
Top