Switch 2 Speculation

With the Switch being so successful, I doubt Nintendo and Nvidia would have any trouble with doing a custom SOC next time. I'm not suggesting anything exotic, but perhaps they take the Xavier SOC and remove all the automotive stuff. I cant imagine that would require a huge big R&D budget. The R&D cost really become minimal when its going towards a product that could sell 100 million units. The next Switch will likely still target the $300 or perhaps $349 launch price. So component selection will always depend on what they can use to target that price and still make money.

I don't think either company will want to waste the money when an off the shelf chip will do both of them fine.

If your talking about removing the ISP ( Image Signal Processor ) or PVA (programable accelerator vision .) I dunno maybe but is it worth stripping it out ? I doubt they take up a lot of room or use a lot of power and they can always be used to assist the system doing AR stuff or VR stuff or just general. They are programable.


To be honest there is no competitor for a dedicated portable console. Nintendo as of now doesn't have to worry about anyone else. So why dump millions if not hundreds of millions is making a custom xavier and doing all the masking and QA and then create chips that will only be used in the new switch ?

At that point if I was Nintendo i would take that money and just use it for the price diffrence between xavier and orin and eat the added expense for the first x million units.

Orin is what Ampere based and as a 12 core arm cpu . I bet that could sit right under an xbox series s
 
any rumor / leaked info whether switch 2 will have a good quality screen? I mean color volume, contrast ratio, no weird artifacts*, etc

Switch and Nintendo's previous handheld have abysmal screens compared to competitors...

*although to be fair, the artifacts on Nintendo screens usually only visible to very few people. Case in point: Switch and 3DS (IPS) scanline/interlacing blemish. Veeeeeeeeery few people able to see it.
 
Isn't mini LED more expensive than OLED at this point?

Very much so. AMOLEDs (just a kind of oled, whatever) are cheeeeap on the other hand. They already fit into $300 phones. Heck HDR oleds do, technically the Switch (whatever) could have an HDR screen, oled or LED, if they Nintendo wanted to and they'd still grab a big profit. It sounds smart to me too, HDR is one of the few advances the average person can see easily in a side by side comparison in recent years. While ray tracing isn't. I think that's a thing forgetten here too often, put two things side by side and ask "if the average consumer cares between the two" is far more important than staring at an in game window to see if the reflection is "right" for 90% of customers. Heck forget 1080p, they could make it 7.5" (about 190cm) 900p and it'd still sell like hotcakes and people would coo over how good the screen looks. Save 1080p for docked mode.

Beyond that the only thing they might need for "downwards" compatibility for at least some PS5/XS titles is a better CPU and storage speed than the PS4/ONE. EUFS allows for that, up to 2gbps for 3.0 and above, and 1gbps for 2.0 (both well established and shipping), and flash storage is again cheap enough that even cheap phones have large capacities. That does ask the question of whether Nintendo is finally ready to join the world of like, a decade ago and have proper support for downloadable titles, but it is a decade ago. The question of CPU is thornier, looking at the upcoming a78, even a quadcore configuration might pull only a third of what the PS5 and Series can do CPU wise. But that'd be enough for *some* games, not that Nintendo might even go that high. Beyond that, if they keep their tiny memory footprint dashboard (512mb right?) 8gb might again be enough for a decent number of devs. Especially as they might expect this new console to sell at least as well as PS5/XS.
 
Very much so. AMOLEDs (just a kind of oled, whatever) are cheeeeap on the other hand. They already fit into $300 phones. Heck HDR oleds do, technically the Switch (whatever) could have an HDR screen, oled or LED, if they Nintendo wanted to and they'd still grab a big profit. It sounds smart to me too, HDR is one of the few advances the average person can see easily in a side by side comparison in recent years. While ray tracing isn't. I think that's a thing forgetten here too often, put two things side by side and ask "if the average consumer cares between the two" is far more important than staring at an in game window to see if the reflection is "right" for 90% of customers. Heck forget 1080p, they could make it 7.5" (about 190cm) 900p and it'd still sell like hotcakes and people would coo over how good the screen looks. Save 1080p for docked mode.

Beyond that the only thing they might need for "downwards" compatibility for at least some PS5/XS titles is a better CPU and storage speed than the PS4/ONE. EUFS allows for that, up to 2gbps for 3.0 and above, and 1gbps for 2.0 (both well established and shipping), and flash storage is again cheap enough that even cheap phones have large capacities. That does ask the question of whether Nintendo is finally ready to join the world of like, a decade ago and have proper support for downloadable titles, but it is a decade ago. The question of CPU is thornier, looking at the upcoming a78, even a quadcore configuration might pull only a third of what the PS5 and Series can do CPU wise. But that'd be enough for *some* games, not that Nintendo might even go that high. Beyond that, if they keep their tiny memory footprint dashboard (512mb right?) 8gb might again be enough for a decent number of devs. Especially as they might expect this new console to sell at least as well as PS5/XS.

isn't amoled pretty bad for performance? like... to get 1080p visual quality, it need to be in 1440p or something? that's one of the advantage PSVR have compared to competitors depiste PSVR have lower resolution.
 
isn't amoled pretty bad for performance? like... to get 1080p visual quality, it need to be in 1440p or something? that's one of the advantage PSVR have compared to competitors depiste PSVR have lower resolution.
Are you talking about RGB vs PenTile RGBG panels?
AFAIK Pentile nowadays is only being used with very high density screens, like 6" smartphones with 1440p resolution or higher.
A 7-8" OLED panel would probably be available in RGB for cheap.
Even the 9 year old Vita had a RGB OLED.
 
I don't think either company will want to waste the money when an off the shelf chip will do both of them fine.

If your talking about removing the ISP ( Image Signal Processor ) or PVA (programable accelerator vision .) I dunno maybe but is it worth stripping it out ? I doubt they take up a lot of room or use a lot of power and they can always be used to assist the system doing AR stuff or VR stuff or just general. They are programable.

Isn't the off the shelf Xavier chip very large? If not, then yea, probably just use an off the shelf chip, but if it is too big, I doubt we are talking big money to do little more than strip out the unnecessary hardware. Millions in R&D could easily be recouped in saved die space over time. Time will tell, but I'm still betting on a semi custom SOC based on an existing Tegra chip, hopefully Xavier and not Tegra X2. The X2 would be a nice upgrade for a Switch Pro next year, but the true successor needs to use something at least as good as the Xavier to get me excited about it.
 
Isn't the off the shelf Xavier chip very large? If not, then yea, probably just use an off the shelf chip, but if it is too big, I doubt we are talking big money to do little more than strip out the unnecessary hardware. Millions in R&D could easily be recouped in saved die space over time. Time will tell, but I'm still betting on a semi custom SOC based on an existing Tegra chip, hopefully Xavier and not Tegra X2. The X2 would be a nice upgrade for a Switch Pro next year, but the true successor needs to use something at least as good as the Xavier to get me excited about it.
Xavier would be 350mm2 on 12nm but there is no reason Nintendo couldn't target 7nm or 7nm+ which would reduce the size of the chip and power consumption

Main reason for xavier would be DLSS + ray tracing. Developers could simply take the pc version of the games and port them to the switch 2. At 720p you'd have decent ray tracing performance , i'd think on par with the consoles doing 4k or maybe even 1440p and you can use dlss to upscale to native screen res.

I think Orin is to far out. It wont be avalible until 2022 at the earliest and i doubt nintendo would go with something that cutting edge.


I think xavier is the sweet spot

You have the 30w 8 core 2.26ghz cpu model with a 512:32:16 volta at 850-1377mhz giving you 875-1410 gflop for fp 32 or 1748- 2820 fp 16. Its powr hungry at 30w on 12nm but i'd be interested to see what it would go down to on more advanced nodes and obviously the lower end of the spectrum would use closer to 15 watts.

You have the 15 watt Carmel that can be up to 6 cores running at 1.4ghz with a 384:24:16 gpu configuration at a 1100mhz giving you 845 gflop / 1690 for fp32/16.

I think this would be the sweet spot for nintendo. Custom Carmel Armv8.2-a cpu


When you compare this to the switch

4 core cortex a57 and quad core a53 at 1.02ghz and a maxwel gpu at 307mhz with 256:16:16 with 2 compute units. Apparently however the switch can only use the quad core a57 while the a53s are unused.

So you increase by 2 cores and they should be more powerful cores and greatly improve the gpu.


It should blow past the xbox one and ps4 in terms of what it can do and like i said with dlss and raytracing it should be able to run the same games the xbox series s will run with maybe some down grades. But i think it would be a much better showing than the switch vs xbox one / ps4.

Nintendo had great success with a lot of indies and older games getting ported to the switch. So i could see that continuing.


I'm not as into arm cpus as others so i'm not sure how the Carmel cores compare to jaguar and ryzen. I think the carmel cores should be able to perform better than the 8 jaguar cores . I think they would not compare to the ryzen cores however esp with the the clock speed diffrences and of course the additional two cores on the ryzen.
 
While I agree with most of the analysis above, I don't really think it's worthwhile to port Carmel any further. Denver missed its second product in favor of off-the-shelf arm Cortex-A IP (Tegra X1), since Nvidia didn't want to port Denver from 28nm to 20nm. Carmel has already been usurped and overtaken by newer IP on newer process nodes.

Nvidia has already committed to using arm "Hercules" for whatever process node Orin is going to be on, and that means whatever possible custom Nvidia-Nintendo IC will probably reuse that work.

That being said, Nvidia never publicly announced an exact process node for Orin (speculations were initially split between TSMC or Samsung 7nm, then Samsung 8nm). I don't recall any shipping A78 chips on 7nm, only 5nm. So someone is going to have to do some work. Either arm and the foundries have already validated A78 for older nodes, Nvidia has done the work themselves (either porting the A78 to an older node or Ampere to a newer node), or the foundry has done the work themselves (w.r.t. A78). Or another customer, assuming they share their work with other foundry customers.

So despite all I said about Nvidia not porting their own custom work to newer nodes, they have done it before (Tegra X1, amusingly enough). Maxwell made its way to 20nm via the X1 and if rumors are to be believed, then the whole X1 also made its way to 16nm via a later revision.
 
Or they will just make the X1 goes at shield tv speed, double the ram size, maybe a better screen and call it a day.

I love the spéculations but it's nintendo we're talking about.
 
*although to be fair, the artifacts on Nintendo screens usually only visible to very few people. Case in point: Switch and 3DS (IPS) scanline/interlacing blemish. Veeeeeeeeery few people able to see it.
3DS scanlines were due 3D technology. On Switch I only noticed it in Animal Crossing and thought it is a game problem.
 
Had a quick skip through the video I think they say the CPU hits 100 degrees celsius & I can imagine the fan sounding like a jetplane.
why should they want this when they could get better performance from a fanless apple with much better battery life to boot.
Sure apple aint selling to others but going intel is just the wrong path nintendo should take
 
that was not to say that Nintendo should use intel for their hypothetical switch 2, just to have an idea of the kind of power/games it could produce in a portable form.
 
3DS scanlines were due 3D technology. On Switch I only noticed it in Animal Crossing and thought it is a game problem.

The parallax barriers are vertical. The scanlines are horizontal. The scanlines also only on IPS 3ds. None on TN 3ds
 
It should blow past the xbox one and ps4 in terms of what it can do and like i said with dlss and raytracing it should be able to run the same games the xbox series s will run with maybe some down grades. But i think it would be a much better showing than the switch vs xbox one / ps4.

I suspect the Switch 2 will sit in the same position as the Switch. It will be a bit more powerful than previous gen consoles, but current gen ports will be limited. Nintendo's proposition to consumers has been a winner and I think it can continue with the next Switch. All of Nintendo's exclusives on one piece of hardware, excellent Indie software support and tons of upgraded ports of previous gen games from the AAA publishers.
 
All this talk with various hardware and considerations that could go to switch 2.... Makes me remember Iwata Ask on Nintendo website....

Satoru Iwata, really what a unique person
 
While I agree with most of the analysis above, I don't really think it's worthwhile to port Carmel any further. Denver missed its second product in favor of off-the-shelf arm Cortex-A IP (Tegra X1), since Nvidia didn't want to port Denver from 28nm to 20nm. Carmel has already been usurped and overtaken by newer IP on newer process nodes.

Nvidia has already committed to using arm "Hercules" for whatever process node Orin is going to be on, and that means whatever possible custom Nvidia-Nintendo IC will probably reuse that work.

That being said, Nvidia never publicly announced an exact process node for Orin (speculations were initially split between TSMC or Samsung 7nm, then Samsung 8nm). I don't recall any shipping A78 chips on 7nm, only 5nm. So someone is going to have to do some work. Either arm and the foundries have already validated A78 for older nodes, Nvidia has done the work themselves (either porting the A78 to an older node or Ampere to a newer node), or the foundry has done the work themselves (w.r.t. A78). Or another customer, assuming they share their work with other foundry customers.

So despite all I said about Nvidia not porting their own custom work to newer nodes, they have done it before (Tegra X1, amusingly enough). Maxwell made its way to 20nm via the X1 and if rumors are to be believed, then the whole X1 also made its way to 16nm via a later revision.

like i said thats 2022 at the earliest for Orin . Nintendo doesn't seem to care about cutting edge tech. So I think even with switch 2 is 2022 Orin is off the table. Heck in 2023 it may still be off the table for Nintendo
 
like i said thats 2022 at the earliest for Orin . Nintendo doesn't seem to care about cutting edge tech. So I think even with switch 2 is 2022 Orin is off the table. Heck in 2023 it may still be off the table for Nintendo
I still agree with your assessment of Nintendo's recent hardware choices.

I suspect Orin silicon is already out there. The 2022 target is likely due to safety certifications for the realtime, deterministic portions of the SoC, along with degradation testing and validation (mostly automotive grade, likely ISO 26262 ASIL-D). In the automotive world, cars usually go through a year of real world mule/prototype testing before actually reaching dealers. I suspect the same for certain components, especially new and novel parts.

Either way, Nvidia's Jetson Hardware roadmap has a new product in 2021 - "Nano Next" in the "starts at $99" bracket. Notably the Xavier NX was unable to be cut down enough for that bracket (it's still $400, and fits in the "starts at $249" bracket). There isn't any new silicon to really justify a new launch. The roadmap doesn't cover dev kits. All of that being said, it's possible that it is another cut down X1 or even a X2-based product (since the X2 has an long-term industrial SKU, the X2 is probably accruing some harvestable silicon).

w.r.t. Xavier vs Orin, I still am a bit more confident in N&N using an Orin offshoot than dragging the expensive AGX Xavier into things. By the time AGX Xavier is cut down enough, it would still have the multiple problems of being based on a compute uarch that never saw a gaming release (unless if the $3000 Titan V was a gaming card), being designed around 12FFN (when more performant and efficient 7nm and 8N designs are already available, and at least one future design is targeted right in Nintendo's power budget), and Nvidia's historical reticence at porting their denver CPUs beyond the node they were initially design on. Of course, they could just cut down the Xavier and leave it on 12nm, but that still leaves the first the problems.

The new Orin SKUs at 5-15W and the Nano Next are why I think this is different than the Switch's Tegra X1 vs X2 debate. X2 was a larger chip that drew more power than the X1. X1 already had consumer releases with consumer OS (Android), whereas the X2 never did (afaik). While the X1 wasn't ideal, it was still better suited than an off-the-shelf X2.

All of that being said, I think I just talked myself into considering the X2 as a possible Switch 2 SoC. Not great.
 
Back
Top