Nintendo Switch Tech Speculation discussion

Status
Not open for further replies.
One thing people need to realize is to upgrade the ENTIRE system, Nintendo simply needs to replace the tablet portion of the system with an upgraded one every few years. So you yank the old tablet out of the dock, plop the new tablet in and put in your mini-SD card into the new system and you're good to go.

So to upgrade the entire system you have to buy a new console. Got it.
 
got my switch , its tiny and very light. I've only had it an hour but i'm enjoying Zelda so far. I can't help but think that sony or ms can make a slightly bigger version with a 8 inch screen and a bit thicker and put in a xbox one or ps4 level soc inside and make a lot of money
 
1 - Why this console released so late if it's using such an old chip (>2 years in production).
Nintendo failed WiiU launch badly, because they didn't have good 1st party lineup. I'd guess they wanted a good launch window lineup this time (albeit some games such as Splatoon missed the launch by a bit). Zelda is very important for Switch launch. A console is nothing without games. We also all know that Nintendo likes to use proven hardware to reduce risks and reduce hardware cost.
2 - Why is it so expensive if this old SoC was supposedly sold so cheap (for being so old) and nvidia offered a "vertical integration" pack.
Nintendo doesn't sell consoles at loss, like Microsoft and Sony have used to do. Also handheld design and parts (other than CPU/GPU) are more expensive than connected boxes. You could ask the same question: Why are Galaxy S and iPhone selling for 800$. 300$ isn't that bad. PS Vita was priced similarly, and it didn't transform to home console.
4 - Why not even bother to take away so much useless silicon (4*A53 module + respective L2 cache, CCI400 "glue", 600MPix/s imaging DSP for a console that has no cameras, 4K HEVC decoder for a console that doesn't support more than 1080p VP9) in a device that will sell at least 5 million units?
New chip revisions are expensive. All of the things you list are tiny. Not much die space wasted. Tegra X1 never got popular in mobile phones or tablets. Maybe Nvidia had excess stock and Nintendo got a good deal.
 
New chip revisions are expensive. All of the things you list are tiny. Not much die space wasted. Tegra X1 never got popular in mobile phones or tablets. Maybe Nvidia had excess stock and Nintendo got a good deal.
In theory, they also might be able to accept chips that have defects in the areas that the Switch can't use.
 
If the 256 core GPU at 768MHz and 4*A57 at 1GHz are confirmed, the docked Switch doesn't even seem to match the current gen consoles in performance-per-watt.
Docked switch has a 393 GFLOPs GPU, the Xbone S consumes 3x more but has a 3.5x more powerful GPU (with several times more bandwidth). The PS4 Pro pulls 150W, so 9.4x higher power consumption for a 4.2 TFLOPs GPU, so a 10.6x faster GPU.

If the Cortex A57 is considered similar to the Jaguar cores clock-for-clock (I see they have very similar Geekbench results), the same can be said about CPU performance.

One can argue the 2*FP16 throughput nullifies the Xbone S comparison, but the same isn't true for the PS4 Pro.

I wouldn't draw such inferences about GPU perf/W from the whole-system wall power consumption numbers we've seen. I mean, was the Switch battery being charged while it was drawing 16W? That's a really huge factor which could account for several extra watts by itself. And just the whole arrangement of the power delivery might not really be optimized for getting peak efficiency out of the wall, or at least not in a way that scales down in watts from XB1 or PS4...
 
New chip revisions are expensive. All of the things you list are tiny. Not much die space wasted. Tegra X1 never got popular in mobile phones or tablets. Maybe Nvidia had excess stock and Nintendo got a good deal.

Yeah all signs point to this, Pascal X1 derivatives are late and hot so that was out (I'm inferring this from the fact they're only being offered in embedded markets) and Nvidia had a whole lot of nothing otherwise for TX1. Sure there's a lot of essentially 'dead' silicon in TX1's Switch implementation but as others have noted it grants them greater manufacturing yields (although at this point that should be nailed down to very low %) with minimal R&D cost and time-to-market for the part. The rest of the costs are for all of those radios in the controllers, the actually good IPS panel (about time Nintendo), 'HD' rumble and the mandatory accessories (base, joycons, etc).
 
Last edited:
I'm wondering if nVidia incorporated all the changes for the switch into the A2 revision of TX1. So TX1 rev A2 and Switch SOC are actually the same chip. That would benefit both nVidia and Nintendo. Maybe they even removed the A53 cores in this revision and used the space for something else? I'm really curious. I'm hoping we will get die shots anytime soon.
 
Teardown on a production unit with high quality pictures:
https://www.sosav.fr/guides/consoles/nintendo/nintendo-salon/nintendo-switch/demontage-complet/



Nintendo failed WiiU launch badly, because they didn't have good 1st party lineup. I'd guess they wanted a good launch window lineup this time (albeit some games such as Splatoon missed the launch by a bit). Zelda is very important for Switch launch. A console is nothing without games. We also all know that Nintendo likes to use proven hardware to reduce risks and reduce hardware cost.

That's the thing though. They don't have a good launch lineup and that has been the main criticism on the console for almost all reviews out there.
Zelda is excellent. It's a top-notch game, a system seller. No question there.
But then, apart from some overly expensive gameplay experiments (like snipperclips and 1-2 Switch), almost all you get is ports of old indie games (that you could buy in $1 humble bundle packs for a windows handheld), ports of old Wii U games and.. erm.. Bomberman.
Nintendo's own lineup of "strong games" is Zelda at launch, Mario Kart 8 port in April, Fire Emblem somewhere in the Fall, Xenoblade Chronicles 2 and Mario Odyssey in the Holiday season. That's 5 games.

You say Nintendo failed Wii U's launch badly because they didn't have a good 1st-party lineup and you're right. But the Wii U's general launch lineup was actually stronger than the Switch's:

- Nintendo land (gameplay experiments similar to 1-2 Switch or Wii Sports, but it came bundled with the console)
- New Super Mario Bros U
- ZombiU
- Wipeout 3
- Batman Arkham City
- Assassin's Creed III
- CoD Black Ops 2



What I do think that failed in the Wii U was terrible marketing (people thought it was a peripheral for the Wii) and offering 7th-gen hardware while the world was already aching for the 8th-gen consoles. All things considered, the initial line-up for the Wii U was actually pretty stronger than the Switch's.
For each Call of Duty or Assassin's Creed the Wii U had (that were already available for PS360, but still..), the Switch has some indie port that has been available for ages on PC+Android+PS4Bone Digital+iOS.




Nintendo doesn't sell consoles at loss, like Microsoft and Sony have used to do. Also handheld design and parts (other than CPU/GPU) are more expensive than connected boxes. You could ask the same question: Why are Galaxy S and iPhone selling for 800$. 300$ isn't that bad. PS Vita was priced similarly, and it didn't transform to home console.
The Vita had state-of-the-art hardware for its time (unlike the Switch) and the initial price was $250. In practice, the only missing thing is a video output connector because as the teardowns show, the EXT port is connected to display controller.
The iphones and android flagships have huge margins from their BoM (depending on storage amount, iphone's margins can reach 300% IIRC), but that's also because they need to compensate for billions of investment into huge marketing budgets, sate-of-the-art SoCs that carry baseband processors, premium materials like metal alloys that can only be CNC machined and not molded, displays with unprecedented densities and brightness levels with factory calibration, expensive super-fast camera modules with OIS and phase-detection autofocus, etc.

Though I guess it's possible Nintendo is simply selling the console for as much as they can. Since they have factually no competition as a handheld console, the margins for the Switch could actually be very large.
They'd better be, for Nintendo's sake.


The problem here is there's probably a Shield Console coming out with the exact same specs and a (subjectively) stronger library with much cheaper games on day one. And there's a HDMI 2.0 connector in it, and it can play 4K HDR Netflix content, and you can install android apps.
nvidia is putting quite the competition on the Switch. Which is odd as hell.



New chip revisions are expensive. All of the things you list are tiny. Not much die space wasted.

I have to strongly disagree with you here. The ISP, video encoders/decoders, LITTLE module(s) and glue logic take up a substantial chunk of the die area.
There aren't a lot of x-ray die shots with proper descriptions out there, but here is a good one from the Exynos 5410:

CU9cOjW.jpg


This is a 122mm^2 SoC at 28nm using Cortex A15 + A7 modules. I know the TX1 uses more complex ARMv8 CPU modules, but it's also a denser 20nm SoC chip. TX1's ISP is also incredibly more powerful than the Exynos 5410 due to its automotive aspirations, and so is e.g. the video decoder (max 1080p 60FPS H264 in 5410 vs. 4K HDR 60FPS HEVC in TX1).
In the end, the proportions inside the various SoCs (without baseband processors) for fixed-function hardware doesn't seem to differ all that much different between models. Here's a shot of the old Tegra 2:

p8CrlNR.jpg


In the Tegra 2, the ISP alone occupied about 10% of the whole chip. In the 5410, the LITTLE module + video coder/encoder + ISP amount to 25% of the chip, and this is even without taking into account the CCI glue. The Switch has no cameras, so it has no use for an ISP at all. Even the audio codec module could be greatly simplified, since the TX1 supports 8-channel 24bit 192KHz but the Switch can only output 6-channel 16bit 48KHz. Plus, the Cortex A57 module could be optimized for density instead of performance, given its very low 1GHz clocks.


In the end, I do think that by taking away all unnecessary stuff from the TX1 plus simplifying the video/audio codec blocks, display output etc. nvidia could have produced a 20-30% smaller chip.
And for a chip that is definitely being put into at least ~8 million devices, you can bet that makes a huge difference. Being able to produce 20% more chips per waffer would result in savings in the order of several tens of millions.



Tegra X1 never got popular in mobile phones or tablets. Maybe Nvidia had excess stock and Nintendo got a good deal.
Were nvidia's sales predictions for TX1 so bad for so long that they had millions of TX1 chips sitting in a warehouse?



I wouldn't draw such inferences about GPU perf/W from the whole-system wall power consumption numbers we've seen. I mean, was the Switch battery being charged while it was drawing 16W? That's a really huge factor which could account for several extra watts by itself. And just the whole arrangement of the power delivery might not really be optimized for getting peak efficiency out of the wall, or at least not in a way that scales down in watts from XB1 or PS4...
OTOH lower wattage PSUs tend to get better efficiency, since less power = less heat = less drainage.
But you're right, exact inferences can't be drawn from wall measurements and theoretical specs alone. That was just an exercise.
My point was the TX1 would have easily beaten the 2013 consoles in power/performance. But for the 2016/2017 models those 20nm are inevitably showing their age.
 
Were nvidia's sales predictions for TX1 so bad for so long that they had millions of TX1 chips sitting in a warehouse?
Apparently not since Shield TV 2017 is using TX1 revision A2 chips and 2015 version was using revision A1 chips. BTW thanks for the link to the teardown!
 
A fully customised SoC could have delivered a smaller possibly more efficient design but then Nintendo would have had to pay for a fully custom design. On top of that with WiiU failing to gain any traction and 3DS growing old Nintendo needed a solution with a much faster time to market. So you have an option to ask for a fully customised SoC with all of the expense and time that would cost or you have an off the shelf part that is good enough with a partner who can also bring real time s/w expertise to the table.

In my fantasy universe Nintendo goes down the fully custom route but Nintendo has shown time and time again they are a fundamentally conservative company. When we combine it with Nvidia having a great deal of incentive to do a deal on the existing design I can see why we got to where we are. Nvidia didn't have thousands of TX1s burning a hole in their warehouse but the costs for the TX1 most certainly were burning a hole in the balance sheet and raising uncomfortable questions about why Nvidia was continuing to pour cash into a market they were meeting with minimal success in.

Let's not pretend Shield is a real console, an Android device with 6 year old games that run, kinda, sorta as well as they do on x360/PS3 is competition for nothing. It has no exclusives of note and Nvidia hasn't been running around with a cheque book to make it happen for them either.

Edit: first BotW test runs
Smooth on handheld 720p
Stutters on occasion on dock, better af filtering, 900p, speculates stutter may be b/w related
http://www.eurogamer.net/articles/d...e-legend-of-zelda-breath-of-the-wild-face-off
 
Last edited:
The little cores may be being used when the console is asleep, and video encode and decode may have uses too (e.g. sharing, streaming, in game FMV).

Nintendo have bought an off the shelf chip, outsourced the API, and I suspect have had Nvidia even handle the board design (Nintendo couldn't possibly do a better job). On some level, neither Nintendo nor Nvidia will care if there's an unused module on the chip if the economics work out. Which they apparently have ...
 
The little cores may be being used when the console is asleep
I hope the console isn't doing absolutely anything other than keeping the volatile RAM alive while asleep. It's it's turning up the WiFi radio and doing silent updates then its standby battery life will hurt a lot.
If it does stuff while docked then it's okay, but in that case they wouldn't really need to use the A53 cores.

and video encode and decode may have uses too (e.g. sharing, streaming, in game FMV).
According to the specs in the leaked documents the developers can only use H264 or VP9 videos up to 1080p (which is really stupid for a console using cartridges and a small eMMC storage BTW), so the video decoder block could be much simpler.
As for video encoding, I seriously doubt Nintendo will ever enable gameplay video recording in the Switch, given the small RAM pool. The share button takes a screenshot, and that's simply copying the framebuffer content at that instant to the storage. There's no need for ISP or video codec for that.



Nintendo have bought an off the shelf chip, outsourced the API, and I suspect have had Nvidia even handle the board design (Nintendo couldn't possibly do a better job). On some level, neither Nintendo nor Nvidia will care if there's an unused module on the chip if the economics work out. Which they apparently have ...
It certainly is an industry first...
 
Last edited by a moderator:
The little cores may be being used when the console is asleep, and video encode and decode may have uses too (e.g. sharing, streaming, in game FMV).
The A53s? AFAIK they simply can't be used if the A57s are active. X1 doesn't have big LITTLE or companion core features. It is a strange aspect to it. I've wondered if the functionality is there but unfinished or buggy.
 
p8CrlNR.jpg


In the Tegra 2, the ISP alone occupied about 10% of the whole chip. In the 5410, the LITTLE module + video coder/encoder + ISP amount to 25% of the chip, and this is even without taking into account the CCI glue. The Switch has no cameras, so it has no use for an ISP at all. Even the audio codec module could be greatly simplified, since the TX1 supports 8-channel 24bit 192KHz but the Switch can only output 6-channel 16bit 48KHz. Plus, the Cortex A57 module could be optimized for density instead of performance, given its very low 1GHz clocks.
Nintendo said there will be gameplay streaming in the future update (similar to PS4 Share button). They still need video encode and decode. Can't just remove them. Customized versions costs extra money and validation. Not worth it. The only useless pieces are ISP and the little CPUs, and those take roughly 10% of die space in this diagram. Tegra 2 was on huge 40 nm process, meaning that the encoder and decoder and ISP take now significantly less die space at 20 nm (assuming similar functionality). If they wanted a fully custom chip, they could have obviously used simpler custom video encoder and simpler custom video decoder and got rid of the little CPU and done lots of other tweaks as well. But Nintendo felt that the current way was best for their business. We must assume that they had very good reasons for this.

Maybe they have already planned a die shrink for future, and are going to remove the extra hardware (ISP + little cluster) at the same time. Parker is already at 16 nm FinFET.
 
According to the specs in the leaked documents the developers can only use H264 or VP9 videos up to 1080p (which is really stupid for a console using cartridges and a small eMMC storage BTW), so the video decoder block could be much simpler.
As for video encoding, I seriously doubt Nintendo will ever enable gameplay video recording in the Switch, given the small RAM pool. The share button takes a screenshot, and that's simply copying the framebuffer content at that instant to the storage. There's no need for ISP or video codec for that.
I STRONGLY doubt that you can just remove certain codec from the encoder/decoder blocks without a MAJOR redesign of that block. Since HEVC is basically H264++ (which itself is MP4++) they surely share a lot of hardware. Dropping support for 4K only means you can make some buffers smaller. Nothing much is gained here. So why do a redesign? That would be a waste of money and resources. If it ain't broken, don't fix it! I assume they dropped support for HVEC only to avoid paying licencing fees for each console. If i'm not mistaken VP9 can achieve similar compression rates as HVEC and is royalty-free.
 
I hope the console isn't doing absolutely anything other than keeping the volatile RAM alive while asleep. It's it's turning up the WiFi radio and doing silent updates then its standby battery life will hurt a lot.
If it does stuff while docked then it's okay, but in that case they wouldn't really need to use the A53 cores.

The A53s? AFAIK they simply can't be used if the A57s are active. X1 doesn't have big LITTLE or companion core features. It is a strange aspect to it. I've wondered if the functionality is there but unfinished or buggy.

My thought is that you might want to download patches / updates on the go, and do so whenever you're near a friendly hotspot (assuming the user wants to). Also good for something like Streetpass, which would absolutely benefit from low power cores.

I don't see the A53s as useless.

According to the specs in the leaked documents the developers can only use H264 or VP9 videos up to 1080p (which is really stupid for a console using cartridges and a small eMMC storage BTW), so the video decoder block could be much simpler.
As for video encoding, I seriously doubt Nintendo will ever enable gameplay video recording in the Switch, given the small RAM pool. The share button takes a screenshot, and that's simply copying the framebuffer content at that instant to the storage. There's no need for ISP or video codec for that.

768 MB is enough for a good chunk of encoded video, and if Nintendo have any sense they'll want streaming and / or uploading of clips to youtube. Some customers will want this, and it's a great promotional tool. So like the A53s I don't see video encode as useless either.
 
Maybe they have already planned a die shrink for future, and are going to remove the extra hardware (ISP + little cluster) at the same time. Parker is already at 16 nm FinFET.

Little cluster would be ideal for Streetpass:

https://www.nintendo.co.uk/Nintendo...s-StreetPass-/What-is-StreetPass--827701.html

... and other background activities.

I wouldn't want my Switch to only silently update while docked, and I wouldn't want to have to power it on either - if I've switched it on it's because I want to game, not watch a patch download.

My phone gets on with all that stuff on its own, when I'm not using it. Even my tablet downloads large system updates without me having to manually switch it in or have it plugged in.
 
My thought is that you might want to download patches / updates on the go, and do so whenever you're near a friendly hotspot (assuming the user wants to). Also good for something like Streetpass, which would absolutely benefit from low power cores.

I don't see the A53s as useless.
No I mean to say that the A53s are just not available. In other X1 devices they are disabled at boot.
 
Status
Not open for further replies.
Back
Top