Nintendo Switch Tech Speculation discussion

Status
Not open for further replies.
Excuse me when I start to sound like a broken record, but some of you are completely ignoring the fact that times have changed dramatically since the Wii U was created, performance and market-wise....Anyone spots the difference here? Anyone? You guys really think that Nintendo was completely ignoring this?
What does the state of the mobile market have to do with Nintendo's choices for Wii U? Looking at Wii U as a console, it was facing a similar technological challenge to its contemporaries as NS will face from mobile, and Nintendo chose to go weak on power instead of bleeding edge. What was the thinking behind Wii U that said, "1 TF and 4 GBs RAM at 100 GB/s is too much; let's go with 16 GB/s DDR and <200 GF GPU," and why doesn't that same thinking apply to Switch?
 
Is it possible that Nintendo was able to score a really good deal with 20nm because its not in demand? I remember someone speculating the possibility that perhaps there was a large volume of unused 20nm wafers, and perhaps Nintendo scored a fire sale on these? Perhaps a scenario where Nvidia expected to sell a lot of Tegra X1 chips, and the demand never came close to their estimates?

The standard X1 has four A57 and four A54 arm cores, with the four A54 cores going mostly unused. I'm not sure how much dye space these take up, but they are most certainly absent in the custom Tegra for Switch. Would this allow them to configure the processor so that it is smaller resulting in more chips per wafer?
 
What was the thinking behind Wii U that said...
The same thinking that said, "we don't need Nvidia."

You don't hire Nvidia just to keep doing more of what you have been doing.

And people who think Nvidia gave Nintendo some great deal should probably think again...
 
Is it possible that Nintendo was able to score a really good deal with 20nm because its not in demand? I remember someone speculating the possibility that perhaps there was a large volume of unused 20nm wafers, and perhaps Nintendo scored a fire sale on these? Perhaps a scenario where Nvidia expected to sell a lot of Tegra X1 chips, and the demand never came close to their estimates?

I was curious if nV had some sort of wafer supply agreement at the time, and perhaps they're giving Nintendo a really good deal as a result, but I wouldn't know much about that (pure speculation, megatons of salt etc etc).
 
I was curious if nV had some sort of wafer supply agreement at the time, and perhaps they're giving Nintendo a really good deal as a result
I doubt it. I imagine the "good deal" part comes from inventory of X1 (for use in early dev kits) that NV was glad to be rid of and tech support / dev rel. Monetarily, I expect Nvidia to be getting paid.
 
What does the state of the mobile market have to do with Nintendo's choices for Wii U? Looking at Wii U as a console, it was facing a similar technological challenge to its contemporaries as NS will face from mobile, and Nintendo chose to go weak on power instead of bleeding edge. What was the thinking behind Wii U that said, "1 TF and 4 GBs RAM at 100 GB/s is too much; let's go with 16 GB/s DDR and <200 GF GPU," and why doesn't that same thinking apply to Switch?

*takes HUGE breath*

Maybe hubris associated just coming off their most successful generation ever (both ds and wii u) and along with that the associative overconfidence in the bizarre thinking that it needed to be super electricity efficient and have almost as small a form factor as possible while also having its hands tied stupidly and fairly severely by Wii BC with a pricey custom exotic mcm not unlike the unnecessariy exotic and expensive cell in the ps3 that also upped the price due to arrogance and overconfidence following a landmark kncokout win the last generation?

*EXHALES*
 
Eurogamer didn't confirm any specs for NS, just that the dev kits had X1 in them. "Nintendo Switch is being powered by a custom Nvidia mobile Tegra processor, with development kits using the X1 chip that's already in use for the Shield Android TV console and the Google Pixel C tablet." -Eurogamer 2 days after these rumored specs were put up, and Nintendo Switch was revealed. The specs are clearly just X1, which is what the dev kits had in them. Doesn't seem like Eurogamer is throwing any support behind an anon pastebin dump.

really? this is from the eurogamer article. there pretty clear on there source saying final hardware will be X1.

"It is worth stressing with a firm emphasis that everything we have heard so far points to Tegra X1 as the SoC of choice for Nintendo NX, and Tegra X2 may simply be a derivative version of X1 with Denver CPU cores, designed for Nvidia's burgeoning automotive line - we literally know very little about it. However, perhaps another factor to consider is launch timing. NX launches in March 2017, almost two years after Shield Android TV with Tegra X1 launched in May 2015. The timing may suggest that Nintendo is waiting for mass production to become available on a more cutting edge part. If the older Tegra X1 is indeed the core component, availability there would not be a problem, suggesting a delay elsewhere in the pipeline. Alternatively, it may simply be the case that Nintendo is holding fire until a compelling array of launch software is ready."

there pretty clear on there source saying final hardware will be X1.
 
What does the state of the mobile market have to do with Nintendo's choices for Wii U? Looking at Wii U as a console, it was facing a similar technological challenge to its contemporaries as NS will face from mobile, and Nintendo chose to go weak on power instead of bleeding edge. What was the thinking behind Wii U that said, "1 TF and 4 GBs RAM at 100 GB/s is too much; let's go with 16 GB/s DDR and <200 GF GPU," and why doesn't that same thinking apply to Switch?
Before Wii U the gaming hardware market was just Sony, Microsoft and Nintendo. There was not much other hardware out there that was powerful enough to allow gaming (aside from desktop computers). Now we have smart phones and tablets, some of them very powerful. And they get better every year. It's just a matter of time until they start to replace dedicated gaming hardware. But that's just the background.

With the Wii, Nintendo got lucky. The Wii had a unique selling point (USP) and that was motion control. Because of the novelty people did not care that it was less powerful than XB360 or PS3.

I think Nintendo thought it could repeat this with the Wii U. Their USP was the Wii U GamePad with the lcd touch screen. I think they thought they could ride the wave of success bit longer and get away with a very modest increase in power because it still had motion control and the Wii game pad as "killer feature". They were wrong. The novelty of motion control had worn off and the GamePad was no replacement for that.

Additionally PS4 and XB1 promised to be much more powerful and were not far off. They were much better balanced and easier to develop for, while the Wii U required special treatment. Publisher shifted their resources away from the Wii U towards the new consoles. With dated hardware and without publisher support the Wii U was dead in the water. Nintendo had gambled and lost.
 
Is it possible that Nintendo was able to score a really good deal with 20nm because its not in demand? I remember someone speculating the possibility that perhaps there was a large volume of unused 20nm wafers, and perhaps Nintendo scored a fire sale on these? Perhaps a scenario where Nvidia expected to sell a lot of Tegra X1 chips, and the demand never came close to their estimates?

The standard X1 has four A57 and four A54 arm cores, with the four A54 cores going mostly unused. I'm not sure how much dye space these take up, but they are most certainly absent in the custom Tegra for Switch. Would this allow them to configure the processor so that it is smaller resulting in more chips per wafer?
Everyone is thinking about the chip price and not what the performance of 20nm adds to the cost of the product.

Think about this, we know that Nintendo Switch has a fan and is actively cooled, part of 20nm chip price would then include the active cooling solution, millions of fans and more expensive see designs to cool the product, a thicker device also adding to the cost.

This thinking only applies to 20nm, because there is the option to instead of using active cooling, a bigger device and performance on the go vs docked. They could have opted for the 16nm process node, had passive cooling and dock performance on the go (possibly) . Shrinking maxwell? Nvidia did it for a handful of shield TVs, why they wouldn't opt for it again for millions of devices or simply use pascal? Well there is no reasoning that can make sense of that.

The fact that it has active cooling, means that performance demanded it and that a smaller process node was unavailable, not competitively priced against the dollars this would add to the price of each unit.

With 16nm being in clear abundance and nicely priced enough for even Chinese manufactures to use it (or 14nm) 20nm becomes less than likely to say the least.
 
Last edited:
"...and why doesn't that same thinking apply to Switch?"

Its not so much that it doesnt (believe me, I'd like to think I learned me lesson after being a longtime gaf lurker in the U spectulation threads), its that people need to be careful that they don't go so far the other way to the point it becomes a sort of trump all point especially when there's at bare minimum some fairly good trustworthiable (is that a word) evidence to the contrary.

Lets be clear here though. I'm not saying we know for a fact its going to be even competently powerful. It's more of a matter of disagreeing with the rather intransigent notion that it almost CAN'T be powerful "cuz Nintendo". Not saying you're necessarily doing this (I don't really think you are), but still,it's definitely a point worth bringing up.

Does the question in all this than become how on earth could a home console that plugs in to your tv be rightfully considered a success if it cant even match the One on paper in even a best case scenario? Because of its nature as a hybrid! If NINTENDO (!) can successully manage to do both in a grossly fattened tablet whilst still effectively allowing for fairly doable home console ports than, to me at least, thats a technical accomplishment worthy of praise and recognition. FWIW I would have MUCH rather they just made a home console that maybe just shared the same architecture with their new handheld whilst still leveraging the advantages that a unified library brings, but it is what it is.
 
6.2" tablet is not good for battery capacity. Or cooling capacity. I suspect the SOC will need be <= 5W. So I'm thinking a 128 ALU device. I wonder if they might use a quad A53 CPU configuration.
 
Last edited:
Everyone is thinking about the chip price and not what the performance of 20nm adds to the cost of the product.

Think about this, we know that Nintendo Switch has a fan and is actively cooled, part of 20nm chip price would then include the active cooling solution, millions of fans and more expensive see designs to cool the product, a thicker device also adding to the cost.

This thinking only applies to 20nm, because there is the option to instead of using active cooling, a bigger device and performance on the go vs docked. They could have opted for the 16nm process node, had passive cooling and dock performance on the go (possibly) . Shrinking maxwell? Nvidia did it for a handful of shield TVs, why they wouldn't opt for it again for millions of devices or simply use pascal? Well there is no reasoning that can make sense of that.

The fact that it has active cooling, means that performance demanded it and that a smaller process node was unavailable, not competitively priced against the dollars this would add to the price of each unit.

With 16nm being in clear abundance and nicely priced enough for even Chinese manufactures to use it (or 14nm) 20nm becomes less than likely to say the least.
20nm down clocked is the cheapest and most likely solution.
 
If Nintendo pays for a shrink, are they trying to be sane?View attachment 1672

Well ... it's a bit of a tenuous Link, but nvidia claim NX is using the same architecture as the world's top performing GeForce graphics cards. So .... Pascal.

Which is a 16 nm architecture. And they're going to be making that 16 nm LPDDR4 controller anyway ... and at least some of the X1's Arm cores are being shrunk to 16 nm regardless ... so something very much like an X1 but with Pascal.

But if Nintendo got a good deal on the X1 license and a good deal on a long term agreement for 20 nm then, yeah, that's what it'll be.
 
20nm down clocked is the cheapest and most likely solution.
You completely ignore the reality that NS is actively cooled, meaning with a fan, and we have seen X1 passively cooled at 850mhz. Your solution to down clock to avoid paying for active cooling doesn't apply because switch has active cooling. Like I said, think.
 
You completely ignore the reality that NS is actively cooled, meaning with a fan, and we have seen X1 passively cooled at 850mhz. Your solution to down clock to avoid paying for active cooling doesn't apply because switch has active cooling. Like I said, think.
What are you even on about? You are arguing for 16/14nm and saying it would remove the need for a fan and then you argue that since there is a fan, it wouldn't be down clocked if it is 20nm. Can you not see that the 2 line of thinking is contradictory?

No, it isn't.
How so? 20nm is cheaper and will give nintendo more money. It is the most likely solution.
 
Status
Not open for further replies.
Back
Top