Nintendo Switch Tech Speculation discussion

Status
Not open for further replies.
The Nintendo Switch's CPU is 10% to 30% faster than Xb1's 1.75ghz 8 core Jaguar. (in line with ps4 pro's cpu) LCGeek gave us the info months ago, and leaked both wii and wii u's cpu. Zero reason to expect that s/he is wrong.

The GPU has been confirmed by Nate at direct feed gaming as using pascal and he has along with Emily Rogers and Laura Dale been 100% accurate on Switch. Besides 20nm is dead, the chip would move to 16nm fenfit naturally anyways.

We also know that it is actively cooled with a fan. X1 at 850mhz in pixel c is passively cooled, since this has moved to 16nm or smaller, there is no reason it would need a fan at 1ghz.

Laura Dale also gave us the info that when the console is docked, that the device increases clocks, my guess is that it is passively cooled when on the go and actively cooled when docked, probably shifting from 1ghz to 1.4ghz or 1.5ghz when docked.

I'll further elaborate on minimum specs being around 500-600gflops on the go, Laura also leaked the maximum battery life as 3hrs. Pixel C during Manhattan benchmark draws 8 watts but has a larger and brighter screen, this device has the X1 gpu clocked at 850mhz and is a 7mm thickness while NS is 15mm with a vent.

If NS is drawing ~7 watts and the battery is 1750mAh (same battery capacity as n3dsxl's than you'd end up with roughly 3 hours battery life, Vita draws ~7 watts with a 2150mAh battery and has a battery life around 4 hours. Considering it almost definitely moved on to 16nm, even maxwell would see a big bump in performance.

Lastly Check out @Syferz's Tweet: https://twitter.com/Syferz/status/790741152752250880?s=09
This is my tweet of a new Tegra being benchmarked, looks like a 256 Cuda core device with a ~1.5ghz clock. NS might be 750gflops when docked, should handle current gen games at 720p with minimal loss of quality, especially when the game's engine can push 16fp, which I believe can handle all the post processing effects, dof and such.
 
If NS is drawing ~7 watts and the battery is 1750mAh (same battery capacity as n3dsxl's than you'd end up with roughly 3 hours battery life, Vita draws ~7 watts with a 2150mAh battery and has a battery life around 4 hours. Considering it almost definitely moved on to 16nm, even maxwell would see a big bump in performance.

3 hours drawing 7W, or 21Wh, holds for a 1.75Ah only if it's ~12V. New 3DS XL uses a standard ~3.7V single cell battery for a total of ~6.5Wh, or less than a third the capacity of the hypothetical Switch battery you're describing. On similar grounds there's no way Vita can be drawing 7W.
 
3 hours drawing 7W, or 21Wh, holds for a 1.75Ah only if it's ~12V. New 3DS XL uses a standard ~3.7V single cell battery for a total of ~6.5Wh, or less than a third the capacity of the hypothetical Switch battery you're describing. On similar grounds there's no way Vita can be drawing 7W.

N3dsxl draws almost 4watts with a maximum battery life of 5 hours, I'm not in marketing so I don't know where the numbers are coming from, but there you go.

Also I do expect a larger battery, and a higher voltage draw, this was a minimum battery size that should be expected, there is also 2 small batteries in the controllers, which I assume connect to the tablet to at least charge, which could be used here to power the device.
 
Last edited:
N3dsxl draws almost 4watts with a maximum battery life of 5 hours, I'm not in marketing so I don't know where the numbers are coming from, but there you go.

If New 3DS XL drew 4W it'd only last 1.6 hours on a 6.5Wh battery. That's simple arithmetic. Regardless of Nintendo's marketing I think it pretty clearly does better than that under realistic usage scenarios.

You (or someone you heard it from) may be confusing the rating on the back of the unit with an actual typical power draw. The rating is specified as a maximum that the adapter has to supply, it's much higher than typical power consumption due to being an absolute worst case and having margins for short term consumption spikes. I'm pretty confident that if anyone actually measured the power consumption at the battery it wouldn't be anything like 4W on average.
 
The Nintendo Switch's CPU is 10% to 30% faster than Xb1's 1.75ghz 8 core Jaguar. (in line with ps4 pro's cpu) LCGeek gave us the info months ago, and leaked both wii and wii u's cpu. Zero reason to expect that s/he is wrong.

The GPU has been confirmed by Nate at direct feed gaming as using pascal and he has along with Emily Rogers and Laura Dale been 100% accurate on Switch. Besides 20nm is dead, the chip would move to 16nm fenfit naturally anyways.

We also know that it is actively cooled with a fan. X1 at 850mhz in pixel c is passively cooled, since this has moved to 16nm or smaller, there is no reason it would need a fan at 1ghz.

Laura Dale also gave us the info that when the console is docked, that the device increases clocks, my guess is that it is passively cooled when on the go and actively cooled when docked, probably shifting from 1ghz to 1.4ghz or 1.5ghz when docked.

I'll further elaborate on minimum specs being around 500-600gflops on the go, Laura also leaked the maximum battery life as 3hrs. Pixel C during Manhattan benchmark draws 8 watts but has a larger and brighter screen, this device has the X1 gpu clocked at 850mhz and is a 7mm thickness while NS is 15mm with a vent.

If NS is drawing ~7 watts and the battery is 1750mAh (same battery capacity as n3dsxl's than you'd end up with roughly 3 hours battery life, Vita draws ~7 watts with a 2150mAh battery and has a battery life around 4 hours. Considering it almost definitely moved on to 16nm, even maxwell would see a big bump in performance.

Lastly Check out @Syferz's Tweet: https://twitter.com/Syferz/status/790741152752250880?s=09
This is my tweet of a new Tegra being benchmarked, looks like a 256 Cuda core device with a ~1.5ghz clock. NS might be 750gflops when docked, should handle current gen games at 720p with minimal loss of quality, especially when the game's engine can push 16fp, which I believe can handle all the post processing effects, dof and such.

I wouldn't call Nate feed a confirmation on pascal, i would call it more speculation, especially when no other reliable source has backed him yet. i find it very strange that nividia wouldn't confirm pascal in there blog talking about gpu, why would nintendo wanna hide that, pascal is the best you can get. why wouldn't you wanna share that?
 
I wouldn't call Nate feed a confirmation on pascal, i would call it more speculation, especially when no other reliable source has backed him yet. i find it very strange that nividia wouldn't confirm pascal in there blog talking about gpu, why would nintendo wanna hide that, pascal is the best you can get. why wouldn't you wanna share that?

They gave an open answer, the problem with saying Pascal is that Volta is next year, as you said why wouldn't you want to market the best graphics architecture in the industry? Because around the time of release, we will be hearing about Volta. So it isn't the best you can get forever, but worlds top GeForce architecture is something that can be marketed for the foreseeable future. Also you mean rumor, not speculation, he didn't say he thinks it is, he said he knows it is. In the end maxwell x1 is a hybrid and moving to 16nm would offer similar performance to what I posted.
 
They gave an open answer, the problem with saying Pascal is that Volta is next year, as you said why wouldn't you want to market the best graphics architecture in the industry? Because around the time of release, we will be hearing about Volta. So it isn't the best you can get forever, but worlds top GeForce architecture is something that can be marketed for the foreseeable future. Also you mean rumor, not speculation, he didn't say he thinks it is, he said he knows it is. In the end maxwell x1 is a hybrid and moving to 16nm would offer similar performance to what I posted.

i read his twitter just now, he posted that he asked nividia about the architecture, and he got no comment from them, doesn't seem like he's sure himself. his source is one contact he trusts, why would his contact know this, yet nobody else has said anything similar. rule of thump is there has to be multiple sources stating that's it's pascal before we can say it's confirmed, eurogamer says there sources are pointing to it being maxwell.
 
i read his twitter just now, he posted that he asked nividia about the architecture, and he got no comment from them, doesn't seem like he's sure himself. his source is one contact he trusts, why would his contact know this, yet nobody else has said anything similar. rule of thump is there has to be multiple sources stating that's it's pascal before we can say it's confirmed, eurogamer says there sources are pointing to it being maxwell.
Like I said, even if it is maxwell, it almost certainly moved to 16nm as 20nm is actually more expensive as it is a dead fab node.

Maxwell at 16nm shouldn't need active cooling at 1ghz when pixel c was passive at 850mhz on 20nm. Nate having a source saying pascal makes it a rumor, if he is not confident in the source anymore than maybe it is wrong, but that doesn't change it to speculation.

Bethesda won't confirm that skyrim is coming to ns, yet they are running surveys on Google asking what platform people are buying the remastered skyrim on with ns being an option. It's PR
 
I remember the days an ARM processor was running off the heat produced by an x86 processor running the same program and faster than the x86 at that...
Just saying that x86 chips are still nowhere near ARM chips in power efficiency, but I don't know how much they account for the power draw of the X1.
If you compare desktop products with their mobile equivalent (performance wise) even though the process is not the same, you'll notice they use a fraction of the power, that even die shrink alone wouldn't match...
 
Like I said, even if it is maxwell, it almost certainly moved to 16nm as 20nm is actually more expensive as it is a dead fab node.

Maxwell at 16nm shouldn't need active cooling at 1ghz when pixel c was passive at 850mhz on 20nm.

so you're saying maxwell 20nm would cost them more to mass produce then parker, that doesn't make any sense, but i'm no expert.
 
this is quote coming from the eurogamer article on nx specs. so far they haven't updated with anything on pascal. i just don't wan't people to be disappointed like they were with the wiiu specs. there is a very small chance it could pascal.

"It is worth stressing with a firm emphasis that everything we have heard so far points to Tegra X1 as the SoC of choice for Nintendo NX, and Tegra X2 may simply be a derivative version of X1 with Denver CPU cores, designed for Nvidia's burgeoning automotive line - we literally know very little about it. However, perhaps another factor to consider is launch timing. NX launches in March 2017, almost two years after Shield Android TV with Tegra X1 launched in May 2015. The timing may suggest that Nintendo is waiting for mass production to become available on a more cutting edge part. If the older Tegra X1 is indeed the core component, availability there would not be a problem, suggesting a delay elsewhere in the pipeline. Alternatively, it may simply be the case that Nintendo is holding fire until a compelling array of launch software is ready."
 
so you're saying maxwell 20nm would cost them more to mass produce then parker, that doesn't make any sense, but i'm no expert.

14nm/16nm process is nearly 3 years old, samsung used them in Galaxy S6 phones over 2 years before NS launches, the pipeline for these nodes are huge and you produce more chips than at 20nm per wafer, not to mention that 20nm process has no future and is being replaced.

But the real nail in the coffin for 20nm NS, is that it is actively cooled. Why would you spend all the money on actively cooling a chip rather than shrink it down to 14/16nm? it is not only far cheaper per unit made to be passively cooled, but you'd also avoid having a vent on the top of the device which will receive rain easily while being used outside, this thing is much more delicate and warranty claims are going to cost quite a bit of money, all of which could have been avoided at a smaller node, so no it being 20nm just to perform at X1's spec makes zero sense from an engineering standpoint.

xzujMVK.png


We see a new Tegra right at the top here, which seems to be 256 cuda cores running about 1.5ghz. also this is shortly after the time NS is rumored to have gotten new dev kits. (10.10.2016)
 
Last edited:
we will know soon enough, and i'll happily eat crow, if i'm wrong. if you look at the wii and wiiu, they both make zero sense from a engineering stand point. in my mind everything points x1 and nintendo made the deal early 2016.
 
Last edited by a moderator:
posted by mistake and can't edit for some reason, anyway we will know soon enough, and i'll happily eat crow, if i'm wrong. if you look at the wii and wiiu, they both make zero sense from a engineering stand point. in my mind everything points x1 and nintendo made the deal early 2016.

Wii U's problem was the embedded ram and BC, this device has neither, the main problem with the embedded ram is that NEC could only do it at 40/45nm (forget which) which meant the entire GPU package had to be produced at 45nm when vendors moved on to 32nm/28nm.

I don't think this thing is not going to be X1, it is still clearly a possibility, but as Nvidia said officially, it is custom, and that means the rumor specs listed here is more likely just a dev kit/fake as they are just emily roger's report and X1 mashed together. If it is X1, I don't think 20nm is likely, considering Nvidia would want to make money and the node is only going to get more expensive in comparison to 16nm which is already competitive/beating it, it seems completely backwards to use the node everyone has already pretty much left.
 
Wii U's problem was the embedded ram and BC, this device has neither, the main problem with the embedded ram is that NEC could only do it at 40/45nm (forget which) which meant the entire GPU package had to be produced at 45nm when vendors moved on to 32nm/28nm.

I don't think this thing is not going to be X1, it is still clearly a possibility, but as Nvidia said officially, it is custom, and that means the rumor specs listed here is more likely just a dev kit/fake as they are just emily roger's report and X1 mashed together. If it is X1, I don't think 20nm is likely, considering Nvidia would want to make money and the node is only going to get more expensive in comparison to 16nm which is already competitive/beating it, it seems completely backwards to use the node everyone has already pretty much left.


Here is the problem. Given everything of Nintendo's history, what is more likely:

1. Nintendo uses a 2 year old, released CPU/GPU system that has already got an Android port and sufficient yields to pump out millions of units in a few months (Tegra X1, Maxwell based). The X1 is also basically unused by anyone in the industry (only NVidia Shield) and will probably be super cheap to acquire.

2. Nintendo is going to use an unreleased, un taped out, never used by anyone else, new system (Tegra X2) that no one has any experience with, for a system that is supposed to release in 6-8 months. X2 is based on Pascal but has not even started final manufacturing processes.


3. maxwell is a safer and cheaper bet, and they could always switch to pascal in the future if this thing is successful, and sell it as a revision.

4. eurogamer article really went into detail, and there source are pretty damn sure it's x1 in the final hardware

5. wiiu problem was everything a 176gflops gpu, with horrible dated cpu, 1gb ram for gaming, that console made zero sense engineering wise.

Its really, really, really hard to believe any rumors that suggest #2 is true. Yeah, sure, maybe NVIdia has had a secret project to make an X2 based on Pascal specifically for Nintendo and its been a super tight secret. In an industry where every GPU tweak and leak occurs months in advance.
 
Last edited by a moderator:
xzujMVK.png


We see a new Tegra right at the top here, which seems to be 256 cuda cores running about 1.5ghz. also this is shortly after the time NS is rumored to have gotten new dev kits. (10.10.2016)
This to me is why we know N. won't be doing that. Not at all technical, but when do Nintendo put fastest-ever hardware in their devices? Do we really believe that Nintendo's mobile device will be topping the chart above, eclipsing the iPad Pro and Pixel C and being the world's most powerful portable device? :p
 
This to me is why we know N. won't be doing that. Not at all technical, but when do Nintendo put fastest-ever hardware in their devices? Do we really believe that Nintendo's mobile device will be topping the chart above, eclipsing the iPad Pro and Pixel C and being the world's most powerful portable device? :p

Because Nintendo? Why even be in a discussion if this is your position? Docked the device couldn't really be less than 512GFLOPs either way since it is actively cooled and X1 was used in the dev kits. That already can do 720p XB1 games with proper memory bandwidth, which is where Nintendo always spends money. If your position is not technical, then I can't really debate you on it. You feel it is impossible, that is within your right, but it is an opinion based on nothing but feelings; because Nvidia can certainly compete with Apple when it comes to GPU performance. I mean where did this Tegra Chip come from? Pascal Tegra exists, the bench is pretty much proof of that, yet they pulled the FCC filings for their next Tegra device, so what happens to this chip? they just throw the design away? 1000s of man hours out the window? pretty unlikely to appease Nintendo if all they are giving them is a cheap X1 chip IMO.
Here is the problem. Given everything of Nintendo's history, what is more likely:

1. Nintendo uses a 2 year old, released CPU/GPU system that has already got an Android port and sufficient yields to pump out millions of units in a few months (Tegra X1, Maxwell based). The X1 is also basically unused by anyone in the industry (only NVidia Shield) and will probably be super cheap to acquire.

2. Nintendo is going to use an unreleased, un taped out, never used by anyone else, new system (Tegra X2) that no one has any experience with, for a system that is supposed to release in 6-8 months. X2 is based on Pascal but has not even started final manufacturing processes.


3. maxwell is a safer and cheaper bet, and they could always switch to pascal in the future if this thing is successful, and sell it as a revision.

4. eurogamer article really went into detail, and there source are pretty damn sure it's x1 in the final hardware

5. wiiu problem was everything a 176gflops gpu, with horrible dated cpu, 1gb ram for gaming, that console made zero sense engineering wise.

Its really, really, really hard to believe any rumors that suggest #2 is true. Yeah, sure, maybe NVIdia has had a secret project to make an X2 based on Pascal specifically for Nintendo and its been a super tight secret. In an industry where every GPU tweak and leak occurs months in advance.

About #5 real quick, that was my point with the 45nm, edram took up so much space and there wasn't a lot of room on the die for the GPU, the die size was actually pretty good at ~156mm^2 (iirc) but if you use so much space for edram, you are limited in shader space. Wii U was flawed in large part because AMD didn't care to make a more powerful system.

Whether it is X1 or Pascal, the end result doesn't change much, we are still looking at a device with a 10%-30% faster CPU, 4GB+ ram and a GPU that can compete against AMD's A10 7850k APU in DX11. If you look up what that device is capable of, it is pretty clear that NS shouldn't have issues with porting current gen games, no one is saying this thing won't be noticeably slower than XB1, but considering it's form factor, that is still an impressive feat.
 
forgot to mention the custo
This to me is why we know N. won't be doing that. Not at all technical, but when do Nintendo put fastest-ever hardware in their devices? Do we really believe that Nintendo's mobile device will be topping the chart above, eclipsing the iPad Pro and Pixel C and being the world's most powerful portable device? :p

excactly
Because Nintendo? Why even be in a discussion if this is your position? Docked the device couldn't really be less than 512GFLOPs either way since it is actively cooled and X1 was used in the dev kits. That already can do 720p XB1 games with proper memory bandwidth, which is where Nintendo always spends money. If your position is not technical, then I can't really debate you on it. You feel it is impossible, that is within your right, but it is an opinion based on nothing but feelings; because Nvidia can certainly compete with Apple when it comes to GPU performance. I mean where did this Tegra Chip come from? Pascal Tegra exists, the bench is pretty much proof of that, yet they pulled the FCC filings for their next Tegra device, so what happens to this chip? they just throw the design away? 1000s of man hours out the window? pretty unlikely to appease Nintendo if all they are giving them is a cheap X1 chip IMO.


About #5 real quick, that was my point with the 45nm, edram took up so much space and there wasn't a lot of room on the die for the GPU, the die size was actually pretty good at ~156mm^2 (iirc) but if you use so much space for edram, you are limited in shader space. Wii U was flawed in large part because AMD didn't care to make a more powerful system.

Whether it is X1 or Pascal, the end result doesn't change much, we are still looking at a device with a 10%-30% faster CPU, 4GB+ ram and a GPU that can compete against AMD's A10 7850k APU in DX11. If you look up what that device is capable of, it is pretty clear that NS shouldn't have issues with porting current gen games, no one is saying this thing won't be noticeably slower than XB1, but considering it's form factor, that is still an impressive feat.

sorry i really lol at this one. it was nintendo that didn't care, they wanted it a nextgen wii and that was what amd gave them, they could have easily put a much better gpu, and 2 gig of gddr5, at the very least 356 gflops.
 
Because Nintendo? Why even be in a discussion if this is your position?
It's not an entirely serious position, but it's still a valid one. Where there's clear precedent and little reason to think there's something to change from this precedent (like a change of board members), then there's good chance the precedent will continue, no? Nintendo's corporate philosophy has been "lateral thinking with withered technology" - is there reason to think that has changed? Perhaps, noting their dwindling importance, they are trying something new.

In this discussion trying to guess what NS's hardware will be, precedent following N. corporate philosophy is very much a legitimate consideration.
If your position is not technical, then I can't really debate you on it. You feel it is impossible,
I didn't say impossible. It's definitely possible. I said it's implausible.
 
Status
Not open for further replies.
Back
Top