Nintendo Switch Tech Speculation discussion

Status
Not open for further replies.
These are all theories based off perhaps some very educated guesses, but AFAIK they have never been confirmed by any official or former official from either nvidia or Sony.

Semiaccurate's very accurate article from May claims that neither Microsoft or Sony would even engage in negotiations. If this part is true, then some kind of confrontation must have happened with Sony.

Perhaps Sony wasn't happy that nvidia sold them their 2 year-old GPU architecture while sitting on the brand new G80 architecture with unified shaders that came out at the exact same time as the PS3.
As a comparison, ATi in 2005 sold Microsoft their very first unified shader architecture, which landed on the X360 no less than 1.5 years before ATI's first graphics cards with unified shader GPUs in 2007. And now we know that AMD provided Sony with features for the PS4 that were only seen later in Hawaii cards. And with PS4 Pro they're providing features that won't be in AMD's GPUs until next year, like 2*FP16 throughput.

So regarding Sony, maybe they ended up thinking that nvidia did hide G80 from them. As a matter of fact up until the Geforce 8800 GTX reveal, Jen Hsun kept going on record saying that unified shaders weren't that much better for GPUs, strongly implying that G80 would be yet another architecture with separate pixel and vertex shaders.

I never read Semiaccurate, but every time I hear about it, I get the distinct impression that at some point in the past, Nvidia ran over Charlie Demerjian's dog. What I heard from people inside Nvidia was that at least Sony (and maybe MS, I don't recall clearly) did approach Nv about the new consoles and Nv was kind of an ass in how they said No. As in, went to the meeting, talked about random stuff for a while, then said "Anywaaay... Great to see you. Bye."

What I heard from someone involved in the PS3 GPU process was that the G80 came up in discussions, but it didn't look like it would be ready and solid far enough ahead of the PS3 launch for it it to be a safe bet. Real shame. With a G80 the PS3 would have walked all over the 360. The SPUs would have been useful for a lot more things than just as a crutch for the 7800.
 
I don't believe X1 performance is the least we can expect. If they want to hit good battery life and thermals, it would be in their best interest to downclock. Especially since the SoC might be bandwidth limited already. They can get ps360 era games on 720p(with a lot better textures), which would be where it would probably best shine. It's not like pushing current gen graphics on it would be feasible either way.

Also Pascal's energy efficiency comes a lot from the die shrink, if they don't shrink the SoC, there is probably not much difference between maxwell and pascal's perf/watt anyways. If we know what the process tech is, it would be better hint at what is actually going to be used.
 
It's possible this is a custom SOC using a smaller than what will be in any tablet/phone/etc Pascal GPU to combine the power efficiency with lower cost due to less die space. Same amount of units/etc as their Maxwell derived Tegras, but with Pascal's tech and power improvements might be quite attractive to Nintendo...
 
Of course they could. IBM was still making CPU cores back then and the X360 used fused CPU+GPU designs since 2010 with 45nm at IBM foundries (only thing missing for a full SoC was a very cheap southbridge). The Wii U in 2011 had a MCM with a GPU+North+Southbridge and CPU in it.
In 2013 IBM had been producing the 32nm Power7+ for quite a while. Why wouldn't IBM be able to produce a Kepler+power7 SoC at 32nm?

You can argue that ordering both the CPU and GPU designs from the same company would probably be cheaper (though the cheapest-ever Wii U didn't even to that), but it's not like Sony chose AMD because they didn't have any other choice.

Even if both nVidia and IBM were willing and able to strike a deal with both of their flagship products on an IBM process (by no means an original effort in the same ballpark as the PS4/XB1 SoCs) I highly doubt Power7's power consumption and size actually made it a realistic alternative to something like Jaguar. Not for this application.

MCMs aren't SoCs, not sure how that even enters the comparison... and Wii U was cheapest ever because it was years behind in performance, but the design was hardly efficient.
 
https://twitter.com/mochi_wsj/status/789116462044590080

Tweet suggesting some 3rd parties porting PS4 titles to Switch....maybe this is more powerful than we think?
Maybe...Battery life in a portable device might be more important to some people than power especially when the screen is small, so the Switch is a beast for a portable, to which extent we don't know yet, but I don't think Nintendo ever published the specs of their devices. How well it fares compared to PS4 and One is going to be interesting when someone performs a Jack the Ripper to the console for a device that seems to weigh 900 grams or so.
 
I don't believe X1 performance is the least we can expect. If they want to hit good battery life and thermals, it would be in their best interest to downclock. Especially since the SoC might be bandwidth limited already. They can get ps360 era games on 720p(with a lot better textures), which would be where it would probably best shine. It's not like pushing current gen graphics on it would be feasible either way.

Also Pascal's energy efficiency comes a lot from the die shrink, if they don't shrink the SoC, there is probably not much difference between maxwell and pascal's perf/watt anyways. If we know what the process tech is, it would be better hint at what is actually going to be used.

Why ? 720p is 921,600 pixels vs 1080p or 2,073,600. Its 1,152,00 pixels less. So it should need half the power of an xbox one or ps4 to play their 1080p games at 720p. The xbox one / ps4 are going to be over 3 years old . A high end tablet SOC should be able to run those games even if some settings are tweaked more. Most likely due to ram constraints.
 
Just to add to the discussion, I have a source that is currently acting under NDA on the issue of Nintendo Switch specs.

That source wasn't 2 days ago.

Why this is the case like 5 months away from release is totally nuts if you ask me.
 
Last edited:
https://twitter.com/mochi_wsj/status/789116462044590080

Tweet suggesting some 3rd parties porting PS4 titles to Switch....maybe this is more powerful than we think?

Many Japanese devs aren't exactly pushing the PS4 very hard. Lots are working on PS4/Vita games. Adding Switch and potential dropping Vita support is a pretty obvious course of action. But that doesn't say anything about the specs. It could raise the technical quality of the PS4 games, though, by lifting the lowest common denominator significantly.
 
Even if both nVidia and IBM were willing and able to strike a deal with both of their flagship products on an IBM process (by no means an original effort in the same ballpark as the PS4/XB1 SoCs) I highly doubt Power7's power consumption and size actually made it a realistic alternative to something like Jaguar. Not for this application.

MCMs aren't SoCs, not sure how that even enters the comparison... and Wii U was cheapest ever because it was years behind in performance, but the design was hardly efficient.

Again, I didn't say it would be better or cheaper, just that it would be possible. AMD won the designs because they provided the best possible solution in the eyes of their contractors, not because there was no other way (as in, if AMD had gone bankrupt in 2010 we'd have no PS4 or Xbone). Credit should be given where credit is due IMO.
I have no idea how much a power-optimized quad-core Power7+ at e.g. 2GHz would consume but it could certainly fit a console if needed. In 2010 IBM was making a 45nm SoC with a 3.2GHz CPU consuming less than 120W.

I spoke of MCMs simply because it's a form of integration that has been used in consoles many times, IBM designs included.
 
anybody has the idea how the cpu of Switch compares to the xbox one's cpu in performance department. could it be faster than xbox one Jaguar microarchitecture
 
We have no idea what Switch's CPU even is, so the answer is(Read this last word in the voice of Lana Kane) NOPE.
 
Any info whether nvidia also involved in the OS, UX, and other stuff like the network stacks?

Nintendo was rumored to work together with EA to bring their software and online expertise to Wii. But it fell apart, and frankly, the Nintendo stuff we finally got is horrible.

If this time they fully work together with nvidia, wow. Finally something modern will come out from Nintendo.
 
anybody has the idea how the cpu of Switch compares to the xbox one's cpu in performance department. could it be faster than xbox one Jaguar microarchitecture
Well, the options are:
1) A57
2) A72
3) A73
4) Denver2
5) Some combination of the above

Can those be competitive with Jaguar? Certainly. The CPU is only clocked at 1.6 GHz in PS4, correct? It shouldn't be terribly difficult to offer competing CPU performance.

The biggest performance question is probably bandwidth. I personally expect at least Parker (Tegra) levels, so 50 GB/s. If one wanted to be really optimistic, you could hope for a single stack of HBM 2/3, but probably will not see that due to cost.
 
Last edited:
I think the big question here is how much battery life would Nintendo want. Shield Tablet K1 for example is only good for maybe 2 hours if you game on it. If you set the GPU frame limit to 30 and lock the CPU down to two cores at reduced clock you can extend that a bit. The Switch looks smaller than Shield Tablet and that doesn't bode well for battery capacity.
 
Question answers itself. Unless I missed something and the source is both known and highly reputable....
TegraX1 has an (in total) 64-bit wide bus. With LPDDR4 at 3200, that yields 25.6GBps nominal. There are three ways the Nintendo SoC could improve on that. First is LPDDR4266. Second is if we are looking at a Parker derivative with a 128-bit bus. Third is if the go with LPDDR4x.
In total, the span is potentially quite large.
 
Question answers itself. Unless I missed something and the source is both known and highly reputable....
Dunno about source. Off the back of the leaked devkit images though, and the general realism of the spec, seems fairly legit. Why would someone make up weak sauce bogus specs? Typically the made up crap is either mock-ups to drive hits and promote someone's design career, or to stir up the internet with crazy specs of super powerful hardware or bonkers gimmicks. Unless the specs here are legitimate but subject to change, they seem highly plausible, with a known hardware limit for the BW. Unless, again, subject to change, that's what the devkits have but the final HW will use tech unavailable to the devkits with more BW.
 
Status
Not open for further replies.
Back
Top