Nintendo Switch Tech Speculation discussion

Status
Not open for further replies.
Skyrim is running on PS3, ish.

This is a Nintendo mobile hardware. When we think that such spec like the frequency could be X, like 1ghz for the GPU...

Then it's more likely lower.
 
I'm really hoping it's Pascal because that would most probably mean it's being done using a FinFet process, though that guy does lose a bit of credibility when he claims the Tegra X1 is 8-10x slower than a Xbone. It's probably close to 4x slower, at least in the form of the Shield TV.

Why because you dont agree? There is more to performance then just Specs. The guy has posted on Anandtech for a few years and has been very reliable and quite detailed in responses. Just check his post history.
 
Why because you dont agree?

Because the TX1 can't be 10x slower than Durango..
Maybe the TK1 could be, but the TX1 is not.

It could behave like 10x slower if the GPU was provided by some lesser known and less competent IHV (like say DMP), but people have benchmarked the TX1 in Shield TV directly against x86 solutions and we know it's well beyond 1/10th of what the Xbone can do.
 
Don't underestimate the power of vertical integration.

Nvidia supplies both the hardware and API. As well as being responsible for a lot of the software stack as well.

It would make no sense to make a custom part as unbalanced as the discussion goes with regards to bandwidth. Expect some "secret sauce" considering Pascal already acts as a Tile Based Renderer in certain aspects (if I recall correctly).
 
Last edited:
Don't underestimate the power of vertical integration.

Nvidia supplies both the hardware and API. As well as being responsible for a lot of the software stack as well.

It would make no sense to make a custom part as unbalanced as the discussion goes with bandwidth. Expect some "secret sauce" considering Pascal already acts as a Tile Based Renderer in certain aspects (if I recall correctly).

It would make sense for Nintendo since they don't seem to care about power. If they can make WiiU game at 1080p, it's all good I guess...
 
Speaking of vertical integration, and nvidia's own statement about providing the full hardware+software package, I guess it's time to remember semiaccurate's article from back in May.
Here are the main points of the article, according to a gaf user:

  • Though Nvidia downplayed console margins, their pride was hurt by the loss in console contracts. All the talk about "focusing on Shield" was a cover for the fact that MS and Sony had soured on them and would not enter negotiations.
  • Nvidia team was told to get a console win or "go home." Enter Nintendo, who apparently made off very well in this deal. This to the point that SemiAccurate questions whether this is a "win" at all for Nvidia.
  • SA has heard that Nvidia are promising software, support, and the whole shebang at a very low cost. According to one source, Nvidia may even be taking a loss on this deal.
  • Not mentioned which generation of Tegra or which process node will be used or when the handheld is scheduled for release.
  • No mention of the home console, but we can speculate what that might be and who might provide the chipset for that one.

nvidia taking a loss is to be taken with a grain of salt of course, but the fact is that the less Nintendo is spending on it, the better the hardware performance should be.
 
I have to believe that Nvidia has invested a lot of R&D into the Tegra processor, and thus far hasn't gained the traction they would have liked. I have never seen any sales figures for the Sheild Console or Tablet, but I would assume they are pretty lackluster. The Tegra is pretty much everything Nintendo wants in a processor. For Nvidia, even if this is a low margin deal, its basically for a product that they had already invested in and had little to lose.
 
The main tablet part is pretty close to an nVidia Shield device, and the controllers could be manufactured separately (and are nothing very special for either Nintendo or nVidia) The Shield family has always been sold for what appears to be pretty thin margins, with no real game sales to make up the difference. I don't think Switch is going to be a raw deal for nVidia unless Nintendo is trying to make as much as they can per console.
 
Well, having nvidia on board for a console failed at least two times in history. This is one reason why MS or sony won't use nvidia technology in future consoles. Now nvidia catched nintendo and I hope nvidia is doing it right this time.

It's really more complex than failed. The first one was due to bad contracts; the tech was fine. Hell, they had one of the best soundchips in a console ever in the original X-Box.

The second was a tech problem, yes, but mainly because they had to rush rush rush once the original Sony made GPU didn't work. If they had chunked it a year earlier, some of the issues of the GPU might not have been there(Hell, maybe the roughly 8% or so clockspeed reduction might not have happened. Losing that much doesn't help matters at all). Perhaps the CPU might not have had to work quite so hard to cover the RSX if it had more vertex shaders, etc, etc... Things like that might have been able to be changed if it had more development time.
 
PS3 was also 10 years ago, I wouldn't really pin current expectations to performance from so long ago, whether or not nVidia can really be blamed for much.

There's been kind of a dark shadow over them since PS4/XB1 came out since they didn't get a design win there and tried to play it off like it didn't matter. But they weren't really in the running because they couldn't yet build an SoC with a competitive 64-bit CPU. So AMD had the right tech at the right time.

For Switch the situation is pretty much the opposite. Despite the way it's being positioned this isn't a home console, it's a tablet-sized handheld, and a small tablet at that, which is an area nVidia has a lot of experience with. Switch could have probably had a much smaller base-only dock with the same functionality, and it could have had wireless streaming to the dock/TV if they wanted. Providing the appearance of a traditional console was obviously important to Nintendo, but it's largely a sleight of hand.

Point is, nVidia today is in a better position to provide competent mobile partnership than any of the vendors Nintendo has previously worked with. They could have perhaps tried to cobble together their own SoC with ARM CPU and IMG (or ARM) GPU IP, but the result is more likely to have turned out like 3DS's SoC than Tegra. Something with really old tech if it would have taken them years longer to complete.
 
Point is, nVidia today is in a better position to provide competent mobile partnership than any of the vendors Nintendo has previously worked with. They could have perhaps tried to cobble together their own SoC with ARM CPU and IMG (or ARM) GPU IP, but the result is more likely to have turned out like 3DS's SoC than Tegra. Something with really old tech if it would have taken them years longer to complete.
Or they could have chosen an off the shelf ARM part from a myriad of other vendors. It's not like a snapdragon 830 would have been worse from a hardware point of view. Nvidia probably gave nintendo a really good deal and full software API package too.
 
The second was a tech problem, yes, but mainly because they had to rush rush rush once the original Sony made GPU didn't work. If they had chunked it a year earlier, some of the issues of the GPU might not have been there(Hell, maybe the roughly 8% or so clockspeed reduction might not have happened. Losing that much doesn't help matters at all). Perhaps the CPU might not have had to work quite so hard to cover the RSX if it had more vertex shaders, etc, etc... Things like that might have been able to be changed if it had more development time.

These are all theories based off perhaps some very educated guesses, but AFAIK they have never been confirmed by any official or former official from either nvidia or Sony.

Semiaccurate's very accurate article from May claims that neither Microsoft or Sony would even engage in negotiations. If this part is true, then some kind of confrontation must have happened with Sony.

Perhaps Sony wasn't happy that nvidia sold them their 2 year-old GPU architecture while sitting on the brand new G80 architecture with unified shaders that came out at the exact same time as the PS3.
As a comparison, ATi in 2005 sold Microsoft their very first unified shader architecture, which landed on the X360 no less than 1.5 years before ATI's first graphics cards with unified shader GPUs in 2007. And now we know that AMD provided Sony with features for the PS4 that were only seen later in Hawaii cards. And with PS4 Pro they're providing features that won't be in AMD's GPUs until next year, like 2*FP16 throughput.

So regarding Sony, maybe they ended up thinking that nvidia did hide G80 from them. As a matter of fact up until the Geforce 8800 GTX reveal, Jen Hsun kept going on record saying that unified shaders weren't that much better for GPUs, strongly implying that G80 would be yet another architecture with separate pixel and vertex shaders.
 
There's been kind of a dark shadow over them since PS4/XB1 came out since they didn't get a design win there and tried to play it off like it didn't matter. But they weren't really in the running because they couldn't yet build an SoC with a competitive 64-bit CPU.


Of course they could. IBM was still making CPU cores back then and the X360 used fused CPU+GPU designs since 2010 with 45nm at IBM foundries (only thing missing for a full SoC was a very cheap southbridge). The Wii U in 2011 had a MCM with a GPU+North+Southbridge and CPU in it.
In 2013 IBM had been producing the 32nm Power7+ for quite a while. Why wouldn't IBM be able to produce a Kepler+power7 SoC at 32nm?

You can argue that ordering both the CPU and GPU designs from the same company would probably be cheaper (though the cheapest-ever Wii U didn't even to that), but it's not like Sony chose AMD because they didn't have any other choice.
 
I think its safe to assume Tegra X1 performance is pretty much the least we can expect, and Tegra Parker isn't out of the question. The Tegra X1 already had proper engine support like Unreal 4, so for developers, porting games should be much easier than it was with the Wii U. It also seems like Nvidia was very hands on with not only the SOC, but also the development environment. This could be the best/easiest console to develop for that Nintendo has ever put out there. So while performance may be relatively limited compared to PS4/X1, its far easier to deal with compromising visual fidelity and resolution than it is to deal with ground up low level code issues. The Tegra Parker chip seems like it could have easily evolved from the partnership with Nintendo. I'm sure the Switch SOC is customized, but I find it hard to believe that the chip is still Maxwell based when Pascal is so much more power efficient, and this obviously important in a product like Switch.

Indie developers will have little to no issue with the Switch in terms of performance. Heck, the Wii U gets Indie support to this day. Very rarely are Indies really pushing the hardware. So for that community, Switch is just another platform that can easily accept their games. Now if Dice starts talking about Switch development, then perhaps there is more performance under the hood than we realize, but personally I think its best for my only Sanity if I simply assume its somewhere in the Tegra X1/Parker ball park, and leave the PS4/X1 pipedream alone.
 
Status
Not open for further replies.
Back
Top