Nintendo Switch Tech Speculation discussion

Discussion in 'Console Technology' started by ToTTenTranz, Oct 20, 2016.

Thread Status:
Not open for further replies.
  1. Globalisateur

    Globalisateur Globby
    Veteran Regular Subscriber

    Joined:
    Nov 6, 2013
    Messages:
    3,402
    Likes Received:
    2,112
    Location:
    France
    Skyrim is running on PS3, ish.

    This is a Nintendo mobile hardware. When we think that such spec like the frequency could be X, like 1ghz for the GPU...

    Then it's more likely lower.
     
  2. itsmydamnation

    Veteran Regular

    Joined:
    Apr 29, 2007
    Messages:
    1,308
    Likes Received:
    407
    Location:
    Australia
    Why because you dont agree? There is more to performance then just Specs. The guy has posted on Anandtech for a few years and has been very reliable and quite detailed in responses. Just check his post history.
     
  3. ToTTenTranz

    Legend Veteran Subscriber

    Joined:
    Jul 7, 2008
    Messages:
    10,813
    Likes Received:
    5,381
    Because the TX1 can't be 10x slower than Durango..
    Maybe the TK1 could be, but the TX1 is not.

    It could behave like 10x slower if the GPU was provided by some lesser known and less competent IHV (like say DMP), but people have benchmarked the TX1 in Shield TV directly against x86 solutions and we know it's well beyond 1/10th of what the Xbone can do.
     
    xpea and RootKit like this.
  4. Pressure

    Veteran Regular

    Joined:
    Mar 30, 2004
    Messages:
    1,408
    Likes Received:
    336
    Don't underestimate the power of vertical integration.

    Nvidia supplies both the hardware and API. As well as being responsible for a lot of the software stack as well.

    It would make no sense to make a custom part as unbalanced as the discussion goes with regards to bandwidth. Expect some "secret sauce" considering Pascal already acts as a Tile Based Renderer in certain aspects (if I recall correctly).
     
    #64 Pressure, Oct 21, 2016
    Last edited: Oct 21, 2016
  5. Rootax

    Veteran Newcomer

    Joined:
    Jan 2, 2006
    Messages:
    1,436
    Likes Received:
    813
    Location:
    France
    It would make sense for Nintendo since they don't seem to care about power. If they can make WiiU game at 1080p, it's all good I guess...
     
    milk likes this.
  6. Pressure

    Veteran Regular

    Joined:
    Mar 30, 2004
    Messages:
    1,408
    Likes Received:
    336
    The customer is always right I suppose.
     
  7. ToTTenTranz

    Legend Veteran Subscriber

    Joined:
    Jul 7, 2008
    Messages:
    10,813
    Likes Received:
    5,381
    Speaking of vertical integration, and nvidia's own statement about providing the full hardware+software package, I guess it's time to remember semiaccurate's article from back in May.
    Here are the main points of the article, according to a gaf user:

    nvidia taking a loss is to be taken with a grain of salt of course, but the fact is that the less Nintendo is spending on it, the better the hardware performance should be.
     
  8. Goodtwin

    Veteran Newcomer Subscriber

    Joined:
    Dec 23, 2013
    Messages:
    1,144
    Likes Received:
    608
    I have to believe that Nvidia has invested a lot of R&D into the Tegra processor, and thus far hasn't gained the traction they would have liked. I have never seen any sales figures for the Sheild Console or Tablet, but I would assume they are pretty lackluster. The Tegra is pretty much everything Nintendo wants in a processor. For Nvidia, even if this is a low margin deal, its basically for a product that they had already invested in and had little to lose.
     
  9. Exophase

    Veteran

    Joined:
    Mar 25, 2010
    Messages:
    2,406
    Likes Received:
    430
    Location:
    Cleveland, OH
    The main tablet part is pretty close to an nVidia Shield device, and the controllers could be manufactured separately (and are nothing very special for either Nintendo or nVidia) The Shield family has always been sold for what appears to be pretty thin margins, with no real game sales to make up the difference. I don't think Switch is going to be a raw deal for nVidia unless Nintendo is trying to make as much as they can per console.
     
  10. I.S.T.

    Veteran

    Joined:
    Feb 21, 2004
    Messages:
    3,174
    Likes Received:
    389
    It's really more complex than failed. The first one was due to bad contracts; the tech was fine. Hell, they had one of the best soundchips in a console ever in the original X-Box.

    The second was a tech problem, yes, but mainly because they had to rush rush rush once the original Sony made GPU didn't work. If they had chunked it a year earlier, some of the issues of the GPU might not have been there(Hell, maybe the roughly 8% or so clockspeed reduction might not have happened. Losing that much doesn't help matters at all). Perhaps the CPU might not have had to work quite so hard to cover the RSX if it had more vertex shaders, etc, etc... Things like that might have been able to be changed if it had more development time.
     
    xpea, Goodtwin and tongue_of_colicab like this.
  11. Exophase

    Veteran

    Joined:
    Mar 25, 2010
    Messages:
    2,406
    Likes Received:
    430
    Location:
    Cleveland, OH
    PS3 was also 10 years ago, I wouldn't really pin current expectations to performance from so long ago, whether or not nVidia can really be blamed for much.

    There's been kind of a dark shadow over them since PS4/XB1 came out since they didn't get a design win there and tried to play it off like it didn't matter. But they weren't really in the running because they couldn't yet build an SoC with a competitive 64-bit CPU. So AMD had the right tech at the right time.

    For Switch the situation is pretty much the opposite. Despite the way it's being positioned this isn't a home console, it's a tablet-sized handheld, and a small tablet at that, which is an area nVidia has a lot of experience with. Switch could have probably had a much smaller base-only dock with the same functionality, and it could have had wireless streaming to the dock/TV if they wanted. Providing the appearance of a traditional console was obviously important to Nintendo, but it's largely a sleight of hand.

    Point is, nVidia today is in a better position to provide competent mobile partnership than any of the vendors Nintendo has previously worked with. They could have perhaps tried to cobble together their own SoC with ARM CPU and IMG (or ARM) GPU IP, but the result is more likely to have turned out like 3DS's SoC than Tegra. Something with really old tech if it would have taken them years longer to complete.
     
    Goodtwin likes this.
  12. Esrever

    Regular Newcomer

    Joined:
    Feb 6, 2013
    Messages:
    759
    Likes Received:
    523
    Or they could have chosen an off the shelf ARM part from a myriad of other vendors. It's not like a snapdragon 830 would have been worse from a hardware point of view. Nvidia probably gave nintendo a really good deal and full software API package too.
     
  13. ToTTenTranz

    Legend Veteran Subscriber

    Joined:
    Jul 7, 2008
    Messages:
    10,813
    Likes Received:
    5,381
    These are all theories based off perhaps some very educated guesses, but AFAIK they have never been confirmed by any official or former official from either nvidia or Sony.

    Semiaccurate's very accurate article from May claims that neither Microsoft or Sony would even engage in negotiations. If this part is true, then some kind of confrontation must have happened with Sony.

    Perhaps Sony wasn't happy that nvidia sold them their 2 year-old GPU architecture while sitting on the brand new G80 architecture with unified shaders that came out at the exact same time as the PS3.
    As a comparison, ATi in 2005 sold Microsoft their very first unified shader architecture, which landed on the X360 no less than 1.5 years before ATI's first graphics cards with unified shader GPUs in 2007. And now we know that AMD provided Sony with features for the PS4 that were only seen later in Hawaii cards. And with PS4 Pro they're providing features that won't be in AMD's GPUs until next year, like 2*FP16 throughput.

    So regarding Sony, maybe they ended up thinking that nvidia did hide G80 from them. As a matter of fact up until the Geforce 8800 GTX reveal, Jen Hsun kept going on record saying that unified shaders weren't that much better for GPUs, strongly implying that G80 would be yet another architecture with separate pixel and vertex shaders.
     
    Pixel, upnorthsox, egoless and 2 others like this.
  14. ToTTenTranz

    Legend Veteran Subscriber

    Joined:
    Jul 7, 2008
    Messages:
    10,813
    Likes Received:
    5,381

    Of course they could. IBM was still making CPU cores back then and the X360 used fused CPU+GPU designs since 2010 with 45nm at IBM foundries (only thing missing for a full SoC was a very cheap southbridge). The Wii U in 2011 had a MCM with a GPU+North+Southbridge and CPU in it.
    In 2013 IBM had been producing the 32nm Power7+ for quite a while. Why wouldn't IBM be able to produce a Kepler+power7 SoC at 32nm?

    You can argue that ordering both the CPU and GPU designs from the same company would probably be cheaper (though the cheapest-ever Wii U didn't even to that), but it's not like Sony chose AMD because they didn't have any other choice.
     
  15. damienw

    Regular

    Joined:
    Sep 29, 2008
    Messages:
    495
    Likes Received:
    49
    Location:
    Seattle
    I'm sure the broken scaler in the PS3 GPU didn't help the relationship.
     
  16. swaaye

    swaaye Entirely Suboptimal
    Legend

    Joined:
    Mar 15, 2003
    Messages:
    8,567
    Likes Received:
    652
    Location:
    WI, USA
    Microsoft did work with NV recently, for Tegra in Surface RT and Surface 2.
     
  17. mpg1

    Veteran Newcomer

    Joined:
    Mar 5, 2015
    Messages:
    1,847
    Likes Received:
    1,446
  18. Esrever

    Regular Newcomer

    Joined:
    Feb 6, 2013
    Messages:
    759
    Likes Received:
    523
    Or maybe they are porting angry birds.
     
    BRiT likes this.
  19. BRiT

    BRiT Verified (╯°□°)╯
    Moderator Legend Alpha

    Joined:
    Feb 7, 2002
    Messages:
    14,813
    Likes Received:
    12,897
    Location:
    Cleveland
    You'd have to have blinders on to think so.
     
  20. Goodtwin

    Veteran Newcomer Subscriber

    Joined:
    Dec 23, 2013
    Messages:
    1,144
    Likes Received:
    608
    I think its safe to assume Tegra X1 performance is pretty much the least we can expect, and Tegra Parker isn't out of the question. The Tegra X1 already had proper engine support like Unreal 4, so for developers, porting games should be much easier than it was with the Wii U. It also seems like Nvidia was very hands on with not only the SOC, but also the development environment. This could be the best/easiest console to develop for that Nintendo has ever put out there. So while performance may be relatively limited compared to PS4/X1, its far easier to deal with compromising visual fidelity and resolution than it is to deal with ground up low level code issues. The Tegra Parker chip seems like it could have easily evolved from the partnership with Nintendo. I'm sure the Switch SOC is customized, but I find it hard to believe that the chip is still Maxwell based when Pascal is so much more power efficient, and this obviously important in a product like Switch.

    Indie developers will have little to no issue with the Switch in terms of performance. Heck, the Wii U gets Indie support to this day. Very rarely are Indies really pushing the hardware. So for that community, Switch is just another platform that can easily accept their games. Now if Dice starts talking about Switch development, then perhaps there is more performance under the hood than we realize, but personally I think its best for my only Sanity if I simply assume its somewhere in the Tegra X1/Parker ball park, and leave the PS4/X1 pipedream alone.
     
Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...