"New" Nintendo Switch (OLED Model) [2021-07-06]

Discussion in 'Console Technology' started by Goodtwin, Jan 8, 2021.

  1. Lurkmass

    Regular

    Joined:
    Mar 3, 2020
    Messages:
    565
    Likes Received:
    711
    The funny thing is if you look at the AMDGPU backend for LLVM, all GFX6/GFX7 (previous consoles were this) and GFX10.x GPUs share a common instruction encoding which totally explains why RDNA/RDNA2 are compatible with code generated on the previous consoles. GFX8 and GFX9 GPUs are the oddballs that don't share a common instruction encoding with them ...

    Here's what the CUDA documentation would have to say in Switch's case ...

    If we look at the table we can cleanly see that the Tegra X1/Switch that their GPU generation/version number is SM_53 which basically means that only another SM_5x (Maxwell) GPU is guaranteed share a common instruction set.

    Even if Nintendo wanted to upgrade a Switch model with say the Tegra X2 (SM_62), they can't because SM_5x and SM_6x likely have incompatible instruction encodings. You can tell that Nvidia pride themselves on changing the ISA a lot.

    AMD have binding commitments to make compatible CPU/GPUs for other console manufacturers. There's no known commitment for Nvidia to do the same for Nintendo ...
     
    #101 Lurkmass, Jul 7, 2021
    Last edited: Jul 7, 2021

  2. Fair enough, but couldn't one or two low power modern ARM cores just do a JIT translation from one instruction set to the other?
     
  3. JasonLD

    Regular

    Joined:
    Apr 3, 2004
    Messages:
    463
    Likes Received:
    105
    From my understanding Nvidia provided most of the development tools for Nintendo Switch. I doubt Nintendo and Nvidia didn't consider future compatibility in mind when they designed Switch and the dev tools.
     
    orangpelupa, Picao84 and pharma like this.
  4. Lurkmass

    Regular

    Joined:
    Mar 3, 2020
    Messages:
    565
    Likes Received:
    711
    Are you talking about the case of emulation ?

    Sure the CPU can recompile the shader binaries but that doesn't change the reality that there are instructions from the previous GPU architecture that doesn't map to the new GPU architecture so the overhead of translation isn't trivial. The best possible outcome Nintendo could hope for in that situation is mirroring what Microsoft is achieving with Xbox 360 software for their current Xbox consoles. A complete solution is intractable and it's not just mismatched instruction sets that is your only problem but fixed function hardware design can change as well ...

    They should've thought of that before making NVN for the Switch because there are now Nintendo games released using Vulkan since they are dead worried about their work getting obsoleted so they needed some insurance ...
     
  5. JasonLD

    Regular

    Joined:
    Apr 3, 2004
    Messages:
    463
    Likes Received:
    105
    Nintendo has been offering backward compatibility on their portable console ever since Game Boy Color, which each successive generation offering backward compatibility of previous generation hardware.
    I am willing to bet that the ability to offer backward compatibility on future hardware was one of the priorities at Nintendo when they went to Nvidia for Switch SoC. Stop overdramatizing as if it would be too difficult for Nintendo to offer backward compatibility because of GPU complier difference, as they are nowhere as difficult as providing backward compatibility on different CPU ISA.
     
    milk likes this.
  6. Lurkmass

    Regular

    Joined:
    Mar 3, 2020
    Messages:
    565
    Likes Received:
    711
    Providing binary compatibility for GPUs is just as hard if not harder than it is CPUs. Modern emulation is largely starting to be gated by GPUs now while the industry luckily converged on AArch64 or x86_64. Some Switch games are somehow 32-bits which is going to be an issue down the line since ARM vendors are eventually going to deprecate 32-bit support ...

    And your initial statement isn't true at all because the Switch doesn't have backwards compatibility with software from the previous generation especially against a much more simpler system ...
     
    Lalaland likes this.
  7. JasonLD

    Regular

    Joined:
    Apr 3, 2004
    Messages:
    463
    Likes Received:
    105
    Since X1 on Switch only utilizes 4 A57 cores (1 reserved for OS), they can easily offer them as little cores on Next switch and there won't be any problem on CPU end. For GPUs, I am sure Nvidia has guidelines to ensure compatibility with future architecture on their development toolkit.

    It has to do with all the previous Nintendo consoles having dual screen gameplay(Both Wii U and 3DS). They are starting from clean state with Switch.
     
    pharma likes this.
  8. Lurkmass

    Regular

    Joined:
    Mar 3, 2020
    Messages:
    565
    Likes Received:
    711
    They don't really seem to care either way. Nintendo wants to have access to low level optimizations which conflicts with Nvidia's determination to always design foreign GPU architectures to each other. There's no sign that their partnership is driven by anything else but their own impulses ...

    If this is their justification then they obviously weren't even trying since this could've been easily fixed by including a USB-C to HDMI adapter while allowing for active dual display. That's the solution to full compatibility but partial compatibility would've been an option as well but it just goes to show you that they aren't always primarily concerned with compatibility ...
     
  9. Kugai Calo

    Regular

    Joined:
    Mar 6, 2020
    Messages:
    309
    Likes Received:
    361
    Location:
    The Prairies
    It's not about the API, but the GPUs' instruction set architecture. Newer Nvidia μarchs' ISA look drastically different from Maxwell's, to maintain backward compatibility they'll have to recompile all the shader code. If they can get all developers to do it (even for titles released way back) then it should be easy, otherwise they'll need a binary translation solution, some endeavour that Nintendo might not be bothered to (pay Nvidia to) take.

    RDNA 2 GPUs are binary compatible with GCN, the two ISAs also look very similar, think of it like x64 processors being able to run x86 code. But for Nvidia it's different.
     
  10. orangpelupa

    orangpelupa Elite Bug Hunter
    Legend

    Joined:
    Oct 14, 2008
    Messages:
    10,466
    Likes Received:
    3,188


    you need to click thru to see the tweet chain

    twitter com /hexkyz/status/1348007153643130881?lang=en

    info since many months ago about switch oled.

    so it seems it have more efficient memory (10nm), and a better HDMI out chip (supports 4K upscaling).
     
  11. JasonLD

    Regular

    Joined:
    Apr 3, 2004
    Messages:
    463
    Likes Received:
    105
    Nintendo at least kept the backward compatibility for two generations for a while (Game Boy and Game Boy Advance, DS and 3DS, and Wii and Wii U). I fully expect they will maintain the full backward compatibility on Switch successor as well, especially given the success they had with the Switch.
     
    milk and pharma like this.
  12. JasonLD

    Regular

    Joined:
    Apr 3, 2004
    Messages:
    463
    Likes Received:
    105
    This doesn't sound like anything other than your own assumptions.
     
  13. Well these guys were spot on back in January.
    The HDMI out @4K is supported by the header, but the console's maximum output is still 1080p60.


    But a new "Switch 2" would probably bring in more ARM cores which could do that work with trivial amounts of power consumption.
    Or knowing Nintendo I wouldn't be surprised if a new SoC simply brought the two Maxwell SMs to ensure compatibility, and for the new games they could use it as a dedicated ML upscaler, and/or sound processor. At 1GHz they'd have 1TFLOPs FP16 which could be plenty for CUDA-driven DLSS.


    Assuming there even is a backwards-compatible Switch 2 with another Nvidia SoC in the works.
    If Nvidia's GPU architectures are indeed ISA-incompatible and Nintendo would need to emulate the GPU instructions, then maybe they're better off with a Samsung + RDNA2 chip (as in being a much cheaper deal than Nvidia).
     
    eastmen likes this.
  14. Lurkmass

    Regular

    Joined:
    Mar 3, 2020
    Messages:
    565
    Likes Received:
    711
    Whatever their decision is, the only complete solution for compatibility is either they include the chip itself or keep/extend the architecture ...

    One outcome would involve having redundant die space being taken up which would bloat the design of the system and the other outcome miraculously requires Nvidia following a model they don't like ...

    All hopes of retaining compatibility and moving to a different architecture are dead since NVN is lower level than CUDA. If NVN supported ingesting PTX (forward compat ISA) binaries instead SASS (native ISA) binaries then that scenario would've been possible but Nintendo refused this option ...
     
  15. JasonLD

    Regular

    Joined:
    Apr 3, 2004
    Messages:
    463
    Likes Received:
    105
    NIntendo has been doing exactly that, with Wii U, Wii, and their previous generation handheld consoles starting with GBA. Whatever the method Nintendo will end up using to retain Switch backward compatibility, it will definitely have it.
     
  16. The Wii U had the ArtX GPU inside?
     
  17. JasonLD

    Regular

    Joined:
    Apr 3, 2004
    Messages:
    463
    Likes Received:
    105
    Yes, Wii U has the same GPU that was inside Wii for compatibility mode.
    Well, more accurately, Wii U's GPU has Wii GPU integrated lol.
     
    #117 JasonLD, Jul 9, 2021
    Last edited: Jul 10, 2021
    Frenetic Pony likes this.
  18. orangpelupa

    orangpelupa Elite Bug Hunter
    Legend

    Joined:
    Oct 14, 2008
    Messages:
    10,466
    Likes Received:
    3,188
    There's eve a meme where a Wii is two game cube taped together, a Wii u is four game cube taped together
     
  19. Well if the meme was trying to mock the Wii's processing power then it actually underdelivered.
    The Wii is literally a gamecube with 50% higher clocks and one slow single-channel 64MB GDDR3 chip.
     
    milk likes this.
  20. Tagrineth

    Tagrineth murr
    Veteran

    Joined:
    Feb 14, 2002
    Messages:
    2,537
    Likes Received:
    25
    Location:
    Sunny (boring) Florida
    Yes and no, IIRC the Wii's GPU features TEV on all four pipelines where the GCN's only supported one TEV. Not a monumental difference, but it definitely does add *something* besides just faster clocks.
     
    milk likes this.
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...