Nintendo Switch Tech Speculation discussion

Discussion in 'Console Technology' started by ToTTenTranz, Oct 20, 2016.

Thread Status:
Not open for further replies.
  1. sebbbi

    Veteran

    Joined:
    Nov 14, 2007
    Messages:
    2,924
    Likes Received:
    5,293
    Location:
    Helsinki, Finland
    256 CUDA cores = 512 flop/cycle (multiply-adds, fp32). At 1 GHz this is 0.5 TFLOP/s. Main memory is 4 GB and bandwidth is 25.6 GB/s.

    Xbox One GPU is 3x wider. 768 SIMD lanes = 1536 flop/cycle (multiply-adds, fp32). At 853 MHz this is 1.3 TFLOP/s. Main memory is 8 GB and bandwidth is 68 GB/s. Plus ESRAM of course.

    Xbox One raw performance and bandwidth are both roughly 2.5x. Even if we assume that Nvidia's GPU is more efficient, we are looking at roughly 2x difference.

    This comparison completely ignores the fast ESRAM memory on Xbox One. It will further increase Xbox One's bandwidth advantage. And Xbox One's main memory BW is already pretty low compared to 176 GB/s of PS4. Double rate 16 bit math will of course help the Nvidia GPU a bit, but this only increases ALU performance (in limited cases) and only helps in cases where you are not texture samping bound (2:1 ALU to TEX rate makes you easily TEX bound) or memory bandwidth bound (25.6 GB/s means that you often are BW bound).

    If these specs are true, this console is not fast enough to run Xbox One 900p ports at reduced 720p resolution. Quality also needs to be slightly scaled down. But these are rumoured specs, hopefully the real specs are a bit higher. I would like to see a new Pascal based GPU.
     
    Michellstar, Heinrich4, DSoup and 6 others like this.
  2. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    42,878
    Likes Received:
    14,930
    Location:
    Under my bridge
    Are there decent specs likely in a handheld? I'm doubting that. This'd have to be the most powerful handheld ever made, with the best silicon process and best battery tech to have any battery life, and that's completely unlike Nintendo. Most of the content shown is Wii U level stuff. I expect nVidia Shield class/PS360/Wii U level games, just on the go and at 1080p for TV.
     
    #42 Shifty Geezer, Oct 20, 2016
    Last edited: Oct 20, 2016
    Michellstar, swaaye and milk like this.
  3. Goodtwin

    Veteran Newcomer Subscriber

    Joined:
    Dec 23, 2013
    Messages:
    1,144
    Likes Received:
    608
    Anyone know how credible Zlatan is over at Anandtech? He seems to imply that memory bandwidth is the biggest bottleneck compared to the X1 chip. He mentioned that overall the chip would be about 3x slower than the Xbox One, but also said the Tegra X1 is about 10x slower than the Xbox One. If he is right, that is a pretty big step up from the Tegra X1, and a very large step up from the Wii U. The video footage for Splatoon looked 1080p, far less aliasing than the Wii U, same goes with Mario Kart. I think we are seeing a respectable upgrade over Wii U, but still well short of PS4/X1. It is a mobile unit after all, expecting X1/PS4 levels of performance from a mobile device was never realistic.
     
    RootKit likes this.
  4. I.S.T.

    Veteran

    Joined:
    Feb 21, 2004
    Messages:
    3,174
    Likes Received:
    389
    Any chance they'll run the thing at higher clocks when docked? I'm guessing no due to thermal constraints, but...
     
  5. Erinyes

    Regular

    Joined:
    Mar 25, 2010
    Messages:
    647
    Likes Received:
    94
    Regarding the bandwidth..wouldn't it be using a very similar dual channel LPDDR4 setup as Parker?
    The A57 is not as bad as you make it out to be. People are just hung up over Qualcomm's implementation of it in the S810.

    The A57 implementation in the E7420 was very respectable from both a performance and power perspective.
    Yep..they'd do well to align with the direction Android is going in IMHO. 2 years down the line we'll see similar performance in phones.
     
  6. tongue_of_colicab

    Veteran

    Joined:
    Oct 7, 2004
    Messages:
    3,565
    Likes Received:
    750
    Location:
    Japan
    Wouldn't 4gb memory be too little for ports?

    There are already a bunch of phones with 6gb ram, wouldn't be that strange to go for 6 or 8gb for Switch.
     
  7. ToTTenTranz

    Legend Veteran Subscriber

    Joined:
    Jul 7, 2008
    Messages:
    10,814
    Likes Received:
    5,381
    I'm really hoping it's Pascal because that would most probably mean it's being done using a FinFet process, though that guy does lose a bit of credibility when he claims the Tegra X1 is 8-10x slower than a Xbone. It's probably close to 4x slower, at least in the form of the Shield TV.


    You're right, though the Cortex A72 is much better and it isn't that new either.
    Plus, I still think a LITTLE core would make a lot more sense for a console that requires continous operation than one of the big cores, which are designed to sprint during short periods of time.
    For example, how much power would two Cortex A53 quad-core modules at 1.5GHz sip, using FinFet?


    Skyrim is originally a PS360 game, so it could probably have run on the Wii U if there was demand for it.
    Although Skyrim is identifiable by many more people because it reached cultural phenomenon status, I'd be much more reassured if they had shown Fallout 4 instead.

    In the end, you didn't see a single multiplatform port from the current generation in that video.

    If there's a fan (and there seems to be), they could technically turn the fan on only for docked operation, allowing for substantially higher clocks.

    Is Nintendo in the position of asking the devs for yet another performance target, on top of their already-a-bit-late-to-the-party console?
    I don't think so...


    We haven't seen any gaming-focused FinFet SoC. All other FinFet SoC makers are occupying much of the space with baseband processors, camera ISPs and huge CPU cores that need to run single-threaded javascript very fast.
    If we go by TK1 and TX1, then imagining a 16FF+ SoC with a GPU that gets close to 1TFLOPs FP32 isn't that far fetched. Of course, it would need 4 channels of the latest LPDDR4x (68GB/s I think?) to keep it from being bandwidth starved.

    It wouldn't be cheap (so it's probably not on the Switch), but I do think a tablet SoC that could bite the heels of Durango would be technically feasible nowadays.
     
    Cyan, Shifty Geezer and BRiT like this.
  8. mpg1

    Veteran Newcomer

    Joined:
    Mar 5, 2015
    Messages:
    1,847
    Likes Received:
    1,446
    "Or 30 fps at 3840×2160 pixels"

    So this thing will likely support 4K video output?
     
  9. Svensk Viking

    Regular

    Joined:
    Oct 11, 2009
    Messages:
    542
    Likes Received:
    112
    I'm no programmer and I guess I'm making a fool of myself now, but why not offer Vulkan and OpenGL on the Switch as well? I imagine Nintendo's own API can yield more performance, but I mean, if Vulkan/OpenGL would be "good enough" for some games, I'd imagine that Switch ports could get a piggyback ride alongside the PC versions
     
  10. ToTTenTranz

    Legend Veteran Subscriber

    Joined:
    Jul 7, 2008
    Messages:
    10,814
    Likes Received:
    5,381
    That's just the spec for HDMI 1.4.
    I doubt it'll ever output at 4K, since Netflix mandates HDMI 2.0 with HDCP 2.2 to run their 4K content.
     
    BRiT likes this.
  11. Pixel

    Veteran Regular

    Joined:
    Sep 16, 2013
    Messages:
    1,000
    Likes Received:
    467
    Memory bandwidth is incredibly low and will be the systems massive bottleneck considering the CUDA core count, plus with that small 2MB cpu cache, memory contention will be an additional real issue in games with alot of cpu memory access. Compromises so much on performance to achieve portability and formfactor. Wondering if they could have thrown in some GDDR5 and down clock it extremely aggressively to achieve that same bandwidth for "portable mode", and at full speed while docked with a bigger chassis, passively cooled when mobile, have a fan as part of the docking station to draw air from some rear air vent/holes when docked.
     
    #51 Pixel, Oct 21, 2016
    Last edited: Oct 21, 2016
  12. swaaye

    swaaye Entirely Suboptimal
    Legend

    Joined:
    Mar 15, 2003
    Messages:
    8,567
    Likes Received:
    652
    Location:
    WI, USA
    What if it's a 64 ALU GPU? Pascal can be that small. ;) This is Nintendo we are talking about.
     
  13. mpg1

    Veteran Newcomer

    Joined:
    Mar 5, 2015
    Messages:
    1,847
    Likes Received:
    1,446
  14. Exophase

    Veteran

    Joined:
    Mar 25, 2010
    Messages:
    2,406
    Likes Received:
    430
    Location:
    Cleveland, OH
    I would say from what we know so far that nVidia's role exceeds the norm for a Nintendo hardware partner. With Wii U they licensed the GPU from AMD but put it in a rather esoteric chip made by a Japanese manufacturer, and tied to a separate IBM CPU chip on an MCM. This time around nVidia is doing the complete SoC and has put a lot of investment into the software stack, and is probably to thank for the level of third party support they're garnering.

    If they're deferring to nVidia this much it's possible that they're also allowing nVidia to be more aggressive than Nintendo has been in their technology choices.

    I don't expect some big boost in capability in the dock though. Nintendo's revealed the dock is for TV out and charging and is not "the system." I doubt they'd even bother with including active cooling in it and if they did I doubt it'd make that huge of a difference.
     
    Goodtwin likes this.
  15. Allandor

    Regular Newcomer

    Joined:
    Oct 6, 2013
    Messages:
    366
    Likes Received:
    169
    Well, having nvidia on board for a console failed at least two times in history. This is one reason why MS or sony won't use nvidia technology in future consoles. Now nvidia catched nintendo and I hope nvidia is doing it right this time.

    I really hope the switch is capable of delivering an image upscaled to 4k. I really don't want to have extra input lag because the scaler of a new TV get's active. This is a must have for a console that's coming out this year and beyond.
     
  16. DieH@rd

    Legend Veteran

    Joined:
    Sep 20, 2006
    Messages:
    6,201
    Likes Received:
    2,143
    Don't look close into the rendering quality shown in this video. There is a healthy chance that they've shown a lot of PR bullshots. In any case, if PS4/Xbone can deliver Skyrim remastered in 1080p60, few-watt Switch will likely stick to 720p30.
     
    milk likes this.
  17. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    42,878
    Likes Received:
    14,930
    Location:
    Under my bridge
    At what battery life? Because that's the other key limiting factor beyond maximum power you can squeeze into a handheld. Shield TV can draw nearly 20 watts gaming.
     
  18. Rodéric

    Rodéric a.k.a. Ingenu
    Moderator Veteran

    Joined:
    Feb 6, 2002
    Messages:
    4,028
    Likes Received:
    886
    Location:
    Planet Earth.
    I'll say it, they should have gone PowerVR :p
     
    #58 Rodéric, Oct 21, 2016
    Last edited: Oct 21, 2016
    I.S.T., function, BRiT and 1 other person like this.
  19. sebbbi

    Veteran

    Joined:
    Nov 14, 2007
    Messages:
    2,924
    Likes Received:
    5,293
    Location:
    Helsinki, Finland
    If these specs are real, then iPad Pro (PowerVR) is clearly faster than the forthcoming Nintendo console. iPad Pro also has 51.2 GB/s memory bandwidth (2x 25.6 GB/s).
     
    Heinrich4 and BRiT like this.
  20. ToTTenTranz

    Legend Veteran Subscriber

    Joined:
    Jul 7, 2008
    Messages:
    10,814
    Likes Received:
    5,381
    Those 20W were measured at the wall, and with a 2W USB3 drive on top of the box's own eMMC. So assuming 80% PSU power efficiency, those 19.4W turn into 15.5W actual consumption. Minus 2W of the external drive it's 13.5W.
    Moreover, since it's always plugged in the big A57 cores are constantly clocked at their maximum 2GHz values.
    Switching those big cores for A53 ones at e.g. 1.6GHz would save them a handful of watts, so now we'd be at 10W? And then the FinFet transition should allow 2x performance at the same consumption.
    That tablet is quite thick, so I think a 45Wh battery inside it wouldn't be an unreasonable fit. With a SoC consuming between 8 and 10W plus screen (2, 3W?) on a 45Wh battery would result in a 3.5 - 5 hour autonomy, which is actually the battery life that the first Vita the 3DS versions had.

    Again, this would be expensive.



    Nah, they definitely should've gone Qualcomm. The Adreno GPUs are definitely kicking ass in terms of performance/area at a very competitive performance/watt.
     
    Shifty Geezer likes this.
Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...