Nintendo Switch Tech Speculation discussion

Status
Not open for further replies.
If people are going to go to the trouble of creating comparisons, why don't they do a good job?! Is this loading from cart or download? Does the save game size/position affect load (the initial comparison where Switch won was in a different place to the Wii U - was the save file much smaller? Or the environment more data intensive?). A video that satisfies no question...
 
I'm impressed that DFs John L managed to get through that whole video without mentioning Wipeout a single time. I wouldn't have been able to do that :p Nevertheless it is promising that the Switch is capable of stable 60 performance and should generally be able to at least beat the Wii U even in handheld mode. Kids really don't find graphics as important - my son has a great time playing SSX On Tour on the ... PS2!
 
What are cart load speeds like?

Pretty good. The longest load time I've had in Zelda was around 10 seconds but most fast travel or death/re-load is around 5 seconds. There's no actual loading in game.

Reflecting on my past few days of using the Switch to play Zelda, it's also stable. I've had no problems with it. I initiated a background download of the only demo on the eStore yesterday and it was a much nicer and quicker experience than initiating a download on PS4/PSN. Also, the screen capture feature is immediate, i.e. you press the capture button and an icon version of the screen pops up in the corner of the screen then fades. On PS4/Pro the notification, mind bogglingly, can take many, many seconds to appear leaving you wondering if you pressed it or not. What's up with that, Sony!?!

If people are going to go to the trouble of creating comparisons, why don't they do a good job?! Is this loading from cart or download?

In the first sentence he says they are the physical copies.

Does the save game size/position affect load (the initial comparison where Switch won was in a different place to the Wii U - was the save file much smaller?

I have zero idea how big Zelda save files are but I can tell you that Zelda is consuming 216mb of the internal 32Gb of flash and most of that will be the day 1 patch. Screenshots by default get saved to the external SD card.
 
Last edited by a moderator:
How long do you guys think it will be before we start seeing replacement parts pop up? You can buy replacement screens, triggers, sliders, RL buttons, etc for the 3DS. Do you think you'll be able to get those for the Switch? I ask because I'm a paranoid guy, if my screen ever gets REALLY fucked somehow I hope I can replace it. lol
 
The teardowns paint it as a very user serviceable device. I expect replacement parts to be available - there's already a demand for replacement screens for ones that got scratched. :p
 
So how much more powerful is this thing than a Wii U? I know it has new architecture that's better than the Wii U's, as well as the ability to support Vulkan. Do you guys think we're going to see some surprising games for it in the future? But I'm curious in terms of how much more powerful than Wii U, and how close to XB1 it is. I've heard some say that it's about 1/3 or 1/4 of an XB1, which would kind of be impressive considering how small the thing is.
 
So how much more powerful is this thing than a Wii U? I know it has new architecture that's better than the Wii U's, as well as the ability to support Vulkan. Do you guys think we're going to see some surprising games for it in the future? But I'm curious in terms of how much more powerful than Wii U, and how close to XB1 it is. I've heard some say that it's about 1/3 or 1/4 of an XB1, which would kind of be impressive considering how small the thing is.
Some links:
Switch vs WiiU: http://www.eurogamer.net/articles/digitalfoundry-2017-fast-rmx-showcases-switches-power-over-wii-u
Switch vs PS4 vs Vita: http://www.eurogamer.net/articles/digitalfoundry-2017-dragon-quest-heroes-2-switch-vs-ps4-comparison
Switch power consumption: http://www.anandtech.com/show/11181/a-look-at-nintendo-switch-power-consumption
Xbox One S power consumption:

Switch (GPU performance) seems to be slightly ahead of last gen consoles (Xbox 360, PS3, WiiU) when handheld. When docked it is roughly 2x last gen. Last gen games were mostly 720p, meaning that the handheld image quality of Switch (720p screen) should be also slightly ahead. Docked IQ is the same, but rendered at 900p or 1080p. GPU performance doubles, but memory bandwidth only gets a minor boost when docked. This is a bit similar to PS4 -> PS4 Pro.

We need more cross platform games to draw final conclusion about Switch vs Xbox One. 1/3 seems to be a pretty good estimate for docked mode. Switch has 4 GB of memory. This is much more than Xbox 360 or PS3 (both had 512 MB). WiiU had 2 GB, but OS used half of it (1 GB usable). Switch Nvidia Maxwell GPU also all the same features as current gen consoles and DX12.1 PCs. Switch has a modern OoO CPU with cache prefetchers. Down-porting games to Switch should be significantly easier than current gen -> last gen.

Switch power consumption when docked (11W) is roughly 5x less than Xbox One S (58W). Both chips are 20nm. Handheld power efficiency is harder to compare against home consoles, because handheld power consumption includes the screen. Handheld battery life beats tablets in gaming. Battery life in Modern Combat 5 or Asphalt 8 is only 1.5h - 2h on high end tablets. Rendering resolution however differs, meaning that direct efficiency comparison can't be made.
 
Last edited:
Switch power consumption:
Anand numbers look weird to me, these don't add up to the battery capacity. With these numbers, console with 16 Whr battery should last for 16/7.1 = 2.25 hours with min display brightness and 1.8 hours with max, which clearly doesn't match 3 hours and 5 minutes (min brightness) and 2 hours and 30 minutes (max brightness) measured by eurogamer and other sites in Zelda. Am I missing something?
 
Anand numbers look weird to me, these don't add up to the battery capacity. With these numbers, console with 16 Whr battery should last for 16/7.1 = 2.25 hours with min display brightness and 1.8 hours with max, which clearly doesn't match 3 hours and 5 minutes (min brightness) and 2 hours and 30 minutes (max brightness) measured by eurogamer and other sites in Zelda. Am I missing something?

AT is measuring at the USB port, so it incorporates losses from the USB power regulation which may be rather high when the input is 15V. There may also be minor loss due to the measurement device itself, although it's possible that it isn't factored in its measurement - at any rate I can't find a user manual/datasheet for it so I don't think it can be determined either way.

I don't know how the Switch's power circuitry is designed but it'd make sense if the USB power is dropped down to a voltage vaguely around what the battery outputs, that is suitable for powering the main PMIC that the battery would power directly.

The Eurogamer numbers you gave would equate to about 5.2Wh at min brightness and 6.4Wh at max. That's about 73% and 72% of the AT numbers respectively, so the loss is pretty close to constant which is a good sign (in practice power supply losses do vary a bit with load so it's not totally linear). From looking around at a few datasheets this appears to be around the ballpark of what you'd expect from a switching DC/DC 15V to 3.3V regulator.
 
Some links:
Switch vs WiiU: http://www.eurogamer.net/articles/digitalfoundry-2017-fast-rmx-showcases-switches-power-over-wii-u
Switch vs PS4 vs Vita: http://www.eurogamer.net/articles/digitalfoundry-2017-dragon-quest-heroes-2-switch-vs-ps4-comparison
Switch power consumption: http://www.anandtech.com/show/11181/a-look-at-nintendo-switch-power-consumption
Xbox One S power consumption:

Switch (GPU performance) seems to be slightly ahead of last gen consoles (Xbox 360, PS3, WiiU) when handheld. When docked it is roughly 2x last gen. Last gen games were mostly 720p, meaning that the handheld image quality of Switch (720p screen) should be also slightly ahead. Docked IQ is the same, but rendered at 900p or 1080p. GPU performance doubles, but memory bandwidth only gets a minor boost when docked. This is a bit similar to PS4 -> PS4 Pro.

Wii U's gpu was the best (not in flops but efficiency) and PS3 had the worst. In handheld mode the switch is at least as good as Wii U's and the docked speed is 2.25x handheld mode ; Switch would at minimum be 2.25x Wii U's gpu with a newer feature set.

I agree that 1/3 Xbox one seems to be a decent rough estimate.
 
Switch power consumption when docked (11W) is roughly 5x less than Xbox One S (58W). Both chips are 20nm. Handheld power efficiency is harder to compare against home consoles, because handheld power consumption includes the screen. Handheld battery life beats tablets in gaming. Battery life in Modern Combat 5 or Asphalt 8 is only 1.5h - 2h on high end tablets. Rendering resolution however differs, meaning that direct efficiency comparison can't be made.

X1S / PS4 Slim & Pro are all TSMC 16 nm FF, and so on a more advanced process than the Switch, though for X1S and particularly PS4Pro they're pushing frequency towards the upper part of the efficient range.

Here's confirmation of the X1S being 16nm FF: http://www.eurogamer.net/articles/digitalfoundry-2016-inside-xbox-one-s-tech-interview

AT is measuring at the USB port, so it incorporates losses from the USB power regulation which may be rather high when the input is 15V. There may also be minor loss due to the measurement device itself, although it's possible that it isn't factored in its measurement - at any rate I can't find a user manual/datasheet for it so I don't think it can be determined either way.

I don't know how the Switch's power circuitry is designed but it'd make sense if the USB power is dropped down to a voltage vaguely around what the battery outputs, that is suitable for powering the main PMIC that the battery would power directly.

The Eurogamer numbers you gave would equate to about 5.2Wh at min brightness and 6.4Wh at max. That's about 73% and 72% of the AT numbers respectively, so the loss is pretty close to constant which is a good sign (in practice power supply losses do vary a bit with load so it's not totally linear). From looking around at a few datasheets this appears to be around the ballpark of what you'd expect from a switching DC/DC 15V to 3.3V regulator.

Earlier in this thread someone used ATs figures to say that TX2 would basically fit within the Switch's power envelope when configured for 7.5W TDP.

It's interesting to see just how much power the switching regulator could be consuming. Battery is probably still on trickle charge even full, so that may be a bit of extra juice too.

At 5.2Wh minimum brightness for the entire system it's clear that anything like the TX2's 7.5W configuration would have been a no go, even if TX2 had fit Nintendo's timescale (which it clearly didn't). Significant downclocks would erode the point of using the more expensive TX2, and for mobile mode make the value of the 128-bit bus in terms of both $$ and power much lower, to the point where it may become a liability.

Will be interesting to see where TX2 turns up, and where it doesn't.
 
Wii U's gpu was the best (not in flops but efficiency) and PS3 had the worst.

How do you figure this? What are you comparing with?

It's hard to really estimate because we don't have GPU-only power consumption numbers for any console that I'm aware of, or any methodology that tries to determine this based on varying workloads. But in terms of overall system perf/W Wii U is surely behind both the original XB1 and PS4, and way behind XB1S and PS4Pro. Or maybe you're defining efficiency as simply the least power consumption regardless of performance?

Earlier in this thread someone used ATs figures to say that TX2 would basically fit within the Switch's power envelope when configured for 7.5W TDP.

It's interesting to see just how much power the switching regulator could be consuming. Battery is probably still on trickle charge even full, so that may be a bit of extra juice too.

At 5.2Wh minimum brightness for the entire system it's clear that anything like the TX2's 7.5W configuration would have been a no go, even if TX2 had fit Nintendo's timescale (which it clearly didn't). Significant downclocks would erode the point of using the more expensive TX2, and for mobile mode make the value of the 128-bit bus in terms of both $$ and power much lower, to the point where it may become a liability.

Will be interesting to see where TX2 turns up, and where it doesn't.

Playing devil's advocate a bit, but I don't agree with you that downclocks would erode the point of using the TX2, although this is completely sidestepping the question of whether or not the extra price would (pretty much unknown to us) Switch is very heavily power limited in handheld mode (and decently in docked mode). Just a straight decrease in power consumption in handheld mode would have been a pretty big deal, a ~3 hour battery life is pretty bad. On the flip side, the 128-bit memory interface (downclocked by about half in handheld mode to try to come closer to the 64-bit power consumption) would have helped keep performance from getting totally hamstrung in docked mode.

The interesting question is exactly what Nintendo would have done with the two Denver cores, if developers would find such a wildly heterogeneous setup desirable. I guess it really depends on their efficiency. If the two Denvers can tend to supply better performance than three A57s at less power, which is at least within the realm of possibility, then they'd probably be a win. Then the OS functions could stay on the A57 cluster and use as little clock speed and as few cores powered as it can get away with.
 
Last edited:
How do you figure this? What are you comparing with?

It's hard to really estimate because we don't have GPU-only power consumption numbers for any console that I'm aware of, or any methodology that tries to determine this based on varying workloads. But in terms of overall system perf/W Wii U is surely behind both the original XB1 and PS4, and way behind XB1S and PS4Pro. Or maybe you're defining efficiency as simply the least power consumption regardless of performance?

Perhaps phoenix was meaning terms of theoretical flops -> real world performance? You could probably make a strong case for that, at least going by multiplatform games.
 
Status
Not open for further replies.
Back
Top