Nintendo Switch Tech Speculation discussion

Status
Not open for further replies.
Not to troll, but I honestly believe BotW is technically severely overrated; it's probably the worst performing first party nintendo title on any platform since the N64. Sorry but the title should be 60fps locked, or 30fps with AA or a 1080p frame buffer. At least based on the graphics.
Switch probably won't get a Zelda game if they can delay it until "Switch 2" or "Switch U", but expect it to look like the 2011 Wii U demo and 1080p/60
 
Not to troll, but I honestly believe BotW is technically severely overrated; it's probably the worst performing first party nintendo title on any platform since the N64. Sorry but the title should be 60fps locked, or 30fps with AA or a 1080p frame buffer. At least based on the graphics.
Switch probably won't get a Zelda game if they can delay it until "Switch 2" or "Switch U", but expect it to look like the 2011 Wii U demo and 1080p/60
Based on the hardware it's running on, it is pretty much the pinacle of what the Wii U can do. This is a very large open world game. Not many open world games from the 360/ps3 generation can claim to be technically superior. GTA V is the only one I can think of. Even Digital Foundry have Zelda a lot of respect when they did their E3 analysis.

Sent from my SM-G360V using Tapatalk
 
This actually paints a pretty good picture for Maxwell's bandwidth saving techniques. In portable mode with only 22GB/s of memory bandwidth the Zelda BoTW runs a near locked 30fps. Wii U had a minimum of 35GB/s with the edram, and another 12.8GB/s from the main memory. The fact that the Tegra X1 is able to run Zelda BoTW in 720P with only 22GB/s of bandwidth is pretty impressive. The jump to 900p seems to overwhelm the 25.6GB/s, the double buffer vsync magnifies the issue. I honestly doubt that the frame time is going much over 33ms, seeing as how I have played long sessions where the frame rate stays solid for good chunks of time. The question is would a triple buffered vsync be better or worse? It causes uneven frame pacing that could be perceived as even worse than a drop to 20fps. Its definitely "framey" when it dips, but it is consistent. When these drops occur, they don't seem to last real long. I would say its rare for them to last more than a few seconds, with only a few areas that seemed to have more sustained issues.

It does, but I don't think it was totally unexpected judging by what GM107 was capable of and it was a 1TFlop+ GPU. I'm not impressed at all and wonder if computing power is more of a bottleneck in the he Switch than memory bandwidth at 720p (900p is another story of course, but by then compute power may be too short of the mark already that discussing memory bandwidth ends up being merely academic).
 
Not to troll, but I honestly believe BotW is technically severely overrated; it's probably the worst performing first party nintendo title on any platform since the N64. Sorry but the title should be 60fps locked, or 30fps with AA or a 1080p frame buffer. At least based on the graphics.
Switch probably won't get a Zelda game if they can delay it until "Switch 2" or "Switch U", but expect it to look like the 2011 Wii U demo and 1080p/60


I think BotW is a "beautiful" game during this 8th gen in the same sense Twilight Princess was a beautiful game for the Wii during the 7th gen.
It's not a major technical achievement by any possible measure, but the assets are so carefully crafted that the end result is really good.

That said, I really don't see anything in BotW that makes me think it would be impossible to do in a PS3 or X360. The grass does look great, but given how everything else is so simple, it isn't anything other games haven't done in the previous gen.



In the end, BotW looks like a great game unfortunately stuck to a mediocre platform compared to today's state-of-the-art (both in docked mode and mobile, to be honest).
 
Anandtech on Switch's power consumption: http://www.anandtech.com/show/11181/a-look-at-nintendo-switch-power-consumption/2

Power consumption is 11W when docked and only when charging at the same time does it consume 16W. This is more inline to what could be expected vs the Shield Android TV in terms of perf-per-watt.

It's also interesting that power is not much lower when undocked and the culprit seems to be the screen in a disproportionate way, much more than I would have anticipated. Almost a 2W difference between min and max brightness, so probably drawing at least another 2W at min bightness.
 
Anandtech on Switch's power consumption: http://www.anandtech.com/show/11181/a-look-at-nintendo-switch-power-consumption/2

Power consumption is 11W when docked and only when charging at the same time does it consume 16W. This is more inline to what could be expected vs the Shield Android TV in terms of perf-per-watt.

It's also interesting that power is not much lower when undocked and the culprit seems to be the screen in a disproportionate way, much more than I would have anticipated. Almost a 2W difference between min and max brightness, so probably drawing at least another 2W at min bightness.
the screen thing is a big issue its the number one killer of your phone esp as you get into note sizes. The super high resolutions also don't help
 
Anandtech on Switch's power consumption: http://www.anandtech.com/show/11181/a-look-at-nintendo-switch-power-consumption/2

Power consumption is 11W when docked and only when charging at the same time does it consume 16W. This is more inline to what could be expected vs the Shield Android TV in terms of perf-per-watt.

It's also interesting that power is not much lower when undocked and the culprit seems to be the screen in a disproportionate way, much more than I would have anticipated. Almost a 2W difference between min and max brightness, so probably drawing at least another 2W at min bightness.

I find it interesting how much the SoC is actually using at these clocks. At min brightness the Switch draws 7.1W, so we are talking at least a few watts for the SoC in undocked mode. Add 5W+ for the SoC when docked (according to that article).

It's quite a jump from what we have seen from Nintendo SoC's in the past. The original 3DS packed a 5Whr battery, with battery life from 3-5 hours. Take away dual screens and wifi and the 3DS SoC was <1W. I think I estimated the Vita SoC as being 1-2W max. It's no wonder they've gone with a fan for the Switch. I am curious as to how such low clocks in the Switch have lead to the SoC using this much power. Are A57's really that bad?
 
Yes, A57 cores were always known for being power hungry. They were a big step up in performance from the A15 cores, but it came with a price, high power draw.

Sent from my SM-G360V using Tapatalk
 
Not to troll, but I honestly believe BotW is technically severely overrated; it's probably the worst performing first party nintendo title on any platform since the N64. Sorry but the title should be 60fps locked, or 30fps with AA or a 1080p frame buffer. At least based on the graphics.
Switch probably won't get a Zelda game if they can delay it until "Switch 2" or "Switch U", but expect it to look like the 2011 Wii U demo and 1080p/60

I think it's pretty impressive when you consider all of the physics at play. If there were no use of physics, I'd agree with you.
 
Yes, A57 cores were always known for being power hungry. They were a big step up in performance from the A15 cores, but it came with a price, high power draw.

Sent from my SM-G360V using Tapatalk

Interesting that they went down this path then. I suppose it speaks to how good of a deal this must have been for Nintendo. It was worth the extra materials cost (fan, copper heatpipe, heatsink) as well as potentially needing a larger battery.
 
Interesting that they went down this path then. I suppose it speaks to how good of a deal this must have been for Nintendo. It was worth the extra materials cost (fan, copper heatpipe, heatsink) as well as potentially needing a larger battery.

CPU and GPU performance doesn't come cheap WRT power. ARM has gotten most of the low hanging fruit WRT perf/watt and are starting to get into the same situation where Intel is at where its getting increasingly difficult to increase performance without also increasing power consumption.

They could have used a more power efficient SOC but then it'd also be less powerful. They could have used a more powerful SOC but then it'd be a lot more power hungry. IMO, while I'm not a fan of Tegra, for what Nintendo wanted from the Switch, I think it was a good compromise between power and power efficiency.

Regards,
SB
 
A15 was the big step up in power consumption and performance. They were so much faster than the A9 SOCs. Night and day. They still feel pretty modern today actually, whereas A9 hardware is rather annoying to use. But this was the beginning of tablets that truly got hot and throttling became a problem. Just go read some reviews of Tegra 4.

I think Switch is pretty nice. It was sloppy of Nintendo to not get Zelda working well at 1080p though. It is entirely their fault for not tuning the game for that configuration.

I somewhat want to say I would've preferred NVidia release the new Shield Tablet, but Android gaming is a sad joke so maybe I don't actually care. ;)
 
Last edited:
Are A57's really that bad?

If we go by Qualcomm's implementations, yes it is.

A possible exercise is to look at benchmark, temperature and throttling comparisons between phones with the Snapdragon 808 (2*A57 1.8GHz + 4*A53 1.44GHz, 20nm) and the Snapdragon 650 (2*A72 1.8GHz + 4*A53 1.4GHz, 28nm).

See for example CPU benchmark results for the Redmi Note 3 "Pro" and Xperia X / X Compact (with S650), and compare with the LG G4 and Nexus 5X (S808):
http://www.gsmarena.com/xiaomi_redmi_note_3_snapdragon-review-1477p5.php
http://www.gsmarena.com/sony_xperia_x-review-1441p5.php

And then consider that Snapdragon 650 devices get substantially better thermal performance and/or throttle significantly less than the 808 ones.

http://www.notebookcheck.net/Sony-Xperia-X-Smartphone-Review.170397.0.html
http://www.notebookcheck.net/Sony-Xperia-X-Compact-Smartphone-Review.181392.0.html
http://www.notebookcheck.net/LG-G4-Smartphone-Review.144672.0.html
http://www.notebookcheck.net/Google-Nexus-5X-Smartphone-Review.155885.0.html


All this, despite the 808 having a significant manufacturing process advantage over the 650.

But these aren't really surprising findings. ARM themselves acknowledge how much better the A72 cores are than the A57:

TOgZQv7.png





And even the A72 are "old" now. The A73 cores that have been in production SoCs for half a year show significant performance/watt over the A72:


t2niSok.jpg
 
Moving to A73 cores would have required Nvidia/Nintendo to pay for the license of the A73 cores, and truly customizing the X1 processor, increasing cost significantly. The Tegra X1 fit the requirements and was very cost effective. Nintendo could have spent lots of $$$ for a custom processor that was moderately more capable than a stock Tegra X1. The bottom line is that there isn't a mobile oriented SOC that would have substantially changed the positioning of the Switch. It was never going to be Xbox One in the form factor that its in. Look how much power the Xbox One S sucks and its using 16nm FinFet. Currently, there is no mobile SOC on the market in a $300 or less device that trumps the X1 in performance. If there is, please link the device, but I haven't seen it.
 
Nintendo needed low BOM, decent margins and a comprehensive technology partner far more than they needed faster CPU cores.

And Nvidia didn't have another off the shelf chip for them at the right point in time anyway.

It's highly likely that Nintendo did, in all probability, make the best choice.
 
A73 is an in-order design, I thought, hence the efficiency argument vs A72.

I don't know what that translates to in real-world applications/games though...
 
CPU and GPU performance doesn't come cheap WRT power. ARM has gotten most of the low hanging fruit WRT perf/watt and are starting to get into the same situation where Intel is at where its getting increasingly difficult to increase performance without also increasing power consumption.

They could have used a more power efficient SOC but then it'd also be less powerful. They could have used a more powerful SOC but then it'd be a lot more power hungry. IMO, while I'm not a fan of Tegra, for what Nintendo wanted from the Switch, I think it was a good compromise between power and power efficiency.

Regards,
SB
Just to add to that, one thing I think most people have missed is that much of the progress in the mobile space over the last 8 years or so has come from min-maxing the hell out of chips. New process nodes have improved things quite a bit, of course, but dynamic power hasn't improved by as much as you'd think. The reason your iPad doesn't double as a hotplate it because Apple offsets the high power usage under load with amazingly low power usage at idle. And most workloads are bursty, allowing the SoC to spend most of its time in deep idle.

The Switch doesn't get that luxury. If it's on, it's almost always running at full-tilt. So Nintendo can't min-max their way out of this. The dynamic power of the chip under load is for all practical purposes its average power consumption.

I do wish Nintendo had gone with a 16FF chip, but oh well. Regardless, a passively cooled handheld on the level of the Vita/3DS would have to be a lot lower performance. Probably a Cortex-A17 and something akin to a PowerVR GT7200 or GT7400?
 
Last edited:
Status
Not open for further replies.
Back
Top