Nintendo Switch Tech Speculation discussion

Status
Not open for further replies.
No one is grasping. At least to me, this is just a light conversation on what could or not be.
After looking at the release games, I already said what I think the SoC is the most likely to be at this point: TX1 at those clocks and that's it.
It doesn't stop me from thinking and discussing what it might be if the Foxconn leaker is right. I just happen to enjoy that. :)

Right there with you, so my apologies if my post came off as aggressive :D

Microsoft hasn't told the world how they cooked up a hot-swappable PCIe transport in the Surface Book they've had in the market for almost a year, either.
The idea of external PCI-Express ports for graphics is rather old. ATi had XGP which even went to market back in 2009.
And once you find a way to redirect the necessary number of pins in USB-C to carry 4*PCIe 3.0 lanes, it might not be all that hard to do what Microsoft is already doing with their own port.

Interesting point but it does seem odd to invest so much engineering in this one narrow aspect. Surface has always been a 'halo' product for MS with it's existence largely a challenge to existing PC OEMs to stop always racing to the bottom and aim for the high end Apple style consumer. They have eaten a lot to produce a device that is exceptional on several levels, if Nintendo have invested heavily for this one super niche edge case I'd be pretty alarmed. I mean with all that engineering investment I'd have been happier to see them put it into the main SoC.
But it wouldn't be the dumbest thing Nintendo did with the Switch, like the tablet missing cameras or microphones, and requiring a smartphone with an app installed to do voice chat for the games.

Honestly this is why I love this thread even more than the Durango/Orbis ones, we not only have seemingly bizarre h/w rumors but actually mad business and design choices too!
 
I'm struggling to see how the clock speeds from the FoxConn leak mean A72 or A73? Couldn't they still be A57's manufactured on 16nm to get the power consumption down? The Switch seems to be packing a pretty hefty battery, and Nintendo says it can be sucked dry in 2.5 hours. Is there an equation that for figuring out what the power draw would have to be inorder to drain the battery in 2.5 hours?
 
But it wouldn't be the dumbest thing Nintendo did with the Switch, like the tablet missing cameras or microphones, and requiring a smartphone with an app installed to do voice chat for the games.

That particular piece is like "Xbox One launch Don Mattrick" level stupid.

"Fortunately we have a product for people who aren't able to get some form of a smartphone, it's called Wii U."
 
The 1785MHz clock speeds on a 16FF Cortex A72 module match a very similar power consumption to a 1GHz 20nm Cortex A57, as you see in the charts I posted. It's that simple.

I get what you and Syferz are saying here, I just don't think this means much of anything. I also don't think the power consumption from those two graphs is even that close, but it's not like this is some massive coincidence...

If Switch is using A72 and/or 16nm this should have been frozen for a long time, even if the silicon wasn't ready and they had to use an X1 as a lower fidelity approximation. This scenario brings into serious question why there'd ever be a 1GHz CPU specification, or why dev kits would be clocked that way. If Switch really does launch with games running the CPUs at 1.785GHz I would consider it more likely that the Eurogamer clock leaks were just wrong.
 
Is there an equation that for figuring out what the power draw would have to be inorder to drain the battery in 2.5 hours?
Absolutely, the Switch battery is 4310 mAh and 3.7 volts. So that makes it a 16Wh battery. (Watts = Amps * Volts)

So you simply divide 16Wh by the number of hours to get the power draw.
16Wh / 2.5h = 6.4W
16Wh / 6h = 2.6W

That would be the power draw of everything in the device. I was guessing earlier about 2W for everything other than the SoC.

Max 4.4W SoC in portable mode?

Caveat: It depends how the capacity of the battery is calculated, The raw capacity must never go below 10% (or even 20%) or above 90% and the charging circuit prevents this from happening. Sometimes the manufacturers takes this into the specs, sometimes not. We only know what's printed on the battery. I don't know if there's a rule for this.
 
Last edited:
I think it's known to be an IPS panel?
That'd be pretty affordable. In a modern $99 smartphone, you can have a 5" 720p IPS panel (with everything else pretty low end)
 
I think it's known to be an IPS panel?
That'd be pretty affordable. In a modern $99 smartphone, you can have a 5" 720p IPS panel (with everything else pretty low end)

I am a bit disappointed that they are still going with LCD since it has been reported almost a year ago that AMOLED production cost are lower than LCD.
 
Are you saying that a 1060 at 1.66GHz could be used in a system drawing only 8W? Notebook Check says 1060m draws 80W or more recently 70W: http://www.notebookcheck.net/Mobile...060-Laptop-Benchmarks-and-Specs.169547.0.html What thin and light laptops have one?
Yeah I do not see any dGPU type HW being used for mobile gaming outside of laptops, and especially not in a handheld gaming product.
To emphasise your point just look at the battery life of the Switch, really should had been a node shrink Tegra even if its performance was not much more as power demand would be better, anything to help that range of 2.5 to 6 hours.

Back to downclocking, not suggesting this is going to happen (IMO we would not see a 1060M or even 1050 in such designs) but it is interesting at base clocks (1508MHz) the 1060 GPU draws 60W while gaming.
Even more amazing is the Tesla P4 that manages 75W with 5.5 FP32 TFLOPs. shame the figure was never given for 50W mode of operation.
Anyway it is not really product margin viable to use much more powerful GPUs or HW and then downclock to seriously low levels unless there is no other option, especially in the context of a console.

Makes one wonder though if Nvidia/Nintendo will look to use Xavier some point end of next year from a console perspective in a similar way to the upgrade we see for Scorpio, appreciate will also come down to how flexible Xavier architecture is in comparison to Tegra.
Cheers
 
Absolutely, the Switch battery is 4310 mAh and 3.7 volts. So that makes it a 16Wh battery. (Watts = Amps * Volts)

So you simply divide 16Wh by the number of hours to get the power draw.
16Wh / 2.5h = 6.4W
16Wh / 6h = 2.6W

That would be the power draw of everything in the device. I was guessing earlier about 2W for everything other than the SoC.

Max 4.4W SoC in portable mode?

Caveat: It depends how the capacity of the battery is calculated, The raw capacity must never go below 10% (or even 20%) or above 90% and the charging circuit prevents this from happening. Sometimes the manufacturers takes this into the specs, sometimes not. We only know what's printed on the battery. I don't know if there's a rule for this.

Can we really extrapolate much from the idea that the SOC can consume north of 4 watts in portable mode? The A57 cores would only be consuming about 1.87watts at 1Ghz, and Maxwell cores would be well south of a watt at such low clocks.
 
So does anyone else feel the REAL gimmick is going to be that every few years they'll release a better version of the Switch tablet for like $200 by itself so you can just upgrade your system every few years?
 
Apparently according to anandtech, ASMedia already sells a controller that enables alternate mode for USB-C, so it's not just Alpine Ridge that enables that function:







It's A57 20nm vs. A72 16FF, but it's just like @Syferz wrote.

QFOqpIe.png
SGM5g4A.png


I'm waiting for Anandtech to review the Huawei Mate 9 with Cortex A73 cores. Those should be even better.



The devkit could have a downclocked GP106 with 16 ROPs disabled and a corresponding 128bit memory bus.
Again, it's a devkit so it could be a placeholder chip for a dedicated GPU coming down the line.

Regardless, if this was to fit the architecture described in the patents, the external GPU would come inside a new, advanced dock that would have its own cooling system.
I hope no one is thinking that the Switch has a hidden GTX 1060 inside it. That would be utterly stupid and impossible.



Why would the console need to install or download anything if the "HD assets" were already in the cartridge?
If it's a cartridge, bandwidth should already be very good. There's no need to install anything. The same happens in the Vita.



It doesn't make much sense.
Most hardware-related decisions taken by Nintendo haven't made any sense either, though. And the fact remains that the Foxconn leaker is definitely legitimate and an external GPU would match Nintendo's patents about the supplementary compute device.
It might be very real in a prototype stage but it'll never come out, for example. Or it may come out only next year, or within 2 years.




Eurogamer is the only source for eurogamer's clocks.
If there really is a 2nd GPU in the docking station then why would LOZ breath of the wild only be running at 900p with no extra bells and whistles as far as GPU specific enhancements?
 
Yeah I do not see any dGPU type HW being used for mobile gaming outside of laptops, and especially not in a handheld gaming product.
To emphasise your point just look at the battery life of the Switch, really should had been a node shrink Tegra even if its performance was not much more as power demand would be better, anything to help that range of 2.5 to 6 hours.
If there really is a 2nd GPU in the docking station then why would LOZ breath of the wild only be running at 900p with no extra bells and whistles as far as GPU specific enhancements?

There's no dGPU in the Switch or the announced dock.
The Foxconn leaker who got the battery capacities right, should button names, weights, IOs in the dock, etc. just happened to mention there was a "more advanced devkit" besides the production models being tested. This "advanced devkit" had a second chip besides the main SoC with the same measurements as a GP106.

The only hypothesis being put on the table here is Nintendo releasing a "Super Dock" later down the line with a dGPU included. This would match both the claims from the Foxconn leaker and all the patents that Nintendo released about the "Supplementary Compute Device".
 
If there really is a 2nd GPU in the docking station then why would LOZ breath of the wild only be running at 900p with no extra bells and whistles as far as GPU specific enhancements?
It'll be enabled in a later firmware update. Remember how MS played it with XBOne? They cunningly put in extra GPU power, let games use the basic specs, and then unleashed the full force a year after release to...um...er....can't remember, but I know on Good Authority that MS used this technique to great advantage and I'm sure Nintendo have carefully watched the rest of the industry and learnt how to pull this off. So docked LOZ next year after the patch is gonna look amaze-balls.
 
There's no dGPU in the Switch or the announced dock.
The Foxconn leaker who got the battery capacities right, should button names, weights, IOs in the dock, etc. just happened to mention there was a "more advanced devkit" besides the production models being tested. This "advanced devkit" had a second chip besides the main SoC with the same measurements as a GP106.

The only hypothesis being put on the table here is Nintendo releasing a "Super Dock" later down the line with a dGPU included. This would match both the claims from the Foxconn leaker and all the patents that Nintendo released about the "Supplementary Compute Device".
Yeah my point/context and post was it never happening for a device outside of laptop 'mobile' for the reasons I mention and to emphasise a previous poster's point in response to someone mulling over it may be possible, rather than what you raised that was pretty clear IMO.
Cheers
 
It'll be enabled in a later firmware update. Remember how MS played it with XBOne? They cunningly put in extra GPU power, let games use the basic specs, and then unleashed the full force a year after release to...um...er....can't remember, but I know on Good Authority that MS used this technique to great advantage and I'm sure Nintendo have carefully watched the rest of the industry and learnt how to pull this off. So docked LOZ next year after the patch is gonna look amaze-balls.

lol, it's getting crazy in here, having a dock with second gpu is the stupidest/ silliest thing i heard, sounds almost like fan fiction. what's the point when all games have to run in handheld mode.
 
Status
Not open for further replies.
Back
Top