Nintendo Switch Tech Speculation discussion

Status
Not open for further replies.
Why would a 10bit panel and local dimming (or just using OLED) consume substantially more battery life?

The OLED Vita could have its panel driver adapted to use HDR and battery life would probably be unaffected.
 
Why would a 10bit panel and local dimming (or just using OLED) consume substantially more battery life?

The OLED Vita could have its panel driver adapted to use HDR and battery life would probably be unaffected.

probably depending on what kind of HDR?

if its HDR Premium Ultra Super Duper (dunno the marketing term) used by samsung, it needs 1000 nits or something right?
if its the usual HDR with local dimming, OLED shouldnt give any more power usage than SDR.
 
probably depending on what kind of HDR?

if its HDR Premium Ultra Super Duper (dunno the marketing term) used by samsung, it needs 1000 nits or something right?
if its the usual HDR with local dimming, OLED shouldnt give any more power usage than SDR.
Actually, the HDR Premium standard only requires 1000nits for LCDs. If it's an OLED panel the requirement goes down to ~500nits and the Galaxy S7's OLED panel already goes beyond that.
Of course, that would require the panel luminance to be set to its maximum values which would drain the battery accordingly.


But again: I don't think HDR would/should be a priority in a non-premium handheld screen right now, much less with a screen that matches the HDR Premium certification.
 
Is HDR really that important? I mean making darkers parts appear even darker? Like todays games aren't a total pain in the ass to play in a normally lit room already. Let's make it even harder... On a handheld, with lots of different light sources, you just want it to be brighter.
 
also probably the wider color gamut and deeper bit is much more useful for handheld screen than the HDR itself. so smoother gradient, great looking colors, in various light condition (still i think, a good polarizer is more important, like Nokia 808)
 
People having (seen) HDR screens say it's really nice and worth upgrading. (Unlike 3D for exemple)

To me HDR = less battery life or higher price point (and the console is already too expensive IMO), I don't think you can get it for free.
 
Is HDR really that important?
My anecdotal experience of someone who just bought a mid/low-end HDR TV and has played Uncharted 4 and FFXV in it, the answer is an easy YES. Running those Samsung and LG demos in my TV makes it look more like a window than a screen.
I'm not sure how it would impact a smaller screen as far as brightness/contrast goes (probably not much to be honest), but the higher color gamut would always be very welcome.


To me HDR = less battery life or higher price point (and the console is already too expensive IMO), I don't think you can get it for free.
Not if you're using an OLED panel.
Then again, an OLED panel will probably be more expensive than an IPS LCD one. Though for a small 6" panel the difference could be marginal.
Of course, this is all worthless if you're running a SoC that can only render graphics similar to 10 year-old hardware.
 
also probably the wider color gamut and deeper bit is much more useful for handheld screen than the HDR itself. so smoother gradient, great looking colors, in various light condition (still i think, a good polarizer is more important, like Nokia 808)

Indeed, HDR would be the first time we can accurately see F̶er̶a̶r̶r̶i̶ Mario red. :yes:
 
Apparently NeoGAF is trying to spin rumors of a "more powerful dock with a dedicated GPU inside" that's supposed to be "GTX 1060 level" specs.

Hype train has derailed, rolled into a field, and is spewing diesel into the rivers, but they're still fully on the throttle.

I just can't look away from a train wreck. Do you have a link?
 
Apparently NeoGAF is trying to spin rumors of a "more powerful dock with a dedicated GPU inside" that's supposed to be "GTX 1060 level" specs.

Hype train has derailed, rolled into a field, and is spewing diesel into the rivers, but they're still fully on the throttle.
Well that would had made sense if it was original design between handheld and docked, but as you say trainweck now and I cannot see this happening now because developers would need to work with 3 different performance modes (undocked,docked, docked+GPU) and that is a lot of developer resource overhead-complications.

Also against this is the real train wreck of their online service, OMFG.....
1 SNES ROM a month that is then taken away for another.....
Considering the catalogue of games they have this could had been a great service along with aspects that Nvidia has with Grid, but nope Nintendo execs mulled over the best way to show how not to do a service and TBH that sums up what we are seeing with the approach to Switch both from a narrative (now pushing more the gimmick 1-2 games) and various prices HW and software, and now service.
On the plus side it makes the initial online service from Sony look amazing, plus side for Sony and Microsoft that is :)
It does not bother me much, but they have now removed BBC iPlayer from Wii U, another service missing from the Switch that some may had expected.

Also funny how Jim Sterling has shown how Nintendo North America and Japan will deadlock Youtube content and compete for ownership.
https://twitter.com/JimSterling/status/821053856452931585
Cheers
 
Last edited:
Apparently NeoGAF is trying to spin rumors of a "more powerful dock with a dedicated GPU inside" that's supposed to be "GTX 1060 level" specs.

Hype train has derailed, rolled into a field, and is spewing diesel into the rivers, but they're still fully on the throttle.
It's based on an old rumor from back in November, and you'll see below why it's being given attention.
I remember mentioning this leak but pretty much put it aside when the Eurogamer clocks article came up. Me and many others..


Brace yourselves for some weirdness regarding a ~2 month-old reddit leak from a guy claiming to work at Foxconn:
https://www.reddit.com/r/NintendoSw...r_someone_who_producing_switch_at_foxconn_is/


Dock: He completely hit spot-on on everything that has been revealed about the dock:
* There's no advanced technology in the dock, it seems pretty cheap and light, feels really plastic, no extra power, it just a output; * 1x USB3.0, 1x hdmi, on the side 2xUSB2.0 on the dock; * There's no fan on the dock, but hole on the back of the dock to allow air absorbing then out from the console's top vent
1*USB3 + 2*USB2 + 1*HDMI is a perfect fit of the official dock specs, as well as the hole in the back.


Console/tablet: He hit spot-on these attributes of the tablet:
(...) Confirmed it's USB-C charging
(...) Saw orange and blue controller
(...) Can be charged while playing
(...) Power adapter is external
(...) Battery 4310mA, 3.7 not changable
(...) was pretty surprise it's 300g console only (excluding joycon etc), used digital scale to weigh
Nintendo's official measurements for the tablet weight is 297g, and battery specifically says 4310mA.


JoyCons: Spot on on everything he said about the Joycons:
There's 2 shoulder button on each joy-con, they are called SL, SR *It's very complex inside, apart from the motherboard in the console, screen, Joy-con is the most valuable in the whole system *The battery inside Joy-con is about 5cm x 2cm x 0.5cm *Very light, about 50g * (22 Nov update) Battery 525mA
The shoulder buttons weren't visible until January 12's reveal, and neither were their names SL SR.
50g weight and 525mAh per joycon are also a perfect match to Nintendo's own official specs.



Up until now I count some 12 points that were successfully predicted, 4 of them with an uncanny precision: tablet/console weight and battery capacities.
There's no way the guy would say the Switch has a 4310mAh battery and hit all those exact 4 digits by luck, and the same goes for the JoyCon batteries, so this person definitely had access to the production line.
However, it's a possibility that the guy could know the factors above and completely make up what's coming next.



Here's what else he said about the production hardware and what measured frequencies appeared in the QA torture tests:

- 100mm^2 SoC (TX1 seems to be 11.4*10.6 = ~120mm^2, though measurement-per-side could have just some error if he used a ruler, so it could still be TX1).
- CPU running at 1785MHz and says ARMv8
- GPU running at 921MHz
- Signal output to screen in torture test machine was 1080p
- Memory running at 1600MHz (probably LPDDR4 3200MT/s)
- Internal heatpipe
- Active fan
- There's a SKU with 4G connection

Neither of this has been official confirmed or denied.
These might have just been the clocks for the torture test during docked mode, meaning the actual production clocks could be lower.
But testing the CPU at 1785MHz just to clock it at 1GHz in the final version seems like a bit of an overkill, even for a torture test IMO.



Here's what he admittedly said was being speculated by him and his coworkers (so something they either pulled out of their asses or heard from someone else):
- Cortex A73 cores
- GPU is Pascal
- Made by TSMC
- 4 GB RAM



Now here's what else he said about the "Advanced version devkit":

- Only 2000 units made and they were devkits
- Being devkits they were heavy, embedded screen, some unidentified I/Os, no internal battery, ethernet, HDMI, 2* WiFi antennas and internal PSU. Looks like a description of this.
- No dock for this version
- Plugs into TV through the integrated HDMI (duh)
- 8 unidentified storage chips (maybe an embedded PCIe/mSATA SSD)
- Two RAM packages for 8GB total RAM (thinking LPDDR4 again?)
- There's also a much larger chip at 12*18mm so ~216mm^2
- But apparently the 100mm^2 chip is also there, hence the SoC + discrete GPU theories.
- Only 200mm^2 discrete GPU from nvidia at the moment is the GP106 / GTX 1060, hence mentioning that card in the new rumors.



Could simply be a devkit that brings an extra discrete GPU just to try out some Gameworks code and then try to downgrade them to fit into the SoC's GPU...
Could be a "Switch Advance" coming out later this year or even the next.
Could be a prototype for an "Advanced Dock" that really has a discrete GPU, which would follow many of those patents.
Could be a development branch for the Switch that Nintendo simply decided not to follow.
Could be the fruit of imagination of the guy who also told a whole bunch of things right.
 
Last edited by a moderator:
I just can't look away from a train wreck. Do you have a link?

I do, but I can't post it until I have ten posts myself. Check ToTTenTranz's comment above, he'll get you down the rabbit hole.

Well that would had made sense if it was original design between handheld and docked, but as you say trainweck now and I cannot see this happening now because developers would need to work with 3 different performance modes (undocked,docked, docked+GPU) and that is a lot of developer resource overhead-complications.

On the one hand, you can grossly oversimplify it by saying "well, only increase the details/resolution/other-GPU-limited-aspects" but on the other, look at some of the examples of the PS4 Pro's "upgrades" for how that could turn out - some devs won't bother at all, some devs will do a good job, others could deliver a (subjectively) worse experience (trashing framerate because of excessive pixel-pushing)

It's based on an old rumor from back in November, and you'll see below why it's being given attention.
I remember mentioning this leak but pretty much put it aside when the Eurogamer clocks article came up. Me and many others..

(not sure if I can link in quotes or not, so removed)

I don't doubt the validity of the information; it carries a lot of weight for the numbers being accurate, especially the battery life being so precise. Consider those dubs- uh, digits - absolutely checked. But remember that that Durango devkit that was being auctioned off back in 2012 was allegedly packing an Intel/NVIDIA setup (which would match the PCs that Microsoft used to demo software at various trade shows) - all this despite the launch XB1 being AMD based for both the CPU and GPU sides. Even in that Reddit post there's an edit that claims the ~2000 units produced were devkits, and Nintendo came in to check on them.

(Unrelated: Thanks to the mod team here for cleaning up my crap formatting and accidental double-posting until I get the ability to self-edit. That does come later, right?)
 
I don't doubt the validity of the information; it carries a lot of weight for the numbers being accurate, especially the battery life being so precise. Consider those dubs- uh, digits - absolutely checked. But remember that that Durango devkit that was being auctioned off back in 2012 was allegedly packing an Intel/NVIDIA setup (which would match the PCs that Microsoft used to demo software at various trade shows) - all this despite the launch XB1 being AMD based for both the CPU and GPU sides. Even in that Reddit post there's an edit that claims the ~2000 units produced were devkits, and Nintendo came in to check on them.

I did write "just regular devkit for Switch" as a possibility, didn't I?
Regardless, a Durango devkit being auctioned in 2012 would probably mean it was an outdated devkit that wasn't being used anymore. By 2013, Microsoft had better have distributed devkits carrying GCN graphics cards at least.
This guy was talking about devkits being produced in November for a console that would release 5 months later. It's not the same thing.
 
On the one hand, you can grossly oversimplify it by saying "well, only increase the details/resolution/other-GPU-limited-aspects" but on the other, look at some of the examples of the PS4 Pro's "upgrades" for how that could turn out - some devs won't bother at all, some devs will do a good job, others could deliver a (subjectively) worse experience (trashing framerate because of excessive pixel-pushing)
I was one thinking they would do ala Pascal Drive PX2 and have the handheld as the Tegra SoC unit while the docking unit would sell separately and more expensive with a small GPU.
But tbh once you bring in 3 notable-different performance envelopes I am not sure it is feasible, notice how careful both Sony and Microsoft are and that is just 2 performance states rather than 3 we would see with the Switch.

They could position it as a separate product and call it the 'dedicated home console dock' version that the Switch plugs into but importantly offers games specifically in this setup *shrug*.
Also this solution would have too much performance to bring across to the handheld now we know its spec when undocked, and in reality the Switch is a handheld that can be connected to a TV screen so this would need to be a distinct product-solution that can play Switch games but also dedicated console based orientated ones that cannot be undocked.
I am not convinced now tbh, depends what feedback-commitments they get from developers.
Cheers
 
I did write "just regular devkit for Switch" as a possibility, didn't I?

For sure, I'm agreeing with you as to this being the most likely explanation behind the units.

Regardless, a Durango devkit being auctioned in 2012 would probably mean it was an outdated devkit that wasn't being used anymore. By 2013, Microsoft had better have distributed devkits carrying GCN graphics cards at least.
This guy was talking about devkits being produced in November for a console that would release 5 months later. It's not the same thing.

Given that they were using Intel/NVIDIA PC's at E3 2013 for their demo units, and were still using PCs at E3 2016 (although it's not confirmed if they were also the same hardware) I wouldn't necessarily say that it's impossible. Regardless, a devkit having beefier specs in order to run debugging - or render bullshots - isn't unusual at all.
 
- 100mm^2 SoC (TX1 seems to be 11.4*10.6 = ~120mm^2, though measurement-per-side could have just some error if he used a ruler, so it could still be TX1).
- CPU running at 1785MHz and says ARMv8
- GPU running at 921MHz
- Signal output to screen in torture test machine was 1080p
- Memory running at 1600MHz (probably LPDDR4 3200MT/s)
- Internal heatpipe
- Active fan
- There's a SKU with 4G connection

Neither of this has been official confirmed or denied.
These might have just been the clocks for the torture test during docked mode, meaning the actual production clocks could be lower.
But testing the CPU at 1785MHz just to clock it at 1GHz in the final version seems like a bit of an overkill, even for a torture test IMO.

I think CPU power drain [edit: on the battery] will have been a significant factor in final CPU speed, where as docked GPU clocks will likely be down to heat / noise / case temps / reliability (Nintendo reliability is generally excellent even over many years of heavy use).

If true, I would take these clocks as an indicator that even with a heatpipe cooler NX never stood a chance at maintaining 2 gHz / 1 gHz clocks under sustained, heavy load. This testing - coming as it apparently did before final clocks were set - could be showing Nintendo working out just what kind of clocks they could get away with and what compromises they would have to make.

I could imagine a testing process being like this:
- work to develop the most brutal torture test you can
- test rigorously to know where throttling, heat, noise and reliability limits are wrt to clock speed
- lower clocks further, to give some headroom
- lower CPU clocks even further if necessary for acceptable battery life
- lower GPU clocks for acceptable battery life in mobile mode

TLDR: Even if a particular NX was seen under testing, with these clocks, at some particular point in time with some particular test setup, it does not mean that these clocks could ever have been suitable clocks for production units (not directed at ToTTenTranz, just a general point).
 
Last edited:
I was one thinking they would do ala Pascal Drive PX2 and have the handheld as the Tegra SoC unit while the docking unit would sell separately and more expensive with a small GPU.

This is what the Supplementary Compute boxes in those registered patents would suggest. Pascal Drive PX2 even uses GP106 cards too, IIRC.



They could position it as a separate product and call it the 'dedicated home console dock' version that the Switch plugs into but importantly offers games specifically in this setup *shrug*.
Also this solution would have too much performance to bring across to the handheld now we know its spec when undocked, and in reality the Switch is a handheld that can be connected to a TV screen so this would need to be a distinct product-solution that can play Switch games but also dedicated console based orientated ones that cannot be undocked.
I am not convinced now tbh, depends what feedback-commitments they get from developers.

Yup, 3 performance targets for a single "console" seems like a death wish as far as early adopters go.
Even more, the GPU dock would have to be very expensive ($200? $300? more?) while the Switch already has a ridiculously high price from the start.

Nintendo could have that as a backup plan if no one is interested in the original Switch and then they'd have something to run all the multiplatform titles. Though we're definitely not talking about games on the go anymore, and those games would have to say "Super Dock required" or something.
When they tear down the Switch, if they find a Thunderbolt controller in there connected to the USB-C port, we'll know if this is/ever was a possibility.
 
Status
Not open for further replies.
Back
Top