Nintendo Switch Tech Speculation discussion

Status
Not open for further replies.
Apparently NeoGAF is trying to spin rumors of a "more powerful dock with a dedicated GPU inside" that's supposed to be "GTX 1060 level" specs.

Hype train has derailed, rolled into a field, and is spewing diesel into the rivers, but they're still fully on the throttle.

That thread was glorious, I once doubted Nintendo fans could give me another MisterXmedia now I cannot wait for years of dark silicon and bright red MS Paint diagrams. The mods shut it down after a few hours of insanity and took to adding mocking parentheses to another thread about the magic ext. GPU dock (for the record they added "(a hardware fan fiction thread)") before also closing that.

As to the truth of the rumors the details lead me to suspect he either did actually work with some pre-release kit or knew someone who did (I mean Foxconn plants are basically modest cities) but the clock speed stuff and second GPU seem like nonsense. I mean we may never know but I thought that Nintendo sub-contract production but keep design in house. That kind of soak and perf. testing seems more in the realms of the kind of design validation an ODM rather than an OEM would do for you. Of course I don't know this and my own experience in the area is rather modest, would a contract manufacturer do this kind of soak and speed testing or would they typically only perform end-of-the-line style QA testing?

When they tear down the Switch, if they find a Thunderbolt controller in there connected to the USB-C port, we'll know if this is/ever was a possibility.

If they find an Intel Thunderbolt controller in there I'm pretty sure that's one of the 4 signs of the apocalypse
 
Apparently NeoGAF is trying to spin rumors of a "more powerful dock with a dedicated GPU inside" that's supposed to be "GTX 1060 level" specs.

Hype train has derailed, rolled into a field, and is spewing diesel into the rivers, but they're still fully on the throttle.
I'm sure most of Neogaf still believe the Wii-u has 352GFLOPs of performance. The place is ridden with fanboy denial.
 
I suppose the battery would be the biggest limitation.

Allowing to charge while playing docked will quickly hit the battery thermal limits because it must be kept below 45C while charging.
http://batteryuniversity.com/learn/article/charging_at_high_and_low_temperatures

I guess it makes sense to require the dock for full power and not simply requiring a power adapter. It allows to shut down the screen (which would otherwise transfer heat to the battery), it's always oriented vertically, it keeps the intakes unobstructed, the heat source and exhaust are at the top, the intake provides laminar flow over the battery, etc.... It should be very robust in all ambient conditions.
 
As to the truth of the rumors the details lead me to suspect he either did actually work with some pre-release kit or knew someone who did (I mean Foxconn plants are basically modest cities) but the clock speed stuff and second GPU seem like nonsense.

Clock speed stuff could be true. There would have been a point at which Nintendo were working out final clocks for the two performance profiles based on all kinds of parameters and doing lots and lots of testing. Just because you can hit a certain frequency under some conditions with some considerations doesn't mean you can use those frequencies in a final product that has to meet all considerations under all circumstances.

Interestingly, some of the leaks last year were talking about the development silicon being able to hit 2 / 1 ghz, but there was no indication of what the actual NX platform was going to use. People thought this odd at the time, but in retrospect it may be because opportunistic max clocks weren't representative of the final device's limitations. And working all that out would take time and testing.

It would be interesting to know what Nintendo's target specs were and how they changed in the run up to finalisation. 1ghz / 768 mHz might actually be a touch higher than Nintendo were telling developers the baseline would be early last year, giving rise to the rumours of a "power bump" late last year.

Edit: 1 gHz quad core A57 tied to a 1060 GPU would be a super balanced console hur hur hur.
 
if there is a tablet out there with an external gpu... I could believe that but not a 1060. Perhaps whatever a 1030 becomes (those out ? ) or more likely a mobile gpu. Something that doesn't use that much power like 20watts max with ram. They could sell this as the 4k dock . I think something that power level would be able to run switch games at 4k 60fps.
 
I suppose the battery would be the biggest limitation.

Allowing to charge while playing docked will quickly hit the battery thermal limits because it must be kept below 45C while charging.
http://batteryuniversity.com/learn/article/charging_at_high_and_low_temperatures

I guess it makes sense to require the dock for full power and not simply requiring a power adapter. It allows to shut down the screen (which would otherwise transfer heat to the battery), it's always oriented vertically, it keeps the intakes unobstructed, the heat source and exhaust are at the top, the intake provides laminar flow over the battery, etc.... It should be very robust in all ambient conditions.

Just don't charge the battery while you're actually playing and make sure the device can take it's power straight from the mains? Oh wait, you were saying that basically already.
 
if there is a tablet out there with an external gpu...

Any tablet with full USB-C 3.1 and thunderbolt (e.g. Aspire Switch Alpha, Asus Transformer 3 Pro, Dell XPS 12) can connect to any discrete GPU as long you put it in a Thunderbolt graphics box.


I could believe that but not a 1060. Perhaps whatever a 1030 becomes (those out ? ) or more likely a mobile gpu. Something that doesn't use that much power like 20watts max with ram. They could sell this as the 4k dock . I think something that power level would be able to run switch games at 4k 60fps.

The mention of a GTX 1060 only came because of the Foxconn guy saying there's a second 200mm^2 chip besides the 100mm^2 SoC, and that's the size of a GP106.
Coincidentally, the Drive PX2 has a dual block of 1 Parker driving a downclocked GP106.

A smaller GP107 already exists, but it's a 130mm^2 chip.

Of course, this 200mm^2 chip in the "devkit" could simply be some kind of southbridge or even a FPGA acting as such made on a very old process, hence the large size.
Or it could be made up.
 
Any tablet with full USB-C 3.1 and thunderbolt (e.g. Aspire Switch Alpha, Asus Transformer 3 Pro, Dell XPS 12) can connect to any discrete GPU as long you put it in a Thunderbolt graphics box.

I was sarky before but I'd like to expand on something here. There is less chance of the Switch having Thunderbolt than of it having a second GPU. Intel does not license Thunderbolt to anybody except Apple and they loathe Nvidia in particular (after NV tried to build an x86 cpu without a license). Easy way to tell is that the Switch dock is "only" $90 if it had a TB3 bridge chip with a USB controller (and it would have to have an independent USB controller) it would probably start at $150
 
I was sarky before but I'd like to expand on something here. There is less chance of the Switch having Thunderbolt than of it having a second GPU. Intel does not license Thunderbolt to anybody except Apple and they loathe Nvidia in particular (after NV tried to build an x86 cpu without a license). Easy way to tell is that the Switch dock is "only" $90 if it had a TB3 bridge chip with a USB controller (and it would have to have an independent USB controller) it would probably start at $150

I agree that a dGPU dock would be ridiculously expensive for what it seems to be an already very expensive base console.
But can Intel simply refuse to sell Alpine Ridge to a company who's willing to pay? There's a chance it would be illegal..
 
So we aren't going to know what the chip looks like until a teardown is done after launch, but what do you guy think? Stock Tegra X1 or custom? Im not sure how much of an investment it would require to scrap any unneeded hardware, such as the A53 cores. Would this require some pretty intensive work for Nvidia to restructure and remove components? Hopefully it doesn't take to long to get a die photo, or even a microscope photo like Marcan did on the Wii U.
 
I agree that a dGPU dock would be ridiculously expensive for what it seems to be an already very expensive base console.
But can Intel simply refuse to sell Alpine Ridge to a company who's willing to pay? There's a chance it would be illegal..

There's no reason why they have to, truth is they would have revoked everybody's x86 license if they could have (it took years of law suits before they and AMD came to a truce) and in this case it's a high sped peripheral bus which has competitors on the market, dominant competitors even (USB 3.1 10Gbps), so no monopoly issue there. They really want TB to take off but they're trying to make water roll up hill as the only peripheral standards that make it out of niche device ghettos are cheap and almost or completely royalty free. I agree it's self defeating but this is the company that tried to make IA64 a thing for almost a decade after it's time had come and gone.

On a technical note they haven't released it to any other ARM vendor except Apple and they may not even have engineering support for ARM even if they wanted to enable another ARM vendor on top of a completely new O/S stack to write and support as well.
 
Any tablet with full USB-C 3.1 and thunderbolt (e.g. Aspire Switch Alpha, Asus Transformer 3 Pro, Dell XPS 12) can connect to any discrete GPU as long you put it in a Thunderbolt graphics box.




The mention of a GTX 1060 only came because of the Foxconn guy saying there's a second 200mm^2 chip besides the 100mm^2 SoC, and that's the size of a GP106.
Coincidentally, the Drive PX2 has a dual block of 1 Parker driving a downclocked GP106.

A smaller GP107 already exists, but it's a 130mm^2 chip.

Of course, this 200mm^2 chip in the "devkit" could simply be some kind of southbridge or even a FPGA acting as such made on a very old process, hence the large size.
Or it could be made up.

It's actually much more interesting then all of that, the Foxconn leak predates the eurogamer clocks. X1 frequency turned out to be 76.8mhz and both the clocks are multipliers. Eurogamer portable is 4x, dock 10x , while the Foxconn clocks are 5x and 12x. Eurogamer's A57 quad core @ 1ghz consumes 1.83w while an A72 quad core @ 1.7ghz consumes 1.9w... I went and found some power consumptions for pascal per 128 Cuda cores and uses tsmc 's 20nm to 16nm power consumption savings of 55% to figure out what the X1' s gpu draws at eurogamer's clocks and Foxconn's clocks. The thing is, both SoC consume ~2.23w with portable clocks.

With the guy getting the battery capacity right of 4310mah and the weight of 300g and the neon Joycons colors as well as the naming of the SL and SR buttons on the Joycons correct while also being confirmed to have originated his post from inside Foxconn all in November, it's obvious that this isn't a rumor, but a genuine leak. Those clocks were stress tested with unity's fish demo for 8 days straight "without dropping a frame" with the same power consumption occam's razor suggests that coming to any other conclusion would be foolish.

As for the SCD, with all the things he reported, I wouldn't dismiss it, not only is it the right size for a 1060 but it is also the right shape ~12mm x ~18mm and lastly, this new soc in switch can actually draw between 7 and 8 watts for the entire system, which could be used as a portable with a terrible battery life of ~2 hours. That would allow Nintendo to target this switch at 720p games with the 472gflops and allow the GTX 1060's 4.3tflops to push 4k as that is a 9x increase or the same flops per pixel ratio.

It magically all fit into place like this, the Foxconn guy didn't know eurogamer's specs but they lead to the same power consumption? It's pretty amazing if he just made up these numbers, and the problem with stress testing x1 at those clocks is the cpu alone would draw 7watts by itself, and could never be used as a portable with Nintendo not changing the cpu speeds between modes, this seems very unlikely.
 
Did this supposed 1060 have a frikkin great big heatsink and cooler on it, and was it arranged with a 256-bit bus and 8 GDDR5 memory chips around it?

Because without those this supposed 1060 isn't a 1060.

And is there really a demand for running NX games at 5K+?
 
Did this supposed 1060 have a frikkin great big heatsink and cooler on it, and was it arranged with a 256-bit bus and 8 GDDR5 memory chips around it?

Because without those this supposed 1060 isn't a 1060.

And is there really a demand for running NX games at 5K+?
The 1060 is 192 bit and 6GB. I don't think running Switch games at 4k will require a 1060 tbh. Probably only need a 1050.
 
The 1060 is 192 bit and 6GB. I don't think running Switch games at 4k will require a 1060 tbh. Probably only need a 1050.

Oops! My bad.

Still, as you say a 1060 is overkill for 4K NX games. And it will still require 6 GDDR5 chips and a sizeable heatsink ....
 
Oops! My bad.

Still, as you say a 1060 is overkill for 4K NX games. And it will still require 6 GDDR5 chips and a sizeable heatsink ....
472gflops x9 = 4248gflops or the 1060 clocked at 1.66ghz (small downclock) you can find these cards in thin and light laptops today, not sure what is unreasonable about it being in a dock.

As long as the dock has its own vram, it should work fine, and with the A72 at that clock, it should be able to handheld 4k gaming, it could also turn off the gpu in the switch and overclock.

Lastly, it's only a 1060 because the leak is talking about a chip with the identical size and shape and 2000 of these were made and have screens on them and no place for a battery, seems to me they could over be high powered devkits and made in such a quantity would mean they are already being shipped to developers.
 
Eurogamer's A57 quad core @ 1ghz consumes 1.83w while an A72 quad core @ 1.7ghz consumes 1.9w...

Where do these numbers come from? A72 improves the power consumption per clock over A57 but not by such a huge degree...

I went and found some power consumptions for pascal per 128 Cuda cores and uses tsmc 's 20nm to 16nm power consumption savings of 55% to figure out what the X1' s gpu draws at eurogamer's clocks and Foxconn's clocks. The thing is, both SoC consume ~2.23w with portable clocks.

Can you show more behind this?

With the guy getting the battery capacity right of 4310mah and the weight of 300g and the neon Joycons colors as well as the naming of the SL and SR buttons on the Joycons correct while also being confirmed to have originated his post from inside Foxconn all in November, it's obvious that this isn't a rumor, but a genuine leak. Those clocks were stress tested with unity's fish demo for 8 days straight "without dropping a frame" with the same power consumption occam's razor suggests that coming to any other conclusion would be foolish.

I don't disagree that there's good reason to believe the clock speeds in that leak were legitimate, but it's also not at all unusual to perform stress tests at clock speeds that are higher than you plan to run at, in order to account for the effects of aging and a larger temperature range, and to accelerate early death defects. Anything that reduces the RMA rate even a little is worthwhile.

This puts the max GPU clock while docked around expected values. I don't doubt that they could run the CPUs a lot higher than 1GHz while docked, but I can see a lot of sense in not allowing developers to run the CPU at different clocks when docked, which could lead to a fundamentally differently feeling game experience between the two modes or at the very greatly least increase the amount of QA work that needs to be done. It's not worth it, particularly not when Nintendo is selling the undocked mode as the primary experience.

So if this is based off of X1 they have to cut the Pixel C's power consumption in half at peak, while presumably maintaining about the same screen brightness, although for a smaller and lower resolution screen, and other auxiliary features like WiFi. So the actual power consumption of the SoC may need to be well under half what it is in the Pixel C. Going from 1.9GHz on the CPU clocks and 850?MHz on the GPU (assuming Pixel C can even sustain all that under load, which I doubt) to 1GHz and ~300GHz on the GPU seems about right. While ~1.8GHz on the CPU and 900+MHz on the GPU (can't remember exact numbers) seems both plausible with X1 but way too power consuming to be used undocked.

I know they said it'll have Cortex-A73 (not A72), but this is a pure guess from them. I have to wonder, if this will have A72s or A73s why does Tegra X2 still only use Cortex-A57? If they could get out an unequivocally better CPU core in a Nintendo product first why wouldn't it be used in X2?

As for the SCD, with all the things he reported, I wouldn't dismiss it, not only is it the right size for a 1060 but it is also the right shape ~12mm x ~18mm and lastly, this new soc in switch can actually draw between 7 and 8 watts for the entire system, which could be used as a portable with a terrible battery life of ~2 hours. That would allow Nintendo to target this switch at 720p games with the 472gflops and allow the GTX 1060's 4.3tflops to push 4k as that is a 9x increase or the same flops per pixel ratio.

It magically all fit into place like this, the Foxconn guy didn't know eurogamer's specs but they lead to the same power consumption? It's pretty amazing if he just made up these numbers, and the problem with stress testing x1 at those clocks is the cpu alone would draw 7watts by itself, and could never be used as a portable with Nintendo not changing the cpu speeds between modes, this seems very unlikely.

472gflops x9 = 4248gflops or the 1060 clocked at 1.66ghz (small downclock) you can find these cards in thin and light laptops today, not sure what is unreasonable about it being in a dock.

As long as the dock has its own vram, it should work fine, and with the A72 at that clock, it should be able to handheld 4k gaming, it could also turn off the gpu in the switch and overclock.

Lastly, it's only a 1060 because the leak is talking about a chip with the identical size and shape and 2000 of these were made and have screens on them and no place for a battery, seems to me they could over be high powered devkits and made in such a quantity would mean they are already being shipped to developers.

Are you saying that a 1060 at 1.66GHz could be used in a system drawing only 8W? Notebook Check says 1060m draws 80W or more recently 70W: http://www.notebookcheck.net/Mobile...060-Laptop-Benchmarks-and-Specs.169547.0.html What thin and light laptops have one?
 
472gflops x9 = 4248gflops or the 1060 clocked at 1.66ghz (small downclock) you can find these cards in thin and light laptops today, not sure what is unreasonable about it being in a dock.

As long as the dock has its own vram, it should work fine, and with the A72 at that clock, it should be able to handheld 4k gaming, it could also turn off the gpu in the switch and overclock.

Lastly, it's only a 1060 because the leak is talking about a chip with the identical size and shape and 2000 of these were made and have screens on them and no place for a battery, seems to me they could over be high powered devkits and made in such a quantity would mean they are already being shipped to developers.

If there was a 1060 (or something like it) in the devkit that the leaker inspected internally, then there would be a suitably respectable heatsink and fan combination that would be clearly visible and also an array of 6 GDDR5 chips (also easily visible) in the there. I am unaware of these having been identified in the system. (And laptop GPUs most certainly will throttle under stress - as will many desktop cards come to think of it).

I am also unaware of the NX having A72 cores inside it. Regardless, if the NX CPU clock doesn't increase in docked mode I see no reason for Nintendo to increase it with dGPU docked mode, given that it would break the goal of having identical game logic and cpu performance in different modes.

On top of this, the NX games don't ship with "HD assets" on their little carts and those would be required to make the most of a 1060 like GPU ... meaning you would also need an additional HDD in the dGPU dock and downloads of improved assets over broadband. This would in turn potentially cause variations in loading and streaming performance that would have to be tested around. This is assuming of course that you're not just running stock "sub XB1/PS4" assets at 4K ... because that would be a waste.

Additionally, a 1060 is overkill for running NX games at 4K and Nintendo have shown no interest in beating out high end consoles from MS and Sony. And a dGPU docked NX still wouldn't be able to run AAA games due to its CPU, even though it had a PS4Pro beating GPU, so who's going to be making games to take advantages of this large, expensive, souped up dGPU dock?

I can't rule out that such a thing exists, all I can say is that there's nothing to really indicate it does and that such a risky, expensive and userbase fragmenting move would be stupid.
 
Status
Not open for further replies.
Back
Top