Nintendo Switch Tech Speculation discussion

Status
Not open for further replies.
If you take a look at a teardown of the Nvidia Shield Console, it doesn't look like there is even a heat sink on the Tegra X1. When you look at the specs for the Shield Console, im not finding where Nvidia actually list clock speeds. Its easy enough to find specs for the Tegra X1, but they those clock speeds are no listed on Nvidias website for the Shield Console. I'm starting to think that perhaps the console doesn't run peak clocks for games. We know the Google Pixel C will in fact throttle significantly after about 30 minutes going full throttle, and that's with the GPU Clocking 850Mhz and the CPU cores at 1.9Ghz. I cant find any adequate benchmarks or test to back this up, but with how small the Shield Console is, and the reports saying fan noise is minimal, I just cant fathom why Nintendo would need to reduce clocks so much more than the Shield Console when they will have similar cooling fans, unless of course the Shield Console doesn't run max clock speeds playing games.

I'm surprised we aren't talking more about the Unreal 4 Profiles becoming public. Baseline settings for docked performance are set to medium compared to high on the current consoles, and portable goes to low settings with a reduction to 66% rendering resolution. This is very telling. If a third party uses Unreal 4 for a PS4/X1 game, its not going to be to hard to port to Switch.
 
Last edited:
If you take a look at a teardown of the Nvidia Shield Console, it doesn't look like there is even a heat sink on the Tegra X1. When you look at the specs for the Shield Console, im not finding where Nvidia actually list clock speeds. Its easy enough to find specs for the Tegra X1, but they those clock speeds are no listed on Nvidias website for the Shield Console. I'm starting to think that perhaps the console doesn't run peak clocks for games. We know the Google Pixel C will in fact throttle significantly after about 30 minutes going full throttle, and that's with the GPU Clocking 850Mhz and the CPU cores at 1.9Ghz. I cant find any adequate benchmarks or test to back this up, but with how small the Shield Console is, and the reports saying fan noise is minimal, I just can fathom why Nintendo would need to reduce clocks so much more than the Shield Console when they will have similar cooling fans.

There could be a number of factors. Perhaps yield, perhaps wishing to be passive only in handheld mode (temperature permitting), perhaps battery life and battery cost. And of course throttling not being an acceptable behaviour for the hardware.

This thing has to work in 30+ degree C ambient air temp, for hours, potentially in direct sunlight, while clasped in sweaty hands, while running 2nd, third, fourth generation software that's optimised skilfully to get the most out of that particular piece of hardware (and generally more processing done and more data transferred (including off chip) means more heat. And it has to reliable under a heavy gaming load, daily, for 5+ years.

Nintendo may also not be using heatpipes or micro vapour chambers in the cooling. If they only have a small finned piece of aluminium (similar to GC, Wii, WiiU except smaller) it's not going to move large amounts of heat.

The clock speeds of the CPU may also indicate how much the chip has to be fear from highly threaded, highly optimised CPU code - something a typical android game probably won't be using, even at this stage.

I think I've said this before (I'm a boring, repetitive man) but the 360S was designed to handle a power virus on the CPU and GPU simultaneously without overheating or throttling. If Nintendo want to achieve the same kind of rock solid performance profiles in a machine that can be held in the hand, in the 30+ degree temps of southern Japan and Europe, potentially while outside in direct sunlight, they're going to need an awful lot of headroom.

I'd love it if someone could write a power virus for a Pixel C and then test it in stupid hot summer temperatures. I suspect we'd see it throttling. A lot.
 
@function
As it pertains to the clock speeds and the Shield Console, do you think its running max clocks while running games like Half Life 2 and Doom BFG? The Shield Console is so tiny, and I see no heat sink over the X1 processor. It should theoretically need to stand up to hours and hours of gaming and cant or doesn't throttle from what I heard.

[QUOTETegra] X1's Maxwell was the final iteration of the architecture and does have technological aspects that are found in Pascal: specifically, double-rate FP16 support. We're also told that Switch has bespoke customizations that may involve pulling in other Pascal optimizations.[/QUOTE]

Of course the clock speeds and the fact that the chip is Maxwell instead of Pascal captured all the headlines, but this is a little nugget of info that deserves discussion. This isn't them guessing, but based on things they heard from sources. So the question is, what optimizations could be adopted?
 
If you take a look at a teardown of the Nvidia Shield Console, it doesn't look like there is even a heat sink on the Tegra X1. When you look at the specs for the Shield Console, im not finding where Nvidia actually list clock speeds. Its easy enough to find specs for the Tegra X1, but they those clock speeds are no listed on Nvidias website for the Shield Console. I'm starting to think that perhaps the console doesn't run peak clocks for games. We know the Google Pixel C will in fact throttle significantly after about 30 minutes going full throttle, and that's with the GPU Clocking 850Mhz and the CPU cores at 1.9Ghz. I cant find any adequate benchmarks or test to back this up, but with how small the Shield Console is, and the reports saying fan noise is minimal, I just cant fathom why Nintendo would need to reduce clocks so much more than the Shield Console when they will have similar cooling fans, unless of course the Shield Console doesn't run max clock speeds playing games.

I'm surprised we aren't talking more about the Unreal 4 Profiles becoming public. Baseline settings for docked performance are set to medium compared to high on the current consoles, and portable goes to low settings with a reduction to 66% rendering resolution. This is very telling. If a third party uses Unreal 4 for a PS4/X1 game, its not going to be to hard to port to Switch.

I guess my question would be how hard it would be to port bigger games to the thing. There are a lot of indie games on UE4 that could probably be brought over, but what about more demanding games like Kingdom Hearts 3 or something like Gears of War 4?

As for the clocks of the thing, someone over on NeoGAF made a good point about something I never considered. Since the Switch is designed to go outside, this thing has to be able to deal with being in very high temperatures. As someone who lives in Louisiana, it gets HOT over here. This has to be able to contend with multiple climates, so that might be why they clocked it down.

I do hope if it proves to be okay in those kinds of climates that they clock it back up a bit eventually.
 
@function
As it pertains to the clock speeds and the Shield Console, do you think its running max clocks while running games like Half Life 2 and Doom BFG? The Shield Console is so tiny, and I see no heat sink over the X1 processor. It should theoretically need to stand up to hours and hours of gaming and cant or doesn't throttle from what I heard.

I'm really out on a limb speculating here, but it could be that the Shield is indeed running max clocks on HL2 and D3:BFG. These games are old, and probably aren't loading the CPU remotely fully as they were either single threaded or only coarsely multithreaded (HL2 might have been updated to use multithreaded particles irrc). So there's probably a lot of unused CPU, and fragment shaders and geometry were relatively simple in those games.

If you run HL2 on your PC now with vsync on you'll see that neither your CPU or GPU get remotely close to having to even unbutton the top button on their cardigan.

You'd need games written specifically to target the Shield tablet to know what it could sustain under adverse situations.

It's entirely possible Nintendo could have made NX clock higher, if they only wanted to run HL2 or were prepared to say that performance levels were not reliable and that game performance should be expected to vary greatly depending on, for example, if you lived close to the tropics, or if you had air conditioning, or if you sat in the window in sunlight. But that's not traditionally how consoles have worked.

Of course the clock speeds and the fact that the chip is Maxwell instead of Pascal captured all the headlines, but this is a little nugget of info that deserves discussion. This isn't them guessing, but based on things they heard from sources. So the question is, what optimizations could be adopted?

Couldn't help you there I'm afraid. I'm guessing tweaks to things like ROP tiling, power and turbo logic, and removing bottlenecks that prevented higher clock speeds (critical paths, I think they call them)...???
 
As for the clocks of the thing, someone over on NeoGAF made a good point about something I never considered. Since the Switch is designed to go outside, this thing has to be able to deal with being in very high temperatures. As someone who lives in Louisiana, it gets HOT over here. This has to be able to contend with multiple climates, so that might be why they clocked it down.

Yep, been a few of us here making this point too!
 
This thing has to work in 30+ degree C ambient air temp, for hours, potentially in direct sunlight, while clasped in sweaty hands, while running 2nd, third, fourth generation software that's optimised skilfully to get the most out of that particular piece of hardware (and generally more processing done and more data transferred (including off chip) means more heat.
So ideally, Nintendo will throttle less but mandate a 'sloppy code' policy where devs just use quick, easy, inefficient code. That's better for the devs (less optimisation effort) and better for the PR (higher clock speed figures) with the same thermal profile...
 
What is it you are disagreeing with?
I'm not going to repost all the points I wrote in the last 5 pages or so. Feel free to quote them if you wish to discuss them.

"the venturebeat story actually corroborate an earlier leak on twitter which was essentially a tegra X1, exactly as we said back in July. We haven't been adding anything to that because the sources that we have are all saying Tegra X1 or a customized variant of it essentially."
-- Richard Leadbetter
Twitter source says 1) it refers to dev kits, 2) the GPU runs at 1GHz and 3) CPU runs at 2GHz.


So they are confident it's an X1 and they are confident about the clock.
They are confident the devs told them the dev kits used a TX1 with the GPU clocked at 1GHz and they are confident the final console's SoC's GPU is clocked between 300 and 768MHz.
Seems like two different things to me.
 
Twitter source says 1) it refers to dev kits, 2) the GPU runs at 1GHz and 3) CPU runs at 2GHz.

They are confident the devs told them the dev kits used a TX1 with the GPU clocked at 1GHz and they are confident the final console's SoC's GPU is clocked between 300 and 768MHz.
Seems like two different things to me.
Twitter specs are peak specs for the silicon. Dev kit specs are the specs devs are to target in final product. So yeah, two different things but not contradictory. Both confirm the processor. Only thing uncorroborated so far are the final clocks. Is there any reason to doubt Leadbetter's sources? To think the devkit specs are wrong or somehow miscommunicated?
 
"the venturebeat story actually corroborate an earlier leak on twitter which was essentially a tegra X1, exactly as we said back in July. We haven't been adding anything to that because the sources that we have are all saying Tegra X1 or a customized variant of it essentially."
-- Richard Leadbetter

Source: www.youtube.com/watch?v=PzS4LbH5nmA

So they are confident it's an X1 and they are confident about the clock. What kind of customization could be made if devs are esentially telling them it's an X1?
Errr...
"the venturebeat story actually corroborate an earlier leak on twitter which was essentially a tegra X1, exactly as we said back in July. We haven't been adding anything to that because the sources that we have are all saying Tegra X1 or a customized variant of it essentially."

See what a bit of alternative bolding does to that qoute? They are not "confident it's an X1", they are giving themselves an awful lot of leeway.
Something which is essentially a customized variant of an X1 could be pretty much any ARM SoC from nVidia. We can all see that there are aspects of the rumours that don't add up. It would be nice if in three weeks nVidia was free to go into more depth regarding the Switch SoC. All due respect to Chipworks, but I prefer it when they keep the horses mouth honest over when they are the only source of info.
 
Multiplatform developers will naturally be driven to multiplatform-capable APIs.
Of course, all of it will be worth nothing if the console is so slow that developers would have to waste a huge amount of time stripping away IQ elements from what was their original vision for their game (ultimately not developing for it).

As for the clocks of the thing, someone over on NeoGAF made a good point about something I never considered. Since the Switch is designed to go outside, this thing has to be able to deal with being in very high temperatures. As someone who lives in Louisiana, it gets HOT over here. This has to be able to contend with multiple climates, so that might be why they clocked it down.

For mobile mode yes, but there is practically zero reason why the Switch in docked mode would have to use more conservative clocks than the Shield Tegra TV.
In fact, there would be reason to believe the Switch's GPU has more power/heat headroom than the Tegra TV, because the later has its CPU cores running at constant 1.9GHz whereas the Switch runs them at little over half of that.

Again, assuming the Switch is using a TX1 with the same 2SM GPU arrangement.


Twitter specs are peak specs for the silicon. Dev kit specs are the specs devs are to target in final product.
Why would there even be a thing such as "peak specs for the silicon" on a dev kit? To whom would that ever concern?
Twitter specs are for devkits, and when that got out, Eurogamer said over and over again that devs were saying those specs "uncannily similar" to Nintendo's own developer briefings.
All of a sudden the CPU runs at half the clock speeds and the GPU only does 30-75% the clocks, and they keep saying the initial specs were uncannily similar?
 
So ideally, Nintendo will throttle less but mandate a 'sloppy code' policy where devs just use quick, easy, inefficient code. That's better for the devs (less optimisation effort) and better for the PR (higher clock speed figures) with the same thermal profile...

Well if you were most interested in selling chipsets that might make you look good ... :D
 
Multiplatform developers will naturally be driven to multiplatform-capable APIs.
Of course, all of it will be worth nothing if the console is so slow that developers would have to waste a huge amount of time stripping away IQ elements from what was their original vision for their game (ultimately not developing for it).



For mobile mode yes, but there is practically zero reason why the Switch in docked mode would have to use more conservative clocks than the Shield Tegra TV.
In fact, there would be reason to believe the Switch's GPU has more power/heat headroom than the Tegra TV, because the later has its CPU cores running at constant 1.9GHz whereas the Switch runs them at little over half of that.

Again, assuming the Switch is using a TX1 with the same 2SM GPU arrangement.



Why would there even be a thing such as "peak specs for the silicon" on a dev kit? To whom would that ever concern?
Twitter specs are for devkits, and when that got out, Eurogamer said over and over again that devs were saying those specs "uncannily similar" to Nintendo's own developer briefings.
All of a sudden the CPU runs at half the clock speeds and the GPU only does 30-75% the clocks, and they keep saying the initial specs were uncannily similar?
At the end of the day, we still don't really know.
A Tegra X1 without silicon modification is undoubtedly cheaper than designing a new SoC, at least for modest volumes. If however Nintendo has gone the extra mile and commissioned custom silicon, we just can't really know what to expect until it is either disclosed or hardware is available for chip analysis.

Personally, I find it hard to reconcile a Tegra X1 at 1GHz/300MHz with a need for active cooling in mobile mode, seeing as older products with the chip used passive cooling when both space constrained and without any airflow at all at much higher clocks even when explicitly tested for thermal throttling. Using the availabe space in the Switch for better passive heat dissipation would both remove an on board fan as a point of mechanical failure, and provide means for optimized cooling when docked. (I can definitely see them clocking very low for battery life reasons though.) It's odd.
 
It is all very odd, but regardless if Nvidia is allowed to really detail the SOC in full on January 12th, the software will speak volumes. Its not like we will only be seeing some BS game trailers, Nintendo is putting the Switch in the hands of gamers and journalist on January 13. We are going to know immediately just how comparable the Switch stacks up with the X1/PS4.

It really is odd though, seeing as how we have a product with the Nvidia Shield Console that uses the Tegra X1 processor, and we are under the understanding that it runs at 1Ghz for the GPU and around 2 Ghz for the CPU. If this were a large product with a big fat heat sink, a large case, and a high CFM fan, then it wouldn't be fair to expect the X1 to clock nearly as high in the Switch, but that's not the situation. The Shield Console is very small, larger than the Switch, but not significantly bigger. There is no heat sink that I can see from the tear down, and the fan is a small turbine fans.
 
"custom Tegra" could just mean the X1 chip with different clocks aka lower clocks

Also everybody keeps debating "why would they need the fan?" It's pretty clear to me that the fan is for docked mode...not portable.
 
"custom Tegra" could just mean the X1 chip with different clocks aka lower clocks

Also everybody keeps debating "why would they need the fan?" It's pretty clear to me that the fan is for docked mode...not portable.

I mostly agree, but the patent does make it sound like the fan runs even in mobile mode. That could be inaccurate to actually production operation, and it doesn't seem likely that a Tegra X1 clocked at a meager 300Mhz would need fan cooling.
 
Why would there even be a thing such as "peak specs for the silicon" on a dev kit? To whom would that ever concern?
Twitter specs are for devkits,
That very same tweet says, "2GHz Maximum CPU, 1 GHz Maximum GPU". That's the maximum the chip can do, the specs nVidia provided Nintendo no doubt because that's what the chips can do if you provide enough cooling. The fact they say maximum shows they're not the operating clocks - contrast that with Sony and MS leaks where we're given clockspeeds as target clockspeeds.

On the flip side, where do you think DF's numbers are coming from? If Switch is clocked that much faster, where the heck is 300 MHz coming from?
 
Status
Not open for further replies.
Back
Top