Nintendo Switch Tech Speculation discussion

Status
Not open for further replies.
Thanks! That completely contradicts their own announcement video.

This leak is from early July, and according to a gaf post from an eurogamer author they had been sitting on those clocks values since later that month.

A lot of stuff might have happened in the meanwhile, like realizing the amount of storage space it would take from the console's paltry 32GB if they were to allow 8 different users in it.

EDIT: just realized this was from the kotaku interview.
 
Last edited by a moderator:
Pixel C has fully clocked Tegra X1:
http://www.anandtech.com/show/9972/the-google-pixel-c-review/2

It loses to iPhone 6s clearly in all CPU benchmarks (even at 1.91 GHz), but wins in GPU benchmarks. If Tegra X1 GPU would have been downclocked to 300 MHz, it would have lost easily to iPhone 6s in all GPU benchmarks. Apple has a pretty good SOC. Both CPU and GPU are top notch.

The "leak" had the "leaker" claiming it would destroy iPhone 8.. so much for any credibility
 
What's so sexy about Cortex A57 cores working at 1780MHz and GPU at 910MHz in a Tegra X1 while docked? Those values are below what you find in the 2 year-old Shield TV.
They're hotter than what's been seen from reliable sources
1 - Where did this "rare" thing came up from? How are 2000 devkits for the GPU dock version somehow rare?
2 - Dock that has a GPU matches Nintendo's own SCP patents.
2000/2000000 = 0.001% of production volume anticipated for the first month, seems pretty uncommon to me
 
The clock speeds from FoxCon could be legit if Nintendo did opt to go with dye shrink to 16nm, and even though the power consumption would go down significantly, the size of the chip would stay about the same (lining up with the Foxcon chip size leak). The shrink could be the entirety of the "custom" Tegra. The Switch has a pretty stout battery, and we know it can be sucked dry in 2.5 hours. I believe this puts the power draw somewhere around 6.4 watts to drain the battery in that amount of time. Four A57 cores clock at 1780mhz at 20nm pull about 6 watts. If 16nm Finfet delivers a 40% reduction in power consumption, this brings them down to 3.6 watts. Eurogamer is the only source referencing the specific clock speeds, and even they admit that the sourced info is from many months ago. Rumors suggested that in October dev kits went out with improved performance. Not saying the FoxCon leaks are more believable than the Eurogamer leaked, but there was enough specifics listed that it shouldn't be outright dismissed.

Im looking forward to the hackers figuring out exactly what the clocks are next month so that we may put that part of the discussion to rest.
 
I would definitely want all that data for extreme situations in the later prototype stages.
For final production hardware, which is clearly what this Foxconn employee was dealing with, it makes no sense IMO.
What the factories' QA team measures during the full production stage is to take random samples from separate batches batch and test them under normal operating conditions, give or take a small power/heat/force/pressure/etc. headroom. This applies to almost everything AFAIK.

They definitely don't do 80% overclocks on the CPU that consume 350% of the normal operation power. I'm pretty sure Intel isn't testing their Kaby Lake i7 CPUs at 6GHz while pulling 300W from the socket. Nor is Samsung testing their Galaxy S7 motherboards with the Snapdragon 820 clocked at 3.6GHz. Sony isn't testing their PS4 Pro with the Jaguars clocked at 3.8GHz either.
It's such a big difference that's simply an unrealistic setting to be tested on final production hardware. At that point, gathering that kind of data is simply a waste of time.

If someone claimed that the original 3DS was tested at 480MHz would that sound outrageous? What if someone said PSVita was tested at 800MHz? I don't think that sounds too crazy because unlike those other scenarios it's actually technically feasible and they probably wouldn't need any kind of exotic cooling or power conditioning to achieve it.

I feel like a broken record here, but it was pretty standard for old Nintendo handhelds to be capable of running at clock speeds far in excess of what they normally ran at.

Gameboy could be overclocked to 2x speed.
Gameboy Color could be overclocked to 3x speed.
Gameboy Advance could be overclocked to 1.9x speed.
Nintendo DS could be overclocked to 1.8x speed.

These overclocked the entire system, not just the CPU and/or GPU, and eventually that approach of replacing the system oscillator would not be feasible.. but I have a pretty good feeling that if there were a way to overclock the CPU on 3DS there'd be a lot of headroom there. And probably a good amount on n3DS, if not as much.

So releasing handhelds that are clocked way lower than they can handle is nothing new (and there are good reasons for this phenomenon) but why would Nintendo perform tests at these clock speeds? Well because they can and because it could possibly better fulfill certain test objectives. Maybe they have tests where number of cycles is more representative than run time, in which case higher clocks makes their testing take less time. Maybe these test settings can cause marginal parts to fail after minutes instead of weeks or months. If you're not a Switch test engineer you can't really say for sure that this sort of testing doesn't make sense.
 
Nvidia is going to pull an Nvidia; looking back at Xbox 2001, or PS3, both MS as well as Sony don't ever want to have anything to do with Nvidia. Nintendo will probably learn the hard way as well..
 
Nvidia is going to pull an Nvidia; looking back at Xbox 2001, or PS3, both MS as well as Sony don't ever want to have anything to do with Nvidia. Nintendo will probably learn the hard way as well..
Really? Based on developer comments, Switch has been a dream to develop for. If the Tegra X1 is clocked low in Switch, it Nintendo, not Nvidia. Nintendo can be ultra conservative. I think Nvidia is the best thing going for third party development on Switch. The API and tools seem to be excellent, and that's all Nvidia.

Sent from my SM-G360V using Tapatalk
 
Best to just expect the worse case scenario, that being the digital foundry clocks. That way we might be surprised. Knowing how conservative Nintendo is, it's more likely than not to have those clocks honestly.
 
The challenges with Nvidia have historically had little to do with their tech and a lot to do with their business practices. I personally doubt they would burn bridges with Nintendo the way they have with other vendors (OG Xbox NV2 license fees, 'bumpgate', RSX issues) as they really need some wins for their mobile tech but they have more form than most.
 
The point is neither MS nor Sony is mortal enemies with nVidia, and what exactly is 'pulling an nVidia' anyway? They supplied a good part for OXB but MS screwed up on the details of the contract. They supplied their leading GPU model for PS3 in a Sony rush-job that only looked bad next to G80 because PS3 was delayed. So what is the pattern of behaviour that constitutes 'an nVidia' and how is this going to impact Switch?
 
The point is neither MS nor Sony is mortal enemies with nVidia, and what exactly is 'pulling an nVidia' anyway? They supplied a good part for OXB but MS screwed up on the details of the contract. They supplied their leading GPU model for PS3 in a Sony rush-job that only looked bad next to G80 because PS3 was delayed. So what is the pattern of behaviour that constitutes 'an nVidia' and how is this going to impact Switch?
They lied about fixed shaders being superior to unified, that was their pr at the time. And I find it hard to believe Nvidia would've given them a better price than ati, who had superior tech at that time. Yes, most of that is on Sony for being crazy with the ps3 architecture, but still.

Gamecube gpu was superior at polygons, though yeah the xbox gpu was also very good. Problem is Nvidia didn't want to work with MS and cut a deal, even if die shrinks weren't in the contract i'd be salty as well if I was MS. The Xbox was a very good piece of hardware, too bad we never got a slim version with better reliability.

But with the switch, Nvidia's given them a great deal.
 
They lied about fixed shaders being superior to unified, that was their pr at the time. And I find it hard to believe Nvidia would've given them a better price than ati, who had superior tech at that time. Yes, most of that is on Sony for being crazy with the ps3 architecture, but still.
All of that is Sony being crazy, unless you have evidence of the negotiations that shows Sony were hoodwinked. They would have discussed with available parties, and for all we know wanted AMD's US tech but by the time they contacted them (after failed experiments with their own GPU ideas), MS had already signed an exclusivity deal.

Unless someone has evidence that nVidia engaged in dodgy dealings, nothing can be concluded from the hardware options. The only known quantity is that they had a limiting licensing deal on OXB, which was MS's responsibility to sort out when reading the fine print - this is business and nVidia is in it for the money like everyone else. The fact MS continues to use nVidia parts in MS products shows there's no sour relationship there, because business is largely impersonal and it'd be financially damaging to select partners based on past contracts.
 
All of that is Sony being crazy, unless you have evidence of the negotiations that shows Sony were hoodwinked. They would have discussed with available parties, and for all we know wanted AMD's US tech but by the time they contacted them (after failed experiments with their own GPU ideas), MS had already signed an exclusivity deal.

Unless someone has evidence that nVidia engaged in dodgy dealings, nothing can be concluded from the hardware options. The only known quantity is that they had a limiting licensing deal on OXB, which was MS's responsibility to sort out when reading the fine print - this is business and nVidia is in it for the money like everyone else. The fact MS continues to use nVidia parts in MS products shows there's no sour relationship there, because business is largely impersonal and it'd be financially damaging to select partners based on past contracts.

Good points and I don't have any evidence, in fact I couldn't even find anything concrete with Nvidia claiming fixed shaders were superior. But it's not like Nvidia, or any company for that matter is going to say their tech is inferior when they're trying to land a deal. It makes more sense that Nvidia mislead sony than ati signing an exclusivity deal.
 
Status
Not open for further replies.
Back
Top