What's so sexy about Cortex A57 cores working at 1780MHz and GPU at 910MHz in a Tegra X1 while docked?
It's stripped down? ( ͡° ͜ʖ ͡°)
What's so sexy about Cortex A57 cores working at 1780MHz and GPU at 910MHz in a Tegra X1 while docked?
Any reference to that somewhere? I'm going from the official Nintendo release video where it specifically states 1 account only per Switch. If there are separate profiles within the account that can be controlled separately for adults and children, then great, but I'm not aware of anything like that being announced.
Thanks! That completely contradicts their own announcement video.
Thanks! That completely contradicts their own announcement video.
Pixel C has fully clocked Tegra X1:iPhone 6 will win against a 300mhz Tegra X1?
Pixel C has fully clocked Tegra X1:
http://www.anandtech.com/show/9972/the-google-pixel-c-review/2
It loses to iPhone 6s clearly in all CPU benchmarks (even at 1.91 GHz), but wins in GPU benchmarks. If Tegra X1 GPU would have been downclocked to 300 MHz, it would have lost easily to iPhone 6s in all GPU benchmarks. Apple has a pretty good SOC. Both CPU and GPU are top notch.
Apple is very tight lipped about their future products. Only an Apple insider would know iPhone 8 performance details. I would suspect the credibility of any claims vs future iPhones. Why would some Apple insider leak this info and break their NDA?The "leak" had the "leaker" claiming it would destroy iPhone 8.. so much for any credibility
They're hotter than what's been seen from reliable sourcesWhat's so sexy about Cortex A57 cores working at 1780MHz and GPU at 910MHz in a Tegra X1 while docked? Those values are below what you find in the 2 year-old Shield TV.
2000/2000000 = 0.001% of production volume anticipated for the first month, seems pretty uncommon to me1 - Where did this "rare" thing came up from? How are 2000 devkits for the GPU dock version somehow rare?
2 - Dock that has a GPU matches Nintendo's own SCP patents.
I would definitely want all that data for extreme situations in the later prototype stages.
For final production hardware, which is clearly what this Foxconn employee was dealing with, it makes no sense IMO.
What the factories' QA team measures during the full production stage is to take random samples from separate batches batch and test them under normal operating conditions, give or take a small power/heat/force/pressure/etc. headroom. This applies to almost everything AFAIK.
They definitely don't do 80% overclocks on the CPU that consume 350% of the normal operation power. I'm pretty sure Intel isn't testing their Kaby Lake i7 CPUs at 6GHz while pulling 300W from the socket. Nor is Samsung testing their Galaxy S7 motherboards with the Snapdragon 820 clocked at 3.6GHz. Sony isn't testing their PS4 Pro with the Jaguars clocked at 3.8GHz either.
It's such a big difference that's simply an unrealistic setting to be tested on final production hardware. At that point, gathering that kind of data is simply a waste of time.
Really? Based on developer comments, Switch has been a dream to develop for. If the Tegra X1 is clocked low in Switch, it Nintendo, not Nvidia. Nintendo can be ultra conservative. I think Nvidia is the best thing going for third party development on Switch. The API and tools seem to be excellent, and that's all Nvidia.Nvidia is going to pull an Nvidia; looking back at Xbox 2001, or PS3, both MS as well as Sony don't ever want to have anything to do with Nvidia. Nintendo will probably learn the hard way as well..
Surface is a high margin product.I keep posting this - Microsoft has used NVidia in various Surface devices. They just launched that Surface Book with the custom-configured GM206 last fall.
They lied about fixed shaders being superior to unified, that was their pr at the time. And I find it hard to believe Nvidia would've given them a better price than ati, who had superior tech at that time. Yes, most of that is on Sony for being crazy with the ps3 architecture, but still.The point is neither MS nor Sony is mortal enemies with nVidia, and what exactly is 'pulling an nVidia' anyway? They supplied a good part for OXB but MS screwed up on the details of the contract. They supplied their leading GPU model for PS3 in a Sony rush-job that only looked bad next to G80 because PS3 was delayed. So what is the pattern of behaviour that constitutes 'an nVidia' and how is this going to impact Switch?
All of that is Sony being crazy, unless you have evidence of the negotiations that shows Sony were hoodwinked. They would have discussed with available parties, and for all we know wanted AMD's US tech but by the time they contacted them (after failed experiments with their own GPU ideas), MS had already signed an exclusivity deal.They lied about fixed shaders being superior to unified, that was their pr at the time. And I find it hard to believe Nvidia would've given them a better price than ati, who had superior tech at that time. Yes, most of that is on Sony for being crazy with the ps3 architecture, but still.
All of that is Sony being crazy, unless you have evidence of the negotiations that shows Sony were hoodwinked. They would have discussed with available parties, and for all we know wanted AMD's US tech but by the time they contacted them (after failed experiments with their own GPU ideas), MS had already signed an exclusivity deal.
Unless someone has evidence that nVidia engaged in dodgy dealings, nothing can be concluded from the hardware options. The only known quantity is that they had a limiting licensing deal on OXB, which was MS's responsibility to sort out when reading the fine print - this is business and nVidia is in it for the money like everyone else. The fact MS continues to use nVidia parts in MS products shows there's no sour relationship there, because business is largely impersonal and it'd be financially damaging to select partners based on past contracts.