No, it isn't.It is the most likely solution.
No, it isn't.It is the most likely solution.
Wii and Wii-u both made nintendo more money per unit than if they would have came with more expensive parts. Do you even math?No, it isn't.
You completely ignore the reality that NS is actively cooled, meaning with a fan, and we have seen X1 passively cooled at 850mhz. Your solution to down clock to avoid paying for active cooling doesn't apply because switch has active cooling. Like I said, think.
Do we know that Nintendo didn't have the same thinking when starting on NX, came up with an esoteric notion, and had to pull a last minute switch because, like PS3's failed GPU concept, they couldn't pull it off and needed a 'last-minute' solution? Certainly that's why nVidia got the PS3 contract.The same thinking that said, "we don't need Nvidia."
You don't hire Nvidia just to keep doing more of what you have been doing.
This makes no sense in relation to your other argument that N. are using 16 nm. If they are using 16 nm, why do they need the fan? If they are using 20 nm, why not use the fan for the same purpose, just with less performance?Think about this, we know that Nintendo Switch has a fan and is actively cooled, part of 20nm chip price would then include the active cooling solution, millions of fans and more expensive see designs to cool the product, a thicker device also adding to the cost.
Why can't they use passive cooling on the go and active cooling when docked with X1?They could have opted for the 16nm process node, had passive cooling and dock performance on the go (possibly) .
You may want to listen to yourself here. You've just cited the fan as evidence that the part isn't 16nm because it needs to be actively cooled.The fact that it has active cooling, means that performance demanded it and that a smaller process node was unavailable...
Well,that is what my degree is in...Do you even math?
This makes no sense in relation to your other argument that N. are using 16 nm. If they are using 16 nm, why do they need the fan?
If they are using 20 nm, why not use the fan for the same purpose, just with less performance?
You may want to listen to yourself here. You've just cited the fan as evidence that the part isn't 16nm because it needs to be actively cooled.
It makes all the sense in the world. His point is that they need the fan because it's faster than X1. X1 level performance on 14/16nm wouldn't need a fan. And keep reading down below.
Because using 14/16nm would arguably be more economical than 20nm + fan + thicker "tablet" (this is his point not mine. I'm inclined to agree tho). It would also lead to a better overall design for the device.
And no, of course not. He is definitely saying that it is 16nm, because if it was 20nm or any other node, a better node would be available, this being 16nm.
Well can't we stick with at least X1 at best X2 and wait politely nodding at each others from time to time ?
Dreamers are the ones take the credit for pushing the world forward. Engineers are the ones who actually do the pushing.Hey, dreamers are the ones pushing the world forward!
Don't bring us down
I guess I missed the whole greater than parker discussion, where was that exactly?We could, if it wasnt for the blind faithful dreamers. They dont want to act that way in reality.
I guess I missed the whole greater than parker discussion
What does the state of the mobile market have to do with Nintendo's choices for Wii U? Looking at Wii U as a console, it was facing a similar technological challenge to its contemporaries as NS will face from mobile, and Nintendo chose to go weak on power instead of bleeding edge. What was the thinking behind Wii U that said, "1 TF and 4 GBs RAM at 100 GB/s is too much; let's go with 16 GB/s DDR and <200 GF GPU," and why doesn't that same thinking apply to Switch?
We could, if it wasnt for the blind faithful dreamers. They dont want to act that way in reality.
Actually it's 28nm, that's why it's custom. Nintendo payed nvidia to make a custom 28nm tegra so they can have the production room to sell 20million units in the first year.
I guess I missed the whole greater than parker discussion, where was that exactly?
I can't agree with that. Pre-reveal maybe, but post the 720p screen is a given. Nobody needs to restate that every time they mention performance. And I don't recall seeing any straight up XB1 performance quoted. CPU side, yeah... With FP16, yeah... but I don't remember anyone claiming 1.3TF for FP32?Any time anyone has ever said the portable device will be Xbox One level of performance without placing qualifiers on it such as lower resolution.