Nintendo Switch Tech Speculation discussion

Status
Not open for further replies.
You completely ignore the reality that NS is actively cooled, meaning with a fan, and we have seen X1 passively cooled at 850mhz. Your solution to down clock to avoid paying for active cooling doesn't apply because switch has active cooling. Like I said, think.

Technology licensing costs and chip purchase costs could easily be greater than the cost of a tiny fan and air venting.

Edit: this is assuming that they're even going to licensing route, and not simply buying complete chips with 'x' characteristics for 'y' dollars per pop.
 
The same thinking that said, "we don't need Nvidia."

You don't hire Nvidia just to keep doing more of what you have been doing.
Do we know that Nintendo didn't have the same thinking when starting on NX, came up with an esoteric notion, and had to pull a last minute switch because, like PS3's failed GPU concept, they couldn't pull it off and needed a 'last-minute' solution? Certainly that's why nVidia got the PS3 contract.

Think about this, we know that Nintendo Switch has a fan and is actively cooled, part of 20nm chip price would then include the active cooling solution, millions of fans and more expensive see designs to cool the product, a thicker device also adding to the cost.
This makes no sense in relation to your other argument that N. are using 16 nm. If they are using 16 nm, why do they need the fan? If they are using 20 nm, why not use the fan for the same purpose, just with less performance?

They could have opted for the 16nm process node, had passive cooling and dock performance on the go (possibly) .
Why can't they use passive cooling on the go and active cooling when docked with X1?

The fact that it has active cooling, means that performance demanded it and that a smaller process node was unavailable...
You may want to listen to yourself here. ;) You've just cited the fan as evidence that the part isn't 16nm because it needs to be actively cooled.

Basically, presence or absence of fan tells us nothing. We've no idea if it's a low power part pushed or a high power part pushed.
 
This makes no sense in relation to your other argument that N. are using 16 nm. If they are using 16 nm, why do they need the fan?

It makes all the sense in the world. His point is that they need the fan because it's faster than X1. X1 level performance on 14/16nm wouldn't need a fan. And keep reading down below.

If they are using 20 nm, why not use the fan for the same purpose, just with less performance?

Because using 14/16nm would arguably be more economical than 20nm + fan + thicker "tablet" (this is his point not mine. I'm inclined to agree tho). It would also lead to a better overall design for the device.

You may want to listen to yourself here. ;) You've just cited the fan as evidence that the part isn't 16nm because it needs to be actively cooled.

And no, of course not. He is definitely saying that it is 16nm, because if it was 20nm or any other node, a better node would be available, this being 16nm.
 
Actually it's 28nm, that's why it's custom. Nintendo payed nvidia to make a custom 28nm tegra so they can have the production room to sell 20million units in the first year.
 
It makes all the sense in the world. His point is that they need the fan because it's faster than X1. X1 level performance on 14/16nm wouldn't need a fan. And keep reading down below.



Because using 14/16nm would arguably be more economical than 20nm + fan + thicker "tablet" (this is his point not mine. I'm inclined to agree tho). It would also lead to a better overall design for the device.



And no, of course not. He is definitely saying that it is 16nm, because if it was 20nm or any other node, a better node would be available, this being 16nm.

Thank you, I'm not sure how much clearer I could get than if it was x1 performance, 16nm is probably cheaper than 20nm and active cooling, larger form factor etc etc... Meaning they are probably overshooting x1 (a 2 year old chip that was shrunk for a handful of shield TV devices over the 28nm maxwell design, that was likely cheaper at the time than 20nm) no one is expecting this to match or exceed current gen consoles, but the path of least resistance is likely greater than x1. This chip could even be x1 on the go since that is what the dev kits were and pascal Tegra reduces power consumption by 60% over maxwell Tegra. So less than 5watts?
 
Why would it be x1 performance? Probably less than x1 when on mobile and x1 max when docked if they are using an x1 as a dev kit.
 
Well can't we stick with at least X1 at best X2 and wait politely nodding at each others from time to time ?

We could, if it wasnt for the blind faithful dreamers. They dont want to act that way in reality.
 
Hey, dreamers are the ones pushing the world forward!

Don't bring us down :(
 
Yo Shifty, this must've got lost in the shuffle. I messed up the reply system or something. Still relevant though.

What does the state of the mobile market have to do with Nintendo's choices for Wii U? Looking at Wii U as a console, it was facing a similar technological challenge to its contemporaries as NS will face from mobile, and Nintendo chose to go weak on power instead of bleeding edge. What was the thinking behind Wii U that said, "1 TF and 4 GBs RAM at 100 GB/s is too much; let's go with 16 GB/s DDR and <200 GF GPU," and why doesn't that same thinking apply to Switch?

Its not so much that it doesnt (believe me, I'd like to think I learned my lesson after being a longtime gaf lurker in the U spectulation threads), its that people need to be careful that they don't go so far the other way to the point it becomes a sort of trump all point especially when there's at bare minimum some fairly good trustworthiable (is that a word) evidence to the contrary.

Lets be clear here though. I'm not saying we know for a fact its going to be even competently powerful. It's more of a matter of disagreeing with the rather intransigent notion that it almost CAN'T be powerful "cuz Nintendo". Not saying you're necessarily doing this (I don't really think you are), but still, at worst it's still a point worth bringing up.

Does the question in all this than become how on earth could a home console that plugs in to your tv be rightfully considered a success if it cant even match the One on paper in even a best case scenario? Because of its nature as a hybrid! If NINTENDO (!) can successully manage to do both in a grossly fattened tablet whilst still effectively allowing for fairly doable home console ports than, to me at least, thats a technical accomplishment in and of itself and meets the bare minimum of what should've rightly been expected of a their next system. FWIW I would have MUCH rather they just made a home console that just shared the same architecture with their new handheld whilst still leveraging the advantages that a unified library brings, but it is what it is.
 
Last edited by a moderator:
Actually it's 28nm, that's why it's custom. Nintendo payed nvidia to make a custom 28nm tegra so they can have the production room to sell 20million units in the first year.

It wouldn't be surprising. TSMC has a newer 28HPC+ process that's even better than 28 HPM in terms of performance and power consumption. It wouldn't be surprising if it was close enough to 20nm, but way cheaper.
 
Any time anyone has ever said the portable device will be Xbox One level of performance without placing qualifiers on it such as lower resolution.
I can't agree with that. Pre-reveal maybe, but post the 720p screen is a given. Nobody needs to restate that every time they mention performance. And I don't recall seeing any straight up XB1 performance quoted. CPU side, yeah... With FP16, yeah... but I don't remember anyone claiming 1.3TF for FP32?
 
Status
Not open for further replies.
Back
Top