Nintendo Switch Tech Speculation discussion

Status
Not open for further replies.
Wii U cpu is slow without a good SIMD, overclocked gamecube x 3. Devs were somewhat happy only because it is out-of-order.
Devs were somewhat happy, because they compared the WiiU CPU to the other last gen consoles. Which had no cache prefetch (had to manually prefetch even linear arrays, otherwise 600+ cycle stalls), no store forwarding (40+ cycle stalls when writing to same memory location and reading it... e.g. every function call), no direct path between int<->float<->vector register files (making it even harder to avoid store forwarding stalls), extremely long SIMD pipelines (had to unroll loops heavily in order to fill the pipelines). I am not even going to be talking about branches... In-order execution is unable to hide any of these stalls. It's all hard manual work.

But WiiU didn't fare well in SIMD heavy code, as it only had 2-wide paired math vs 4-wide SIMD (with single cycle multiply-add) on Xbox One and SPUs on Cell. Modern ARM chips have 4-wide SIMD (NEON). NEON is actually a pretty good instruction set. Better than SSE3 for sure (most PC games are still limited to SSE3).
 
Last edited:
Yeah but you know in a lot of games 7 cores are not use at 100% at the same time, often bottlenecked by the slowest thread... Quad with better single thread performance could have been nice ?
 
Yeah but you know in a lot of games 7 cores are not use at 100% at the same time, often bottlenecked by the slowest thread... Quad with better single thread performance could have been nice ?
Yes, quad with 2x clocks (or quad with 40% higher IPC & 40% clock boost) is always faster than 8 core (as fast or faster).

But twice as many simple CPU cores with lower clock = much better perf/watt. More small cores is preferable for games/engines that are well optimized for the architecture and scale to large amount of cores. All work is split to small independent tasks -> there's no hard-coded threads -> no single thread can be the bottleneck. Now modern PC rendering APIs such as DirectX 12 and Vulkan also allow submitting GPU work simultaneously from any amount of CPU cores, eliminating the biggest traditional single thread bottleneck (rendering). Of course it takes time for engines to evolve. Many engines are still designed around the DirectX 9/10/11 threading model. Single render thread is often the bottleneck.
 
I assume Tegra X1 but it could be X2, I don't really expect it, but it could be a pleasant suprise, the clock speed sound legit really, why are people complaining about having a really good handheld console from Nintendo which doubles as a home console ?
I like the idea and the hardware isn't laughable, just not as good as some crazy enthusiasts wet dreams...
(and I assume those people dislike the XB1 and PS4 because they are really crap compared to high-end PC components.)
 
I assume Tegra X1 but it could be X2, I don't really expect it, but it could be a pleasant suprise, the clock speed sound legit really, why are people complaining about having a really good handheld console from Nintendo which doubles as a home console ?
I like the idea and the hardware isn't laughable, just not as good as some crazy enthusiasts wet dreams...
(and I assume those people dislike the XB1 and PS4 because they are really crap compared to high-end PC components.)

it's pretty easy to see why people are complaining
I assume Tegra X1 but it could be X2, I don't really expect it, but it could be a pleasant suprise, the clock speed sound legit really, why are people complaining about having a really good handheld console from Nintendo which doubles as a home console ?
I like the idea and the hardware isn't laughable, just not as good as some crazy enthusiasts wet dreams...
(and I assume those people dislike the XB1 and PS4 because they are really crap compared to high-end PC components.)

for a home console those specs are not really acceptable in 2017. they are good for a portable though. this is just nintendo way of exiting the home console race and focusing on handheld while trying to keep it's home console fans happy.
 
One of the interesting thing is that it is being listed after NVIDIA products like TK1 and TX1. Is NVIDIA doing the submission or Nintendo since the dates are very close? It looks like NVIDIA is working a lot closer than I thought with Nintendo for Switch.
 
To me it looks like nVidia is working as close as possible to be able to launch its own console, like Sony did to Nintendo once already, and like MS did to SEGA on the Dreamcast.
 
Nvidia is preparing the presentation of the 2017 version of android tv.
Last version was tegra X1 with 2GHz for cpu and 1GHz for gpu, so next version will be ouch for switch's cpu and please no for switch's gpu
 
I assume Tegra X1 but it could be X2, I don't really expect it, but it could be a pleasant suprise, the clock speed sound legit really, why are people complaining about having a really good handheld console from Nintendo which doubles as a home console ?
I think because of the future viability of the platform and Nintendo as a company. Nintendo's market is always shrinking. They needed (to some of our minds) a platform that'd grow their market share by appealing to a wider audience. A console that could pander to the COD/FIFA crowd as well as the Nintendo Portable crowd could have done that, whereas a new portable is only going to appeal to the (ever shrinking) Nintendo portable audience in all likelihood.

I suppose the good news here really is, if these specs are true, Nintendo is probably a step closer to releasing their games on other platforms. They'll be developing skills using nVidia tools and GPU that'll be portable to other consoles, instead of their own esoteric hardware. Switch comes out end of 2017; a couple years after that of lousy sales and no software because 3rd party avoid it*; maybe first PS/XB games from Nintendo in 2020?

* Any chance of Nintendo having an open platform so indies can target this without jumping through hoops? Probably not seeing as its Nintendo, so likely not even a PSN/XBL type indie library that Vita offers.
 
I assume Tegra X1 but it could be X2, I don't really expect it, but it could be a pleasant suprise, the clock speed sound legit really, why are people complaining about having a really good handheld console from Nintendo which doubles as a home console ?
I like the idea and the hardware isn't laughable, just not as good as some crazy enthusiasts wet dreams...
(and I assume those people dislike the XB1 and PS4 because they are really crap compared to high-end PC components.)
If these specs are true, Switch is roughly equivalent to last gen consoles. Pretty good specs for a handheld gaming device, but not impressive compared to home consoles. Scorpio will be out next Christmas (~4x faster than XBox One), meaning that Switch would be already 2 gens behind competing home consoles next Christmas. As a handheld, Switch however has zero competition. Huge jump over Nintendo 3DS and PS Vita.
 
As a pure mobile device that will be gaming at sustained loads all the time, it really isn't a bad hardware specification. Things were done to achieve more than 5 minutes of mobile gaming at consistent performance. Despite the compromises done for power management reasons, It's still going to beat the living snot out of the Nintendo DS.

Things only look bad when if you expect it to compete with hardware that has 15x the power draw corded to wall outlets.

As a cheap low power mobile device that replaces the 3DS, it isn't a bad hardware specification. It's when the "hybrid" term is thrown around with the console being thick and having air vents and a fan that is active during mobile mode that the specs stop making any sense.


@ninelven has made some bold statements. I'm personally looking froward to the 20W chip that's 3 ~ 4 times faster than Xbox One S (in terms of both CPU and GPU) that he says is going to be in the hands of consumers next year.
I'm personally looking forward to you linking this direct quote of @ninelven saying he's expecting the Switch to be 3-4x faster than the Xbone.
Otherwise I'll just assume it's not true and you're just trying to use hyperbole and being unnecessarily aggressive to somehow put your points of view above the others', again.


i think you embarrassed yourself enough in this thread, everything you said in this thread was wrong.
You've shown yourself to behave and write like a troll, which is why people have mostly stopped responding to your posts.




But twice as many simple CPU cores with lower clock = much better perf/watt. More small cores is preferable for games/engines that are well optimized for the architecture and scale to large amount of cores. All work is split to small independent tasks -> there's no hard-coded threads -> no single thread can be the bottleneck. Now modern PC rendering APIs such as DirectX 12 and Vulkan also allow submitting GPU work simultaneously from any amount of CPU cores, eliminating the biggest traditional single thread bottleneck (rendering).

This is why I was hoping for 8+ Cortex A53 cores at 1.2 - 1.5GHz instead of 4 big cores. Even less if these big cores are clocked so low.
One other thing I also hoped for was the console getting 4 big cores that are exclusively accessible to the developer, while having e.g. 2 LITTLE cores at ~800MHz taking care of OS, peripherals, communication and such. Developers wouldn't have to worry about how much % of this core they could use to pass QA.


To me it looks like nVidia is working as close as possible to be able to launch its own console, like Sony did to Nintendo once already, and like MS did to SEGA on the Dreamcast.
They tried that and it didn't go so well in the past...
Plus if these specs are true, nvidia isn't exactly going to be showing that much of a technical prowess because they're showing off performance levels that are lower in mobile mode than 2016's FinFet SoCs from MediaTek and HiSilicon (not even going into Samsung, Qualcomm or Apple territory because those would be either too expensive or simply unavailable).





BTW, nvidia is reportedly launching the Shield TV 2 with Parker in CES, so before the 12th January reveal.

IEI1KkA.jpg


I wonder what message they'll be sending if the Shield TV 2 is ~2x more powerful than the Switch in docked mode.
 
As a handheld, Switch however has zero competition.

My next smartphone will probably be a xiaomi mi 5, cheaper than switch and in the same range.
I don't have benchmark at the moment, but the SD821 must be faster than 1/3 maxwell.
 
An off-topic (not clocks related :)) question - is it known if the console can be connected to TV without the dock?

I mean if it's selling point is that it's portable, it would be great if you could take it travelling and then connect it to whatever TV you have available at your destination, even if it'd only stretch the 720p image to the native res of the screen. Also, can you even charge the thing without the dock?
 
Nvidia is preparing the presentation of the 2017 version of android tv.
Last version was tegra X1 with 2GHz for cpu and 1GHz for gpu, so next version will be ouch for switch's cpu and please no for switch's gpu

But we know Parker is 2 Denver cores and 4 A57 cores. Even if the 16nm node helps with perf/watt, it makes no sense for Nvidia to use A57 for 16nm FinFet when A72 is much smaller and already old by now. It must be because automotive validation is behind the mobile scene. So im not sure if the next Shield TV is an upgrade, it could very well be a cost effective version for 99 dollars. Im not sure what Nvidia could do to make the next Shield tv more desirable , it already has 4K and HDR support, it chews through all games easily.
 
As a cheap low power mobile device that replaces the 3DS, it isn't a bad hardware specification. It's when the "hybrid" term is thrown around with the console being thick and having air vents and a fan that is active during mobile mode that the specs stop making any sense.



I'm personally looking forward to you linking this direct quote of @ninelven saying he's expecting the Switch to be 3-4x faster than the Xbone.
Otherwise I'll just assume it's not true and you're just trying to use hyperbole and being unnecessarily aggressive to somehow put your points of view above the others', again.



You've shown yourself to behave and write like a troll, which is why people have mostly stopped responding to your posts.






This is why I was hoping for 8+ Cortex A53 cores at 1.2 - 1.5GHz instead of 4 big cores. Even less if these big cores are clocked so low.
One other thing I also hoped for was the console getting 4 big cores that are exclusively accessible to the developer, while having e.g. 2 LITTLE cores at ~800MHz taking care of OS, peripherals, communication and such. Developers wouldn't have to worry about how much % of this core they could use to pass QA.



They tried that and it didn't go so well in the past...
Plus if these specs are true, nvidia isn't exactly going to be showing that much of a technical prowess because they're showing off performance levels that are lower in mobile mode than 2016's FinFet SoCs from MediaTek and HiSilicon (not even going into Samsung, Qualcomm or Apple territory because those would be either too expensive or simply unavailable).





BTW, nvidia is reportedly launching the Shield TV 2 with Parker in CES, so before the 12th January reveal.

IEI1KkA.jpg


I wonder what message they'll be sending if the Shield TV 2 is ~2x more powerful than the Switch in docked mode.


do you go by the name Thraktor on neogaf by any chance, i'm sorry you find these specs so hard to accept and are really reaching for straws, i still remember the same thing happened to you when wiiu was shown to be 176 gflops, you wouldn't accept it.

shield 2 is not in competition with switch, the shield 1 destroys the 3ds, and the didn't stop the 3ds from destroying it in sales.
 
Last edited:
But we know Parker is 2 Denver cores and 4 A57 cores. Even if the 16nm node helps with perf/watt, it makes no sense for Nvidia to use A57 for 16nm FinFet when A72 is much smaller and already old by now. It must be because automotive validation is behind the mobile scene. So im not sure if the next Shield TV is an upgrade, it could very well be a cost effective version for 99 dollars. Im not sure what Nvidia could do to make the next Shield tv more desirable , it already has 4K and HDR support, it chews through all games easily.

GFXBench results that have appeared with Parker point to >50% better performance than the TX1 Shield TV. The GPU seems to be clocking 50% higher and there's 2x the memory bandwidth in there.
The current Shield TV seems to be as barebones as possible, if you look at teardowns. I'm not sure how nvidia could make it any cheaper to produce without changing the SoC. Much less if they're doing it to halve the cost.
Plus, the market is already being flooded by ~$50 Android PCs capable of HDMI 2.0 and HEVC 10bit acceleration capable of Netflix 4K HDR output.
 
Nvidia is preparing the presentation of the 2017 version of android tv.
Last version was tegra X1 with 2GHz for cpu and 1GHz for gpu, so next version will be ouch for switch's cpu and please no for switch's gpu

GFXBench results that have appeared with Parker point to >50% better performance than the TX1 Shield TV. The GPU seems to be clocking 50% higher and there's 2x the memory bandwidth in there.
The current Shield TV seems to be as barebones as possible, if you look at teardowns. I'm not sure how nvidia could make it any cheaper to produce without changing the SoC.

Im not saying Parker is not an improvement. We already know its specs, im saying it was an odd decision to include 4 A57 cores if they were planning on using it for Shield, A72 is much smaller and could have saved them some space.
 
Im not saying Parker is not an improvement. We already know its specs, im saying it was an odd decision to include 4 A57 cores if they were planning on using it for Shield, A72 is much smaller and could have saved them some space.

Because automotive was the priority and Shield TV 2 is the afterthought.
Who knows, they may even disable the A57 module for the Shield, as Automotive Parker may be using those only to handle the external dGPUs.
 
Status
Not open for further replies.
Back
Top