Nintendo Switch Tech Speculation discussion

Status
Not open for further replies.
Looks like IGN's saying the same thing :

"At the time of this writing, Nintendo has not officially confirmed the exact specifications of the Nvidia Tegra-based chipset that powers the Switch. That said, it’s fairly clear that the Switch is almost as far behind the power curve of its competitors as the original Wii was when it first came out. For example, Breath of the Wild, which was developed simultaneously on the Wii U, seldom quite makes it all the way to 30 frames per second in TV mode, and it even dips far south of that when lots of particles or physics objects are on screen at once. That it suffers from these performance issues despite a lack of anti-aliasing does not bode well for the system’s long-term capabilities – or its prospects for landing ports of big-budget AAA games."
 
Need a quality & performance comparison between Wii U & Switch to see how it goes.
If a game skip frames you should skip it ;p
[Why aren't games mandatory 60Hz ?! looks less good sure, but it's so much more responsive]
 
Looks like IGN's saying the same thing :

"At the time of this writing, Nintendo has not officially confirmed the exact specifications of the Nvidia Tegra-based chipset that powers the Switch. That said, it’s fairly clear that the Switch is almost as far behind the power curve of its competitors as the original Wii was when it first came out. For example, Breath of the Wild, which was developed simultaneously on the Wii U, seldom quite makes it all the way to 30 frames per second in TV mode, and it even dips far south of that when lots of particles or physics objects are on screen at once. That it suffers from these performance issues despite a lack of anti-aliasing does not bode well for the system’s long-term capabilities – or its prospects for landing ports of big-budget AAA games."

I'm going to have to call bullshit on that. While it doesn't have the same horsepower as the other modern systems, it still is a modern system that can actually run modern engines. The Wii was literally an overclocked Gamecube using ancient architecture. It couldn't even run Unreal 3 at the time. The Switch runs both the full version of Unreal 4 and Unity, which alone is a huge boon to it. It can also use Vulkan.

While weak in comparison to the other systems, I don't think it's anywhere near as a bad as Wii vs PS360 was.
 
I'm going to have to call bullshit on that. While it doesn't have the same horsepower as the other modern systems, it still is a modern system that can actually run modern engines. The Wii was literally an overclocked Gamecube using ancient architecture. It couldn't even run Unreal 3 at the time. The Switch runs both the full version of Unreal 4 and Unity, which alone is a huge boon to it. It can also use Vulkan.

While weak in comparison to the other systems, I don't think it's anywhere near as a bad as Wii vs PS360 was.
Well, yes and no.

The Wii could do almost everything. It really had the newest features on board. But it was far behind the xbox360/ps3 when talking about execution units, RAM, Bandwidth, ....
Just like WiiU. Modern GPU, but to small and the rest of the system was far far away from being called performant.

This switch is a great handheld, but to big for a handheld (joycons + tablet ... to big for a normal pocket). If you want to be online (well when something onlineworthy is out) you also need a smartphone that shares the internet connection, which is already in your pocket.
For a home console it is underpowered and overpriced.
When they bring out a "local"-switch (like I mentions bevor), that only cost $150-200, than it might actually work out. But there are so many "ouya"-like consoles on the market ... well at least Nintendo has the software ... someday.
 
I'm going to have to call bullshit on that. While it doesn't have the same horsepower as the other modern systems, it still is a modern system that can actually run modern engines. The Wii was literally an overclocked Gamecube using ancient architecture. It couldn't even run Unreal 3 at the time. The Switch runs both the full version of Unreal 4 and Unity, which alone is a huge boon to it. It can also use Vulkan.

While weak in comparison to the other systems, I don't think it's anywhere near as a bad as Wii vs PS360 was.
Not to mention the gap isn't as wide just in terms of raw horsepower and memory amount, partially due to the Ps4 not being state of the art at release like 360 was.
 
The "but It can run unreal engine 4" argument is moot. Yeah, it can. But it does not matter if a GAME using this engine can't run on the Switch because it's too slow compared to xbone, ps4, and the devs don't have time to downgrade the visuals, the memory usage, etc. Now, it's good for exclusives game which can use it yeah, but there won't be a lot third party exclusive AAA games anyway. 0 I would say...
 
Well, yes and no.

Wii really had the newest features on board. But it was far behind the xbox360/ps3 when talking about execution units, RAM, Bandwidth, ....
Except it didn't. It wasn't even on par with the original Xbox in terms of its feature set. Gamecube/Wii is like DX7+, Ps3 and 360 are DX9.
 
Except it didn't. It wasn't even on par with the original Xbox in terms of its feature set. Gamecube/Wii is like DX7+, Ps3 and 360 are DX9.
More like D3D8 vs D3D9, there was no shader support in D3D7, and shaders in D3D8 were register combiners in hardware (ie configurable rather than programmable).
 
More like D3D8 vs D3D9, there was no shader support in D3D7, and shaders in D3D8 were register combiners in hardware (ie configurable rather than programmable).
Wii/GC had its own thing for shaders, something called the TEV. Even though it could do most of the things Xbox could, it wasn't done in the same manner.

I was pretty sure Xbox had "programmable" shaders? Something about it was more modern than Wii, anyway.
 
Nintendo Switch is 16W (docked home console mode). Xbox One Slim is 50W. Both when playing demanding AAA games. With 3x+ lower power consumption, it's not possible that the docked Switch matches current gen consoles in pure performance.

If the 256 core GPU at 768MHz and 4*A57 at 1GHz are confirmed, the docked Switch doesn't even seem to match the current gen consoles in performance-per-watt.
Docked switch has a 393 GFLOPs GPU, the Xbone S consumes 3x more but has a 3.5x more powerful GPU (with several times more bandwidth). The PS4 Pro pulls 150W, so 9.4x higher power consumption for a 4.2 TFLOPs GPU, so a 10.6x faster GPU.

If the Cortex A57 is considered similar to the Jaguar cores clock-for-clock (I see they have very similar Geekbench results), the same can be said about CPU performance.

One can argue the 2*FP16 throughput nullifies the Xbone S comparison, but the same isn't true for the PS4 Pro.


Switch obviously is the best gaming portable around.
Yes, but its high price and seemingly old hardware seem to leave too big of an opening for Windows devices in a similar form factor to creep in with significantly higher performance (and the ability to run multiplatform titles).





In the end, my 4 main questions regarding the Switch are:

1 - Why this console released so late if it's using such an old chip (>2 years in production).
2 - Why is it so expensive if this old SoC was supposedly sold so cheap (for being so old) and nvidia offered a "vertical integration" pack.
3 - Why are there so few games at release time if the console released so late, even more considering Nintendo hasn't released any meaningful number of games during the past 3/4 years.
4 - Why not even bother to take away so much useless silicon (4*A53 module + respective L2 cache, CCI400 "glue", 600MPix/s imaging DSP for a console that has no cameras, 4K HEVC decoder for a console that doesn't support more than 1080p VP9) in a device that will sell at least 5 million units?
 
Maybe it's related to that "get a win or get out" line I saw in the Tegra thread regarding Nvidia's Tegra division not exactly getting many wins. Maybe Nintendo got quite the deal. And of course Nintendo is interested in making lots of money if possible. They've never cared about their tech being bleeding edge. They are selling you that exclusive Nintendo gaming experience and people usually seem to eat it up. :) Though that doesn't seem to be going as well for them as it once did.

I'm actually surprised it's as powerful as it is. It's actually pretty modern for them, and rather powerful. They've compromised battery life by making it as powerful as it is instead of doing something like Moar 3DS + HDMI. I imagine that's because Switch is now their all-in-one converged product.
 
Last edited:
With those power consumption numbers Switch also has significantly lower performance-per-watt than the Shield Android TV and an idle consumption at least 2x as high, which is very weird IMO.
 
With those power consumption numbers Switch also has significantly lower performance-per-watt than the Shield Android TV and an idle consumption at least 2x as high, which is very weird IMO.

If I remember correctly, they clocked it down so the system wouldn't throttle itself like the shield does. I read something on NeoGAF about it.
 
1 - Why this console released so late if it's using such an old chip (>2 years in production).
2 - Why is it so expensive if this old SoC was supposedly sold so cheap (for being so old) and nvidia offered a "vertical integration" pack.
3 - Why are there so few games at release time if the console released so late, even more considering Nintendo hasn't released any meaningful number of games during the past 3/4 years.
4 - Why not even bother to take away so much useless silicon (4*A53 module + respective L2 cache, CCI400 "glue", 600MPix/s imaging DSP for a console that has no cameras, 4K HEVC decoder for a console that doesn't support more than 1080p VP9) in a device that will sell at least 5 million units?

You can ask the same for wii and wiiu. Nintendo hasn't cared about good tech, in there home console's for 3 generations now. let's get one thing clear, nintendo has exited the home console market with switch , they don't care about third party ports, this is the 3ds successor, with a after thought of how to keep there small home console user base.
 
Wii/GC had its own thing for shaders, something called the TEV. Even though it could do most of the things Xbox could, it wasn't done in the same manner.

I was pretty sure Xbox had "programmable" shaders? Something about it was more modern than Wii, anyway.

While a lot of material does talk about programmable pixel shaders in DX8 (like this and this), if you go to the Nvidia developer page on the subject you get this:

Nvidia said:
Third-Generation GPUs

The third generation of GPUs (2001) includes NVIDIA's GeForce3 and GeForce4 Ti, Microsoft's Xbox, and ATI's Radeon 8500. This generation provides vertex programmability rather than merely offering more configurability. Instead of supporting the conventional transformation and lighting modes specified by OpenGL and DirectX 7, these GPUs let the application specify a sequence of instructions for processing vertices. Considerably more pixel-level configurability is available, but these modes are not powerful enough to be considered truly programmable. Because these GPUs support vertex programmability but lack true pixel programmability, this generation is transitional. DirectX 8 and the multivendor ARB_vertex_program OpenGL extension expose vertex-level programmability to applications. DirectX 8 pixel shaders and various vendor-specific OpenGL extensions expose this generation's fragment-level configurability.

So programmable vertex shaders, but technically still only configurable pixel shaders, even if configurability had been considerably improved beyond DX7.

It's worth noting that in Xbox...
Yes they weren't as powerful as in the next PC generation but Xbox1 pixel shaders were still awesome for it's time :) The shader language on Xbox1 to configure these were much more powerful than in DX8 (ps1.1) as you had access to separate .rgb/.a pairing (i.e. 16 instructions per shader instead of 8) as well as Nvidia specific complex instruction such as the wonderful xmma & xfc (fog combiner). So you could do much more interesting & complex shading on Xbox1 than you could on the same hardware on PC DX8 / ps1.1.

(I recommend taking a look at the whole post, you'll see a genuine Original Xbox pixel shader!)

You could do a lot more on Xbox than you could on PC via DX8. And I think this bodes well for Switch, as Nvidia have provided an API that will be designed to get the most out of their X1 platform. Switch should be able to make far better use of the X1 than any game could using Android. And this will include ground up, mega budget Nintendo AAA stuff.

I would assume the Nvidia / Nintendo development package will include an API and developer guidance to maximise the benefit of the tiled rasterizer, for one thing ...

Probably means more load on the battery / cooler at lower clocks too ... :D
 
With those power consumption numbers Switch also has significantly lower performance-per-watt than the Shield Android TV and an idle consumption at least 2x as high, which is very weird IMO.

Which is why having basically the same peak power consumption and 2x the idle power consumption is weird.

Games are likely to be better at loading the chip than on Switch than on an Android device, and when games hit temperature or power limits on Android they can just throttle. So having a similar peak power consumption isn't surprising. Switch can draw near the power limit at lower clocks, and Shield will duck under them.

The chip package even has the same decoupling caps after all, and the thermal limit for the chip (and the various parts of the chip) are likely to be the same. So .... I actually think that kinda makes sense.

As for idle, well, Nintendo have possibly set a static performance profile, meaning the chip won't drop clocks when it's idling - it'll likely keep them stable to avoid performance variations. The console way of doing things vs the phone way, so to speak.

That's my interpretation of what's going on anyway.
 
One can argue the 2*FP16 throughput nullifies the Xbone S comparison, but the same isn't true for the PS4 Pro.
One can argue that you are not even comparing with real power consumption numbers of Xbone S
There are eurogamer tests for power consumption - http://www.eurogamer.net/articles/digitalfoundry-2016-microsoft-xbox-one-s-review
XBox One S actual power consumption is 5 times higher, flops number is 3.57x for FP32 and 1.79x for FP16, texturing speed is 3.57x, fillrate/blending are 1.19x
Geez, AMD fanboys are everywhere these days
 
Last edited:
One can argue that you are not even comparing with real power consumption numbers of Xbone S
There are eurogamer tests for power consumption - http://www.eurogamer.net/articles/digitalfoundry-2016-microsoft-xbox-one-s-review
XBox One S actual power consumption is 5 times higher, flops number is 3.57x for FP32 and 1.79x for FP16, texturing speed is 3.57x, fillrate/blending are 1.19x
Geez, AMD fanboys are everywhere these days

I quoted sebbbi's post directly which claimed 50W, who probably got his numbers from a quick google search or this video.
Or maybe he just knows his stuff as a developer. Instead of an internet forum troll working as a nvidia shill for lack of anything better to do with his life.
nVidia trollboys are everywhere these days. I thought you guys were contained in the architecture/graphics forums but turns out you'll leave the horde to infect the console forums as soon as there's a console with something nvidia.


I was just going to ignore and report this troll post, but as it turns out it's even liked by a moderator, so I guess this kind of behavior is not only accepted but encouraged as well. So fuck it.
Or maybe there's just finger picking who gets to insult whom, as we'll see if my post gets moderated or not.


FWIW, it has nothing to do with AMD vs. nvidia but obviously with the TX1 being a planar 20nm SoC while the 2016 consoles have 16FF SoCs. Process technology, not nvidia vs. amd architecture.
 
If there was a better means to multi-rate posts, like specify which sections you like such as the contributions of additional information to threads as Positive [Liked] and to rate other sections as Neutral [Disagree].

Initially I didn't take the last comment as negative, but now that I re-read it, it doesn't belong in the civilized Console sections. @OlegSH please keep it civil here.
 
Status
Not open for further replies.
Back
Top