Nintendo Switch Tech Speculation discussion

Status
Not open for further replies.
Good question.
For professional reasons, I have had a long term interest in benchmarking, and often no numbers are better than bad numbers.
Eyeballing the specs and estimating between thumb and forefinger, if done by someone who has been around for a while, will give a ballpark estimate that circumvents corner cases and all the unknown potholes and wrinkles of any given architecture that may mess up a specific benchmark result, but will tend to even out a bit over a larger project. Estimation side steps the compiler (entire software stack for something like a game console) can of worms as well.
Of course, lacking both technical specs, and in my case, intimate knowledge of the hardware demands of real world graphics software, I'm not in a good position to estimate anything personally as far as the Switch goes, but I guess a large number of developers could do a decent enough job of it.

Where such an analysis would get really interesting is of course the specific wrinkles of a design. How would you play to its strengths? What common techniques would you try to avoid, and use alternative methods to achieve similar results?

Using multiplatform title performance is the only option available to the console armchair expertise (to whom I belong). But quantifying the visual output into relative performance numbers of the underlying architectures? Impossible beyond the very broadest of strokes.

Of course its not perfect, its gives a ballpark estimate for sure though. If the hardware is more powerful, most multiplatform games should run better, or else what's the point of the more powerful hardware if most games run worse? of course, you have your bad ports, or ports that don't take advantage for what ever reason, but most games should run better, or at the very least not be inferior. wiiu for example, everybody was confused as to why ports ran worse because they thought it had around 350-500 gflops, but just based of multiplatform games, i knew that didn't make sense, just to many inferior ports, then the real specs came out and most ports being inferior made sense. back to the point, we are not dealing with never before seen vastly different architecture anymore, like ps2, and ps3 where developers have to take years to get to understand to use hardware properly, those days are done.
 
Of course if you compared raw peak throughput, Switch might not be that impressive. But nobody reached even close to maximum peak performance on last gen consoles, even if (= when) they spend half of the project programming effort optimizing the code. Modern CPU & GPU allows developers to focus on critical parts instead of having to optimize the whole code base.
Speaking in hypotheticals, if one were to port a PS4 game to the Switch, a naive way would be to reduce frame rate targets from 60fps to 30fps, or resolution from 1080p to 720p (or variable), removing features that simulate optical/cinema artefacts (CA, film grain, DOF, motion blur), or simply reduce complexity such as amount of ambient debris/grass/bushes and so on.
But that is just cutting down.
More interesting would be what could be achieved by replacement. Changing to less resource intensive methods for AA, filtering, shadows, AO, and so on. What effects are available that yield visually close, but not quite as good results, but still provide an overall preferable trade off to, say, reducing frame rate? If we see down ported multiplatform titles, I wouldn't be surprised to see differences in how these issues are handled. From tinkering with settings on PC games, it would appear that "almost as good" visual results can sometimes be achieved at much lower computational cost. Console titles are probably more tightly balanced, but there has to be some room for replacement strategies there as well, doesn't it?
A well performed combination of cutting back and replacement strategies might be capable of producing pretty decent results for the casual eye.
 
Switch has modern hardware, OS (freeBSD), and API (Vulkan), so yes it's a giant step forward for Nintendo. Welcome to the 21st century ! (It was about time :p)
 
Dynamic res for Zelda Breath of the wild. From 648p to 720p on portable mode, 810p to 900p docked.

http://www.eurogamer.net/articles/d...h-of-the-wild-uses-dynamic-resolution-scaling

Never really noticed it, so that's good. It seems like they didn't put a lot of effort into the port, though I guess considering the size of the game it's kind of understandable.

I wonder if Nintendo will still be using their old engines on Switch? For instance, I wonder if ARMS, Splatoon 2, and Xenoblade 2 are all going to use the old Wii U engines? I'd like to see what a native engine could do on the system, but I don't know how Nintendo does things in those regards.
 
Never really noticed it, so that's good. It seems like they didn't put a lot of effort into the port, though I guess considering the size of the game it's kind of understandable.

I wonder if Nintendo will still be using their old engines on Switch? For instance, I wonder if ARMS, Splatoon 2, and Xenoblade 2 are all going to use the old Wii U engines? I'd like to see what a native engine could do on the system, but I don't know how Nintendo does things in those regards.
Never really noticed it, so that's good. It seems like they didn't put a lot of effort into the port, though I guess considering the size of the game it's kind of understandable.

I wonder if Nintendo will still be using their old engines on Switch? For instance, I wonder if ARMS, Splatoon 2, and Xenoblade 2 are all going to use the old Wii U engines? I'd like to see what a native engine could do on the system, but I don't know how Nintendo does things in those regards.

don't expect much because the game has to be made in portable mode, and switch in portable is mode is more powerful then wiiu, but not by much.
 
I wonder if the firmware bug that causes a minor performance drain when using dynamic resolution (See: DF article about Fast RMX) could be a potential factor in Zelda performance drops then ...?

You only need to miss 1/30 second (30 hz) by a fraction and frame time drops to 1/20 (20 hz).
That's the first thing that came into my mind as well when reading the DF article. Zelda is dropping frames too often for it to be intentional. Nintendo quality is better than that. Hopefully the firmware bug gets fixed soon.
 
That's the first thing that came into my mind as well when reading the DF article. Zelda is dropping frames too often for it to be intentional. Nintendo quality is better than that. Hopefully the firmware bug gets fixed soon.

I think some of the weird "freeze" can be affected by this bug, but not every one of them. I really believe they ported a heavily wii u optimised engine to a not so powerfull and totally different architecture in a short time, and they could not optimise it very well. It's that simple for me... Pushing the game back on the switch was not an option.
 
That's the first thing that came into my mind as well when reading the DF article. Zelda is dropping frames too often for it to be intentional. Nintendo quality is better than that. Hopefully the firmware bug gets fixed soon.

It will be interesting to see what happens, but I am not personally expecting a night and day difference. This was Nintendo's most technically ambitious game in their history, and I think some framerate stutter here and there was considered acceptable. I played for a few hours last night portably, and the game runs very smooth. I see it hitch here and there, but nothing like docked where you can be in an area that runs extended periods of 20fps. I just don't want to get my hopes up that this will suddenly vanish after the firmware update.

How problematic would it be to implement a triple buffer vsync? I'm confused how pretty much every title on Wii U implemented a triple buffer vsync, and suddenly we have BoTW with a double buffer. Isn't the only downfall extra memory usage?
 
Switch has a modern OoO CPU. Last gen CPU code was horrible. Lots of loop unrolling to avoid in-order bottlenecks (no register renaming). Lots of inlined functions, because no store forwarding hardware -> 40+ cycle stall for reading function arguments from stack. No direct path between int/float/vector register files (through memory = LHS stall). Variable shift was microcoded (very slow). Integer multiply was very slow. There was no data prefetching hardware. Developer had to manually insert prefect-instructions (even for linear array iteration). ~600 cycle stall if prefetch wasn't used for L2 cache miss (and no OoO to hide any of it). Code was filled with hacks to avoid all these CPU bottlenecks.
I think you're talking about the PS3 and X360 only, because the Wii U had 3 OoO PowerPC 750 cores, though they were clocked rather low at 1.24GHz and carried an ancient instruction set.


Speaking in hypotheticals, if one were to port a PS4 game to the Switch, a naive way would be to reduce frame rate targets from 60fps to 30fps, or resolution from 1080p to 720p (or variable), removing features that simulate optical/cinema artefacts (CA, film grain, DOF, motion blur), or simply reduce complexity such as amount of ambient debris/grass/bushes and so on.
But that is just cutting down.

You're assuming geometry could be exactly the same but it most probably can't, and you're assuming these multiplatform games run at 60FPS in the standard PS4 version, whereas I think the great majority of them run at 30 FPS.
Maxwell architecture's geometry performance is much higher than GCN's but this is a 2 "Polymorph" geometry engines that needs to run at 300MHz. So while the rest can probably be LOD'ed down, geometry cannot and asset developers may need to go back to the drawing board for this.

And then there's CPU performance. While the A57's performance/clock may be similar to Jaguar (according to geekbench results at least), game devs have access to at least 6 cores at 1.6GHz in PS4Bone and only 3 cores at 1GHz in Switch.
This is over 3x the total amount of instructions per second. And while graphics may be scaled down somewhat easier, going over such a big difference in CPU performance may be even harder.
 
I think you're talking about the PS3 and X360 only, because the Wii U had 3 OoO PowerPC 750 cores, though they were clocked rather low at 1.24GHz and carried an ancient instruction set.
WiiU had no SIMD (only packed singles), no SMT and very low clock rate. Xbox 360 CPU did 4d multiply+add per core per cycle at 3.2 GHz (over 5x peak flop advantage).

PPC 750 OoO machinery wasn't that advanced either. You can't compare it to modern Intel/AMD/ARM cores. It was significantly older design than the PS3 and Xbox 360 CPUs. I haven't programmed WiiU myself, so I don't know all the low level details. OoO doesn't necessarily mean that the CPU had store forwarding or data cache prefetchers. Integer multiply and variable shift was most likely also slow (those have nothing to do with OoO).

Update: Found confirmation that 750CL added L2 cache prefetcher. This is CPU used in the original Wii. WiiU CPU was basically a die shrink of it (plus obviously more cores, much higher clock and larger caches).
 
Last edited:
Mario Kart 8? Looked awesome and ran at a (lot) more stable 60 fps.

No AA, no AF, very basic tracks.... It look good because of "art", but that's it for me. No, I agree it's not fair because of the resolution differences, and the tv it was played on. And that NGC was not blown away by the competition power wise.
 
No AA, no AF, very basic tracks.... It look good because of "art", but that's it for me. No, I agree it's not fair because of the resolution differences, and the tv it was played on. And that NGC was not blown away by the competition power wise.

I never played that game that much, but the tracks I saw didn't seem that simple to me.
Also, I don't remember clearly, but I'm pretty sure rogue leader didn't use any AF, and only minimal AA.
 
Too bad we never saw a game 5 times better (visually) than Rogue Leader :(
Not sure what you mean. If you mean the amount of detail on screen, games pushed waaay more than 5x rouge leader. Or if you mean just in terms of looks.

The latter I can understand, it's just diminishing returns. The jump from N64 to gamecube was much, much more palpable than any leap that came after. Hell a Ps2 game can still look beautiful while a ps4 game can look tripe. Games had better direction back then, and now we have things like procedural generation that make games look more sterile, all these nasty post processing effects etc.
 
Yeah, what I meant was more of a "wow" effect. I was blown away by Rogue Leader, like nearly everybody at the time. Since Nintendo let it go (power wise) we didn't have the chance to experience that again on a "multiple gamecube taped together" type of console :eek: (wii , wiiU). Well I didn't anyway, not on a Nintendo console.
 
Mario Kart 8? Looked awesome and ran at a (lot) more stable 60 fps.
Yeah i'd say MK8 is Wii U's poster boy. That and Bayonetta 2. Both games didn't need much cpu power so the Wii U could shine. Didn't have AA or AF though. Bandwidth is definitely the reason console games still don't have max AF.
 
You're assuming geometry could be exactly the same but it most probably can't, and you're assuming these multiplatform games run at 60FPS in the standard PS4 version, whereas I think the great majority of them run at 30 FPS.
Maxwell architecture's geometry performance is much higher than GCN's but this is a 2 "Polymorph" geometry engines that needs to run at 300MHz. So while the rest can probably be LOD'ed down, geometry cannot and asset developers may need to go back to the drawing board for this.

And then there's CPU performance. While the A57's performance/clock may be similar to Jaguar (according to geekbench results at least), game devs have access to at least 6 cores at 1.6GHz in PS4Bone and only 3 cores at 1GHz in Switch.
This is over 3x the total amount of instructions per second. And while graphics may be scaled down somewhat easier, going over such a big difference in CPU performance may be even harder.
Nope, I made no such assumption, you quoted my "Speaking in hypotheticals..." yourself!
Of course the CPU is a factor, it's just a bit trickier to make assumptions about, but then again we aren't seeing a factor of five weaker performance either. sebbbi laid it out quite clearly: compared to the WiiU, the Switch appears much easier to downport XB1/PS4 titles for, seen from both a CPU and GPU perspective.
We are adults here, we know that when it comes to the large publishers the decisions will likely be made on economical or company strategic grounds, rather than technical. I just tried to tease a bit more techniques out of sebbbi when he was on tap so to speak, but any answers would by necessity be rather situational.
I actually got a Switch, to test the waters, enjoy Zelda, and pass on if it doesn't make sense in our lives. Overall it poses an interesting question in the balance between utility and enjoyment provided by portability, versus the compromise in graphical fidelity. By now the trade off seems clear to me, and I'll submit that it makes sense to try it for oneself before making strong statements on the issue.
 
Status
Not open for further replies.
Back
Top