Nintendo Switch Tech Speculation discussion

Status
Not open for further replies.
True, but the fact remains that Wii U was never the target platform for these multi plat titles, and developers spent years tailoring their game engines to take advantage of the 360 and PS3. I remember in the Secret Developer article at Eurogamer, the developer made mention of how its common knowledge that the Cell processor was used for graphics processing on the PS3, but this was also done with the 360 CPU as well. The SIMD capabilities of these consoles was maximized, and that's something that was never going to port to the paired singles CPU the Wii U rocked. From what I can gather, whatever modest advantages the Wii U GPU had was mitigated by the CPU performance. More or less, Nintendo crafted hardware that was on par with the 360 and PS3, but had some fundamental design characteristics that had advantages and disadvantages with the other consoles.



Doesn't Xbox One S use 16nm Finfet? I believe it is, and that is quite a bit more power efficient than 20nm. So even with the more efficient node, the Xbox One S pulls nearly 6x the amount of juice the Switch is pulling, at least when not charging the battery.

One third the power of the Xbox One seems reasonable, and half as powerful seems to be possible depending on just how useful the FP16 capabilities turn out to be. Docked offers 786Gflop half precision, although the real word performance may find the memory bandwidth ceiling prior to reaching maximum shader performance. Regardless, this is pretty darn good for a product in this form factor pulling 1/6th the power. Im sure a lot of people are going to see this as a monumental chasm, but Im not so sure. Look how much parity there are with multi platform games on the Xbox One and PS4, and the PS4 has a 600Gflop advantage over the Xbox One, similar to the 800Gflop advantage the Xbox One has over the Switch docked.

Both Steep and Skyrim come out in the fall, and I wouldn't be shocked to see Call of Duty to make an appearance. The chasm in performance certainly will make porting to Switch less than straight forward, but the business proposition is more likely to determine what support looks like. Even if Switch sells very well, if Activision releases COD on Switch and doesn't sell a million copies, they will likely decline to continue with such efforts on the platform. Same with Steep from Ubisoft, while I don't see the million sold milestone being the benchmark for this particular title, but if it struggles to sell even a few hundred thousand units, Ubisoft will likely streamline their offering down to titles like Just Dance.

It'll really depend on how many Switches are out there at that point. I mean right now I believe there are three million world wide. So hopefully Activision has realistic expectations.

As for Steep... Is that game even good? Lol, I kind of wish they'd brought something else a bit more appealing. I'm concerned about Skyrim. They say it's a Switch edition, which might be a crooked way of not putting the DLC in the game. Good way to sabotage the sales of that game.
 
Really people still use the rushed port excuse. were talking about a whole generation here, wiiu had tons of ports, and most were inferior, and the ones that were better, didn't really give me a impression of gpu was being clearly more powerful, for example the criterion port, very slight advantage in frame rate advantage in crashes, runs same sub hd resolution just like 360/ps3, and had a revamped lighting at night, which was a design choice, they made everything darker. i'm sorry but even in 2012 not being to be clearly superior then consoles released in 2005 is not halfway descent. not even one game on wiiu had better AA, and higher resolution the whole generation compared to 360. the difference when wiiu ports were better, is much smaller then many 360 vs ps3 ports.

You don't think it was halfway decent, cool. We're talking semantics. I say halfway decent because you could still create a great looking game at hd resolution and it was at least more modern than the other consoles, and used far less power in a year that 28nm wasn't feasible. You can call it crap, shit or whatever you want to call it.

I don't think the console was a good design, i'm just talking about the gpu, alone when compared to the 7th generation. Games aren't going to show a tangible advantage unless they were tailor made for the console which had a much different architecture. Was the gpu so much better that games were going to show an advantage by default? No. Technically it has less theoretical power than Xenos ; 176 gflops vs. 204 (It's 240 but something about the architecture makes it 204 in actuality). But it's a DX10 part with shader model 4.0, compared to the DX9 and shader model 3 Xenos and per flop is going to be more efficient.

"We're a GPU-heavy game," "Wii U has a powerful GPU with more oomph than the rivals - and is more modern in architecture and shader support, which may come in handy later on. - http://www.eurogamer.net/articles/2012-08-30-how-powerful-is-the-wii-u-really

http://nintendotoday.com/developer-shinen-says-wii-u-gpu-ahead-of-current-gen/

I don't think it's about footing the bill, it's just the gpu wasn't capable enough, and like i said before even the good ports showed minuscule advantage in gpu power. then you have games like tekken tag, sonic racing, sky landers, and disney infinity, which run lowest resolution or have there graphics downgraded to run on wiiu. wiiu gpu being better then 360 is not a fact, and i wish people would stop saying that like it's a fact, when there is no real proof to back it.

Sonic is an interesting example, because it runs at 1152x544 on 360, 1024x576 on Wii U (technically less pixels) but the full 720p on Ps3. We know the Ps3 isn't that much more capable than 360 and if anything should have a lower resolution, so maybe the developers didn't play nice with the eDRAM on WiiU and 360? It's hard to say. I don't remember seeing anything on Skylanders or DI.

I use the NFS example because Criterion was a very technically competent studio. You left out the higher resolution textures, and it wasn't sub HD since there was no upscaling ; there were very small black borders at the top and bottom of the screen, the game was 1280x704. Man that game was pretty fun on Wii U, one of few games that had cool game pad features.
 
Last edited:
i'm just talking about the gpu, alone when compared to the 7th generation.
The dispute here is whether that's a reasonable way to characterize "half-decent for 2012." It could trade blows with the seventh-gen consoles, yes, but it wasn't very powerful given the tech available. How shortly before the launch of the PS4 would the WiiU have had to come out for the line of reasoning to break down? Six months? Six weeks? Six days? If WiiU launched on November 14, 2013, would you still argue that it was good for the time because its competitor's previous products hadn't yet been superseded?
 
Switch seems to be an easy system to hack ...

ff0a57_nintendo-switch-webkit-hack_news.jpg
 
The dispute here is whether that's a reasonable way to characterize "half-decent for 2012." It could trade blows with the seventh-gen consoles, yes, but it wasn't very powerful given the tech available. How shortly before the launch of the PS4 would the WiiU have had to come out for the line of reasoning to break down? Six months? Six weeks? Six days? If WiiU launched on November 14, 2013, would you still argue that it was good for the time because its competitor's previous products hadn't yet been superseded?

It's simply my subjective opinion. My original point was that the Wii U had the best gpu among previous consoles to try and gauge how much better the switch's gpu is, nothing more.

The fact is Sony and MS weren't ready to launch in 2012, even though the tech was available. They also had *much* better technology available than what they used in 2013, while we're at it. Someone could just as easily call those consoles crap for their time and not be wrong, because it's subjective.
 
I know the battery life has been stated to be quite low but having only updated the firmware and played 45 minutes of Zelda the battery is at 60%. A little unfortunate as I doubt I'll ever use it docked. Judging by the much longer battery life playing indie games that's being reported there is probably wide variation in the X1's power draw.
I got almost 3 hours on Zelda, like all batteries the reporting of battery usage is obviously not a constant, so it dropped from 50% to 5% much slower than it dropped from 100 too 50%.
 
It's simply my subjective opinion. My original point was that the Wii U had the best gpu among previous consoles to try and gauge how much better the switch's gpu is, nothing more.

The fact is Sony and MS weren't ready to launch in 2012, even though the tech was available. They also had *much* better technology available than what they used in 2013, while we're at it. Someone could just as easily call those consoles crap for their time and not be wrong, because it's subjective.
What we think of them is subjective, yes.

What's less subjective is that WiiU was considerably less powerful relative to computing devices of 2012 than the PS4 and XB1 were relative to computing devices of 2013. In that sense, WiiU is behind the industry-trend curve, which is what people are saying.
 
It'll really depend on how many Switches are out there at that point. I mean right now I believe there are three million world wide. So hopefully Activision has realistic expectations.

As for Steep... Is that game even good? Lol, I kind of wish they'd brought something else a bit more appealing. I'm concerned about Skyrim. They say it's a Switch edition, which might be a crooked way of not putting the DLC in the game. Good way to sabotage the sales of that game.

What did Xbox One sell at launch, 4-5 million units by the end of 2013? All the major AAA games that released that Christmas sold pretty damn well for a userbase that size. Its not like COD Ghost and Assassins Creed 4 had trouble selling a million units to that smaller userbase. By comparison, those games struggled to sell even 300k on a larger Wii U userbase. So publishers are going to assume a lower attach rate for their games on a Nintendo platform compared to Sony and Microsoft. This is when we can circle back to the performance deficit on the Switch, and the extra effort to port these games. If return on investment looks bleak, that's when publishers jump ship. If Switch were basically a PS4 clone, and ports were copy and paste, perhaps they would slap it on a disk and sell a few hundred thousand copies, but when a dedicated team is needed to handle the porting, then its all about dollars and cents.
 
What we think of them is subjective, yes.

What's less subjective is that WiiU was considerably less powerful relative to computing devices of 2012 than the PS4 and XB1 were relative to computing devices of 2013. In that sense, WiiU is behind the industry-trend curve, which is what people are saying.
But that doesn't contradict anything i've said.
 
True, but the fact remains that Wii U was never the target platform for these multi plat titles, and developers spent years tailoring their game engines to take advantage of the 360 and PS3. I remember in the Secret Developer article at Eurogamer, the developer made mention of how its common knowledge that the Cell processor was used for graphics processing on the PS3, but this was also done with the 360 CPU as well. The SIMD capabilities of these consoles was maximized, and that's something that was never going to port to the paired singles CPU the Wii U rocked. From what I can gather, whatever modest advantages the Wii U GPU had was mitigated by the CPU performance. More or less, Nintendo crafted hardware that was on par with the 360 and PS3, but had some fundamental design characteristics that had advantages and disadvantages with the other consoles.

For the most part though the ports really do match up with specs, even ports where developers talked up, and made seem like there would a nice advantage were inferior. I don't think we need to make excuses, the wiiu is just really poor hardware

That was my point: "better" at what exactly? Unless the WiiU was a superset in all respects including all aspects of performance, there is no way to guarantee that ported code would run faster on the newer system. And that obviously wasn't the case. CPU and main memory bandwidth alone present hurdles necessitating recoding for ports before we even start considering the GPU at all.

Again, this is unrelated to the Switch, apart from the caution against using ports for performance comparisons. Using Geekbench at the time to compare the PS3 CPU vs. contemporaries for instance, you could see even normal subtests (not encryption or such) vary more than a factor of five in relative performance. Performance comparisons between architectures is simply difficult. If all we have available for doing them is a really crappy tool on top of the inherent problems, then maybe we just shouldn't do it.

Then what would you use to compare hardware? because exclusives are far worse in comparing hardware. we can look for examples or killzone or uncharted 2, something most people were saying couldn't be done on 360, yet it was later proven false by halo 4, gears of war 3, and horizon. we just don't know what naughty dog could have made possible on 360, or nintendo for that matter. I think if the hardware is really step up, ports by competent developers will eventually prove that.
 
For the most part though the ports really do match up with specs, even ports where developers talked up, and made seem like there would a nice advantage were inferior. I don't think we need to make excuses, the wiiu is just really poor hardware



Then what would you use to compare hardware? because exclusives are far worse in comparing hardware. we can look for examples or killzone or uncharted 2, something most people were saying couldn't be done on 360, yet it was later proven false by halo 4, gears of war 3, and horizon. we just don't know what naughty dog could have made possible on 360, or nintendo for that matter. I think if the hardware is really step up, ports by competent developers will eventually prove that.

Its not excuses, they are facts. The Wii U had a very different CPU architecture than the 360/PS3, and didn't benefit from years of developers working on the hardware. Look at the evolution of Call of Duty on PS3, it went from a very much inferior game on the PS3, and eventually got to near parity with the 360 with Ghost. If Wii U had come out in 2010 and sold well, you would have seen an evolution of continued improvements made to the Wii U builds year over year. There was no way developers were going to spend big money customizing their game engines to maximize the potential of a poor selling console. Especially when the PS4 and X1 weren't. Wii U actually had some good ports, Mass Effect 3 is very good, and Need for Speed Most Wanted was excellent. Not bad for hardware on paper that looked drastically inferior to the 360/PS3. You can say garbage hardware all you want, but the business side of things were at least as much to blame for the poor quality ports.

Of course we are going to use games to use multi platform games to compare hardware, but that doesn't always paint the whole picture. A crappy performing multi plat has been shown to be at times, simply a shit port. You can imagine if a developer made no use of the Switch's ability to use FP16 shaders. We know from post Sebbi has made that more than half the shaders can benefit from FP16 and work just fine. If the developer simply ports the game to Switch and its crap, but didn't make use of these unique abilities, its an example of a crappy port, and is not a good indication of the systems potential. I disagree on exclusives being irrelevant. They are often times the games that squeeze the most out of a consoles capabilities, how is that irrelevant? Multi plat developers aren't going to tailor their games to the extent that one piece of hardware benefits greatly while creating a nightmare on the other.

Switch is in a unique position. No one is questioning that its not significantly less capable than the Xbox One and PS4. The question is just how much effort does it take to down port to Switch? What do the sacrifices to do it look like? As long as those hurdles are not extreme, I would bet the Western Publishers will show up at E3 with more ports heading to Switch than you might expect. Its wont be a situation where its not possible, it will be about it being profitable. Wii got multiple COD games because they sold rather well. Albeit they ran at 30fps, but there is no the developer couldn't drop to 30fps on Switch as well. Drop the resolution to 900p and 30fps docked, that's a tremendous amount of resources freed up immediately. A COD port looks favorable if they are willing to make those compromises.
 
Its not excuses, they are facts. The Wii U had a very different CPU architecture than the 360/PS3, and didn't benefit from years of developers working on the hardware. Look at the evolution of Call of Duty on PS3, it went from a very much inferior game on the PS3, and eventually got to near parity with the 360 with Ghost. If Wii U had come out in 2010 and sold well, you would have seen an evolution of continued improvements made to the Wii U builds year over year. There was no way developers were going to spend big money customizing their game engines to maximize the potential of a poor selling console. Especially when the PS4 and X1 weren't. Wii U actually had some good ports, Mass Effect 3 is very good, and Need for Speed Most Wanted was excellent. Not bad for hardware on paper that looked drastically inferior to the 360/PS3. You can say garbage hardware all you want, but the business side of things were at least as much to blame for the poor quality ports.

ps3 and 360 were drastically different, it's a much bigger difference in architecture then wiiu to 360, you might notice most wiiu ports graphically match the 360 while the ps3 version is lower resolution or has weaker AA. wiiu gpu is pretty close to 360 in terms of architecture, it's ram is where it has a huge advantage, and cpu where it a disadvantage at least in terms of ease of developing for it. so in reality, so we don't really know if the business side of things or just a really inferior cpu, and with the vast majority of ports being inferior, especially those which heavily depended on the cpu. people stating that it was more about business, is just speculation, especially when you look at the specs on paper
 
Wii U could be measured in Gamecubes. :p

I do hope multi-plat games do decently well on Switch. I wouldn't mind seeing some more last gen collections come to it. Dark Siders collection, Mass Effect collection, Tomb Raider collection, etc with visuals upgraded a bit would be cool... Especially on the go! GTAV would be great too.
 
Wii U could be measured in Gamecubes. :p

I do hope multi-plat games do decently well on Switch. I wouldn't mind seeing some more last gen collections come to it. Dark Siders collection, Mass Effect collection, Tomb Raider collection, etc with visuals upgraded a bit would be cool... Especially on the go! GTAV would be great too.
Yes those are all great , add fallout 3 / new vegas and skyrim and we have a great portable machine imo.
 
Wii U could be measured in Gamecubes. :p

I do hope multi-plat games do decently well on Switch. I wouldn't mind seeing some more last gen collections come to it. Dark Siders collection, Mass Effect collection, Tomb Raider collection, etc with visuals upgraded a bit would be cool... Especially on the go! GTAV would be great too.

Assuming all cores are equal espresso is like 5 GameCubes :p
 
Then what would you use to compare hardware? because exclusives are far worse in comparing hardware. we can look for examples or killzone or uncharted 2, something most people were saying couldn't be done on 360, yet it was later proven false by halo 4, gears of war 3, and horizon. we just don't know what naughty dog could have made possible on 360, or nintendo for that matter. I think if the hardware is really step up, ports by competent developers will eventually prove that.
Good question.
For professional reasons, I have had a long term interest in benchmarking, and often no numbers are better than bad numbers.
Eyeballing the specs and estimating between thumb and forefinger, if done by someone who has been around for a while, will give a ballpark estimate that circumvents corner cases and all the unknown potholes and wrinkles of any given architecture that may mess up a specific benchmark result, but will tend to even out a bit over a larger project. Estimation side steps the compiler (entire software stack for something like a game console) can of worms as well.
Of course, lacking both technical specs, and in my case, intimate knowledge of the hardware demands of real world graphics software, I'm not in a good position to estimate anything personally as far as the Switch goes, but I guess a large number of developers could do a decent enough job of it.

Where such an analysis would get really interesting is of course the specific wrinkles of a design. How would you play to its strengths? What common techniques would you try to avoid, and use alternative methods to achieve similar results?

Using multiplatform title performance is the only option available to the console armchair expertise (to whom I belong). But quantifying the visual output into relative performance numbers of the underlying architectures? Impossible beyond the very broadest of strokes.
 
Good question.
For professional reasons, I have had a long term interest in benchmarking, and often no numbers are better than bad numbers.
Eyeballing the specs and estimating between thumb and forefinger, if done by someone who has been around for a while, will give a ballpark estimate that circumvents corner cases and all the unknown potholes and wrinkles of any given architecture that may mess up a specific benchmark result, but will tend to even out a bit over a larger project. Estimation side steps the compiler (entire software stack for something like a game console) can of worms as well.
Of course, lacking both technical specs, and in my case, intimate knowledge of the hardware demands of real world graphics software, I'm not in a good position to estimate anything personally as far as the Switch goes, but I guess a large number of developers could do a decent enough job of it.

Where such an analysis would get really interesting is of course the specific wrinkles of a design. How would you play to its strengths? What common techniques would you try to avoid, and use alternative methods to achieve similar results?

Using multiplatform title performance is the only option available to the console armchair expertise (to whom I belong). But quantifying the visual output into relative performance numbers of the underlying architectures? Impossible beyond the very broadest of strokes.
My educated guess is that Switch is much easier to develop than last-gen consoles (Xbox 360, PS3, WiiU).

Reasons:
1. Switch has a modern OoO CPU. Last gen CPU code was horrible. Lots of loop unrolling to avoid in-order bottlenecks (no register renaming). Lots of inlined functions, because no store forwarding hardware -> 40+ cycle stall for reading function arguments from stack. No direct path between int/float/vector register files (through memory = LHS stall). Variable shift was microcoded (very slow). Integer multiply was very slow. There was no data prefetching hardware. Developer had to manually insert prefect-instructions (even for linear array iteration). ~600 cycle stall if prefetch wasn't used for L2 cache miss (and no OoO to hide any of it). Code was filled with hacks to avoid all these CPU bottlenecks.
2. Switch has a modern GPU and unified memory architecture. Last gen had either EDRAM or split memory (PS3). Always had to fight to fit data to fast GPU memory (256 MB on PS3 was largest, but that was all of it). Switch 4 GB unified is going to be life saver. In addition Maxwell has delta color compression and tiled rasterizer to automatically reduce memory bandwidth usage (last gen consoles had no such fancy hardware). Maxwell also has compute shaders, allowing more flexibility and efficiency to rendering techniques.

Of course if you compared raw peak throughput, Switch might not be that impressive. But nobody reached even close to maximum peak performance on last gen consoles, even if (= when) they spend half of the project programming effort optimizing the code. Modern CPU & GPU allows developers to focus on critical parts instead of having to optimize the whole code base.
 
Status
Not open for further replies.
Back
Top