Switched Universe Nintendo Switch design wishes *spin-off*

they could have put even more power if they had not made those gimmick control scheme, a real portable with a dock and a wireless controller would have been so much better.
They used the best hardware that was available at the time. Tegra X2 was not available in the required quantity. Ps4 used year old chipsets as well and was not the best that could've been used. Perhaps if Switch launched in the fall x2 could've been used but the bottom line is these consoles need to be profitable.

Not to mention if the hardware itself was smaller they could fit even less powerful chips inside or at least have worse battery life / more heat.

Just because you have no interest in new control schemes doesn't mean there's no use for them.
 
Crazy. Interesting stuff Bethesda is trying to pull off / cash in on.

I have a feeling there is a good chance Nintendo struck a deal with Bethesda to get this support. Bethesda doesn't exactly have a history of supporting Nintendo, and on top of it the hardware presents a challenge that seems like something Bethesda would have declined to tackle in the past. For Bethesda, these are pieces of software that are being ported late, and doesn't bring the same challenges a simultaneous multi platform release would. Finalized and optimized code has been there for both Skyrim and Doom, and there wont be to many bugs and resource sucking gremlins left in the pipeline. This work has already been done, and now its about tweaking settings, scaling down assets, and testing to see what resolution and framerate is sustainable.

they could have put even more power if they had not made those gimmick control scheme, a real portable with a dock and a wireless controller would have been so much better.

Possibly, but within the form factor limitations that Switch operates under, there really aren't to many processors available that match or exceed the Tegra X1 for graphics processing. Even the mobile processors that do indeed outclass the Tegra X1, is it enough to be a meaningful difference? Even a 2x performance boost doesn't really make a night and day difference. Heck, the Switch gets a 2.5x boost going from portable to docked, and the biggest difference is primarily found with an increase in resolution and not much else.

Portable co-op is a big deal to a lot of people. I have played co-op Mario Kart 8 and Ultra Street Fighter 2 on my lunch breaks many times now. Even solo table top play is pretty appealing at times. I think its safe to say that all products have features that you would gladly exchange for something else. For some people not having system voice chat is a big deal, and for others they could care less. Free services like Discord are available to them, and a lot of people prefer not to chat with raging randoms anyway. Then there are those who just love to hate on "insert product", and we all know, haters gonna hate. LOL
 
They used the best hardware that was available at the time.
Because every portable game console that existed before the switch had to use years-old existing SoCs, and no console maker ever dared to order custom SoCs or even make their own?


Tegra X2 was not available in the required quantity.
Regardless of how little it matters if Parker (or any other SoC originally built for other kinds of devices) was a good fit for the Switch or not, I'm pretty sure you don't know this.
Because if you did, you couldn't be talking about it.


I have a feeling there is a good chance Nintendo struck a deal with Bethesda to get this support. Bethesda doesn't exactly have a history of supporting Nintendo, and on top of it the hardware presents a challenge that seems like something Bethesda would have declined to tackle in the past.
On the contrary, id Software has made notable titles for smartphone hardware that presented great technical achievements for their time.

There's idTech 4 Doom Resurrection that ran on the iphone 3gs with a Samsung SoC that had a PowerVR SGX535 (rated at 1.6GFLOPs):



And one year later they made Rage HD that ran on the same hardware and used idTech 5.


idTech has shown to be able scale downwards really well.
 
Because every portable game console that existed before the switch had to use years-old existing SoCs, and no console maker ever dared to order custom SoCs or even make their own?

Because every console since the ps3 has used years old existing chips. ;)
Regardless of how little it matters if Parker (or any other SoC originally built for other kinds of devices) was a good fit for the Switch or not, I'm pretty sure you don't know this.
Because if you did, you couldn't be talking about it.

It's an educated guess based on the fact that once again, since the ps3 no console has used state of the art chipsets. Of which i'm sure chip yields had at least something to do with all of those releases.
 
Even the PS3 didnt use state of the art chipsets, their GPU was already old when it released.
 
Yeah that's what I meant, Ps3 was the first to start the trend. I think even the Cell was ready in 2005

Why start there? The original Xbox, hell even the NES used off the shelf hardware.

I'm not sure this bolsters your argument about the Switch being the best option though. A better option would have been semi-custom on the latest node like what Sony and Microsoft accomplished with AMD, instead of the downclocked off the shelf TX1 Nvidia sold to Nintendo.
 
*best option on N budget and timeline.

The best option without a budget is some quantum phased device that derives zero point energy from a parallel universe that was designed in null time and brought back in a delorean and stored in some crummy cave for hundreds of years until 2017.
 
Why start there? The original Xbox, hell even the NES used off the shelf hardware.

I'm not sure this bolsters your argument about the Switch being the best option though. A better option would have been semi-custom on the latest node like what Sony and Microsoft accomplished with AMD, instead of the downclocked off the shelf TX1 Nvidia sold to Nintendo.
It's not about being off the shelf or not, but Xbox was pretty state of the art. Just needed more bandwidth really. Its gpu was not an off the shelf part.

A 16nm switch could've achieved higher clocks but the silicon would've been the same. I guess the one major thing that could've been changed would the the addition of an 128 bit memory bus, but I couldn't say if it was within reason to do or not on the 20nm process. It certainly would've been very beneficial though. The thing that gets me is that cpu core locked to the os ; a secondary chip for the os would've been great. But that's not an issue with the TX1 itself.

It's worth noting that Sony and MS had 3 years to shrink their hardware while there's only a year and a half between TX1's launch and the Switch's launch.
 
Because every console since the ps3 has used years old existing chips. ;)
- The PS Vita launched with the most powerful gaming SoC ever built until the ipad 3 around 1.5 years later, carrying a top-performing mobile GPU together with the highest-end ARM cores at the time and a never-before-seen WideIO + LPDDR memory arrangement.
- The PS4 launched with the most powerful gaming APU ever built until the PS4 Pro released, carrying a brand new GCN2 GPU. The first GCN2 GPU to ship on PC - Bonaire - released 2 months before the console's release).
- The XBOne launched with the second most powerful gaming APU ever built until the PS4 Pro released, again with a brand new GCN2 GPU.
- The PS4 Pro launched with the most powerful gaming APU ever built until XBoneX's release, carrying a GPU architecture that has features from Vega which only released ~8 months later.
- The XBoneX will launch with the most powerful gaming APU ever built.

In fact the only consoles that managed to use "old tech" since the PS3's launch were the 3DS with ARM11 cores and OpenGL ES1.1 GPU, the Wii U with Power750 cores and a pre-GCN GPU with DX10-level featureset, and the Switch with an off-the-shelf 2 year-old tired and boring extra-downclocked SoC.
 
... and the Switch with an off-the-shelf 2 year-old tired and boring extra-downclocked SoC.
That argument was going well until this editorial narrative. ;) I don't think the chipset is tired and boring, because it can be targeted explicitly as a console chip. Until Switch, Tegra K1 and X1 got Android multiplat code by and large. The install base didn't justify writing Shield optimised code. Switch provides a large enough platform that maxing the hardware is an economically viable option. Thus, regardless what raw power exists in other Android devices, Switch is the platform that'll use its potential to best advantage. We should, hopefully, see something akin to console vs. PC in terms of efficiency and utilisation. Only Apple is in a position to outperform Switch in high-end devices due to a single architecture and universal metal support, if they are targeted effectively, which is still questionable given the intention to sell the same product to older devices. Again, like PC, bleeding edge will likely go mostly unused for the next couple of years on Apple, limiting software to 2-year-old designs.

So definitely not tired and boring. Can't be tired as Tegra is hardly used anywhere, and won't be boring because devs will explore the architecture and find great ways to do stuff with highly optimal code, as long as Switch continues selling well.
 
That argument was going well until this editorial narrative.
That "tired and boring" part was obviously my own personal opinion, and I exaggerated to make it obvious :)
Nintendo could have achieved so much yet they did so little. They launched a console perfectly on time for a 16FF SoC, for much more power-efficient Cortex A72 cores, for playing with low-hanging fruit optimizations like using 1 or 2 low-clocked LITTLE cores for a developer-transparent OS (where OoO isn't that important) while freeing power to get more big cores available to developers, a significantly wider GPU, 128bit-wide LPDDR4...

Instead they simply bought a stock of 2 year-old SoC that was originally intended for tablets + automotive + set-top boxes but was never very competitive in the power efficiency area, disabled a good part of the chip (A53 cluster, I/O, video encoding blocks, etc.) and then downclocked it until it was good for the Switch's power target (which - again - is not what TX1 was intended for during development).

And then they further saved money on the stupidest things, like cameras for AR games and a microphone. A friggin' microphone! People who are playing multiplayer games with party chat need to drain their smartphone battery and use a buggy app!


I know mine is an old and tired argument, but I can't just see posts saying "they used the best they could" and let it pass. The Tegra X1 is not the best they had off-the-shelf and it's not the best they could do. It's so far away from the best they could do, one can only wonder why Microsoft or Sony haven't announced something to counter it. My guess is they're waiting for the fabrication processes to evolve to the point where they can make mobile versions of the 2013 consoles.
 
As far as I'm concerned it is entirely foolish to complain about what is in the Switch now. It is useless to speculate about what could have been. This is Nintendo being discussed so it should be a given they won't make the best decision when it comes to performance. They could care less. This is three generations in a row they've come out with underwhelming hardware, yet here we are.

It's Nintendo. It should be a given the system won't be a beast.
 
My guess is they're waiting for the fabrication processes to evolve to the point where they can make mobile versions of the 2013 consoles.

I complete that with waiting for AMD to improve their GPU power efficiency dramatically as well, if they intend to use AMD again, that is. Polaris and Vega improved a bit but they are still behind nvidia in performance / Watt. Even if they are now in Maxwell league, they would not want to make just a "me too", especially after Switch changed the rules of the game and selling itself as an hybrid console. I doubt a pure mobile console would have a lot of appeal now VS Switch, without being massively overpowered against it.
 
- The PS4 launched with the most powerful gaming APU ever built until the PS4 Pro released, carrying a brand new GCN2 GPU. The first GCN2 GPU to ship on PC - Bonaire - released 2 months before the console's release).
- The XBOne launched with the second most powerful gaming APU ever built until the PS4 Pro released, again with a brand new GCN2 GPU.
- The PS4 Pro launched with the most powerful gaming APU ever built until XBoneX's release, carrying a GPU architecture that has features from Vega which only released ~8 months later.
- The XBoneX will launch with the most powerful gaming APU ever built.

Jaguar was a year old, and it only gets older with each 8th gen refresh. Ps4's gpu was pitcairn, a first gen gcn part. There's nothing state of the art about any of these chips you listed, and the Xbox one using Bonaire doesn't net it any advantages over the PS4. And that's just talking about the architecture here. There were 3 gpu tiers above the PS4, those being the fully featured 7870, 7950 and 7970. The last of which launched almost a full year before the ps4. So, yes none of these consoles were using the best technology available.

And I don't know why you keep repeating the "most powerful console chip" as if you're a PR spokesman ; that's not what we're discussing. We're discussing wheter a specific console chip was the best available, or not.

Admittedly I don't know much about the Vita but it was a very nice piece of hardware. I kind of doubt it was the best of the best hardware available, not that it really mattered since anything stronger would've been running on a bogged down OS like android or Apple.
 
I doubt a pure mobile console would have a lot of appeal now VS Switch, without being massively overpowered against it.
There won't be a pure mobile console ever again IMO. It's so easy to put external video support in there somehow, any new portable console will be able to run as a TV console just by supporting an external controller.
 
and the Switch with an off-the-shelf 2 year-old tired and boring extra-downclocked SoC.

Don't forget the vita was massively downclocked as well. Instead of the advertised 2GHz it only ran at 333MHz, and the GPU only 111. With wifi disabled it could go to 444MHz CPU and 222 MHz GPU.
 
Back
Top