The next Nintendo hardware? [2021-09]

The Steam Deck is unlikely to put any sort of dent into the Switch's sales. Its catering to a different market, mainly consumers who are already vested into a library of PC games on Steam. Perhaps there will be greater excitement for Steam Deck because of a segment of Switch gamers who have enjoyed gaming on their Switch but are also into PC gamers, and after playing Switch they are absolutely interested in playing their Steam library portably. Switch 2 will again lean on its first party content and the appeal that the hybrid setup offers consumers. The Deck is no more of a threat to Switch 2 than PC gaming is to PS5/X, they coexist very comfortably.

The reviews for the Switch OLED have been very positive. Outside of those who are still upset that it isn't a Pro model, everyone else seems to be very impressed with just how much better games look thanks to the OLED screen. The fact that even the Steam Deck is going with a 800p screen leads me to believe Nintendo can stick with 720p with the Switch 2, potentially offering a premium OLED model for $399 with more internal storage and a $299 model with an LCD screen with internal memory that is practically non existent.

The discovery of a patent for what I can only assume is a propriety version of DLSS for the next Switch leads me to believe Nintendo will lean on it heavily to improve image quality when played in docked mode. I doubt internal resolutions will go beyond 1080p, with 1440p being the absolute maximum, but with proper DLSS support the image quality will look much superior to 1080P even if it falls short of native 4K. Nintendo knows that hours played docked vs portable on Switch are pretty much split down the middle, so if lots of people have been ok with playing 720p games during this generation, they will no doubt be fine with sub native 4k with the Switch 2.

Nintendo Switch 2 doesn't need to mix things up to much, the formula is solid enough that it warrants a more straight forward successor. Will it sell as well the original Switch? Probably not, but just like the GBA didnt sell as well as the Gameboy, it will still be very successful while Nintendo figures out its next big idea to turn gaming on its head again.
 
btw the switch oled has been hacked with the same hardware hack as mariko switch. Dunno how people managed to do it tho, as i assume Nintendo updated the protection/checkes on switch oled.
 

Correct me if I am wrong, but isn't Pascal and Maxwell the same architecture? I thought I remember reading that Maxwell became Pascal when they moved from 20nm to 16nm? I know it wouldn't be popular around here, but I am not convinced that Nintendo would have a problem with sticking with the same architecture. The last time Pascal was still used was in a laptop discrete graphics card, the MX350. GPU performance on that card seams like it would be adequate for what Nintendo would want to accomplish with Switch 2. This would be a custom SOC of course using ARM CPU cores and presumably tensor cores for DLSS. Shrink things down to the 7nm process and the performance and power consumption all fit into the form factor of the Switch. Nintendo has historically used custom processors, the Tegra X1 was the exception not the rule. So while it make seem impractical to use an outdated architecture, if Nvidia cant solve the backwards compatibility issue with a better solution, it would be a very Nintendo thing to do and stick with it. Its still more modern than the PS4/X1, and just like the original Switch, I suspect many of the third party ports will be from the prior generation and not the current.
 
if the new hardware is powerful enough couldn't it just emulate maxwell ? I mean we see amd's soc's emulating the switch pretty well. Nvidia who makes the switch hardware should be able to make it work at least as good

That seems reasonable to me, but I'm not educated enough on the subject to know just how likely that would be. I do remember AMD talking about the GPU in the Wii U, and how if it wasn't for their knowledge of the Wii GPU, it wouldn't have been possible to make that GPU backwards compatible with the Wii (even Gamecube if you use Homebrew). Nvidia is very good on the software and development tools side of things. So perhaps they can configure a layer of software that takes care of the shader compiling issues without having to completely emulate the Tegra X1.
 
if the new hardware is powerful enough couldn't it just emulate maxwell ? I mean we see amd's soc's emulating the switch pretty well. Nvidia who makes the switch hardware should be able to make it work at least as good

Counting on better hardware and more memory, you can just recompile the shaders and cache them.
 
People were saying the same thing about PS5/Series and Back Compat across different AMD architectures and different instruction sets. Has he learned nothing?
I don't know about PS5, but Xbox One games all run in a virtual machine. So being binary compatible isn't as much of an issue there. You just have to make sure that virtual machine works on your new hardware.
 
I don't know about PS5, but Xbox One games all run in a virtual machine. So being binary compatible isn't as much of an issue there. You just have to make sure that virtual machine works on your new hardware.
Last gen and current gen PlayStations and Xbox weren't a massive challenge because the new hardware was vastly modern iterations of the old. Some CPU and GPU instructions had changed or disappeared but those are easy to handle.

Virtual machines don't negate the need for binary compatibility but they definitely help because any layer of abstraction is an opportunity to deploy technologies which are now almost standard. But even when you do have a binary translation challenge, which simply cannot be managed with clever software, like running 360 games on Xbox One, you can get creative.

They was a really interesting article in 2017 by Digital Foundry, where they spoke with Microsoft, about how they pull apart select 360 games, translate the binaries, make some other tweaks, then effective re-compile and package for the new machine. Obviously not technically "backwards compatible" because it's not running the original game, but does it matter? Aside form the effort involved.
 
Last gen and current gen PlayStations and Xbox weren't a massive challenge because the new hardware was vastly modern iterations of the old. Some CPU and GPU instructions had changed or disappeared but those are easy to handle.

Virtual machines don't negate the need for binary compatibility but they definitely help because any layer of abstraction is an opportunity to deploy technologies which are now almost standard. But even when you do have a binary translation challenge, which simply cannot be managed with clever software, like running 360 games on Xbox One, you can get creative.

They was a really interesting article in 2017 by Digital Foundry, where they spoke with Microsoft, about how they pull apart select 360 games, translate the binaries, make some other tweaks, then effective re-compile and package for the new machine. Obviously not technically "backwards compatible" because it's not running the original game, but does it matter? Aside form the effort involved.
BC 360 games run in a virtual 360 specially configured and packaged with the binary with some code altered for modern convenience. Stuff like adding the higher quality FMV for FFXIII. Most of the game code is untouched and stuff like res boosts, AF, and higher framerates are all settings changed in the VM I think. But the games don't know they aren't running on a 360, and they are even network compatible for multiplayer and cloud saves. You can play Crimson Skies networked between Xbox, 360, One and Series consoles.
 
Most of the game code is untouched and stuff like res boosts, AF, and higher framerates are all settings changed in the VM I think.

The game code is recompiled (what Microsoft weirdly call transcompilation). None of is the same. There is no way binaries targeting the tri-core 3.2 GHz PowerPC-based Xenon CPU will run on the 1.2Ghz Jaguar cores. Microsoft explain the whole process in the Digital Foundry article.
 
if the new hardware is powerful enough couldn't it just emulate maxwell ? I mean we see amd's soc's emulating the switch pretty well. Nvidia who makes the switch hardware should be able to make it work at least as good

That seems reasonable to me, but I'm not educated enough on the subject to know just how likely that would be. I do remember AMD talking about the GPU in the Wii U, and how if it wasn't for their knowledge of the Wii GPU, it wouldn't have been possible to make that GPU backwards compatible with the Wii (even Gamecube if you use Homebrew). Nvidia is very good on the software and development tools side of things. So perhaps they can configure a layer of software that takes care of the shader compiling issues without having to completely emulate the Tegra X1.


People were saying the same thing about PS5/Series and Back Compat across different AMD architectures and different instruction sets. Has he learned nothing?

The difference being that AMD has now turned over a new leaf. Ever since GCN, AMD has started maintaining binary compatibility for their GPUs just like they do for their x86 CPUs. Nvidia has yet to show that their willing to do the same. Unless Nvidia drops their future plans for introducing incompatible architectures or assign a dedicated hardware design team to specifically extend the Maxwell architecture, their partner's options on retaining binary compatibility remains limited ...
 
The game code is recompiled (what Microsoft weirdly call transcompilation). None of is the same.

I think what see colon is stressing is they do not change Source Code nor does it change game logic nor overall API usages.

For those not following along and reading DF, this process translates the existing already compiled game binaries and translates them into more fitting for x86 binary instructions and wraps that into a virtual X360 environment.
 
I think what see colon is stressing is they do not change Source Code nor does it change game logic nor overall API usages.
Sure, all coders will understand that "transcompiling" a binary does not change source code, which in most cases wouldn't even be accessible to Microsoft. In the context of a backwards compatibility this is important. Xbox One/Series is oft cited as a platform that has multiple generations of backwards compatibility but the truth it has a single generation of backwards compatibility: Xbox One. Every older 'compatible' game, both 360 and OG Xbox, was subject to work by Microsoft's teams. Modern Xbox's cannot run native code/games from the original Xbox - which was also 80x86, or Xbox 360 which was PowerPC. OG Xbox compatibility as limited for this reason, OG/360 comipabity on modern Xbox is limited for this reason.

What I'm saying is it, let's not over-state the problem-solving abilities of virtual machines. They aren't magic. This is not a bash against the Xbox, it's stating technical realities.
 
Yes, Xbox has 1 generation of easy(ier) native backwards compatibility while the other generations require additional software tools along with VMs. VMs by themselves won't solve everything, but they can make it a easier than if they weren't used.

I disagree on the scope of older compatible games, they're limited by legal restrictions more so than technical restrictions.
 
I disagree on the scope of older compatible games, they're limited by legal restrictions more so than technical restrictions.
Yup, the gradual increase in use of licensed content, which began during the CD-era, definitely came back to bite publishers in the arse. I'm still waiting for see how the GTA Trilogy Definitive Edition soundtracks compare with the original games.
 
Back
Top