Nintendo Switch Tech Speculation discussion

Status
Not open for further replies.
Well there are obviously plenty of people ok with not playing the "proper" versions since they have been buying them year after year on the 360/PS3.
These versions sell because people stuck on old hardware. Early adopters won't buy highly inferior version on new hardware.
 
What better compression? How much better compression do these techniques provide?
  • Video is easiest: Use HEVC (H.265) or VP9 instead of H.264, save 50%. Pascal has HEVC support, not sure about VP9.
  • Texture data: Difficult topic. If you can do with loosy compression and need to do JPEG, use a non-standard encoder like Mozjpeg (https://github.com/mozilla/mozjpeg). Saves about 15-20% at the same bitrates in my experience. If you don't need to do JPEG, use a better format like BPG (http://bellard.org/bpg/). BPG is based on HEVC, so it may already be supported in hardware for Pascal, at least it should not be so hard to provide hardware support. Saves allegedly up to 50% @ same quality. If you need lossless compression use need to use some existing format you can use ZopfliPNG encoder (https://github.com/google/zopfli) for PNGs. Saves up to 15%. If you can use something fancy use WebP or FLIF (http://flif.info/) for additional saving. Not sure how hard hardware support for those formats would be. Maybe there are even better compression algorithms specially for texture data. I'm sure nVidia has some engineers sitting around that are paid to come up with solutions for that.
  • World data: Without changing data: Use something better than zlib compression (e.g. LZMA2). Compression rates can vary greatly between different algorithms, see http://maximumcompression.com/ for a glance what is possible. When you can reorganize the data you probably can save even more. In the FP16 thread sebbi said that world data is usually 32-bit integers. One idea would be to reorganize world into octree structure and store coordinates relative to the node they are in. If octree subdivision is chosen properly could save 1 byte of precision per value, maybe a few bits more. Might save maybe (I'm guessing here) 15% on the raw world data. Could also make patching take less space. Instead of replacing the whole world data, you just need the data for the nodes that changed. Game engines may already do something like this. I guess the possibilities here are endless.
  • Audio: Just lower the bitrate. MP3 encoders for instance have become very good, so for a game 128 kbps (16-bit 44,1 kHz) should suffice. While you may say that games already do this, I faintly remember one game (Titanfall?) that used uncompressed audio on PC because of latency issues. With flash or cartridge memory and hardware support that should not be a problem.
I think the problem is solvable.
 
Wii , wii-u , and 3ds all support sd/micro-sd so Nintendo being Nintendo would probably continue to support it.
You're right. I actually totally forgot about that. It's been almost 10 years since I bought a card for my Wii and I didn't even know my 3ds xl(*) had one. Sorry!

(*) I bought the (new) 3ds xl only to finally play Majora's Mask through some 15 years later. Full disclosure: Still haven't done it. :runaway:
 
Last edited:
These versions sell because people stuck on old hardware. Early adopters won't buy highly inferior version on new hardware.
This is a market discussion, rather than hardware. Isn't a version you can play in bed, on the train, in the back of the car and so on actually the superior version from an entertainment and usability standpoint? And wouldn't the publishers be happy to sell to that demographic as well?
Also consider just who would be early adopters of the Switch. Wouldn't they to a large proportion be people who find the portability to be valuable or at least intriguing?

(By the way, people aren't "stuck" on older hardware. They actually stay there by choice, because the older device serves them better, incomprehensible as that may be for technology enthusiasts. There is an initial peak of sales to the "must have the latest gadget" group, but that volume is only a tiny fraction of the total, and sales have traditionally grown over the first three/four years. And it is not because it takes the average buyer three years to save up $299....)
 
  • Video is easiest: Use HEVC (H.265) or VP9 instead of H.264, save 50%. Pascal has HEVC support, not sure about VP9.
  • Texture data: Difficult topic. If you can do with loosy compression and need to do JPEG, use a non-standard encoder like Mozjpeg (https://github.com/mozilla/mozjpeg). Saves about 15-20% at the same bitrates in my experience. If you don't need to do JPEG, use a better format like BPG (http://bellard.org/bpg/). BPG is based on HEVC, so it may already be supported in hardware for Pascal, at least it should not be so hard to provide hardware support. Saves allegedly up to 50% @ same quality. If you need lossless compression use need to use some existing format you can use ZopfliPNG encoder (https://github.com/google/zopfli) for PNGs. Saves up to 15%. If you can use something fancy use WebP or FLIF (http://flif.info/) for additional saving. Not sure how hard hardware support for those formats would be. Maybe there are even better compression algorithms specially for texture data. I'm sure nVidia has some engineers sitting around that are paid to come up with solutions for that.
  • World data: Without changing data: Use something better than zlib compression (e.g. LZMA2). Compression rates can vary greatly between different algorithms, see http://maximumcompression.com/ for a glance what is possible. When you can reorganize the data you probably can save even more. In the FP16 thread sebbi said that world data is usually 32-bit integers. One idea would be to reorganize world into octree structure and store coordinates relative to the node they are in. If octree subdivision is chosen properly could save 1 byte of precision per value, maybe a few bits more. Might save maybe (I'm guessing here) 15% on the raw world data. Could also make patching take less space. Instead of replacing the whole world data, you just need the data for the nodes that changed. Game engines may already do something like this. I guess the possibilities here are endless.
  • Audio: Just lower the bitrate. MP3 encoders for instance have become very good, so for a game 128 kbps (16-bit 44,1 kHz) should suffice. While you may say that games already do this, I faintly remember one game (Titanfall?) that used uncompressed audio on PC because of latency issues. With flash or cartridge memory and hardware support that should not be a problem.
I think the problem is solvable.
1 Games do not use much videos nowadays anyway.
2 PS1 and PS2 have dedicated jpeg decoder. Does any game use it for textures? No.
Current texture formats are supported in hardware. You will need to decode jpeg to raw (or supported texture format) in memory to use it. More memory and CPU needed.
3 Decompression is slow and resource intensive.
4 Patents. MP3 and AAC needs licensing for developer. Nintendo uses ADPCM for a long time.
Sony and MS made own lossy formats with hardware support.
I assume Nintendo would still use ADPCM as I do not think they made own and nvidia implemented it.
Nintendo already uses extremely low quality music in Smash Brothers for 3DS.
 
There are still progress in compression algorithms today, so I wouldn't exclude better compression as an option, would be nice to have hardware compression (general, obviously GPU already support compressed data) to save time too but I wouldn't count on it.
 
1 Games do not use much videos nowadays anyway.
2 PS1 and PS2 have dedicated jpeg decoder. Does any game use it for textures? No.
Current texture formats are supported in hardware. You will need to decode jpeg to raw (or supported texture format) in memory to use it. More memory and CPU needed.
3 Decompression is slow and resource intensive.
4 Patents. MP3 and AAC needs licensing for developer. Nintendo uses ADPCM for a long time.
Sony and MS made own lossy formats with hardware support.
I assume Nintendo would still use ADPCM as I do not think they made own and nvidia implemented it.
Nintendo already uses extremely low quality music in Smash Brothers for 3DS.
1) Maybe. So they won't be an issue. Some still do (Quantum Break anyone?). It's good to have a solution.
2) Textures were used differently back then. Most of the time one texture => one file or one memory chunk. Now we have mega textures, i.e. one image that contains all or most of the textures. Much better suited for compression and much bigger gains to be had.
3) It's not when it's done in hardware. Google for instance "HEVC skylake kaby lake power" and see how much less power (edit: and cpu) is needed when doing things directly in hardware
4) Use Ogg Vorbis instead. ADPCM is awful.
 
Last edited:
Well there are obviously plenty of people ok with not playing the "proper" versions since they have been buying them year after year on the 360/PS3. Many millions of copies have been sold on those older console post PS4/Xbone launch.

So how will you be playing your non skanky version during your lunch break? During your flight heading out for vacation? Your ignoring the Switches obvious selling point and advantage.

I believe that many of these third party titles, especially sports titles, have a better chance of doing well on Switch thanks to it being mobile. There are lots of multi device consumers out there, but does anyone ever buy Madden or Fifa for both the Xbox One and the PS4? Nope, of course not. With the Switch, having a second version isn't out of the question
I'm not massively disagreeing with you here, but AFAIK sales of the older versions dive pretty badly. eg FIFA 17 opening week - 53% on PS4, 40% on XB1. So only 7% of sales across last gen and boxed PC. Sales of FIFA 16 heavily favoured net gen too. Given sales of non-eilte versions of their games don't sell so well, what's the incentive to invest in porting a non-elite version? And given Nintendo's track record for its past two home consoles, how many devs are really going to get behind it with their best efforts, rather than just looking for a few easy wins and see what happens?

So there's still plenty of reason to doubt even cheapo sports games IMO. The portable coop experience won't translate to complex games like FIFA, and would be better suited to something like Sensible Soccer

Isn't a version you can play in bed, on the train, in the back of the car and so on actually the superior version from an entertainment and usability standpoint? And wouldn't the publishers be happy to sell to that demographic as well
Wouldn't that argument be true of all games? So you'd see better software sales on handhelds? How's Vita's software sales doing? So no, playing FIFA lying in bed isn't as good as playing it on the big screen, maybe couch coop. Any more than watching a blockbuster on your iPhone is better than watching it at the cinema. ;)

(By the way, people aren't "stuck" on older hardware. They actually stay there by choice...)
Stuck just means they choose not to spend $300 they need (want) to spend on other things. If the consoles were $100, everyone would upgrade. So 'stuck' is a reasonable short-hand IMO. Just like we're all 'stuck' on standard TVs and not buying HDR TVs.
 
Wouldn't that argument be true of all games? So you'd see better software sales on handhelds? How's Vita's software sales doing? So no, playing FIFA lying in bed isn't as good as playing it on the big screen, maybe couch coop. Any more than watching a blockbuster on your iPhone is better than watching it at the cinema. ;).
But then Switch offers the big screen experience too, only probably at somewhat reduced fidelity compared to the pure stationaries. Overall it offers more, apart from possibly some subsurface scattered nasal pores, and it will be attractive to a partly different demographic from the PSOne, therefore growing the total audience for the multiplatform titles.

I think that will have some appeal for publishers.
 
There are still progress in compression algorithms today, so I wouldn't exclude better compression as an option, would be nice to have hardware compression (general, obviously GPU already support compressed data) to save time too but I wouldn't count on it.
Do you genuinely believe that better compression will get 50 GB games down onto 9 GBs? And moreso, that console games won't use these compression systems as well to fit more in?

1) Maybe. So they won't be an issue. Some still do (Quantum Break anyone?). It's good to have a solution.
You're talking about getting 50 GB games onto a 10 GB footprint or something - enough that a lack og BR drive and HDD won't hurt Swtich. If the current game file sizes aren't due to video, then there's nothing to be gained from using alternative video compression.
2) Textures were used differently back then. Most of the time one texture => one file or one memory chunk. Now we have mega textures, i.e. one image that contains all or most of the textures. Much better suited for compression and much bigger gains to be had.
Hardly anyone is using megatexturing. And megatexturing creates huge files. Everyone else is using compressed texture maps handled in hardware. There's very little savings to be gained in texture compression by new techniques.
4) Use Ogg Vorbis instead. ADPCM is awful.
All games use compressed audio by and large. Again, virtually no savings to be had.
World data: Without changing data: Use something better than zlib compression (e.g. LZMA2). Compression rates can vary greatly between different algorithms, see http://maximumcompression.com/ for a glance what is possible.
From your link, compression rates don't vary greatly. By and large you're looking at 80% compression for much of the top end of that table. Speed is also an issue, and the reason devs will use a particular compression scheme is balance between compression ratio and performance. If they're only getting 50% compression, it's due to performance reasons.
When you can reorganize the data you probably can save even more. In the FP16 thread sebbi said that world data is usually 32-bit integers. One idea would be to reorganize world into octree structure and store coordinates relative to the node they are in. If octree subdivision is chosen properly could save 1 byte of precision per value, maybe a few bits more. Might save maybe (I'm guessing here) 15% on the raw world data.
Not sure what you mean by world data. If it's stuff you're writing, like a Skyrim world state, you need to just write data to a database and can't afford to compress it. Compression is only really good for static data to be loaded. But let's give it your 15% gain and have a look at a hypothetical game.

50GB game. Save 2 GBs compressing 4 GBs of h.264 to h.265. Save 2 GBs by zipping up the world data. Audio is already compressed. Textures are already compressed. Meshes are already compressed. You're still above 40 GBs*. The only way to get smaller is to have less - there's no special compression scheme to double efficiency. Changing to Octree data structures and megatextures and adding all that extra effort doesn't save a great deal. If Switch is to get something like TR:RotTR, it'll need something like 20 GBs same as XB1, as least 15 GBs, unless aspects are reduced/removed to get it smaller (at additional cost to the developer).

We actually had this discussion years ago about download titles, I think started when MS forced a 40MB download limit on 360 games. The only area you can really save lots of storage is dynamic asset generation, such as creating texture on the fly in shaders. Everything else requires storage, which is why we've progressed from MB HDDs to GBs to TBs, and why 16 GB PCs aren't utterly extravagant when working with large, data-rich files., and why MS quickly revised their file limit policy!

* These are obviously make-believe numbers based on broad gut-feeling approximations. A break down of a PC game data files would be useful reference. Feel free to come up with your own speculations based on your own best case scenarios to see if you can get a radically smaller file. If there are games using uncompressed audio and large FMVs, obviously savings can be had!
 
Last edited:
I'm not massively disagreeing with you here, but AFAIK sales of the older versions dive pretty badly. eg FIFA 17 opening week - 53% on PS4, 40% on XB1. So only 7% of sales across last gen and boxed PC. Sales of FIFA 16 heavily favoured net gen too. Given sales of non-eilte versions of their games don't sell so well, what's the incentive to invest in porting a non-elite version? And given Nintendo's track record for its past two home consoles, how many devs are really going to get behind it with their best efforts, rather than just looking for a few easy wins and see what happens?

So there's still plenty of reason to doubt even cheapo sports games IMO. The portable coop experience won't translate to complex games like FIFA, and would be better suited to something like Sensible Soccer

Wouldn't that argument be true of all games? So you'd see better software sales on handhelds? How's Vita's software sales doing? So no, playing FIFA lying in bed isn't as good as playing it on the big screen, maybe couch coop. Any more than watching a blockbuster on your iPhone is better than watching it at the cinema. ;)

Stuck just means they choose not to spend $300 they need (want) to spend on other things. If the consoles were $100, everyone would upgrade. So 'stuck' is a reasonable short-hand IMO. Just like we're all 'stuck' on standard TVs and not buying HDR TVs.

I'm not disagreeing with the idea that these games will ultimately sell better for the PS4/Xbone, but stand by the idea that there could be a market large enough on Switch that the publishers will ultimately support the platform with ports. Like you said, the sales for 360/PS3 versions of these games declined pretty significantly after the release of the PS4/Xbone, but that didn't stop them from releasing them for past 3 years. My point with the older versions on the 360/PS3 is that the Switch's hardware performance will not stop publishers from bringing sports games to Switch. I can pretty much guarantee EA and 2K Sports will be testing the waters next year, and continued support will depend on how well the Switch is selling, and how well their games sell on Switch next year.

Being able to play on the go is a big deal to some people, especially kids. They aren't going to be lugging their PS4 over to grandmas for Thanksgiving, so portability of the Switch is a big deal for many. Yes, Vita is a bust, but it didn't have the kind of first party support driving consumer interests like the far more successful 3DS.

Getting back on topic, it really comes down to the fact that Switch does come with limitations compared to the PS4/Xbone because its a mobile platform. There are certainly games where the argument can be made where the limitations make ports of certain AAA games unlikely, but to lets not use a broad paint brush and apply this to all third party games, because its simply not true. There are plenty of games, like Skyrim for example, that a quality port to Switch isn't a problem for developers. Targeting 720p, Skyrim on Switch can likely see most of the improvements seen on the special edition version for PS4/Xbone.
 
Last edited:
I think you can see "repacks" from pirated pc games.

40-50 GB games usually goes down to 5-15 GB

I attached a few screenshot of the pirated repack releases
 

Attachments

  • Screenshot_20161104-224947.png
    Screenshot_20161104-224947.png
    273.8 KB · Views: 21
  • Screenshot_20161104-224936.png
    Screenshot_20161104-224936.png
    280.5 KB · Views: 16
  • Screenshot_20161104-224920.png
    Screenshot_20161104-224920.png
    273.6 KB · Views: 16
  • Screenshot_20161104-224901.png
    Screenshot_20161104-224901.png
    300.3 KB · Views: 17
  • Screenshot_20161104-224913.png
    Screenshot_20161104-224913.png
    287.6 KB · Views: 13
Last edited:
But the packed data still need to be unpacked somewhere to be usable, preferably on a HDD. Or you'll need a warning on the game: "This game can't be downloaded on a standard Switch, please install a bigger sd card before buying this game. Please understand".

That is, when the powerful Switch CPU will have finished unpacked those...:rolleyes:
 
But the packed data still need to be unpacked somewhere to be usable, preferably on a HDD. Or you'll need a warning on the game: "This game can't be downloaded on a standard Switch, please install a bigger sd card before buying this game. Please understand".

That is, when the powerful Switch CPU will have finished unpacked those...:rolleyes:

Im not familiar with how this "repacking" works. Is it something that can be unpacked on the fly so that it can be put directly into the ram?
 
Repacking is usually compressing (with very high and slow compression) the already compressed games data.

So you need to "install" it first before the game can run.

But ages ago, repacking also can means : extract the game data, lossy compress them, and then lossless compress it again with even higher compression using whatever the game engine still can read.

For example, call of duty games was using ZIP archives renamed as IWD or something. At default, it was compressed with very low compression.

When you extract the data and then compress it again with zip at very high mode, then rename it back, the game still work just fine but the total game size is smaller.


For Switch, Nintendo can mandate the use of high compression by default. Or force the developers select the best compression themselves by limiting the game cart size.

Of course this will affect loading duration.
 
What about non level based games? Most of them nowadays.
Stop with loading screen every 10 seconds? Like a buffering on a youtube video every 10 second.

Yea, I would assume that open world games would not play nice with this. Some games like Call of Duty on the other hand, it should work ok. Obviously this depends on just how quickly the hardware can unpack these files. If its takes two minutes to load a multi player map in COD, that is pretty much a game breaker. I suppose something to consider with cartridge capacity is the rumored 16GB limit could easily increase post launch. I'm sure with every passing year the cost per GB of storage comes down significantly. So lets assume each 16 GB cartridge cost Nintendo $5 to manufacture, a year later for that same $5, it gets you 32GB, and then the following year that $5 gets you 64GB. So I do expect the file size limit to increase pretty significantly over time.
 
Last edited:
There's a rumor around saying it's limited to 128GB of storage expansion, which doesn't make any sense technically (it would have to be SDXC to support 128GB, so it would naturally support up to the 2TB addressing of SDXC).

It could be some very artificial limitation. Maybe the number of free blocks had too many digits and didn't fit the menu layout. Maybe a nintendo block will be some odd capacity like 13.1MB, leading to a limit of 9,999 nintendo blocks, and ultimately a 128GB storage limit.

I'm only half-joking. :runaway:
 
They just did not test huge cards.
There could be a lot of bugs in filesystem code which leads to undesirable behaviour.
For example hardcoding number of inodes on media. Then when you run out of these you can not create new files even if there is space left.
 
But ages ago, repacking also can means : extract the game data, lossy compress them, and then lossless compress it again with even higher compression using whatever the game engine still can read.
Yeah, lossy compression is an option. Can also reduce all texture res by four, for example. At 720p that might be an option. It's not the same as new compression methods though. ;)

For example, call of duty games was using ZIP archives renamed as IWD or something. At default, it was compressed with very low compression.

When you extract the data and then compress it again with zip at very high mode, then rename it back, the game still work just fine but the total game size is smaller.
If that's happening, there's likely a reason for it. Using lower compression with no impact on game would mean slow loading and potentially worse seek times from DVD games, so why do it? Possibly on consoles the ZIP overhead was enough that they needed to use a lower compression ratio?

For Switch, Nintendo can mandate the use of high compression by default. Or force the developers select the best compression themselves by limiting the game cart size.
If it's not a special hardware feature, it's not something Switch can leverage to get larger games into smaller space over the other consoles. Again, there's nothing to be gained from larger files. It means longer downloads, more BW and server costs to the console companies/distributors, slower loading (unless decompression is notably slower), and worse user experience. It would be irrational for devs to opt for a lower compression ratio than possible. Maybe a couple of games have been released where devs forgot to change a setting, but I find it hard to believe that they wholesale are wasting resources on extraneous data. Large data use beyond what's necessary is more from multiple language packs and ridiculous uncompressed audio, which is something Nintendo can manage for sure. But that's not a hardware topic. ;) For hardware, I see zero opportunity for supplementing limited storage with fancy compression methods.
 
Status
Not open for further replies.
Back
Top