Next-gen console versus PC comparison *spawn

a patent predicated drive, controller and filesystem (software stack) wouldn't necessarily preclude it applying on PC.
But I doubt Sony care about anybody using their implementation on PC
If Sony were granted a patent on a SSD controller or disk block compression and tried to enforce it on the PC market, it would be challenged immediately. These are clearly prior inventions, even if relevant patent terms have lapsed.


There is one original part in the Sony patent, and it's about the implementation of the software stack (but not the filesystem, which is only mentioned in passing) in relation to hardware-based decompression - specifically, the SSD controller is not claimed to support any compression.

This means it's not actually block (sector/cluster) compression, but rather the OS layer (or SDK toolset) for compressing the entire game data file during installation (or during offline packaging), so that CPU-intensive algorithms with larger dictionaries can used to improve compression rate - then the ARM processors in the NVMe controller would decompress the entire file stream, and not just one block, when reading back data. If true, this could be a novel approach.

That said, Stac Electronics did offer both software drivers and hardware cards for their Stacker disk compression - these guys won a patent infringement lawsuit against Microsoft in 1994 for their DoubleSpace disk compression. Their hardware implementation was compatible with disks compressed in software (to a certain version of the algorythm) and vice versa.

So again, it feels like a collection of prior art, though I have no desire to examine each and every claim in relevant patents.

most SSDs (a single drive/controller package) employ a native data structure that bears little relation to the OS filesystem, managed by the controller to encrypt, map bad blocks, manage garbage collection and wear levelling and crypto-shredding but custom file systems are supported all most desktop OSs
This is related. NVMe 1.x still uses 48-bit LBA (sector numbers) to address the data, but sector sizes can be arbitrarily large, and there is a list of multiple supported sector sizes, ranked by performance - so if the OS could match the real physical write block size (8-16-32KB), and/or erase page size (512KB - 1MB - 2MB), when formatting the file system and issuing disk IO commands, the need for internal garbage collection and wear levelling could be reduced, and read/write speeds could be increased considerably.

Textures use GPU format compression and geometry does not need it and have its own form of compression like displacement mapping usable on GPU. I think they want to compress some of the files for keep at low as possible game size on SSD.
Textures and geometry are your enitre game data - if you can't efficiently compress them, there won't be any size reduction.
 
Last edited:
If Sony were granted a patent on a SSD controller or disk block compression and tried to enforce it on the PC market, it would be challenged immediately. These are clearly prior inventions, even if relevant patent terms have lapsed.


There is one original part in the Sony patent, and it's about the implementation of the software stack (but not the filesystem, which is only mentioned in passing) in relation to hardware-based decompression - specifically, the SSD controller is not claimed to support any compression.

This means it's not actually block (sector/cluster) compression, but rather the OS layer (or SDK toolset) for compressing the entire game data file during installation (or during offline packaging), so that CPU-intensive algorithms with larger dictionaries can used to improve compression rate - then the ARM processors in the NVMe controller would decompress the entire file stream, and not just one block, when reading back data. If true, this could be a novel approach.

That said, Stac Electronics did offer both software drivers and hardware cards for their Stacker disk compression - these guys won a patent infringement lawsuit against Microsoft in 1994 for their DoubleSpace disk compression. Their hardware implementation was compatible with disks compressed in software (to a certain version of the algorythm) and vice versa.

So again, it feels like a collection of prior art, though I have no desire to examine each and every claim in relevant patents.


This is related. NVMe 1.x still uses 48-bit LBA (sector numbers) to address the data, but sector sizes can be arbitrarily large, and there is a list of multiple supported sector sizes, ranked by performance - so if the OS could match the real physical write block size (8-16-32KB), and/or erase page size (512KB - 1MB - 2MB), when formatting the file system and issuing disk IO commands, the need for internal garbage collection and wear levelling could be reduced, and read/write speeds could be increased considerably.

Textures and geometry are your enitre game data - if you can't efficiently compress them, there won't be any size reduction.

There is other data than geometry which take someplace: animation, sound for example. Maybe they want to compress the part they can compress. But imo this is more for saving space on the SSD.

https://www.dropbox.com/s/cngcqlvb8...marvels_spider-man_preliminaryexport.pdf?dl=0

Animation or sound are not as big as geometry or textures but not negligible size too. After it is probably your solution for compression and they will compress all data and uncompress the data when they read it and transfer it to memory.
 
Last edited:
If Sony were granted a patent on a SSD controller or disk block compression and tried to enforce it on the PC market, it would be challenged immediately. These are clearly prior inventions, even if relevant patent terms have lapsed.

There are many patents relating to drive controllers already, patents are predicated on a new method not the broad implementation. If Sony are doing something new or unique - and I have not and nor am I claiming this -then they can certainly patent that. That is the point of patents.
 
Animation or sound are not as big as geometry or textures but not negligible size too
Of course its gonna depend on the game, but animation should be bigger than geometry esp nowadays where characters have far more sorts of animations than the old days of idle/walk/run/hurt/die etc
 
Not from what I've seen. It's £445 to file if proceeding all the way to grant, and then increasingly expensive to maintain up to £610 in the 20th year, totalling £4640 to maintain it. Every patent not granted is a long term loss of four grand. The PO makes no money from rejecting patents, hence my doubt (in the face of clear evidence to the contrary with patents granted for tech that clearly isn't patentable) that they do any work to ensure patents are actually valid and instead just grant all of them.
I imagine a big chunk ends up with the patent lawyers so that the filing is prim and proper, which may take an unspecified amount of back & forth.
 
That's full hardware virtualisation - i.e. each container has to include a hosted OS and drivers/runtime, similarly to the statically compiled game executable on the Xbox One, where the hypervisor hosts two separate virtual machine partitions for the Windows 10 OS and the game code.

Given the variety of hardware that has to be supported, that hypervisor becomes the full Windows OS, probably with Hyper-V isolation that runs a lightweight Windows OS for each VM, and the container-hosted OS becomes the full Windows OS (like Nano Server in Windows Docker containers), which kind of defeats the whole point.

Does this hint at where they are heading?


Tomm McClain
 
Yes, that's using the built-in compact.exe command and the new NTFS 'CompactOS' file compression that's been available since Windows 10. It uses modified LZ77 algorithm with Huffman coding and a much larger dictionary. It's also available to applications through Windows Compression API (which additionally includes ZIP file archives).

The command to compress all files in the current directory would be compact /c /EXE[:algorithm] , where algorithm is one of XPRESS4K | XPRESS8K (default) | XPRESS16K | LZX

Compression only persists during read-only access - any write will automatically decompress the file.


I've tested this on a temp NTFS partition formatted with 2MB and 4KB clusters, and various game data files (from World of Tanks Encore RT demo) were compressed by 15-30%. But what's really good, while the original NTFS compression only works on 4KB clusters and brutally chops the compressed file in pieces, resulting in heavy fragmentation of the compressed file even on an empty disk - the new algorithm works with arbitrary cluster size and writes the compressed data to a contiguous block of new clusters - so there is no fragmentation!

And even if the volume is fragmented, applications can still use file defragmenation API to analyze file allocations and try to reallocate the clusters in a contiguous block; this API has been available since Windows Vista.

Unfortunately there's no difference in read throughput with 2MB clusters. Since ATTO benchmark saturates at about 4.5 GB/s read and 3.5 GB/s write througput in disk cache mode on my system, I suppose it's limited by the file IO subsystem in the OS.


So I guess only the first part of the "file acces / allocation / defragmentation" equation is missing right now - which is OS support for large sectors (8-16-32-64 KB) as implemented in NVMe 1.x, and large buffers in block IO to take advantage of giant clusters (512KB - 1 MB - 2MB) implemented in latest versions of exFAT and NTFS.


lots of it might come from duplicated files for optimized streaming.
I.e. duplicates for "hard disk streaming" - but since disk access time is almost instant on SSDs, duplicated resources are not really needed anymore.

this is more for saving space on the SSD
I'd suppose developers can either use their own highly-tuned data compression algorithm or just rely on the simpler one provided in the OS.

But the most important part is the simple file system with contiguous block allocation and large data buffers in the file IO stack.

Animation or sound are not as big as geometry or textures but not negligible size too
but animation should be bigger than geometry esp nowadays
Animations are typically implemented with skeletal or skeletal-muscle models, where model geometry is manipulated through 'bones' and 'muscles' - for human characters, that should be much smaller that geometry and textures, even on low-res LOD.
 
Last edited:
patents are predicated on a new method not the broad implementation. If Sony are doing something new or unique - and I have not and nor am I claiming this -then they can certainly patent that
I've pointed above which exact claim of the Sony patent may be unique - that's software file compression in the OS and hardware decompression in the NVMe controller.

If Sony just got a patent for a regular NVMe controller with LBA to memory translation and hardware data compression, they would have hard time enforcing this patent on the PC, since all of this has already been implemented, decades ago.

Does this hint at where they are heading?
Nope - that's about Windows 10 'Secured Core', secure key storage techology to suppress physical attacks on firmware, which was first implemented in the XBox One and PS4.

https://www.platformsecuritysummit.com/2019/speaker/chen/
 
Last edited:
I've pointed above which exact claim of the Sony patent may be unique - that's software file compression in the OS and hardware decompression in the NVMe controller.

If Sony just got a patent for a regular NVMe controller with LBA to memory translation and hardware data compression, they would have hard time enforcing this patent on the PC, since all of this has already been implemented, decades ago.

A patent is the sum of its parts and is valid if is is an overall new method for achieving something. Apple has a ton of patents for multi-touch gestures decades after touch screens were a thing, some with patented, but different, implementations of achieving the same thing. You can't take broad stroke approach with patents, otherwise every fundamental idea would result in one patent and everybody would be locked into cross-licensing for almost every broad implementation.

You can throw a single change, large or minor, into an existing implementation and it be different enough to warrant a new patent because it is a new method. Look at CPUs, HDDs, engines. Lots of unique implementations delivering broadly the same goal. :yep2:

edit: typo
 
Last edited by a moderator:
Animations are typically implemented with skeletal or skeletal-muscle models, where model geometry is manipulated through 'bones' and 'muscles' - for human characters, that should be much smaller that geometry and textures, even on low-res LOD
I've been aware of this since halflife 1 :mrgreen: though quake3 for some reason still stored the whole animation data, doom3 changed over to skeletons though
the thing is its just math
how many bones in a typical model? a few hundred?
how many frames in a typical animation? 30-200
how many animations for a single model? 50-200

lets say (conservative estimate, games today are prolly lots higher)
100x100x100 = a million just for a single character!

I can see in my game its not the geometry that is taking up the storage space, its the animations
 
I've pointed above which exact claim of the Sony patent may be unique - that's software file compression in the OS and hardware decompression in the NVMe controller.

If Sony just got a patent for a regular NVMe controller with LBA to memory translation and hardware data compression, they would have hard time enforcing this patent on the PC, since all of this has already been implemented, decades ago.

Nope - that's about Windows 10 'Secured Core', secure key storage techology to suppress physical attacks on firmware, which was first implemented in the XBox One and PS4.

https://www.platformsecuritysummit.com/2019/speaker/chen/
This video explains more about customization of Xbox SOC than any other thing I've ever read or seen on Xbox. All the decisions, the always online discussion at the beginning. It all came down to trying to secure the console. Nuts. They seemed to have solved it, so I guess they don't need to solve this for next gen. Though it may explain why Xbox installs so slow compared to PS4.
 
This video explains more about customization of Xbox SOC than any other thing I've ever read or seen on Xbox. All the decisions, the always online discussion at the beginning. It all came down to trying to secure the console. Nuts. They seemed to have solved it, so I guess they don't need to solve this for next gen. Though it may explain why Xbox installs so slow compared to PS4.

Un-popular doesn't hurt with piracy either. Also every console being a devkit throws out the "homebrew" arguments.
 
A patent is the sum of its parts and is valid if is is an overall new method for achieving something. You can't take broad stroke approach with patents, otherwise every fundamental idea would result in one patent and everybody would be locked into cross-licensing for almost every broad implementation.
I'm not really interested in examining Sony application to determine if their method of converting LBA numbers to a memory address in a flash memory partition is new in relation to exising patents. I seriously doubt it though.
And even if the patent is granted, its claimed the 'software stack' wouldn't really apply to the Windows PCs and concurrent multi-processing file access.

lets say (conservative estimate, games today are prolly lots higher)
100x100x100 = a million just for a single character!
I don't really think there are hundreds of joints, or that each keyframe contains all the joints. There are maybe dozens of joints in the entire model, and only a few of them are used in each keyframe - so body part animations could be combined to form complex moves, with 'bone' movements restricted by a physical model. And facial animations can use morph target meshes each made with several thousand triangles (vertices).

But even if your 1M figure is valid, each joint in an animation frame includes an ID number, XYZ coordinates for offsets, and frame time - when using 32-bit numbers, that's 20 bytes per joint, or 20 MB in total for each model. Whereas a single 4Kx4K BC6H compressed texture is about 12 MB, and a single 2M triangle mesh is about 8MB.

it may explain why Xbox installs so slow compared to PS4.
Game executable and data are always encrypted on the media, and they are not decrypted when installed on the hard disk, and even when loaded into RAM.
 
Last edited:
Game executable and data are always encrypted on the media, and they are not unencrypted when installed on the hard disk, and even when loaded into RAM.
lol shoot, I even watched that part of the presentation.
 
I'm not really interested in examining Sony application to determine if their method of converting LBA numbers to a memory address in a flash memory partition is new in relation to exising patents. I seriously doubt it though. And even if the patent is granted, its claimed the 'software stack' wouldn't really apply to the Windows PCs and concurrent multi-processing file access.

You keep asserting this and it's not true. Anybody can add a completely new file systems to Windows and you're not bound by existing kernal I/O frameworks, you can deploy your own. The patent would a) prevent any custom approach but b) would also be patent barrier to Microsoft deploying the same technology directly in Windows. Apple's APFS does much to aid solid-state-to-RAM transfers, modern Macs have a custom SSD, custom controller and dedicated solid-state I/O framework and it would be astonishing if Microsoft are not working on similar improvements to Windows.

Patents are intended as a protection for the future.
 
I don't really think there are hundreds of joints, or that each keyframe contains all the joints. There are maybe dozens of joints in the entire model, and only a few of them are used in each keyframe - so body part animations could be combined to form complex moves, with 'bone' movements restricted by a physical model. And facial animations can use morph target meshes each made with several thousand triangles (vertices).
even my indie game has about 100 bones per model
heres something about uncharted 4
there are almost 1,200 bones (or moving parts) in his face. In Uncharted 3, his face consisted of around 250 bones.
Uncharted 4 had 1,200 story sequences, 34,000 animations, and 14.5 hours of animation time
A last gen game had 250 bones just in his face, I assume only in cut scenes they would use this and simplified in gameplay but from what I read uncharted 3 had 250 bones in drakes model during gameplay. This is a last gen game, sure near the pinnacle of last gen, uncharted 4 has like 60 different animations just to pick up stuff!
Sure every game aint a naughty dog game, but mate, I think you are underestimating how much space this stuff takes up. I can see it in my builds

eg https://docs.unity3d.com/Manual/ReducingFilesize.html
FileSizeOptimization.png

I have no idea what this program is (but its typical of what I see), you can see animations are far larger than the meshes
 
even my indie game has about 100 bones per model
heres something about uncharted 4


A last gen game had 250 bones just in his face, I assume only in cut scenes they would use this and simplified in gameplay but from what I read uncharted 3 had 250 bones in drakes model during gameplay. This is a last gen game, sure near the pinnacle of last gen, uncharted 4 has like 60 different animations just to pick up stuff!
Sure every game aint a naughty dog game, but mate, I think you are underestimating how much space this stuff takes up. I can see it in my builds

eg https://docs.unity3d.com/Manual/ReducingFilesize.html
FileSizeOptimization.png

I have no idea what this program is (but its typical of what I see), you can see animations are far larger than the meshes

In the link to Spiderman presentation I gave the animation for cutscene is multiple GB. This is not a negligible part of the game size at all, same for the audio.
 
You keep asserting this and it's not true. Anybody can add a completely new file systems
It is completely unrealistic to expect that console developers will use their limited resources to design a custom file system and implement robust Windows drivers for this filesystem with a suite of disk tools for formatting and error checking/recovery, then modify their game installers to repartition the user's disk and dedicate a separate partition to this custom file system - all this to improve game loading speeds in their Windows ports.

The patent would a) prevent any custom approach but b) would also be patent barrier to Microsoft deploying the same technology directly in Windows.
Even if this patent is granted, it wouldn't invalidate the design of the Installable file system interface, which has been available since at least OS/2 1.2 and DOS 4.0.
You cannot claim something as your invention when it's been implemented in a product that shipped 30 years ago - that's prior art and it's not patentable (or enforceable in courts).

Moreover, Sony's patent applicaiton does not really describe that custom read-only filesystem in any detail - so this part not enforceable either, because patents are granted for specific designs, and not just general ideas.

even my indie game has about 100 bones per model you can see animations are far larger than the meshes
Whatever suits you best, but IMHO facial animations in Unity are typically implemented with morph targets (aka blend shapes) based on motion capture, not thousands of facial bones. Also full body motion capture is extremely reaslistic for skeletal models with only a few bones.

In the link to Spiderman presentation I gave the animation for cutscene is multiple GB
That's a small percentange of the total size of all assets, which is around 350 GB.
 
Last edited:
It is completely unrealistic to expect that console developers will use their limited resources to design a custom file system and implement robust Windows drivers for this filesystem with a suite of disk tools for formatting and error checking/recovery, then modify their game installers to repartition the user's disk and dedicate a separate partition to this custom file system - all this to improve game loading speeds in their Windows ports.

I assume you mean PC game developers here? No, I wouldn't expect them to do this either, very few games have any specific hardware optimisations outside of compiler/CPU and some graphics effects. I'm not sure what this has to do with patent protection.
 
Back
Top