Next Generation Hardware Speculation with a Technical Spin [post GDC 2020] [XBSX, PS5]

Discussion in 'Console Technology' started by Proelite, Mar 16, 2020.

  1. dobwal

    Legend Veteran

    Joined:
    Oct 26, 2005
    Messages:
    5,432
    Likes Received:
    1,492
    I think decompression/compress on SSDs may be encumbered by patents. Seagate has Durawrite technology but its from an acquisition of a company that bought out Sandforce the original owner of the tech who used to provide SSD controllers. Outside of Seagate there doesn't seem to be new SSDs with compression tech. All the older SSDs with compression seem to be Sandforce controller based.
     
    PSman1700 likes this.
  2. pjbliverpool

    pjbliverpool B3D Scallywag
    Legend

    Joined:
    May 8, 2005
    Messages:
    7,829
    Likes Received:
    1,142
    Location:
    Guess...
    How about having a hardware decompression block on the GPU that only handles that data which doesn't need to be processed by the CPU, e.g. textures? According to Microsoft via Digital Foundry, that makes up the large majority of game streaming data and so would have a big positive impact on both IO bandwidth and disk footprint. It might also sufficiently offload the CPU so as to make it potentially feasible to decompress the rest of the disk data via software.

    In terms of standards, it could work something like DirectX. DirectStorage is the API which potentially any system can take advantage of, but like DirectX, if you want full hardware compliance you need the decompression block on the GPU which supports whatever standard Microsoft decides.
     
    rokkerkory and PSman1700 like this.
  3. Frenetic Pony

    Regular Newcomer

    Joined:
    Nov 12, 2011
    Messages:
    496
    Likes Received:
    216
    There's too many texture compression formats for an asic to reasonably handle, x-zips, crunch, blah blah blah. Hell if you really wanted to use virtualized texturing the hardware decompression isn't even that useful as you'd want to go through crunch as well, a much better image only compressor to get install size down.

    I can't find any mention of an on ssd decompression Bloc being patented despite searching. In part I suspect because it's not needed for most applications, honestly the way the new consoles use it is a bit weird and fairly specific. But I could easily have missed one.
     
  4. dobwal

    Legend Veteran

    Joined:
    Oct 26, 2005
    Messages:
    5,432
    Likes Received:
    1,492
    Texture compression does not seem to relevant. BCn is supported by both PC vendors. The solution seems to be a two stage compression process, lossy texture block compression followed by lossless compression. Lossless cuts bandwidth consumption in half from SSD to VRAM. Textures are stored in VRAM in lossy format and are decompressed on the fly on the gpu.

    Check Durawrite from Sandforce. Compression reduces write amplification on SSDs and improves performance as there is less capacity being used on the SSDs.
     
    #2684 dobwal, May 28, 2020
    Last edited: May 28, 2020
  5. DSoup

    DSoup meh
    Legend Veteran Subscriber

    Joined:
    Nov 23, 2007
    Messages:
    12,738
    Likes Received:
    8,127
    Location:
    London, UK
    Microsoft establish software (OS/API) standards for PC compatible hardware running Windows, it's not their role to mandate what hardware standards everybody's personal computer should compromise - especially when they have a vested interest in pushing their own standard (BCPack). If they mandate BCPack as a standard, somebody is profiting from this, either Microsoft from licensing, or somebody building chips that motherboard manufacture now have to incorporate or forcing Intel, AMD and other chipset manufacturers to license Microsoft's tech?

    Not happening. :nope:
     
    egoless, pharma and Ronaldo8 like this.
  6. Ronaldo8

    Newcomer

    Joined:
    May 18, 2020
    Messages:
    233
    Likes Received:
    232
    This is absolutely true. Windows will fit the hardware and not the reverse. That's how this OS has survived through 30 years worth of hardware changes. Thus, the idea of some type of XVA-compatible hardware does not seem to make sense. How is MS supposed to elicit compliance in the first place?
     
  7. function

    function None functional
    Legend Veteran

    Joined:
    Mar 27, 2003
    Messages:
    5,340
    Likes Received:
    2,797
    Location:
    Wrong thread
    It's not exactly the same thing, but S3TC and 3Dc ended up becoming a standard part of PC (and console) GPU hardware.

    There's certainly precedent for vendor specific hardware decompression support being implemented cross all hardware vendors products when there's a benefit.

    If BCpack is useful and cheap to implement, and MS add it to DX with a CPU fallback, and they make sure there are no barriers or risks for other vendors using it, it's at least within the realms of possibility. Or for something else like it.

    Best place to have it on PC would probably be on the GPU anyway, so you maximise savings at all stages before the GPU including the PCIe bus.
     
  8. PSman1700

    Veteran Newcomer

    Joined:
    Mar 22, 2019
    Messages:
    2,686
    Likes Received:
    896
    I think there's no reason for concern (yet). New NV and AMD hardware isn't out yet, so ain't direct storage on pc and 7gb/s ssd's. Also we have not all the details on how much harddrive streaming is required for next gen. But seeing the UE5 demo it doesn't seem that much. It supposedly ran on a current day SSD/laptop solution, and that at a higher framerate.
     
  9. Inuhanyou

    Veteran Regular

    Joined:
    Dec 23, 2012
    Messages:
    1,114
    Likes Received:
    286
    Location:
    New Jersey, USA
    We dont know anything about that beyond rumors and speculation. Better not use it
     
  10. PSman1700

    Veteran Newcomer

    Joined:
    Mar 22, 2019
    Messages:
    2,686
    Likes Received:
    896
    It’s the same the other way around.
     
  11. Lucid_Dreamer

    Veteran

    Joined:
    Mar 28, 2008
    Messages:
    1,210
    Likes Received:
    3


    Jump to around the 30 minute mark. It seems this approach, from GG, runs into a ton of flushes.
     
  12. ThePissartist

    Veteran Regular

    Joined:
    Jul 15, 2013
    Messages:
    1,589
    Likes Received:
    537
    You're like a broken record. Do we have to go around this for another few pages?
     
    disco_ likes this.
  13. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    43,577
    Likes Received:
    16,028
    Location:
    Under my bridge
    To be clear, you mean the graphics card, decompressing from RAM (or storage) when data is copied to the VRAM?
     
    PSman1700 and function like this.
  14. function

    function None functional
    Legend Veteran

    Joined:
    Mar 27, 2003
    Messages:
    5,340
    Likes Received:
    2,797
    Location:
    Wrong thread
    That's my thought, yeah. I expect it'd be be too slow / expensive to do it while accessing texels for rendering, and you'd have to muck around with ... all kinds of stuff in the GPU (over my head) and waste silicon in every unit that had to work with this new format (every shader engine? Every texture sampler? Every memory controller?).

    ... But, if when the data comes across the PCIe bus you decompress it on the fly to its regular compression format (such as DXTC) and then stick it vram ready for use, that'd seem ideal IMO. That way the texture samplers don't need any changes, things carry on working as they do now with whatever formats they use now.

    If you have a new texture compression format I think it makes most sense to support in on the GPU. That way more general purpose processors like your flash controller, your CPU etc don't need to bother, much as they don't care about DXTC. They just see reduced traffic. And any texture pool in main ram can effectively store (e.g.) twice as much, reducing the chance you're left waiting for data from the SSD.

    Whether its BCPack or some other open industry standard, I think we're probably due for something that reduces the cost of getting textures to the GPU, and that makes preparing them for use as low cost (latency, silicon area and resources) as possible. A small hardware block on the GPU might seem extreme, but it's an awful lot better (and probably faster and less power hungry) than using up-to several CPU cores on the job!
     
    milk, pjbliverpool and PSman1700 like this.
  15. DSoup

    DSoup meh
    Legend Veteran Subscriber

    Joined:
    Nov 23, 2007
    Messages:
    12,738
    Likes Received:
    8,127
    Location:
    London, UK
    Yup, lots of of industry standards began as proprietary APIs but the difference is they evolved naturally and were discussed and adopted by consensus by the industry forum responsible - and Microsoft are a vital part of all of these forums.

    Microsoft forcing their standard on everybody else, even if it's free, limits innovation and open competition in this field.
    Doing so effectively prevents competing standards from gaining traction. I don't think anybody would argue this is good.

    What about data in main RAM used by the CPU? Having this functionality on the GPU creates a problem. I personally think the logical location for a decompressor is in the modern equivalent of the southbridge because all data pulled from I/O goes through this bus. This is point of a dedicated I/O controller. :yes:
     
    #2695 DSoup, May 28, 2020
    Last edited: May 28, 2020
    egoless, milk, function and 1 other person like this.
  16. mrcorbo

    mrcorbo Foo Fighter
    Veteran

    Joined:
    Dec 8, 2004
    Messages:
    3,947
    Likes Received:
    2,690
    "This is what we support in Windows and here's what you need to implement support it. How you do this is up to you. You are free to implement your own solution, but you're on your own if you do so (unless it gets popular then we might adopt it)." Seems pretty much their standard approach to me.
     
    function, DSoup, AzBat and 1 other person like this.
  17. Jay

    Jay
    Veteran Regular

    Joined:
    Aug 3, 2013
    Messages:
    2,705
    Likes Received:
    1,811
    Provide API, with a reference implementation.

    If someone wants to use a different compression format their free to. Could even use cpu or gpu to do the decompression.
    In the end the reference would become the defacto standard though, as the games would be shipping in that format.
    But it would leave it open if a better compression method came along.
     
    milk and pjbliverpool like this.
  18. mrcorbo

    mrcorbo Foo Fighter
    Veteran

    Joined:
    Dec 8, 2004
    Messages:
    3,947
    Likes Received:
    2,690
    It's fine on console, though? Because those platforms are all about forcing things on users in exchange for consistency and ease of use and all the other benefits that result from ceding control to a single entity.

    The open nature of the PC platform means no one, not even MS, gets to force a standard. Whatever they come up with will be in cooperation and consultation with their hardware partners, at least.
     
    AzBat and PSman1700 like this.
  19. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    43,577
    Likes Received:
    16,028
    Location:
    Under my bridge
    I think you've actually inferred that argument yourself in past posts - what's the use of open competing protocols if they can't be adopted because they aren't universally standards. MS (or whoever) stepping up and saying, "do it this way," at least gets something usable working, as opposed to sitting around waiting for committee after committee to finally settle on something. MS mandating ZLib and BCPack in hardware and that being adopted by everyone may be a worse possible solution that some ideal other compressor, but will be far better than the realities of conflicting, underused proprietary, competing solutions.
     
    PSman1700 and mrcorbo like this.
  20. pjbliverpool

    pjbliverpool B3D Scallywag
    Legend

    Joined:
    May 8, 2005
    Messages:
    7,829
    Likes Received:
    1,142
    Location:
    Guess...
    Is this exactly what they're already doing for GPU hardware standards via Direct3D?

    Sure those standards are no doubt set with a level of input from NV, AMD and Intel but there's no reason why the same couldn't apply here.

    EDIT: re-reading what you'd said, you may have minsunderstood my meaning. I'm not suggesting this is something MS would mandate, but rather they set it as a hardware standard for full DirectStorage compliance just as they do with any other D3D hardware feature.

    The sell would be something like "With Microsoft DirectStorage, any modern SSD and graphics card can take advantage of software improvements that help to speed up communication between your SSD and GPU. However if you have a full DirectStorage compliant GPU, you can also take advantage of hardware accelerated IO performance as well as smaller game install sizes"

    Microsoft can then back this up by shipping all UWP games with the option to either install the game with BPACK compressed textures (how it would download), or decompress them at install time for a larger install footprint and slower IO. I guess using software decompression or more loading screens would also be options if they were more appropriate for the specific game.

    Other game developers would be free to follow suit or not but provided there was no or minimal licensing cost to them for using the encryption then it seems like an easy win.
     
    #2700 pjbliverpool, May 28, 2020
    Last edited: May 28, 2020
    PSman1700 likes this.
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...