Support for Machine Learning (ML) on PS5 and Series X?

Discussion in 'Console Technology' started by Shifty Geezer, Mar 18, 2020.

  1. DSoup

    DSoup meh
    Legend Veteran Subscriber

    Joined:
    Nov 23, 2007
    Messages:
    12,794
    Likes Received:
    8,190
    Location:
    London, UK
    You're talking about a specifically-designed gameplay mechanic, which can be easily programmed, I am talking about an unintended consequence of machine learning should you have self-adapting algorithms applies to NPC behaviour. This is also risk with machine learning algorithms that adapt to data/stimulus but often without understanding the data.

    I don't know if Forza's avatars are machine learning but this is a good example of an unintended consequence of adapting to stimuli that the algorithm programmes cannot foresee at the time of writing. And every single example of machine learning cocking up is the same.
     
  2. arhra

    Newcomer

    Joined:
    May 16, 2003
    Messages:
    184
    Likes Received:
    104
    So, I was combing through MS patents again, and this one popped up, which seems renegade to this thread:

     
  3. iroboto

    iroboto Daft Funk
    Legend Regular Subscriber

    Joined:
    Mar 6, 2014
    Messages:
    10,870
    Likes Received:
    10,965
    Location:
    The North
    Some really creative methods they have going on here.

    I’m not sure if this is just a patent or if they actually have something working. Would love to see this work.

    this coupled with the texture up-resolution at runtime, starting to feel like creativity in this field is absolutely worth the silicon investment in a larger chip and include some tensor cores,
     
    pharma, BRiT and PSman1700 like this.
  4. Mskx

    Newcomer

    Joined:
    Apr 20, 2019
    Messages:
    181
    Likes Received:
    188
    This has come up on here a couple of times but... Remember back in February when James Gwertzman from Playfab(owned by Microsoft) said this?
    From here: https://venturebeat.com/2020/02/03/...t-generation-of-games-and-game-development/2/

    Kinda seems relevant to that patent.
     
    #144 Mskx, Jul 2, 2020
    Last edited: Jul 2, 2020
  5. Scott_Arm

    Legend

    Joined:
    Jun 16, 2004
    Messages:
    14,212
    Likes Received:
    5,651
    Trading compute power for bandwidth. May be an interesting choice. Wonder how much it costs to upscale a 2048x2048 texture to a 4096x4096. I'm assuming you'd only ever upscale to your mip0 (4k or 8k) textures and any mip1, mip2 etc would be streamed off the disk. I can't see upscaling more than 1 mip level.
     
    zupallinere likes this.
  6. iroboto

    iroboto Daft Funk
    Legend Regular Subscriber

    Joined:
    Mar 6, 2014
    Messages:
    10,870
    Likes Received:
    10,965
    Location:
    The North
    Fairly massive savings I think.

    4194304px upscaled to 16777216px
    starting at 4x bandwidth savings if we're looking at transfer rate.

    So if you pull in a tile at 2048x2048 say mip1, but now you get close enough to that tile that it needs to adjust to mip0. You don't need to recall mip0 from disk, you just compute mip0, save that result and use that one. That skips the bandwidth requirement entirely for I/O and can be done via async compute.
     
    #146 iroboto, Jul 2, 2020
    Last edited: Jul 2, 2020
    Ronaldo8 and function like this.
  7. Jay

    Jay
    Veteran Regular

    Joined:
    Aug 3, 2013
    Messages:
    2,737
    Likes Received:
    1,844
    There was a lot of mud thrown at Nvidia tensor cores, but could end up being a huge benefit, that just wasn't visible at the start.
    Not just talking DLSS2.0,but DX12 ML makes use of them also.

    It's just a shame that we don't really know how much performance is required for all this ML stuff and how much real world benefit the reduced precision on xsx gives.
    Even dedicating a 1TF could end up being a decent net gain overall.
     
  8. iroboto

    iroboto Daft Funk
    Legend Regular Subscriber

    Joined:
    Mar 6, 2014
    Messages:
    10,870
    Likes Received:
    10,965
    Location:
    The North
    I'm glad to see it evolve. Those RTX cards could really see a massive hey day as it's usage continues forward.
     
  9. Scott_Arm

    Legend

    Joined:
    Jun 16, 2004
    Messages:
    14,212
    Likes Received:
    5,651
    But what's the cost on the gpu side. That's what I'm curious about.
     
  10. iroboto

    iroboto Daft Funk
    Legend Regular Subscriber

    Joined:
    Mar 6, 2014
    Messages:
    10,870
    Likes Received:
    10,965
    Location:
    The North
    Depends on the type model, the depth and size. The more you attempt to do and the higher the accuracy, the larger the network, thus longer processing time.

    It can be fairly intensive I suspect and someone will need to look at the maximum number of textures that need to be computed per frame. If it's 1 tile/texture perhaps there is more than enough compute and bandwidth. If it's 100/1000 tiles changing over from mip1 to mip0, suddenly power is going to matter a lot more.
     
  11. milk

    milk Like Verified
    Veteran Regular

    Joined:
    Jun 6, 2012
    Messages:
    3,449
    Likes Received:
    3,331
    I'd like to see a god game where you have to control your own ML AI driven Peter Molineux and keep him for confusing a media interview with an internal brainstorm meeting.
     
    Rootax likes this.
  12. function

    function None functional
    Legend Veteran

    Joined:
    Mar 27, 2003
    Messages:
    5,367
    Likes Received:
    2,876
    Location:
    Wrong thread
    Using sampler feedback to predict upcoming changes, and using SFS to seamlessly blend in any changes which missed the current frame due to being "over budget", might be a good way to distribute the workload over time. Same way it can distribute loads from the SSD over time.

    It'd pretty cool is SF / SFS could be used for both paging texture tiles from SSD over time and ML upscaling over time.

    Infact ... I wonder if you could use sampler feedback to indirectly gauge when to load higher lod models in. For example, if you're about to need a higher lod texture on a model, you might also know you'll need a higher lod model too. And if you're running a dynamic resolution system, and/or allow for different target resolutions (e.g. PC or XSX vs Lockhart), maybe such a tied-together LOD system could automatically adapt based on whatever the resolution happened to be at any given time. Tune it once for an optimal pixel size vs detail level, and just let it do its thing wherever it's doing it.
     
    Jay, PSman1700, Ronaldo8 and 3 others like this.
  13. Ronaldo8

    Newcomer

    Joined:
    May 18, 2020
    Messages:
    233
    Likes Received:
    232
    Luckily, the XSX has lots of CUs to throw at it.
     
    PSman1700 likes this.
  14. Ronaldo8

    Newcomer

    Joined:
    May 18, 2020
    Messages:
    233
    Likes Received:
    232
    I've said it before and I'll say it again: invest in tensor and RT cores.
     
  15. mpg1

    Veteran Newcomer

    Joined:
    Mar 5, 2015
    Messages:
    1,988
    Likes Received:
    1,575
    How much of the install size would be shaven off if a game ships with low-res textures instead of high-res textures?
     
  16. Jay

    Jay
    Veteran Regular

    Joined:
    Aug 3, 2013
    Messages:
    2,737
    Likes Received:
    1,844
    Was just thinking they could on XO family ML back to gpu format texture during download. So smaller quicker downloads. Install size the same.

    On Xbox Series hardware, install remain as jpg and ML done in realtime. So will then have a much reduced install size also. Saving the precious SSD space.
    Could even do the highest ones during download on Lockhart, if theres latency or TF deficiency doing it in real time.

    Probably only happen for the odd 1P game for a long time though.
    This all sounds good, here's hoping.
     
    #156 Jay, Jul 3, 2020
    Last edited: Jul 3, 2020
  17. iroboto

    iroboto Daft Funk
    Legend Regular Subscriber

    Joined:
    Mar 6, 2014
    Messages:
    10,870
    Likes Received:
    10,965
    Location:
    The North
    There's no ML that will convert Jpeg to BCn, I read that patent wrong... rather I didn't read past the description.
     
  18. Jay

    Jay
    Veteran Regular

    Joined:
    Aug 3, 2013
    Messages:
    2,737
    Likes Received:
    1,844
    I can't remember the details either, too be fair.
    But the important bit is from a non gpu texture format to a supported one.
    I'm talking about the download package, install size, runtime and when the inferencing could be done.
     
  19. iroboto

    iroboto Daft Funk
    Legend Regular Subscriber

    Joined:
    Mar 6, 2014
    Messages:
    10,870
    Likes Received:
    10,965
    Location:
    The North
    hmm indeed.

    this is why I hate patent diving. There are so many, and so many related ones.
    This patent was linked for instance:
    http://www.freepatentsonline.com/y2019/0304138.html

    and I thought it was the same as this patent here:
    https://patentscope.wipo.int/search....wapp2nA?docId=US253950223&tab=PCTDESCRIPTION

    and I started writing about the second one, in reference to the first one. Looked like a bumbling idiot. So yea, I get where you're going. I'm not going to read either, this stuff is mentally exhausting.
     
    Jay likes this.
  20. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    43,577
    Likes Received:
    16,029
    Location:
    Under my bridge
    Imagine trying to be a judge and decide what patents are and aren't infringed! If it were me, I'd disqualify all patents for being written in gobbledegook! If its not obvious what you're patenting, you aren't really patenting anything.

    Patent culture has completely corrupted the idea into opportunistic obfuscation.

    Maybe ML can be purposed to translate patent speak into real language?
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...