Playstation 5 [PS5] [Release Holiday 2020]

Discussion in 'Console Technology' started by BRiT, Mar 17, 2020.

  1. Inuhanyou

    Veteran Regular

    Joined:
    Dec 23, 2012
    Messages:
    1,073
    Likes Received:
    249
    Location:
    New Jersey, USA
    Yeah cause vita games were entirely different skus. What nintendo did with switch was something only they with their catalog and unique place in the industry could have done. They deserve tons of credit for the foresight and prep that must have required of their teams, third parties and development workflow in general. Despite not aiming for power parity with the other machines long before that, it must have been quite an adjustment period.

    Going back to our PS5 cooling situation discussion, i think the hype around the cooling solution isnt just because Pro's was a cheap solution, but more importantly because of how Cerny has hyped it up and especially due to these extreme clocks they are promising will stay near constant most of the time. Even Xbox Series X which is relatively speaking much lower clocked had to fundamentally rethink console design to work around the cooling and heat dissipation issues they were facing, even with more leeway from a higher CU chip.
     
  2. BRiT

    BRiT Verified (╯°□°)╯
    Moderator Legend Alpha

    Joined:
    Feb 7, 2002
    Messages:
    15,397
    Likes Received:
    13,845
    Location:
    Cleveland
  3. MrFox

    MrFox Deludedly Fantastic
    Legend Veteran

    Joined:
    Jan 7, 2012
    Messages:
    6,418
    Likes Received:
    5,820
    The only possibility I can see for a ps5 portable is a separate platform with an independent catalog, but same OS and as binary compatible as possible with ps5 to make porting extremely easy, with no mandate tying the platforms together. Anything above 10W is DoA or niche. Ps4 portable isn't possible at 7nm.

    2c/4t zen2
    9 CU rdna2 (10 with 1 disabled)
    8GB lpddr5 (2 chips, 128bit @ 5500)
    256GB ssd
    Power cap at 10W

    Calling it PS5P with the same logo, copy paste the P with the same kerning, spending millions in typographic research.
     
    #1883 MrFox, Apr 26, 2020
    Last edited: Apr 26, 2020
  4. damienw

    Regular

    Joined:
    Sep 29, 2008
    Messages:
    501
    Likes Received:
    49
    Location:
    Seattle
    I think you mean 'nanobots'.
     
  5. anexanhume

    Veteran Regular

    Joined:
    Dec 5, 2011
    Messages:
    1,920
    Likes Received:
    1,262
  6. Sav

    Sav

    Joined:
    Apr 8, 2020
    Messages:
    2
    Likes Received:
    0
    Word is official playstation magazine uk staff have already seen and played stuff.
     
  7. Karamazov

    Veteran Regular

    Joined:
    Sep 20, 2005
    Messages:
    2,975
    Likes Received:
    2,426
    Location:
    France
    Word is i'm tired of waiting, it's taking way too long.
     
    Cuthalu, McHuj, AzBat and 1 other person like this.
  8. Sav

    Sav

    Joined:
    Apr 8, 2020
    Messages:
    2
    Likes Received:
    0
    The virus has delayed and pushed back everything.
     
  9. ThePissartist

    Veteran Regular

    Joined:
    Jul 15, 2013
    Messages:
    1,554
    Likes Received:
    506
    As far as I understand, from a traditional perspective when using a HDD a game like Wipeout would essentially have the entire track within memory at any one time including all textures (or potentially loading them in/out from the HDD) and trackside detail.

    HDD: if applied to a PS5 it’d be ~16GB (give or take a couple of GB depending on the OS) of track and associated data. There may be instances where textures are loaded in/out of RAM directly proportional to the speed of the HDD and movement of the vehicle.

    SSD: in a simple implementation could have 10GB assigned to static unchanging track and data, with 6GB assigned as variable usage RAM. This 6GB could then have significant amounts of trackside detail loaded in/out depending on distances of the viewpoint/vehicle. The 5.5GB/s of the SSD could then be used to change trackside detail for every one second of gameplay (likely flushed in/out at the periphery of the player). This *could* then in a best case scenario take a track from have 16GBs of data assigned to it, to having something like 370GB assigned to it (10GB + 6GB*60, for a 60 second lap length). It seems a little preposterous, but certainly a possible use of the SSD. The bottleneck would then become one of storage space and download speeds.

    Anyone more techy minded that can explain to me why this wouldn’t be the case?

    Maybe we’ll see more technology for procedurally generated textures in order to avoid taking up too much SSD.
     
  10. iroboto

    iroboto Daft Funk
    Legend Regular Subscriber

    Joined:
    Mar 6, 2014
    Messages:
    10,338
    Likes Received:
    9,957
    Location:
    The North
    Might be easier to say that both setups would likely be streaming textures as they needed it. but with the traditional platter, they needed to hold a larger size of the track in memory in case something happened and it would suddenly not have anything to render.

    Because it needs to hold a larger size of the track in memory, since the capacity of memory is the same (assume 16GB), then each texture must be of smaller size. So if we assume because the platter speed is 1/50th the speed of the SSD solution, it may need to hold 50x more track data at any time than the SSD. Just to ensure the game can run. But you're still held by the finite capacity of the VRAM.

    So 4K textures are about 8MB per texture. 8K textures are 32MB.
    The slow HDD platter could only afford to use 160KB sized textures as it needs to hold these textures for a bigger part of the track because streaming it in is going to be much slower and less immediate then the SSD solution. While the SDD solution can hold significantly less track data and support a much larger 8MB per texture in memory.
     
    PSman1700 likes this.
  11. ThePissartist

    Veteran Regular

    Joined:
    Jul 15, 2013
    Messages:
    1,554
    Likes Received:
    506
    Sounds like you're agreeing, but considering only textures? I would have thought you'd be able to include any data; sounds, textures, polygons, etc.

    The detail of the track would then be proportional to the speed of the streaming device (assuming the GPU can render that much detail).
     
  12. iroboto

    iroboto Daft Funk
    Legend Regular Subscriber

    Joined:
    Mar 6, 2014
    Messages:
    10,338
    Likes Received:
    9,957
    Location:
    The North
    Yea, I tried to use the words 'track loaded' as sort of trying to encapsulate everything.
    But there are some things, like audio, that even if it's not visible you need to keep around based on radius.
    Polygons and models, I don't think the data is very large on those relatively speaking, I could be wrong though.

    We had a developer allude to how memory management here is the most important thing for extracting performance out of the console. And SSDs will help in alleviating the amount of memory they need to reserve in the capacity just because they can rely on calling it in on shorter notice. It lets them push it to the edge, but I mean there are going to be upper bounds on how much you can hold on the screen, and how much can be happening because you'll eventually run out of VRAM memory capacity.
     
    BRiT and PSman1700 like this.
  13. PSman1700

    Veteran Newcomer

    Joined:
    Mar 22, 2019
    Messages:
    2,489
    Likes Received:
    771
    Wipeout seems a perfect candidate for streaming of the ssd, but doesnt that wear fast on the ssd? Imagine transferring at 5gb/s almost constantly, someone playing hours of sessions, day after day.
     
  14. iroboto

    iroboto Daft Funk
    Legend Regular Subscriber

    Joined:
    Mar 6, 2014
    Messages:
    10,338
    Likes Received:
    9,957
    Location:
    The North
    those are peak values. Just like TF, don't expect to hold the peak values for more than a blip.
     
  15. manux

    Veteran Regular

    Joined:
    Sep 7, 2002
    Messages:
    2,059
    Likes Received:
    922
    Location:
    Earth
    SSD's wear on writes. Reads should be of no concern when ssd durability is thinked about.
     
  16. MrFox

    MrFox Deludedly Fantastic
    Legend Veteran

    Joined:
    Jan 7, 2012
    Messages:
    6,418
    Likes Received:
    5,820
    With something like a 16K block size and the queues kept a healthy length, it can definitely maintain that peak as long as it needs. But yeah, it won't really do this during gameplay it will go up and down based on what happens in the game. I can certainly see one of the spiderman speedrun challenges keeping this constantly near the max.

    TF is limited by memory access, cache misses, branching, the algorithm being vectorizable or not, how many of those instructions are MACCs, etc... There's a million factors for ALU occupancy which I don't really understand, but storage is quite straightforward.
     
    #1896 MrFox, Apr 29, 2020
    Last edited: Apr 29, 2020
    disco_ likes this.
  17. iroboto

    iroboto Daft Funk
    Legend Regular Subscriber

    Joined:
    Mar 6, 2014
    Messages:
    10,338
    Likes Received:
    9,957
    Location:
    The North
    yea, with the game directories being mapped as memory, it should mainly be reading only. The only time you're going to write during gameplay is to page things out of vram to make space.
     
    PSman1700 likes this.
  18. iroboto

    iroboto Daft Funk
    Legend Regular Subscriber

    Joined:
    Mar 6, 2014
    Messages:
    10,338
    Likes Received:
    9,957
    Location:
    The North
    Game code is unlikely to hit it. Variation from 1 frame to the next is minimal. Once it's loaded into memory, you're not going to toss it if you have to use it for next frame.
     
  19. ThePissartist

    Veteran Regular

    Joined:
    Jul 15, 2013
    Messages:
    1,554
    Likes Received:
    506
    We could apply the same rules to any game, you'd have core assets that'd always remain in a pool of ~10GB RAM; character model and textures, physics, rendering engine, etc. Then essentially everything else is somewhat variable within a set radius of the player.

    It means detail would drastically increase.

    Scenarios where constant flushing of data would make the best use of the SSD speed. Devs would then presumably need to consider which variables are static/changeable and what the radius should be.

    Is that right? I would have thought youd be able to maintain those speeds, but really I have no technical knowledge to really comment further.
     
  20. Scott_Arm

    Legend

    Joined:
    Jun 16, 2004
    Messages:
    14,155
    Likes Received:
    5,439
    A game like wipeout is perfect for disk streaming, because you can only go two directions on the track. Predicting what you need to load is incredibly easy. If you buffer data for 1 or 2 seconds ahead of you on the track, then you can probably take good advantage of disk io. Racing games should look MUCH nicer this gen. Hopefully the track details will catch up to the cars. The cars honestly already look very good.
     
    ThePissartist likes this.
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...