Next Generation Hardware Speculation with a Technical Spin [post GDC 2020] [XBSX, PS5]

Discussion in 'Console Technology' started by Proelite, Mar 16, 2020.

  1. shiznit

    Regular Newcomer

    Joined:
    Nov 27, 2007
    Messages:
    338
    Likes Received:
    88
    Location:
    Oblast of Columbia
    I'm happy they are trying, and if you're right we can all benefit. I'm just not convinced the same cooling solution that was supposedly running into trouble with 2.0/3.0 fixed can now do 98% of 2.23/3.5 "almost all of the time." Unless the occasional hard drop below 2.0/3.0 buys you a lot of time at high clocks. That hasn't been my experience overclocking PC hardware.
     
    PSman1700 likes this.
  2. Pixel

    Veteran Regular

    Joined:
    Sep 16, 2013
    Messages:
    1,002
    Likes Received:
    472
    Could Sony be employing something similar to MS's Hovis Method where due to the silicon lottery each chip is has its power profile customized optimally for each chip? So in theory one person's PS5 could offer slightly different performance than another that requires slightly higher voltage to hit the same clock frequency. And going forward as yields and processes advance this gives Sony an opening where future PS5 chip revisions might be able to sustain that 2.23Ghz in most and eventually every scenario?
    Getting into the conspiracy theory realm Sony could bin the most capable chips for PS5's they send to pixel counting reviewers like Digital Foundry?
     
  3. BRiT

    BRiT Verified (╯°□°)╯
    Moderator Legend Alpha

    Joined:
    Feb 7, 2002
    Messages:
    15,905
    Likes Received:
    14,820
    Location:
    Cleveland
    That would be absolutely horrible for early adopters especially if a devkit offered better performance than retail units, so the devs think they're offering a great experience but a large subset of their consumers suffer.
     
    PSman1700 and Pixel like this.
  4. Proelite

    Veteran Regular Subscriber

    Joined:
    Jul 3, 2006
    Messages:
    1,458
    Likes Received:
    817
    Location:
    Redmond
    @MrFox's post got me thinking.

    Xsx's PSU is 315 watt, which indicates Xsx's TDP to be around 235W. This almost guarantees that the console is the highest TDP console in history, easily 35W more than the launch PS3s.

    If you look at the TDP's that the highest end GPU's draw, they're usually 300-350w.

    For a MAX console someday, we'll see a 400-500mm2 SOC, 300-350 watt TDP, 400-500 watt PSU. One can dream.
     
  5. Pixel

    Veteran Regular

    Joined:
    Sep 16, 2013
    Messages:
    1,002
    Likes Received:
    472
    Good point. Maybe with the increased industry use of VRS and dynamic resolution scaling the difference might not occur in framerate. One person's PS5 running a game at 1800p while your neighbor's runs at 1700p might not be noticed and problematic to anyone other than the very hardcore or Digital Foundry.
     
  6. psorcerer

    Regular

    Joined:
    Aug 9, 2004
    Messages:
    732
    Likes Received:
    134
    No, I'm suggesting that >4K assets are still needed even in 4K or lower final resolution.

    Yep. It uses texture space shading. I think it was said in every SFS description or manual.
     
    #1366 psorcerer, Mar 30, 2020
    Last edited by a moderator: Mar 30, 2020
  7. MrFox

    MrFox Deludedly Fantastic
    Legend Veteran

    Joined:
    Jan 7, 2012
    Messages:
    6,487
    Likes Received:
    5,991
    I believe it to be a mistake from DF. The lower current 5V rail is almost always shared. So it cannot be added to the total wattage. It's for USB and other 5v devices supply. The USB power today needs to be subtracted from the total as a fixed provisioning.

    Edit: just watch it again, it's two 12v rail actually, I thought it was 300W@12 and 5A@5V, but no, so I don't know. There's no way to figure it out without testing it. They split it in two but without any 5V. It's a weird split, why not a single rail? Maybe conduction emission failed so this isolates?

    The fat PS3 was higher in the peak consumption, it's still the king of max operating watts and had the correct 480W PSU for it. There are many early tests of some games reaching 230W, which would rise as the generarion advanced. It's the 40/80 that was 200W.

    I know because I have a 60GB ps3 from launch. :runaway:
     
    #1367 MrFox, Mar 30, 2020
    Last edited: Mar 30, 2020
    Silenti, goonergaz, chris1515 and 2 others like this.
  8. see colon

    see colon All Ham & No Potatos
    Veteran

    Joined:
    Oct 22, 2003
    Messages:
    1,626
    Likes Received:
    488
    The GPU is more bound by bandwidth and the CPU more bound by latency. I'm sure there might be some edge cases where Series X could be in a position where it's lower minimum bandwidth could be detrimental, but I think given it's higher potential bandwidth, it will be on par, worse case, with PS5.
     
  9. iroboto

    iroboto Daft Funk
    Legend Regular Subscriber

    Joined:
    Mar 6, 2014
    Messages:
    10,800
    Likes Received:
    10,815
    Location:
    The North
    why?
    Why not just front load > 4K assets. Downsample them using the best possible downsampling techniques to 4K and use those assets for your game to use.

    Why leave ultra large textures to eat up room and waste compute and bandwidth and downsample in real time?
     
  10. psorcerer

    Regular

    Joined:
    Aug 9, 2004
    Messages:
    732
    Likes Received:
    134
    Aliasing. Imagine that the angle for your 4K texture to the camera is 10 degrees.
    sin(10) is 0.17 which means your one texel is worth 0.17 of an on screen pixel now. You will need 5x resolution to stay ~4K on screen.
     
    Scott_Arm, chris1515 and Mitchings like this.
  11. blakjedi

    Veteran

    Joined:
    Nov 20, 2004
    Messages:
    2,985
    Likes Received:
    88
    Location:
    20001
    Ok my impression was rather that the smartshift solution was effectively a seesaw; you could hit 3.5ghz on the CPU or the 2.23 Ghz on GPU but not both simultaneously.

    I don't think that the ratio of power transfer of one to the other is linear as some sort of step function so theres no way to calculate what a frequency change in one means to the other.

    But I also didn't get the impression that there was any circumstance where both peaks could be either met or sustained at the same time.
     
    PSman1700 likes this.
  12. rokkerkory

    Regular Newcomer

    Joined:
    Sep 3, 2013
    Messages:
    364
    Likes Received:
    103
    With xsx cpu having 76mb of sram... what do we expect for ps5?
     
  13. 3dilettante

    Legend Alpha

    Joined:
    Sep 15, 2003
    Messages:
    8,365
    Likes Received:
    3,955
    Location:
    Well within 3d
    Sony specifically stated that the PS5 chips would behave identically to an idealized SOC. Every chip as part of its validation testing would be profiled to get per-chip values for the properties that go into how they rate silicon quality. How the units react to various levels of activity at all the points in the voltage/clock curve would be studied and used to calculate power consumption.
    In order to keep things consistent, Sony would settle on a fixed set of costs and force all chips manufactured to act as if they had the same silicon quality. Chips with better performance would behave as if they were average. Chips that could not meet that average would be discarded.

    Opting to vary PS5 performance based on the week of manufacture would damage its acceptance with developers and perhaps even customers. Developers couldn't trust whether their benchmarking was accurate, and might put in safety margins that negated anything Sony got from the clocking, and customers would be incentivized to not buy the console as long as possible, worrying that early revisions would be inferior.
    If a later PS5 chip could hold max clocks all the time, it would be programmed to act like a chip from week one.

    As far as using the Hovis method, which what little I've seen involves customizing board or package components to match the electrical variation of the chip:
    If following Sony's early discussion of wanting the PS5 to have a fast transition from the PS4 and is a volume launch, tweaking every board and package is a hinderance and a cost-adder versus the more niche Xbox One X. While I do not know the exact details of Sony's method, given that it should be forcing its silicon to meet a constant target, I would be curious if altering the mixture of electrical components per-console would conflict with firmware trying to get a consistent result.
     
    blakjedi, Arwin, disco_ and 6 others like this.
  14. iroboto

    iroboto Daft Funk
    Legend Regular Subscriber

    Joined:
    Mar 6, 2014
    Messages:
    10,800
    Likes Received:
    10,815
    Location:
    The North
    If you can't draw a triangle into that space, how would you texture that?
    Seems like a pointless case to justify 5x the resolution of a texture.
     
  15. psorcerer

    Regular

    Joined:
    Aug 9, 2004
    Messages:
    732
    Likes Received:
    134
    Err...okay. You draw a triangle, triangle is 4kx4kx4k pixels in object space.
    With our 10 deg camera it's now 4kx0.7Kx0.7k but the u/v coordinates are still the same.
    So, effect is the same.

    Obviously you can and will sustain these.
    I'm not sure why it's so hard to understand though.
     
    #1375 psorcerer, Mar 30, 2020
    Last edited by a moderator: Mar 30, 2020
  16. iroboto

    iroboto Daft Funk
    Legend Regular Subscriber

    Joined:
    Mar 6, 2014
    Messages:
    10,800
    Likes Received:
    10,815
    Location:
    The North
    are you referring to a texture being 4096x4096 as a 4K texture and a 8K texture as 8192x8192?
    Because I'm not referring to that. Though in retrospect I should have.

    I was thinking that if you stood the closest you could to a texture with the camera and the resolution of that texture still maintained 1:1 at native resolution with no stretching, then there would be no need to go higher. i don't actually know what texture size that needs to be for that to happen though.

    mind you, as texture sizes get higher in size it is murderous on bandwidth with Ansio.

    You're suggesting PS5 run 8K and 16K textures respectively?
    8192x8192 and 16384x16384?
     
    #1376 iroboto, Mar 30, 2020
    Last edited: Mar 30, 2020
  17. Rockster

    Regular

    Joined:
    Nov 5, 2003
    Messages:
    973
    Likes Received:
    129
    Location:
    On my rock
    Mesh shading should help improve saturation across a wider chip also.
     
  18. Rockster

    Regular

    Joined:
    Nov 5, 2003
    Messages:
    973
    Likes Received:
    129
    Location:
    On my rock
    Probably won't actually calibrate each console in anyway. The clock speed will be set based on the measured activity level of the CPU and GPU, which in turn was based on the power draw of the "model" unit at those activity levels. Power draw and heat will vary between consoles based on environmental factors and chip variations, but their clock speed and performance will be identical given the same compute load.
     
    #1378 Rockster, Mar 30, 2020
    Last edited: Mar 30, 2020
    PSman1700 and TheAlSpark like this.
  19. Rockster

    Regular

    Joined:
    Nov 5, 2003
    Messages:
    973
    Likes Received:
    129
    Location:
    On my rock
    Sony's statements would be inclusive of profiling data for all games across its platform, all 4000+. And out of that 4000+, there may be relatively few "Doom Eternal" type games from a utilization perspective since we don't have those statistics. But the "rarity of downclocking" statement could align with the significantly greater numbers of Indie and low budget games released on the platform versus actual usage or playtime in games likely impacted by downclocking, which consists of more optimized, higher load, AAA titles.
     
  20. Rockster

    Regular

    Joined:
    Nov 5, 2003
    Messages:
    973
    Likes Received:
    129
    Location:
    On my rock
    Unlikely, at 10% CPU and 10% GPU utilization, I would expect both to run at 3.5 and 2.223 respectively. At some utilization, they would each start to downclock. If the CPU is at a lower utilization, and the GPU is reaching its threshold, then smartshift can kick in and provide additional power to the GPU to avoid downclocking as soon. I question Alex's notion of developers actually having to choose a speed setting. It seems much more straight forward to simply allow the system to adjust dynamically in realtime and optimize performance in given scenes as normal. The only drawback there is mostly on the CPU side where certain game systems may expect a fixed clock. Perhaps that's the element they allow the developer to define.
     
    Picao84 likes this.
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...