Next Generation Hardware Speculation with a Technical Spin [pre E3 2019]

Discussion in 'Console Technology' started by TheAlSpark, Dec 31, 2018.

Thread Status:
Not open for further replies.
  1. see colon

    see colon All Ham & No Potatos
    Veteran

    Joined:
    Oct 22, 2003
    Messages:
    1,768
    Likes Received:
    746
    We don't know this because we don't know what MS's intention is for Lockhart is and we don't know it's specs. I'm an X owner, and a Pro owner, and I might be a Lockhart customer. The point is that Xbox One owners, the ones that own S and OG models, might not be upgrading to a system that's essentially a One X either. Why would they, especially if the used market or even new market would have X's available for less. There has to be a compelling feature to push people to upgrade, and traditionally that has been exclusive new software, and shiner graphics. It's a lot harder to convince gamers that they need a faster CPU if your screenshots look the same.
     
    GrimThorne, HBRU and iroboto like this.
  2. iroboto

    iroboto Daft Funk
    Legend Regular Subscriber

    Joined:
    Mar 6, 2014
    Messages:
    12,365
    Likes Received:
    14,121
    Location:
    The North
    I second this.

    I rather go a step further and introduce a concept that I don't think has been introduced yet since the discussions of 2 consoles began.
    Let's revisit the traditional idea of what if MS axes Lockhart/Anaconda and goes with a single SKU.

    They've not committed to anything official yet, so until up to the point where they actually decide to announce 2 separate xbox devices, it could be very well 1 device and we wouldn't know the better.

    This is why I found people trying to both leak/guess MS specs, after much thought seems like a lesson in complete futility.
     
    #2342 iroboto, May 23, 2019
    Last edited: May 23, 2019
    milk likes this.
  3. vjPiedPiper

    Newcomer

    Joined:
    Nov 23, 2005
    Messages:
    106
    Likes Received:
    61
    Location:
    Melbourne Aus.
    To me this looks like a pretty reasonable guess.
    Especially if MS introduce the High end first, cos early adopters will always pay more, get the title of "most powerful next gen console"
    then release the Lockart version 6+ months later....
     
    milk and see colon like this.
  4. DieH@rd

    Legend Veteran

    Joined:
    Sep 20, 2006
    Messages:
    6,270
    Likes Received:
    2,246
    IMO, the only believable stuff came from Jason Schrier from Kotaku, who wrote that Sony aims to beat Stadia specs.
     
    HBRU likes this.
  5. see colon

    see colon All Ham & No Potatos
    Veteran

    Joined:
    Oct 22, 2003
    Messages:
    1,768
    Likes Received:
    746
    The only way i see it working is if Anaconda launches first. High price, high specs, high performance. And then Lockhart launches later as the affordable midrange, like PC graphics cards launch. Unless Anaconda launches 2 years after Lockhart and is the premium X style system, but I see little reason for them to develop both at the same time. Part of X's winning recipe was that Microsoft looked at the software available, analyzed what was needed to run those games at 4K, and tuned the hardware to accomplish that goal. They would be losing that insight if they design the hardware too early.
     
  6. Jay

    Jay
    Veteran Regular

    Joined:
    Aug 3, 2013
    Messages:
    3,245
    Likes Received:
    2,495
    I've only seen romours that are worth discussing for sake of discussion and breaking down, doesn't mean I think they were true.

    You'd have to point me to this romour your thinking of and explain why it's credible eg known insider with multiple confirming sources.

    I have no idea about what TF they will be, maybe I'm low balling it, it's more a guide to how they relate to each other.
    Although I think 10TF isn't bad, not when games are coded to take advantage of the newer functionality that will be included, RPM, mesh shading, VRS, super quick asset streaming, ID buffer 2.0, zen, etc

    This wasn't a rumor, it was him putting out his own opinion, which he clarified.
    And even then it just says aims, when it was known what Stadia was, the only reason someone would say aim is if its close. Otherwise be easy to say it's more powerful.
     
    #2346 Jay, May 23, 2019
    Last edited: May 23, 2019
    ToTTenTranz likes this.
  7. Jay

    Jay
    Veteran Regular

    Joined:
    Aug 3, 2013
    Messages:
    3,245
    Likes Received:
    2,495
    That's not a reason to question Lockhart and 2 tier approach.
    That's the same question someone would be asking if it's worth buying PS5. Why not just buy PS4 Pro?
    Any new generation needs reasons to upgrade and most of the time that will be the games and how different they are.
    Although there is also QOL things to take into account, in the case of next gen loading times, etc

    How will games look much better compared to 1X on PS5 or Anaconda?
    The point is they won't massively, until games are coded for them, which then they will look better on Lockhart also.

    Lockhart is more than 1X with a better/upclocked CPU, its a next gen console with the pros and cons that go with having to sell that to people.
     
  8. Jay

    Jay
    Veteran Regular

    Joined:
    Aug 3, 2013
    Messages:
    3,245
    Likes Received:
    2,495
    Think I should have added a $450 discless Anaconda.
    That would make things pretty interesting.
    I'm guessing there's a growing amount of people who would rather save $50 than have an optical drive now.
     
    HBRU likes this.
  9. 3dilettante

    Legend Alpha

    Joined:
    Sep 15, 2003
    Messages:
    8,517
    Likes Received:
    4,572
    Location:
    Well within 3d
    Late followup on a few items:
    This might be one way AMD could leverage GCN's architecture to satisfy some of the objections Sony's audio engineer had to using the GPU for audio purposes, back in 2013.
    The diagram in the patent can be compared to the original GCN architecture diagram, where significant elements are in the same position and shape in both. What's stripped out of the proposed compute unit is most of the concurrent threads, SIMDs, LDS, and export bus.
    What's left is a scalar unit that runs a persistent task scheduling program that reads messages over a new queue and matches the queue commands to a table of task programs, then starts them executing.
    There's seemingly only one SIMD with a modified dual-issue ALU structure and a tiered register file. While there's no LDS, there's a different sort of sharing within the SIMD with a register access mechanism that allows for loading registers across "lanes" very easily, and a significant crossbar that can rearrange outputs in a programmed way. Some elements of the crossbar may be similar to the LDS, which automatically managed accesses between banks and handled conflicts.
    The vector register file is not allocated like standard GCN. Besides the different tiers, the execution model sets aside a range of global registers, and per-task allocations that are created and released in a manner similar to standard shaders.

    Once up and running, this CU would run a shader that essentially runs forever waiting to take host-generated messages directly, or read from a monitored address range. Rather than write to a standard GPU queue, have the command processor read it, engage a dispatch processor or the shader launch path, negotiate for a CU, go through the initialization process, set up parameters, wait for the CU to spin up, the host might be able to ping this custom unit with a series of writes or an interrupt. In the absence of CU reservation and real-time queues, GPUs can take tens of milliseconds, which Sony's audio group found unacceptable. Even with those measures, there's still a lot of the listed process that still has to happen to launch a shader on a reserved CU.

    Other objections were the generally large minimum concurrency requirements, where a CU's multiple SIMD architecture required at least 4 (or at least 8 realistically) wavefronts before it could be reasonably expected to get good hardware utilization, and Sony's HSA presentation indicated a hoped-for flexible audio pipeline that wouldn't need the batching of hundreds of tasks. This stripped-down CU would remove the extra SIMDs Sony wouldn't want to batch for, although it's not clear if there's still 64-element wavefront granularity or if that could somehow be reduced. Just doing that and reducing the amount of concurrent wavefronts could save a decent amount of area, perhaps a third. More area could be saved if the texture and load/store units could be reduced in size for a workload that didn't need as many graphics functions. Some space savings could be lost with the more complex register file and ALU arrangement.

    The queue method also provides a different and more direct way to get many low-latency tasks programmed into a software pipeline in a way that isn't as insulated as an API, while the task programs can still abstract away the particulars of CU.

    This could be appealing for one or more custom Sony audio units, or more so than the existing GPU TrueAudio setup.

    As for whether this could be relevant to Navi or a console, I did see one reference to shared VGPRs being added as a resource description for HSA kernels in changes added for GFX10.
    https://github.com/llvm-mirror/llvm...3380939#diff-ad4812397731e1d4ff6992207b4d38fa
    Although similar wording in a singe reference in many thousands of lines of code is slim evidence.


    One item to note is that the path between the unified last-level cache and the CUs supports at least several times the bandwidth of the memory bus, and the command processor, control logic, and export paths can all move values or signals amounting to hundreds to thousands of bytes per cycle to and from the CUs. Separating the CUs from their support infrastructure exposes all the on-die communications that had been considered internal to the die.
     
  10. chris1515

    Legend Regular

    Joined:
    Jul 24, 2005
    Messages:
    5,555
    Likes Received:
    5,303
    Location:
    Barcelona Spain
    #2350 chris1515, May 23, 2019
    Last edited: May 23, 2019
    Shortbread, megre, milk and 4 others like this.
  11. see colon

    see colon All Ham & No Potatos
    Veteran

    Joined:
    Oct 22, 2003
    Messages:
    1,768
    Likes Received:
    746
    Except in the case of PS4 owners upgrading to the hypothetical 10TF PS5, you have a marketing opportunity that goes something like "2.5x faster than Pro" and some wizbang screenshots showing a PS4 game running at twice the resolution, or with higher quality settings. The hypothetical Lockhart I was responding to was 5.4TF (less than One X), 12GB of RAM (same as One X), and a 750GB SSD (less storage than One X, but faster), and maybe not a disc drive. This would put it's graphics power in the neighborhood of One X, constrained by the same amount of memory, and only enhanced by a much faster CPU. It's much harder to market a "Next Generation" box that is the same or worse in paper specs. Furthermore, I know TF aren't all the same, but we are talking about related hardware here. AMD GPU's haven't really shown a performance increase per flop over the last few generations. If a game barely runs 10% slower on Lockhart than X that's a non-starter for many people. If it runs the same, it's a non-starter for many people. It has to be better across the board.

    Also, if the next generation is going to be real 4K, I think 5.4TF is too little, regardless of any upgrades in CPU. The 6TF GPU in One X is the best part of the system, and it's not really enough for 4k most of the time. And if you want to increase graphics fidelity, you would need more memory and more GPU power, not the same or less. If Sony launches with 10TF, Microsoft will get destroyed if they launch a console with half the GPU power, unless it's half the price.
     
  12. iroboto

    iroboto Daft Funk
    Legend Regular Subscriber

    Joined:
    Mar 6, 2014
    Messages:
    12,365
    Likes Received:
    14,121
    Location:
    The North
    More the reason why for MS, the debate should really be around Lockhart and not Anaconda. Discussing the value proposition of the base model that determines base performance and feature set, meant to sell the majority of SKUs, Lockhart is surprisingly quiet in terms of discussions and people only seem to care about Anaconda.

    Grave misplacement of focus on us considering which SKU holds all the dice.
     
    Metal_Spirit likes this.
  13. TheAlSpark

    TheAlSpark Moderator
    Moderator Legend

    Joined:
    Feb 29, 2004
    Messages:
    22,057
    Likes Received:
    8,265
    Location:
    ಠ_ಠ
    Pft... on the internet. :p
     
    Picao84, Jay and iroboto like this.
  14. Jay

    Jay
    Veteran Regular

    Joined:
    Aug 3, 2013
    Messages:
    3,245
    Likes Received:
    2,495
    So your saying people are buying into the promise of better looking games until they come.
    How is that different than buying Lockhart knowing you will get better games?
    If someone wants to buy a previous generation console knowing that they will miss out on next gen games that's fine.
    Games aren't going to look much better, not at the start compared to 1X on any console.

    Games won't run slower on Lockhart, hence why I said 15% raw performance improvement, if it's less than 15% then it will be clocked higher to at least match. The graphics power in BC mode is at minimum the same as 1X, not when running next gen games though. TF is only a single part of what makes up a consoles power(even for gpu which may facilitate RTRT, and other graphical improvements like mesh shading etc), but everything else you glide over contributes to the system. CPU, SS storage, these makes a big impact on what is capable and the experience.
    Normal consumers will not know or care how many TF Lockhart has.
    The fact is Lockhart will be able to play games that 1X can't, and play current games as good with a better overall experience.

    I don't see next gen about being the real 4k, whatever that may mean to people.
    But what you will get is better graphics regardless if you buy a Lockhart or an Anaconda when games are coded for them compared to 1X.
    I also never said Lockhart is aiming for 4k.

    I've got a 1X and I'm considering a Lockhart, so I'm sure people with base Xbox ones will also, plus the list I gave of people I feel would be interested.

    Mum's and dad's and people in general seem to be able to buy tv's, phones, fridges, headphones, you name it, which has multiple options yet for a console with 2 they loose the power to comprehend?
    • Entry option
    • Premium option
    Either with or without an optical drive.

    Consoles must be one of the very few electrical items that people will have trouble understanding entry and premium?

    Models get replaced all the time in consumer goods. This is less often than 90% of them.

    I'm not sure your not over thinking it to be honest?
     
    MBTP likes this.
  15. Jay

    Jay
    Veteran Regular

    Joined:
    Aug 3, 2013
    Messages:
    3,245
    Likes Received:
    2,495
    In terms of marketing, it's always around the premium product, then they say From x amount.

    The people following all this until release are generally interested in the premium product. Which has the higher Flops PS5 or Anaconda.
    Here we know that there's a bit more to it than that.
     
    #2355 Jay, May 23, 2019
    Last edited: May 23, 2019
  16. DieH@rd

    Legend Veteran

    Joined:
    Sep 20, 2006
    Messages:
    6,270
    Likes Received:
    2,246
    Interesting patents for offloading storage management to custom secondary hardware. With increased speeds, CPU utilization for managing file transfers can become noticeable.

    If they went with this, that would indeed deserve the moniker of a "custom SSD".
     
    mahtel and MBTP like this.
  17. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    43,576
    Likes Received:
    16,033
    Location:
    Under my bridge
    "Additional CPU." That one really could be Cell. Should be a few mm² at 7 nm, wouldn't need to be programmed by anyone other than Sony, and could do PS3 BC.

    My thinking here is how to get a Cell in there for PS3 BC. Though it is tiny, it still needs to be justified. Just as PS1's hardware did useful stuff for PS2, PS3's CPU could be beneficial for PS5 as a file processor. Considering the ARM in PS4 is too crap to enable background downloading, a tiny Cell much handle all that gubbins much better. And the value to the system, "plays all your old games," would be substantial in marketing terms.
     
  18. flutter

    Newcomer

    Joined:
    Apr 25, 2019
    Messages:
    17
    Likes Received:
    19
    Probably just an arm processor, like in the PS4 though, although much, much better.

    Do they even make cell chips anymore? How is the tdp?

    I was thinking that it would be a multipurpose processor where it can also PSVR's asynchronus reprojection right in the box, as well as assist background functions, but I'm not sure if something like that can do everything.
     
    milk likes this.
  19. ToTTenTranz

    Legend Veteran Subscriber

    Joined:
    Jul 7, 2008
    Messages:
    11,696
    Likes Received:
    6,579
    From that patent:

    There it is. Custom hardware for faster decrypting and decompression, as we've been saying it would be needed if significantly lower loading times were to ever be achieved.
    This is something that could take several years for gaming PCs to catch-up. Maybe it can be done through software if people have 16+ cores and massive amounts of RAM to use as storage scratchpad, but that would also require devs to make a massively parallel method for decompression/decryption.

    AFAIK Cell was co-developed by IBM to be produced on their fabs. Can TSMC even produce a chip with an embedded Cell without entering into IP infrigement?
    And if so, could they do it without an incredibly amount of man-months/years dedicated to significant re-engineering?
    An 8-core 3.2GHz Zen 2 wouldn't be able to emulate the Cell? With two 256bit FMAs, each Zen 2 core has twice the theoretical floating point throughput of a SPE at ISO clocks (25.6 vs 51.2 GFLOPs at 3.2GHz), and this is obviously with much better utilization due to modern schedulers and much larger caches. Would Sony even need a Cell to emulate the PS3 at this point?

    Using modern ARM cores would enable them to use them in standby mode with very low power utilization, whereas applying power-saving features to a Cell would again require massive engineering efforts.
    AMD is used to embedding ARM cores into their CPUs/APUs, but they haven't done anything remotely similar with Cell.

    And this is all assuming Sony would see a substantial market benefit in enabling PS3 BC into the PS5. What demand is there to play PS3 games that weren't already ported to the PS4?
     
  20. MrFox

    MrFox Deludedly Fantastic
    Legend Veteran

    Joined:
    Jan 7, 2012
    Messages:
    6,488
    Likes Received:
    5,996
    As much as I'd love a perfect compatibility for my ps3 games, the demand for it probably can't justify the engineering effort of shrinking the cell down to 7nm. It's hundreds of millions invested for a few percent of gamers who would buy the ps5 anyway. The majority of must-have titles on ps3 have a remaster on ps4, which is profitable for sony and the public is generally receptive. It just sucks for the more obscure titles which will never get remastered, but again, those have no demand, and the few gamers who really care about old ps3 games still have their ps3.

    With that said, they need a plan for PSNow and ps3 games, how would they scale access to this catalog of old games? Maybe the demand is so low that their current deployment is enough. Or server size x86 emulation is sufficient despite being power hungry.
     
Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...