What is a hardware compatible CPU? *spawn

Discussion in 'Console Technology' started by phoenix_chipset, Nov 15, 2017.

  1. phoenix_chipset

    Regular Newcomer

    Joined:
    Aug 26, 2016
    Messages:
    546
    Likes Received:
    246
    It is a new cpu, but it could also be considered the same cpu with added enhancements. Not very important semantics unless you're dying to be right.

    Good on you for proving yourself wrong. You should now consider that some developers who are trying to eek out every bit of juice they can from the ps4 (for example) with a first party budget are using highly optimized code as well, and not with an "my opinion is far more likely than yours until you show me different" arrogance.
    [/QUOTE]

    Well, ok. I did. But it's highly probable for first party games, esp. later in the consoles' lives. But you're also assuming, and more so. Because if we can't know, my statement of "To be absolutely certain hardware is 100% compatible with older software without additional software workarounds, you have to have the same architectural design." Is absolutely correct. The people designing the playstation 5 couldn't possibly know how every single developer wrote their ps4 code, just as we can't. You following me here? Do you have the ability to change your stance when confronted or are you just going to keep power tripping?

    If not you might as well just ban me.
     
  2. Scott_Arm

    Legend

    Joined:
    Jun 16, 2004
    Messages:
    14,212
    Likes Received:
    5,651
    What does "to the metal" cpu code even look like? They use LLVM/Clang, x86. If they kept the same OS/APIs, I have a hard time imaging why a newer x86 CPU would be a big issue.
     
  3. MrFox

    MrFox Deludedly Fantastic
    Legend Veteran

    Joined:
    Jan 7, 2012
    Messages:
    6,488
    Likes Received:
    5,995
    You're pulling off a Leeroy Jenkins here.
     
    function likes this.
  4. Scott_Arm

    Legend

    Joined:
    Jun 16, 2004
    Messages:
    14,212
    Likes Received:
    5,651
  5. DSoup

    DSoup meh
    Legend Veteran Subscriber

    Joined:
    Nov 23, 2007
    Messages:
    12,793
    Likes Received:
    8,180
    Location:
    London, UK
    Most commonly, this is a reference to writing code using assembly language rather than a compiler.
     
    iroboto likes this.
  6. Scott_Arm

    Legend

    Joined:
    Jun 16, 2004
    Messages:
    14,212
    Likes Received:
    5,651
    Yah, but it's x86 assembly ... not CPU specific like microcode.
     
  7. MrFox

    MrFox Deludedly Fantastic
    Legend Veteran

    Joined:
    Jan 7, 2012
    Messages:
    6,488
    Likes Received:
    5,995
    I guess it changes nothing from a compatibility stand point since the opcodes used are based on the compiler target. Say, using AVX opcodes in assembly or enabling AVX flag in the compiler build is the same end result.
     
  8. iroboto

    iroboto Daft Funk
    Legend Regular Subscriber

    Joined:
    Mar 6, 2014
    Messages:
    10,864
    Likes Received:
    10,951
    Location:
    The North
    the only thing i can think of is perhaps race conditions. But I haven't heard of that type of thing happening in a CPU
     
  9. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    43,577
    Likes Received:
    16,029
    Location:
    Under my bridge
    It's not about being right but understanding what everyone's talking about. If someone using the word, "CPU," is meaning something different to someone else meaning the same thing, the discussion will always be at odds. That's why clarification is important.

    I wrote, "To my mind, devs are no longer hitting the CPUs that hard, but I may be wrong on that." That's a clear acknowledgement of an assumption. Not even an assumption, but a theory. You stated your POV as factual and that the alternative is ridiculous. You assumed you were right; I presented a belief that I recognised could be wrong.

    Yes and no. Technically, yes, to be 100% certain. But at the same time, the CPUs are designed to be perfectly compatible with old code. That's the very basis of a persistent, compatible architecture. Where you're not right is saying that we can be 100% certain that a second CPU is absolutely necessary to play the back catalogue. We don't know if devs are or are not accessing the CPU at such a low level as to make CPU compatibility an issue. For all we know, Sony have mandated against the use of inline intrinsics etc. for just that purpose. We don't know if the same code running on Zen would even fall over as, despite being a different implementation of x64 to Jaguar, AMD may have done a thorough job of implementing compatibility. In truth, to be absolutely certain your hardware is 100% compatible with older hardware, you'd have to include the older hardware, because any change introduces the possibility of a fault. Even PS2 had hardware incompatibilities within the same product due to different hardware implementations. So it seems reasonable to me to consider '100% BC' to be less than 100% identical as long as there's a very high probability of compatibility because the hardware is designed to be compatible.

    I've half a mind to. You've asserted yourself rudely, failed to adjust to conversations (not giving well defined answers), turned a polite discussion into a fight, and continue to be argumentative, incapable of saying, "sorry," for being so crass with your original attitude that started all this - even after accusing me of starting it, and then getting a lengthy response pointing out the progression of events which you've now just fobbed off by claiming I'm on a power trip.

    I was happily discussing the next-gen consoles. I was interested in discussing whether the CPU could be a problem even if the same x64 architecture, open to the possibility of you being someone knowledgeable and able to provide informed insight, only for you to make unjustified assertions and then ridicule those with different ideas as they tried to discuss the point. You have three options now:

    1) Start engaging in a decent discussion, including clarifying points and definitions where there's uncertainty so everyone's talking about the same thing, using technical reference where available, and acknowledging when a viewpoint is just a theory or feeling instead of asserting it as if fact or where one's knowledge isn't that strong.

    2) Walk away in a huff because Beyond3D is a crap forum run by Nazi tyrants.

    3) Have another volley of pointing out how wrong and stupid and overbearing and up-himself I am prior to a ban.
     
  10. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    43,577
    Likes Received:
    16,029
    Location:
    Under my bridge
    Getting back to the question, what is a hardware compatible CPU? I've always taken it to be a CPU in the same family made to be compatible, executing the same source code. Conceptually, there could be faults in implementation, but otherwise you can rely on a later version of a processor to run the legacy code without issue, certainly for an iteration or two of the family. the fastest variation in CPUs at the moment is probably ARM, with a crazy number of CPUs from different IHVs all running the same code. We do get compatibility issues even with hardware abstraction on Android, but then how much is CPU incompatibility and how much is the rest of the system?
     
  11. kalelovil

    Regular

    Joined:
    Sep 8, 2011
    Messages:
    558
    Likes Received:
    95
    http://www.agner.org/optimize/blog/read.php?i=838
     
    function and Shifty Geezer like this.
  12. mrcorbo

    mrcorbo Foo Fighter
    Veteran

    Joined:
    Dec 8, 2004
    Messages:
    3,947
    Likes Received:
    2,690
    My first thought is any CPU that is capable of running the same code unmodified and that, generally, it's enough that it's a CPU that shares the same instruction set(s). So, yeah, pretty much what you and most others seem to think.
     
  13. Scott_Arm

    Legend

    Joined:
    Jun 16, 2004
    Messages:
    14,212
    Likes Received:
    5,651
    I would say that in black box terms, hardware compatibility between CPUs would mean they can read the same machine code and transform the same input data into the same result. There may be performance differences because of how the black box is implemented, but it ultimately works.
     
    mrcorbo likes this.
  14. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    43,577
    Likes Received:
    16,029
    Location:
    Under my bridge
    Yeah, but in the console space there is the concern that low-level access will require stuff like same timings. AFAIK this hasn't been a real issue for a couple of generations, but that's mostly a gut feeling. I was shocked that GPU differences could have such an impact still and you couldn't just replace the GPU in PS4 with a more recent design, with Sony instead doubling up the same hardware. Would be nice to know for sure what options and limits MS and Sony have for CPU, though I'm happy to operate under the assumption that any x64/Zen will work just fine. And that's even if Sony care for BC! :runaway:
     
  15. mrcorbo

    mrcorbo Foo Fighter
    Veteran

    Joined:
    Dec 8, 2004
    Messages:
    3,947
    Likes Received:
    2,690
    I suspect that the move to massively multi-threaded engines and out-of-order execution in CPUs have made strict timing of code execution less of a possibility.
     
  16. Scott_Arm

    Legend

    Joined:
    Jun 16, 2004
    Messages:
    14,212
    Likes Received:
    5,651
    But GPUs don't standardize their ISA like CPUs do with x86 and AVX, or at least that's how I'd understand it. GPUs standardize based on an API and the underlying ISA changes significantly between families of GPUs. So if you optimize code for a specific GPU ISA, you're out of luck running the same code on a new GPU if the ISA changes. So intel and amd cpus understand the same machine code, but Nvidia and AMD gpus do not, and AMD/Nvidia GPUs do not even understand the same machine code across families of GPUs.

    Someone can correct, as I'm no expert.
     
    iroboto, Shifty Geezer and mrcorbo like this.
  17. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    43,577
    Likes Received:
    16,029
    Location:
    Under my bridge
    That's right. I just didn't appreciate how alien one GPU in a 'family' could be from its predecessor, but of course it's pretty obvious when you think how they need constant driver changes to hang it all together in PC space. We were talking about GCN 2 and GCN 3, and the implication was similar to Pentium 3 and Pentium4, that GCN 3 can run GCN 2 stuff just fine. Not at all! There isn't such a thing as a clear GCN 2 demarcation with similar architectural parts having differences enough to make them incompatible. That possibly speaks volumes to how much effort CPU makers go to to add improvements without screwing around with compatibility.
     
  18. DSoup

    DSoup meh
    Legend Veteran Subscriber

    Joined:
    Nov 23, 2007
    Messages:
    12,793
    Likes Received:
    8,180
    Location:
    London, UK
    The implementation of microcode in x86 is used sparingly. You can't write any semi-complex programme using only microcode, ergo the closest an OS or application developer can get "to the metal" is assembly language. Microcode is very rarely compatible across processor generations either. Intel write bespoke microcode for their CPUs

    It's smoke and mirrors. If cast you mind back across 80x86 over the decades, the core architecture which began as what we now term conventional CISC, has veered into RISC and VLIW and now remains an abominable mishmash of all three. There is absolutely nothing standard about the Instruction Set Architecture, only the Instruction Set and even then new x86 instructions are introduced frequently and old instructions deprecated and handled by processor firmware or just dropped completed

    The fact that you as an application of OS developer don't need to worry too much about this, is testament to Intel and AMD's designs. Remember XOP? Of course you don't, it was introduced in in Bulldozer and removed from Zen. There are plentiful examples of this. :yep2:

    CPUs really aren't more standard than GPUs it's simply that the abstraction between the application and hardware is handled at OS/API level for GPUs and by the processor itself for CPUs.
     
    #38 DSoup, Nov 18, 2017
    Last edited: Nov 18, 2017
    BRiT likes this.
  19. Pixel

    Veteran Regular

    Joined:
    Sep 16, 2013
    Messages:
    1,002
    Likes Received:
    472
    So developers using lower level API's like GNM aren't going to throw roadblocks into backwards compatability if they move to new cpu Ryzen, Ryzen+, etc or GPU archtectures that might appear in next gen consoles (Navi's successor?)?
    http://www.eurogamer.net/articles/digitalfoundry-how-the-crew-was-ported-to-playstation-4

     
  20. BRiT

    BRiT Verified (╯°□°)╯
    Moderator Legend Alpha

    Joined:
    Feb 7, 2002
    Messages:
    16,006
    Likes Received:
    14,990
    Location:
    Cleveland
    It all depends if the newer SOCs ever remove instructions or change the actual effect of it. On the cpuside I dont think AMD removed instructions when creating new CPUs, but some instruction timing could have taken longer to execute. On the GPU side Sony was very careful to use a frankenstein GPU approach on the 4Pro, by using the PS4 GPU and only bolting on very specific new insttuctions (dr fp16 and gradiants) while retaining all the old.

    If you look through AMD GPU tech papers for GCN just glance through the changes section. Any time they indicate instructions have been removed, that could lead to extensive trouble in a console setup like Sony's where there isnt as much of a layer of abstraction between the games and the hardware.
     
    Pixel likes this.
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...