Forbes: AMD Created Navi for Sony PS5, Vega Suffered [2018-06] *spawn*

Discussion in 'Graphics and Semiconductor Industry' started by BRiT, Jun 12, 2018.

  1. Geez the level of salt in this thread, starting with whomever wrote that title, is way off the charts.

    Ease up people. No one here ever remotely suggested Sony's engineering is better than the others'.
    There's no need to feel threatened by any of the rumors/news/comments, nor to kneejerk react in accordance. No one is saying "the PS5 is gonna CRUSH XBOXTwo because Navi magic saucezz!!111oneone."

    No, Sony is not designing Navi. But Sony is in a position to be more demanding into the details of their Playstation SoC than Microsoft because they have IC design teams of their own, who even to this day are designing custom SoCs and image processors.
    And this does not mean the PS5 has an intrinsic advantage versus xboxtwo.
    In fact, the last time Sony got deeply involved in processing hardware (Cell) Microsoft actually got away with a much better deal out of it (Xenon).
    Sony demanding XYZ features for Navi doesn't mean Microsoft won't get access to it, if they find it to be worth the money/transistors/die-area, and the same could even work in reverse.


    Dude.. Jason Evangelho was a Technical Marketing Specialist for RTG in 2016-2017. Is it that hard to believe he has more than a couple of people in his contacts list with insider knowledge?

    His article (which mostly reports on RTG being short on staff while working on Vega because Lisa Su redirected their engineers to work on Navi for PS5) falls perfectly in line with RTG's slow cadence of GPU releases, Raja leaving RTG on a lower note and even tidbits that @digitalwanderer mentioned here in the forum about the whole saga. I have very little reason to believe the guy pulled those things out of his ass.


    Which ones? Care to source?
    All I can find are news about Microsoft trying to hire A.I. hardware engineers as recently as June of last year, which was succeeded by reports of Microsoft trying to buy A.I. hardware from Huawei a couple of months later.
    What GPU did Microsoft work on?


    Claiming Sony shouldn't use their GPU talent for their benefit because the PS2 wasn't the most powerful console of the 6th generation, or Cell didn't work out great for videogames is a huge strawman.
    Cell was an ambitious (therefore very risky) project that tried to fuse CPU and GPU designs and it mostly failed, sure.
    But the PS2's hardware was anything but a failure.

    The Graphics Synthetizer (53 million transistors 150MHz) was originally developed for the 250nm process in a time where SDR RAM was very slow (best they could do was 3.2GB/s) so it needed eDRAM taking up die area and transistors to reach its performance target. So yeah, it was the equivalent of Xenon's backend+eDRAM chip so everything else had to be done on the CPU's vector units.
    The NV2A (57 million transistors 250MHz) was a 150nm chip so it could clock significantly higher, and it came out after DDR became available which allowed a significant bandwidth advantage foregoing the need for eDRAM. This is akin to the PS4 GDDR5 vs. XBOne DDR3+eDRAM debacle. Because of that, it could spend transistors on all the rest (pixel and vertex shaders, T&L unit) while keeping control of the costs.

    And despite all that, the Emotion Engine + Graphics Synthetizer combo proved to be excellent at scaling down in cost and power (much more than the Xbox which had CPU and GPU made by different vendors), and the cost advantage allowed it to sell more than 150M units..
    The latest Slimline PS2 with the unified EE+GS chip was an awesome piece of hardware for its time, IMHO.
    There's nothing in the PS2 that Sony should be ashamed of.


    What's there to differ? What I see in there is a couple of projects where IHVs did hardware design while Microsoft worked on the software implementation.
     
  2. BRiT

    BRiT (>• •)>⌐■-■ (⌐■-■)
    Moderator Legend Alpha

    Joined:
    Feb 7, 2002
    Messages:
    20,516
    Likes Received:
    24,424
    Pixel and vipa899 like this.
  3. vipa899

    Regular

    Joined:
    Mar 31, 2017
    Messages:
    922
    Likes Received:
    354
    Location:
    Sweden
    Success had not much to do with its hardware, it was the least performing and the hardest to code for.

    And still those fantasy that amd designs their gpu for sony, lol.
     
  4. TheAlSpark

    TheAlSpark Moderator
    Moderator Legend

    Joined:
    Feb 29, 2004
    Messages:
    22,146
    Likes Received:
    8,533
    Location:
    ಠ_ಠ
    Hololens :?:

    Anyways, I suppose it depends on how stringent into semantics you want to be. They did ask for a number of customizations in Durango and Scorpio (command processor/DX12-related + Jaguar bits + backward compatible things), but they have admitted to designing Shape. On a side note, it'll be interesting to see where that goes next time since AMD has seemingly shifted away from particular audio accelerated HW in their desktop cards to reserving compute units since Polaris.

    I'd be curious if there's enough work on the audio side of things that it'd be worthwhile to allocate the finite GPU resources on console and try to fit it into the compute queues of more modern engines or if they might need a lot more work (a.k.a. no time for developers to bother with).
     
    Deleted member 13524 and BRiT like this.
  5. Where exactly is this assertion?

    The only assertion that is being repeated is Navi getting input from Sony and Lisa Su redirecting RTG engineering efforts to respond to that input.

    It was the earliest hardware, in a time when GPUs were going through giant leaps every single year (1999 - S3TC/DXTC and bump mapping; 2000 - T&L and multitexturing; 2001 - Pixel&Vertex Shaders).
    Everything was hard to code for at that time, and if the PS2 was that hard, it would have fallen like the Saturn did.


    Yes, god forbid that AMD would believe the world's number one seller of gaming GPUs (Sony with >90Million PS4 sold) is important enough to design a GPU focused on their reccomendations.
    lol
     
    #25 Deleted member 13524, Jan 29, 2019
    Last edited by a moderator: Jan 29, 2019
    MBTP likes this.
  6. milk

    milk Like Verified
    Veteran

    Joined:
    Jun 6, 2012
    Messages:
    3,977
    Likes Received:
    4,102
    I can understand the adverse reaction to people who extrapolated from that Forbe's comment that sony is pretty much doing most of the designing for navi, which is a simplistic and stupid extrapolation to make. But to then say Sony's know-how has absolutely no value for AMD is equally simplistic and stupid.
     
  7. And it's the complete opposite of what the Forbes article actually says:

    Created for, not created by.
    And if Sony was designing Navi then AMD wouldn't need to send 2/3rds of their engineers to work on that.
     
  8. milk

    milk Like Verified
    Veteran

    Joined:
    Jun 6, 2012
    Messages:
    3,977
    Likes Received:
    4,102
    What I meant is that some people took that Forbes comment and ran with it, assuming much more out of it than it was deserved.
     
    MBTP, vipa899 and BRiT like this.
  9. mrcorbo

    mrcorbo Foo Fighter
    Veteran

    Joined:
    Dec 8, 2004
    Messages:
    4,024
    Likes Received:
    2,851
    Which is kind of shaky, TBH. What if the input that specifically caused the reallocation of engineering resources was just, "Here is our launch timetable and we need you to hit these development milestones within this timeframe."? If the project was otherwise in danger of slipping, that alone could account for a re-allocation of engineering resources, no?

    FWIW, I think Sony's feedback and input will have clearly influenced the design of Navi. I just think the impact of that influence on the base architecture is generally overestimated by Sony enthusiasts. Where they will have a large role is in the development of the specific implementation of Navi that will be part of the PS5's SoC.
     
    MBTP, vipa899, BRiT and 2 others like this.
  10. TheAlSpark

    TheAlSpark Moderator
    Moderator Legend

    Joined:
    Feb 29, 2004
    Messages:
    22,146
    Likes Received:
    8,533
    Location:
    ಠ_ಠ
    So probably RTG will have their own roadmap of things that they are working on or planning on implementing - the timetable/schedule just depends on that allocation. If a client wants something sooner - bam - they can let the Semi Custom folks know, then they can communicate that back to RTG.

    If they want something totally custom - the client would ask for it (see various MS/Sony doohickies), but if it's otherwise unpatentable by the client, for example, it's probably something already on the agenda at AMD (or just a simple configuration of existing IP blocks), and again, it's just about timetable/priorities/resources. We can give credit where credit is due in the literature because that's less disputable as to who came up with the idea and specific implementation.

    There's little point in making leaps and conclusions otherwise as to who is responsible for a tech. We can leave the gold star stickers in the teacher's drawer back in Kindergarten. There are more interesting tech things to discuss than who did what here, and it's disappointing to get so hung up on these lines of discussions here.
     
    #30 TheAlSpark, Jan 29, 2019
    Last edited: Jan 29, 2019
    Silent_Buddha and mrcorbo like this.
  11. DmitryKo

    Regular

    Joined:
    Feb 26, 2002
    Messages:
    967
    Likes Received:
    1,223
    Location:
    55°38′33″ N, 37°28′37″ E
    In the context of supposed Sony's superior knowledge of dedicated 3D graphics hardware, I must say it again: they have none, since their previous in-house implementations of 3D graphics pipeline were based on 3rd-party processors with SIMD vector extensions.

    You may call it flexible, forward-thinking etc., but I'd think they really did not have a clear vision of 3D graphics hardware going forward (or relied too much on Japanese arcade developers still rooted in 8-bit era). All these hair and facial expression demos looked good at E3 but they never found a way into production games except for pre-computed cut-scenes. And emulation even made these PS2 games look better due to advance texture filtering, which is only possible with dedicated hardware.
     
    vipa899 likes this.
  12. DmitryKo

    Regular

    Joined:
    Feb 26, 2002
    Messages:
    967
    Likes Received:
    1,223
    Location:
    55°38′33″ N, 37°28′37″ E
    Nope, it was the exact opposite. Microsoft first designed a new incompatible API and then drew very specific reference hardware requirements for that API, which was essentially a form of hardware-assisted sprite rendering optimized for transforming and compositing multiple compressed 2D images to a tiled framebuffer, all in the effort to reduce memory bandwidth requirements and computational complexity of the traditional 3D graphics pipeline.

    And then came IHVs like 3Dfx and NVidia who basically brute-forced the complexity and bandwidth with dedicated triangle setup hardware and fast dedicated memory, building their hardware around traditional OpenGL rasterisation pipeline - so Microsoft had to abandon their concepts of Talisman, retained mode and execute buffers, and implement DrawPrimitive in Direct3D 5 and multi-texture in Direct3D 6 to make full use of the new graphics cards. And then they had to implement hardware T&L in Direct3D 7, so they never even bothered to actually work on Fahrenheit Low Level...
     
    #32 DmitryKo, Jan 29, 2019
    Last edited: Jan 29, 2019
  13. vipa899

    Regular

    Joined:
    Mar 31, 2017
    Messages:
    922
    Likes Received:
    354
    Location:
    Sweden
    Desperation perhaps, i dont know. AMD designing a chip for Sony, the fantasy train has to stop somewhere.

    Not sure if your just ironic there, but the margins are much smaller in the console market, ps4 apu with 90 million, not the number one im afrald, its probably a mobile gpu thats number one. Not a play station with 90 million over 7+ years (of the same old 2012 tech).

    Maybe it should have, we would have seen better looking games over the 6 years lifetime and devs would have had an easier time.
    PS2 was in a way a super psx.
     
  14. Ike Turner

    Veteran

    Joined:
    Jul 30, 2005
    Messages:
    2,110
    Likes Received:
    2,304
    Pete, Anarchist4000, egoless and 3 others like this.
  15. MrFox

    MrFox Deludedly Fantastic
    Legend

    Joined:
    Jan 7, 2012
    Messages:
    6,488
    Likes Received:
    5,996
    I was promised a gold star sticker for good behaviour.
     
    TheAlSpark and vipa899 like this.
  16. vipa899

    Regular

    Joined:
    Mar 31, 2017
    Messages:
    922
    Likes Received:
    354
    Location:
    Sweden
    I dont get why this is so important to them sony camp? If amd designs the gpu it will be better then if sony does it. I want a high end amd Arcturus in my pc by the time next gens arrive, surely hope theres nothing sony in it seeing their track record.

    Its just a console, primary for playing games, a toy, it doesnt matter that amd designs the apu.
     
  17. Putas

    Regular

    Joined:
    Nov 7, 2004
    Messages:
    738
    Likes Received:
    355
    Killer app cannot target hardcore gamers. So it was just another game running fast enough. Plus Nvidia and Rendition were unlikely to use miniGL for Quake.

    What device do you refer to as for hardware performance of Talisman?
     
  18. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    19,426
    Likes Received:
    10,320
    Which...

    And since there's no actual attributable quote there's no way to know if the "source" was just generalizing all console efforts into a pool and the article writer then assuming it means Sony/PS.

    Considering no other industry source has corroborated what Forbes has said...

    It's far more likely that the shift in resources was a response to ALL contracted console manufacturer's wanting custom parts and not just Sony.

    Especially if you consider that at the time this all supposedly happened, Microsoft was by far AMDs largest customer for custom console graphics. They were a proven partner with a proven track record and had just basically matched Sony console for console.

    In other words, past history said that Microsoft was a better bet for profitability for AMD than Sony.

    Either way, had ANY other reputable tech. sites corroborated with their sources what Forbes wrote, then it'd at least have a tiny bit of credibility that Sony was commanding that amount of RTG resources at the expense of other AMD partners.

    But there isn't. So people are left grasping at straws...no wait...grasping at one straw to justify Sony being the prime beneficiary of that much of RTGs engineering resources. When it's far more plausible that that amount of engineering resources was in fact reallocated to all custom console requests (Sony, MS, and whoever else wants custom SOCs).

    No-one disputes that Sony have asked for and gotten customizations that are specific to their implementation. But so has Microsoft and presumably any other semi-custom partners that they have.

    Regards,
    SB
     
    #38 Silent_Buddha, Jan 29, 2019
    Last edited: Jan 31, 2019
    vipa899 likes this.
  19. vipa899

    Regular

    Joined:
    Mar 31, 2017
    Messages:
    922
    Likes Received:
    354
    Location:
    Sweden


    Just .like DF says, dont believe all the BS. Fantasy train has to stop, ppl might really start to believe in things and be massively dissapointed.
     
  20. 3dilettante

    Legend Alpha

    Joined:
    Sep 15, 2003
    Messages:
    8,579
    Likes Received:
    4,799
    Location:
    Well within 3d
    Sony's investment into the Cell initiative represented a high-water mark for its microelectronics division, and its failure devastated Sony's manufacturing and design ambitions and damaged Sony overall.
    A whole leading-edge fab was built and then had to be sold because of the unrealized demand and Sony gave up on competitive logic processes, though I believe Sony bought it back more recently to manufacture camera image sensors.
    In the period prior to that, there were job losses and a lack of projects at that level of logic complexity and manufacturing node.

    If we are to believe Sony had some of them employed on the PS3's graphics capabilities prior to resorting to Nvidia, that would leave nearly 15 years of Sony's graphics architecture resources not being heavily utilized--given the very long PS3 generation and the predominantly third-party PS4+Pro.
    I wouldn't expect Sony to keep most of them circling around for their chance to tweak the margins of AMD's architecture in 2019/2020.
    I suppose we're aware of AMD's collaborating with Mark Cerny, or at least giving him enough things to fiddle with to make him happy. It might not need that much of a resource investment, and if a good chunk is from Cerny it needs less of whatever resources Sony had over a decade prior.

    What is "re-taping"?
    Tape-out is a specific step in finalizing a chip and sending it to the manufacturer, with all design elements and layout final.
    A re-spin is sending a round of wafers through manufacturing, possibly with adjustments in manufacturing process or minor mask revisions to correct for bugs or faults.

    What you're describing seems more like a re-design or the development of a revision of the microarchitecture. That's a more expensive proposition, and it is also not entirely unexpected in the sense that AMD has multiple revisions of its architectures across different physical chips. Within its own products, a differently-sized chip made at a different time tended to have slightly different point revisions of IP blocks, and the semi-custom designs similarly had their own variations.

    Is the implication that Sony would want AMD to sell its specific silicon as well? Otherwise, there's going to be a different implementation for AMD's products regardless.
     
    MBTP, mrcorbo, Silent_Buddha and 2 others like this.
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...