MEME: Sony engineering better than everyone else *cleanup*

Discussion in 'Graphics and Semiconductor Industry' started by w0lfram, Jan 28, 2019.

  1. w0lfram

    Newcomer

    Joined:
    Aug 7, 2017
    Messages:
    140
    Likes Received:
    27
    If Navi is built for gaming, then perhaps it's focus on FP32 and actual geometry crunching. Not a hand-me-down from the business sector. I am sure AMD's work with SONY, gave them plenty of "re-taping" ideas to play with. (Better game compute ratios)
     
  2. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    8,036
    Likes Received:
    1,732
    Location:
    Finland
    Seriously wtf is up with all this "with Sony" nonsense? One damn rumor and people are still going around talking like Sony is designing the damn thing.
    Sony is no different from Microsoft or any other semi-custom partner, sure, they surely share ideas which may or may not affect the architecture in development phase, but they really deal with the semi-custom department and semi-custom department gets their hands on the architectures when they're ready to be implemented in those semi-custom designs
     
  3. Rootax

    Veteran Newcomer

    Joined:
    Jan 2, 2006
    Messages:
    1,103
    Likes Received:
    533
    Location:
    France
    One beneficial aspect of working with amd and sony for consoles is, I guess, having more data to see what devs wants, how they want to do stuff, what's the bottlenecks of their chips, etc. Don't get me wrong, they already have that internally and on PC, but being on console put that on another level.
    And If Sony and MS "pick" or "ask" for Y and Z instead of X and W, in a sense, it can help AMD focusing on the "useful" stuff, for gaming...
     
    milk likes this.
  4. Geeforcer

    Geeforcer Harmlessly Evil
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    2,297
    Likes Received:
    464
    Because nothing makes you an authority on contemporary GPU design quite like having the worst graphics subsystem in comparable consoles 20+ years ago; then following up with failing GPU design so hard 15 years ago as to have to rush and buy a practically off-the-shelf part due to the lack of time to even commission a true semi-custom one and then not bothering to even try ever since.
     
    Pixel, Kej, egoless and 5 others like this.
  5. milk

    Veteran Regular

    Joined:
    Jun 6, 2012
    Messages:
    2,828
    Likes Received:
    2,335
    Sony might not have a great track record in actually developing good graphics silicon, but the experience they have in AAA low-level graphics and game code is unmatched on PC-space, and arguably also on Xbox. Don't right off completely SONY's ability to provide AMD useful input.
     
    #5 milk, Jan 28, 2019
    Last edited: Jan 29, 2019
    Shifty Geezer, Clukos and pharma like this.
  6. ToTTenTranz

    Legend Veteran Subscriber

    Joined:
    Jul 7, 2008
    Messages:
    9,669
    Likes Received:
    4,328
    Hum... I'd say Sony is very different from Microsoft.
    With the PS2 and PSP they developed the whole things in-house using MIPS IP cores and their own GPUs/Vector-FPUs. In the PS3 they co-developed Cell with Toshiba and IBM which was supposed to power the whole thing.

    Apart from Microsoft never developed processing hardware in-house, and AFAIK all Nintendo graphics solutions have been developed by Silicon Graphics / then-ArtX / then-ATi / then-AMD (and now nvidia). I think the only in-house graphics solution from Nintendo is the on in DS, but that's a very simple GPU.
    Unless those guys at Sony have different jobs now, the resident talent for graphics processing architectures at Sony exists, while we can't really say the same about Nintendo or Microsoft.


    Plus the "rumor" of Sony messing on Navi with AMD came from a Forbes contributor with AFAIR a very decent track record.
     
    w0lfram likes this.
  7. DmitryKo

    Regular

    Joined:
    Feb 26, 2002
    Messages:
    611
    Likes Received:
    416
    Location:
    55°38′33″ N, 37°28′37″ E
    I beg to differ.

    Microsoft Talisman was a hardware project combining tile-based rendering API with a reference hardware specification; vendors like Samsung/3DO, Cirrus Logic, Trident and Philips/NXP (Trimedia) already had working VLIW DSP implementations by 1997, but then 3Dfx Voodoo Graphics PCI arrived with a mini-GL driver for QuakeGL, and the whole Talisman effort was dead on arrival.

    Then in 1998 they partnered with SGI on Fahrenheit Scene Graph (OpenGL++) and Fahrenheit Low Level specifications, with the latter poised to succeed Direct3D 5, but then Nvidia GeForce256 arrived with hardware transform and lighting in August 1999, and the FLL effort had to be abandoned in favor of refactored Direct3D 7.

    So that probably taught Microsoft a lesson or two - and Sony arguably had to endure similar lessons with PlayStations 2 and 3.
     
    #7 DmitryKo, Jan 28, 2019
    Last edited: Jan 28, 2019
    xz321zx, Pixel, Kej and 3 others like this.
  8. Putas

    Regular Newcomer

    Joined:
    Nov 7, 2004
    Messages:
    379
    Likes Received:
    54
    How could miniGL kill Talisman?
     
  9. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    8,036
    Likes Received:
    1,732
    Location:
    Finland
    And we all know how well that turned out, see Geeforcers post above.
    Except that Microsoft has developed hardware in-house, including graphics and AI chips
    Again, see Geeforcers post.
    Care to point to his previous track record on leaks? Just because someone writes for Forbes doesn't mean everything he writes is true.
     
  10. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    15,756
    Likes Received:
    4,686
    Wait, what? Versus MS who has been listening to game developers for over 2 decades now and coordinating that with discussions with what is possible with hardware manufacturer's? That was the whole reason that DirectX existed. To try to bring game developers closer to hardware developers and to try to inform game developers of what is possibly coming in hardware over the next X years. As well to give hardware developers a much better grasp what game developers want and to try to coordinate with them to offer universal hardware capabilities to make things easier for game developers. Prior to that game development on PC was a nightmare of hardware incompatibilities and software developers not being able to rely on having X hardware feature available for their game depending on what hardware an end user had.

    Versus console manufacturers who have traditionally just told game developers, this is what you get. Either make a game for it or don't, this is what you get. Sure it made game development a bit easier in that you had stable platforms with set hardware capabilities, but it also meant that game developers had almost no say in where hardware went. And game development was only easier WRT having a stable platform. Platform documentation could vary greatly in quality. The SDKs could often be a nightmare to work with (no input from game devs kind of does that). Etc.

    It wasn't until PS4 that Sony actually started to listen to what game developers wanted, and they only started to do that because of how the X360/PS3 generation turned out.

    OTOH - MS's consoles have always been praised for how developer friendly they were and easy it was to code games on them. Hence, Sony finally relenting and listening to game developers when the X360 basically matched PS3 sales.

    And a good thing that happened as well. The PS4 would likely be a much worse machine if that hadn't happened and Sony continued to do their own thing and not listen to game developers. Ironically, it's also Microsoft not listening to game developers as much as they had in the past that lead to the mistakes of the XBO.

    Nintendo? I'm still not sure if Nintendo is listening to game developers or not. :p

    Regards,
    SB
     
    #10 Silent_Buddha, Jan 28, 2019
    Last edited: Jan 28, 2019
  11. DmitryKo

    Regular

    Joined:
    Feb 26, 2002
    Messages:
    611
    Likes Received:
    416
    Location:
    55°38′33″ N, 37°28′37″ E
    miniGL driver was a killer app for Voodoo Graphics PCI because it could run hardware-accelerated GLQuake at ~30 fps - making every hardcore gamer want a 3D accelerator card from 3Dfx, NVidia, or Rendition.

    On the other hand, Talisman was a radical departure from existing triangle-based rasterizer pipelines in Direct3D, Glide, and miniGL/OpenGL, and its hardware performance was far behind Voodoo Graphics.
     
    #11 DmitryKo, Jan 28, 2019
    Last edited: Jan 29, 2019
    egoless likes this.
  12. vipa899

    Regular Newcomer

    Joined:
    Mar 31, 2017
    Messages:
    922
    Likes Received:
    354
    Location:
    Sweden
    Sony might suck at engineering, but the PS2 was just perfect in a sense that it sold north of 155 million units, has the best library of all consoles, and had uniqe graphics at the time. PS2 seems somewhat rushed hardware, designed by someone that didnt have too much knowledge what developers wanted, but turned out to be a great product anyway. Seemed a one-time success afterall as with PS3 this didnt really went like PS2.
     
  13. DmitryKo

    Regular

    Joined:
    Feb 26, 2002
    Messages:
    611
    Likes Received:
    416
    Location:
    55°38′33″ N, 37°28′37″ E
    Yet PS2 had no dedicated graphics hardware besides a custom VPU processor... in the year 2000, when the PC had hardware triangle setup/rasterizer and transform&lighting (and soon programmable pixel/vertex shaders).
     
  14. vipa899

    Regular Newcomer

    Joined:
    Mar 31, 2017
    Messages:
    922
    Likes Received:
    354
    Location:
    Sweden
    @DmitryKo

    Yes it was very basic but fast, i know devs didnt like the hardware, but i think it was able to display graphics uniqe compared to the competition (ZOE2 as an exemple).
    Offtopic but do you happen to play BF4 pc?
     
  15. w0lfram

    Newcomer

    Joined:
    Aug 7, 2017
    Messages:
    140
    Likes Received:
    27
    I mentioned SONY, because they worked with AMD engineers on their custom chip, to be better suited for their library of games. SONY is not at all paying AMD for GPU resources better suited for datacenters... they are paying AMD for a GAMING chip.

    And with that insight, AMD could have designed NAVI with better uArch meant for gaming.



    This same concept (of learning from SONY) also applies to AMD working alongside & with Microsoft. And... incorporating what they learned into NAVI.
     
  16. vipa899

    Regular Newcomer

    Joined:
    Mar 31, 2017
    Messages:
    922
    Likes Received:
    354
    Location:
    Sweden
    Whats your source?
    Nvidia has the best gaming chip right now, they didnt need Sony for it being successfull.
     
  17. w0lfram

    Newcomer

    Joined:
    Aug 7, 2017
    Messages:
    140
    Likes Received:
    27
    Forbes: https://www.forbes.com/sites/jasone...nys-playstation-5-vega-suffered/#29b62e24fda8

    What is the best gaming chip..? Who don't need SONY? Perhaps you misread my post?.
    I was referencing and referring to my opening remarks (thread spawn) & not the post that was above mine. And that the "re-taping" of NAVI is actually a good thing, because it allows AMD to sneak in a few more "adjustments" they have since nailed down (ie: had wished they could've got into navi, & now can, etc)

    Dr Su even mentions this and seemed actually glad that "something popped up" & had led them back to a "re-taping" navi. Because, "it allows them to not make mistakes, like they did with Polaris, etc" (*paraphrasing).
     
    Pixel likes this.
  18. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    8,036
    Likes Received:
    1,732
    Location:
    Finland
    She did what where? Only thing Su has said about Navi was in the CES keynote and it was only a tidbit saying it's one of their future products
     
    vipa899 likes this.
  19. Rootax

    Veteran Newcomer

    Joined:
    Jan 2, 2006
    Messages:
    1,103
    Likes Received:
    533
    Location:
    France
    Yeah well in some way, the VUs were a lot more flexible that what was existing in the pc space at the time. A bitch to dev. for because it was very unique and other bottlenecks, but it was pretty "smart" in the end imo...
     
  20. ToTTenTranz

    Legend Veteran Subscriber

    Joined:
    Jul 7, 2008
    Messages:
    9,669
    Likes Received:
    4,328
    Geez the level of salt in this thread, starting with whomever wrote that title, is way off the charts.

    Ease up people. No one here ever remotely suggested Sony's engineering is better than the others'.
    There's no need to feel threatened by any of the rumors/news/comments, nor to kneejerk react in accordance. No one is saying "the PS5 is gonna CRUSH XBOXTwo because Navi magic saucezz!!111oneone."

    No, Sony is not designing Navi. But Sony is in a position to be more demanding into the details of their Playstation SoC than Microsoft because they have IC design teams of their own, who even to this day are designing custom SoCs and image processors.
    And this does not mean the PS5 has an intrinsic advantage versus xboxtwo.
    In fact, the last time Sony got deeply involved in processing hardware (Cell) Microsoft actually got away with a much better deal out of it (Xenon).
    Sony demanding XYZ features for Navi doesn't mean Microsoft won't get access to it, if they find it to be worth the money/transistors/die-area, and the same could even work in reverse.


    Dude.. Jason Evangelho was a Technical Marketing Specialist for RTG in 2016-2017. Is it that hard to believe he has more than a couple of people in his contacts list with insider knowledge?

    His article (which mostly reports on RTG being short on staff while working on Vega because Lisa Su redirected their engineers to work on Navi for PS5) falls perfectly in line with RTG's slow cadence of GPU releases, Raja leaving RTG on a lower note and even tidbits that @digitalwanderer mentioned here in the forum about the whole saga. I have very little reason to believe the guy pulled those things out of his ass.


    Which ones? Care to source?
    All I can find are news about Microsoft trying to hire A.I. hardware engineers as recently as June of last year, which was succeeded by reports of Microsoft trying to buy A.I. hardware from Huawei a couple of months later.
    What GPU did Microsoft work on?


    Claiming Sony shouldn't use their GPU talent for their benefit because the PS2 wasn't the most powerful console of the 6th generation, or Cell didn't work out great for videogames is a huge strawman.
    Cell was an ambitious (therefore very risky) project that tried to fuse CPU and GPU designs and it mostly failed, sure.
    But the PS2's hardware was anything but a failure.

    The Graphics Synthetizer (53 million transistors 150MHz) was originally developed for the 250nm process in a time where SDR RAM was very slow (best they could do was 3.2GB/s) so it needed eDRAM taking up die area and transistors to reach its performance target. So yeah, it was the equivalent of Xenon's backend+eDRAM chip so everything else had to be done on the CPU's vector units.
    The NV2A (57 million transistors 250MHz) was a 150nm chip so it could clock significantly higher, and it came out after DDR became available which allowed a significant bandwidth advantage foregoing the need for eDRAM. This is akin to the PS4 GDDR5 vs. XBOne DDR3+eDRAM debacle. Because of that, it could spend transistors on all the rest (pixel and vertex shaders, T&L unit) while keeping control of the costs.

    And despite all that, the Emotion Engine + Graphics Synthetizer combo proved to be excellent at scaling down in cost and power (much more than the Xbox which had CPU and GPU made by different vendors), and the cost advantage allowed it to sell more than 150M units..
    The latest Slimline PS2 with the unified EE+GS chip was an awesome piece of hardware for its time, IMHO.
    There's nothing in the PS2 that Sony should be ashamed of.


    What's there to differ? What I see in there is a couple of projects where IHVs did hardware design while Microsoft worked on the software implementation.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...