Forbes: AMD Created Navi for Sony PS5, Vega Suffered [2018-06] *spawn*

Geez the level of salt in this thread, starting with whomever wrote that title, is way off the charts.

Ease up people. No one here ever remotely suggested Sony's engineering is better than the others'.
There's no need to feel threatened by any of the rumors/news/comments, nor to kneejerk react in accordance. No one is saying "the PS5 is gonna CRUSH XBOXTwo because Navi magic saucezz!!111oneone."

No, Sony is not designing Navi. But Sony is in a position to be more demanding into the details of their Playstation SoC than Microsoft because they have IC design teams of their own, who even to this day are designing custom SoCs and image processors.
And this does not mean the PS5 has an intrinsic advantage versus xboxtwo.
In fact, the last time Sony got deeply involved in processing hardware (Cell) Microsoft actually got away with a much better deal out of it (Xenon).
Sony demanding XYZ features for Navi doesn't mean Microsoft won't get access to it, if they find it to be worth the money/transistors/die-area, and the same could even work in reverse.


Just because someone writes for Forbes doesn't mean everything he writes is true.
Dude.. Jason Evangelho was a Technical Marketing Specialist for RTG in 2016-2017. Is it that hard to believe he has more than a couple of people in his contacts list with insider knowledge?

His article (which mostly reports on RTG being short on staff while working on Vega because Lisa Su redirected their engineers to work on Navi for PS5) falls perfectly in line with RTG's slow cadence of GPU releases, Raja leaving RTG on a lower note and even tidbits that @digitalwanderer mentioned here in the forum about the whole saga. I have very little reason to believe the guy pulled those things out of his ass.


Except that Microsoft has developed hardware in-house, including graphics and AI chips
Which ones? Care to source?
All I can find are news about Microsoft trying to hire A.I. hardware engineers as recently as June of last year, which was succeeded by reports of Microsoft trying to buy A.I. hardware from Huawei a couple of months later.
What GPU did Microsoft work on?


And we all know how well that turned out, see Geeforcers post above.
(...)
Again, see Geeforcers post.
Claiming Sony shouldn't use their GPU talent for their benefit because the PS2 wasn't the most powerful console of the 6th generation, or Cell didn't work out great for videogames is a huge strawman.
Cell was an ambitious (therefore very risky) project that tried to fuse CPU and GPU designs and it mostly failed, sure.
But the PS2's hardware was anything but a failure.

The Graphics Synthetizer (53 million transistors 150MHz) was originally developed for the 250nm process in a time where SDR RAM was very slow (best they could do was 3.2GB/s) so it needed eDRAM taking up die area and transistors to reach its performance target. So yeah, it was the equivalent of Xenon's backend+eDRAM chip so everything else had to be done on the CPU's vector units.
The NV2A (57 million transistors 250MHz) was a 150nm chip so it could clock significantly higher, and it came out after DDR became available which allowed a significant bandwidth advantage foregoing the need for eDRAM. This is akin to the PS4 GDDR5 vs. XBOne DDR3+eDRAM debacle. Because of that, it could spend transistors on all the rest (pixel and vertex shaders, T&L unit) while keeping control of the costs.

And despite all that, the Emotion Engine + Graphics Synthetizer combo proved to be excellent at scaling down in cost and power (much more than the Xbox which had CPU and GPU made by different vendors), and the cost advantage allowed it to sell more than 150M units..
The latest Slimline PS2 with the unified EE+GS chip was an awesome piece of hardware for its time, IMHO.
There's nothing in the PS2 that Sony should be ashamed of.


I beg to differ.

Microsoft Talisman was a hardware project combining tile-based rendering API with a reference hardware specification; vendors like Samsung/3DO, Cirrus Logic, Trident and Philips/NXP (Trimedia) already had working VLIW DSP implementations by 1997, but then 3Dfx Voodoo Graphics PCI arrived with a mini-GL driver for QuakeGL, and the whole Talisman effort was dead on arrival.

Then in 1998 they partnered with SGI on Fahrenheit Scene Graph (OpenGL++) and Fahrenheit Low Level specifications, with the latter poised to succeed Direct3D 5, but then Nvidia GeForce256 arrived with hardware transform and lighting in August 1999, and the FLL effort had to be abandoned in favor of refactored Direct3D 7.

What's there to differ? What I see in there is a couple of projects where IHVs did hardware design while Microsoft worked on the software implementation.
 
Which ones? Care to source?
All I can find are news about Microsoft trying to hire A.I. hardware engineers as recently as June of last year, which was succeeded by reports of Microsoft trying to buy A.I. hardware from Huawei a couple of months later.
What GPU did Microsoft work on?
Hololens :?:

Anyways, I suppose it depends on how stringent into semantics you want to be. They did ask for a number of customizations in Durango and Scorpio (command processor/DX12-related + Jaguar bits + backward compatible things), but they have admitted to designing Shape. On a side note, it'll be interesting to see where that goes next time since AMD has seemingly shifted away from particular audio accelerated HW in their desktop cards to reserving compute units since Polaris.

I'd be curious if there's enough work on the audio side of things that it'd be worthwhile to allocate the finite GPU resources on console and try to fit it into the compute queues of more modern engines or if they might need a lot more work (a.k.a. no time for developers to bother with).
 
The salt is overflowing from the repeated assertion that Navi couldnt happen without Sony
Where exactly is this assertion?

The only assertion that is being repeated is Navi getting input from Sony and Lisa Su redirecting RTG engineering efforts to respond to that input.

Success had not much to do with its hardware, it was the least performing and the hardest to code for.
It was the earliest hardware, in a time when GPUs were going through giant leaps every single year (1999 - S3TC/DXTC and bump mapping; 2000 - T&L and multitexturing; 2001 - Pixel&Vertex Shaders).
Everything was hard to code for at that time, and if the PS2 was that hard, it would have fallen like the Saturn did.


And still those fantasy that amd designs their gpu for sony, lol.
Yes, god forbid that AMD would believe the world's number one seller of gaming GPUs (Sony with >90Million PS4 sold) is important enough to design a GPU focused on their reccomendations.
lol
 
Last edited by a moderator:
I can understand the adverse reaction to people who extrapolated from that Forbe's comment that sony is pretty much doing most of the designing for navi, which is a simplistic and stupid extrapolation to make. But to then say Sony's know-how has absolutely no value for AMD is equally simplistic and stupid.
 
I can understand the adverse reaction to people who extrapolated from that Forbe's comment that sony is pretty much doing most of the designing for navi, which is a simplistic and stupid extrapolation to make.
And it's the complete opposite of what the Forbes article actually says:

According to my sources, Navi isn't just inside the Sony PS5; it was created for Sony.
(...)
But the collaboration came at the expense of Radeon RX Vega and other in-development projects. Allegedly, Koduri saw up to a massive 2/3 of his engineering team devoted exclusively to Navi against his wishes, which resulted in a final RX Vega product Koduri was displeased with as resources and engineering hours were much lower than anticipated.

Created for, not created by.
And if Sony was designing Navi then AMD wouldn't need to send 2/3rds of their engineers to work on that.
 
Where exactly is this assertion?

The only assertion that is being repeated is Navi getting input from Sony and Lisa Su redirecting RTG engineering efforts to respond to that input.

Which is kind of shaky, TBH. What if the input that specifically caused the reallocation of engineering resources was just, "Here is our launch timetable and we need you to hit these development milestones within this timeframe."? If the project was otherwise in danger of slipping, that alone could account for a re-allocation of engineering resources, no?

FWIW, I think Sony's feedback and input will have clearly influenced the design of Navi. I just think the impact of that influence on the base architecture is generally overestimated by Sony enthusiasts. Where they will have a large role is in the development of the specific implementation of Navi that will be part of the PS5's SoC.
 
So probably RTG will have their own roadmap of things that they are working on or planning on implementing - the timetable/schedule just depends on that allocation. If a client wants something sooner - bam - they can let the Semi Custom folks know, then they can communicate that back to RTG.

If they want something totally custom - the client would ask for it (see various MS/Sony doohickies), but if it's otherwise unpatentable by the client, for example, it's probably something already on the agenda at AMD (or just a simple configuration of existing IP blocks), and again, it's just about timetable/priorities/resources. We can give credit where credit is due in the literature because that's less disputable as to who came up with the idea and specific implementation.

There's little point in making leaps and conclusions otherwise as to who is responsible for a tech. We can leave the gold star stickers in the teacher's drawer back in Kindergarten. There are more interesting tech things to discuss than who did what here, and it's disappointing to get so hung up on these lines of discussions here.
 
Last edited:
In the context of supposed Sony's superior knowledge of dedicated 3D graphics hardware, I must say it again: they have none, since their previous in-house implementations of 3D graphics pipeline were based on 3rd-party processors with SIMD vector extensions.

You may call it flexible, forward-thinking etc., but I'd think they really did not have a clear vision of 3D graphics hardware going forward (or relied too much on Japanese arcade developers still rooted in 8-bit era). All these hair and facial expression demos looked good at E3 but they never found a way into production games except for pre-computed cut-scenes. And emulation even made these PS2 games look better due to advance texture filtering, which is only possible with dedicated hardware.
 
What's there to differ? What I see in there is a couple of projects where IHVs did hardware design while Microsoft worked on the software implementation.
Nope, it was the exact opposite. Microsoft first designed a new incompatible API and then drew very specific reference hardware requirements for that API, which was essentially a form of hardware-assisted sprite rendering optimized for transforming and compositing multiple compressed 2D images to a tiled framebuffer, all in the effort to reduce memory bandwidth requirements and computational complexity of the traditional 3D graphics pipeline.

And then came IHVs like 3Dfx and NVidia who basically brute-forced the complexity and bandwidth with dedicated triangle setup hardware and fast dedicated memory, building their hardware around traditional OpenGL rasterisation pipeline - so Microsoft had to abandon their concepts of Talisman, retained mode and execute buffers, and implement DrawPrimitive in Direct3D 5 and multi-texture in Direct3D 6 to make full use of the new graphics cards. And then they had to implement hardware T&L in Direct3D 7, so they never even bothered to actually work on Fahrenheit Low Level...
 
Last edited:
I just think the impact of that influence on the base architecture is generally overestimated by Sony enthusiasts.

Desperation perhaps, i dont know. AMD designing a chip for Sony, the fantasy train has to stop somewhere.

Yes, god forbid that AMD would believe the world's number one seller of gaming GPUs (Sony with >90Million PS4 sold) is important enough to design a GPU focused on their reccomendations.

Not sure if your just ironic there, but the margins are much smaller in the console market, ps4 apu with 90 million, not the number one im afrald, its probably a mobile gpu thats number one. Not a play station with 90 million over 7+ years (of the same old 2012 tech).

Everything was hard to code for at that time, and if the PS2 was that hard, it would have fallen like the Saturn did.

Maybe it should have, we would have seen better looking games over the 6 years lifetime and devs would have had an easier time.
PS2 was in a way a super psx.
 
There's little point in making leaps and conclusions otherwise as to who is responsible for a tech. We can leave the gold star stickers in the teacher's drawer back in Kindergarten. There are more intresting tech things to discuss than who did what here, and it's disappointing to get so hung up on these lines of discussions here.
I was promised a gold star sticker for good behaviour.
 
What I meant is that some people took that Forbes comment and ran with it, assuming much more out of it than it was deserved.

I dont get why this is so important to them sony camp? If amd designs the gpu it will be better then if sony does it. I want a high end amd Arcturus in my pc by the time next gens arrive, surely hope theres nothing sony in it seeing their track record.

Its just a console, primary for playing games, a toy, it doesnt matter that amd designs the apu.
 
miniGL driver was a killer app for Voodoo Graphics PCI because it could run hardware-accelerated GLQuake at ~30 fps - making every hardcore gamer want a 3D accelerator card from 3Dfx, NVidia, or Rendition.

On the other hand, Talisman was a radical departure from existing triangle-based rasterizer pipelines in Direct3D, Glide, and miniGL/OpenGL, and its hardware performance was far behind Voodoo Graphics.

Killer app cannot target hardcore gamers. So it was just another game running fast enough. Plus Nvidia and Rendition were unlikely to use miniGL for Quake.

What device do you refer to as for hardware performance of Talisman?
 
And it's the complete opposite of what the Forbes article actually says:



Created for, not created by.
And if Sony was designing Navi then AMD wouldn't need to send 2/3rds of their engineers to work on that.

Which...

According to my sources...

And since there's no actual attributable quote there's no way to know if the "source" was just generalizing all console efforts into a pool and the article writer then assuming it means Sony/PS.

Considering no other industry source has corroborated what Forbes has said...

It's far more likely that the shift in resources was a response to ALL contracted console manufacturer's wanting custom parts and not just Sony.

Especially if you consider that at the time this all supposedly happened, Microsoft was by far AMDs largest customer for custom console graphics. They were a proven partner with a proven track record and had just basically matched Sony console for console.

In other words, past history said that Microsoft was a better bet for profitability for AMD than Sony.

Either way, had ANY other reputable tech. sites corroborated with their sources what Forbes wrote, then it'd at least have a tiny bit of credibility that Sony was commanding that amount of RTG resources at the expense of other AMD partners.

But there isn't. So people are left grasping at straws...no wait...grasping at one straw to justify Sony being the prime beneficiary of that much of RTGs engineering resources. When it's far more plausible that that amount of engineering resources was in fact reallocated to all custom console requests (Sony, MS, and whoever else wants custom SOCs).

No-one disputes that Sony have asked for and gotten customizations that are specific to their implementation. But so has Microsoft and presumably any other semi-custom partners that they have.

Regards,
SB
 
Last edited:

Just .like DF says, dont believe all the BS. Fantasy train has to stop, ppl might really start to believe in things and be massively dissapointed.
 
Hum... I'd say Sony is very different from Microsoft.
With the PS2 and PSP they developed the whole things in-house using MIPS IP cores and their own GPUs/Vector-FPUs. In the PS3 they co-developed Cell with Toshiba and IBM which was supposed to power the whole thing.
Sony's investment into the Cell initiative represented a high-water mark for its microelectronics division, and its failure devastated Sony's manufacturing and design ambitions and damaged Sony overall.
A whole leading-edge fab was built and then had to be sold because of the unrealized demand and Sony gave up on competitive logic processes, though I believe Sony bought it back more recently to manufacture camera image sensors.
In the period prior to that, there were job losses and a lack of projects at that level of logic complexity and manufacturing node.

Unless those guys at Sony have different jobs now, the resident talent for graphics processing architectures at Sony exists, while we can't really say the same about Nintendo or Microsoft.
If we are to believe Sony had some of them employed on the PS3's graphics capabilities prior to resorting to Nvidia, that would leave nearly 15 years of Sony's graphics architecture resources not being heavily utilized--given the very long PS3 generation and the predominantly third-party PS4+Pro.
I wouldn't expect Sony to keep most of them circling around for their chance to tweak the margins of AMD's architecture in 2019/2020.
I suppose we're aware of AMD's collaborating with Mark Cerny, or at least giving him enough things to fiddle with to make him happy. It might not need that much of a resource investment, and if a good chunk is from Cerny it needs less of whatever resources Sony had over a decade prior.

I was referencing and referring to my opening remarks (thread spawn) & not the post that was above mine. And that the "re-taping" of NAVI is actually a good thing, because it allows AMD to sneak in a few more "adjustments" they have since nailed down (ie: had wished they could've got into navi, & now can, etc)
What is "re-taping"?
Tape-out is a specific step in finalizing a chip and sending it to the manufacturer, with all design elements and layout final.
A re-spin is sending a round of wafers through manufacturing, possibly with adjustments in manufacturing process or minor mask revisions to correct for bugs or faults.

What you're describing seems more like a re-design or the development of a revision of the microarchitecture. That's a more expensive proposition, and it is also not entirely unexpected in the sense that AMD has multiple revisions of its architectures across different physical chips. Within its own products, a differently-sized chip made at a different time tended to have slightly different point revisions of IP blocks, and the semi-custom designs similarly had their own variations.

Is the implication that Sony would want AMD to sell its specific silicon as well? Otherwise, there's going to be a different implementation for AMD's products regardless.
 
Back
Top