series consoles DX12_X has lower level access to RT compared to PC DX12.XSX has not low access api for RT. It's a "better" way to access to the hardware resources not possible with high level access.
series consoles DX12_X has lower level access to RT compared to PC DX12.XSX has not low access api for RT. It's a "better" way to access to the hardware resources not possible with high level access.
Source? Cuz that would be very new to me.That's absolutely false, GE ps5 occupies bigger space in the hardware chipset than AMD gpu and surely has more set features.
That's absolutely false, GE ps5 occupies bigger space in the hardware chipset than AMD gpu and surely has more set features.
Primitive Shader as hardware exists in everything from Radeon RX Vega to the latest RDNA 3-based GPU. When viewed from DirectX 12, Radeon GPU's Primitive Shader is designed to work as a Mesh Shader.
Since the PS5 GPU is an AMD RDNA-based GPU, it is equipped with a Primitive Shader and can be used natively (from the PS5 SDK). As a result, some PS5 exclusive game titles effectively utilize Primitive Shader.
In my sense, Primitive Shader is used in the first party title for PS5 as it is, so I think that the number of usage cases of Mesh Shader is exceeded. Compared to that, Mesh Shader has become an industry standard, but it seems that there are few examples of active use of Mesh Shader in recent game works.
And they probably would if they could get similar terms from the others as they have with AMD, also BC needs to be taken care of and CPU interaction might be easier with AMD and probably support from AMD is easier with an AMD SOC.Sony should just ditch AMD and go with other vendors - Intel, Imgtech, nVidia... Everyone is better with Raytracing than AMD...
Sony should just ditch AMD and go with other vendors - Intel, Imgtech, nVidia... Everyone is better with Raytracing than AMD...
series consoles DX12_X has lower level access to RT compared to PC DX12.
Source? Cuz that would be very new to me.
Yeah, they should go for one of the other X86 APU vendors that have a high-end GPU. You know, those ones.
I think you are confused with the bigger spaced allocated to ROPs units on PS5 compared to all RDNA2 and XSX GPU because of double more stencil / depth ROPs. Anyways Cerny never claimed to have invented the geometry engine inside PS5 and there are no patents from him about it.That's absolutely false, GE ps5 occupies bigger space in the hardware chipset than AMD gpu and surely has more set features. But no one has ever said ps5 is unique and god like machine for it. To be fair I heard more such bullshitness like those around XSX hardware than PS5 even from the same MS slides but I find useless such kind of discussion from the last posts.
They could ditch x86, too... Apple is emulating x86 just fine...Yeah, they should go for one of the other X86 APU vendors that have a high-end GPU. You know, those ones.
They could ditch x86, too... Apple is emulating x86 just fine...
They dont need an APU. Cheap interconnects allow the combination of Ryzen and IMG IP.
He has no idea of the technical challenges ...Lol dude, come on. You're talking about fundamentally changing the architecture for a midgen refresh to get potentially better RT. This is ridiculous.
We're over 2 years in and Sony still can barely meet demand. What is the possible impetus for Sony to tear down this relationship with AMD which has made them wildly successful? It makes no sense.
It would not be the first time that a console maker is replacing their product within four to five years. Microsoft did it with the Xbox 360. Sony and Microsoft cant be so blind to not read the market.
Even Nvidia doesn't care about binary compatibility on their own hardware designs and is the main reason why their console partner is holding out on a successor for so long because they can't guarantee that they'll get a design from Nvidia that's compatible with Maxwell binaries so that their partner will have to be content with whatever suboptimal software solution they'll come up with ...If you're going to risk backwards compatibility/developer familiarity, you've really, really got to bring something truly extraordinary to the table. Even Nvidia's improvements in RT have been fairly predictable, yes they have big gains but also big, expensive GPU's. I don't see anything to indicate Nvidia has something that would be such an exponential performance uplift at a comparable wattage to warrant Sony wanting to sign separate license agreements with now 2 potential vendors, risk backwards compatibility, require a substantial reworking of their SDK, and all for...better ray tracing? Are console players really demanding that? It's nuts.
MS 'replaced' their 360 with...an X86 APU, so not sure why you would use that as an example for them moving away from X86, I mean that RISC design was melting their motherboards.
What does 'reading the market' mean as well? Aside from moving to X86 being the best decision Sony has made in the past decade in terms of actual market success, if you're looking at truly performant ARM-based designs, you're really looking at one company right now, and that's Apple. Those chips are massive, and have no bearing on what is actually economical for a console vendor.
Every design is hitting the process node wall. There is no indication AMD is suffering from this particularly worse than others, in fact when you look at the wattage the 3D Vcache chips are demanding with the performance they're delivering, they seem to be doing quite well - and designs such as 3D cache that can actually be managed per CCX make even more sense on a singular platform where developers know exactly how to maximize it and how to avoid its pitfalls.
If you're going to risk backwards compatibility/developer familiarity, you've really, really got to bring something truly extraordinary to the table. Even Nvidia's improvements in RT have been fairly predictable, yes they have big gains but also big, expensive GPU's. I don't see anything to indicate Nvidia has something that would be such an exponential performance uplift at a comparable wattage to warrant Sony wanting to sign separate license agreements with now 2 potential vendors, risk backwards compatibility, require a substantial reworking of their SDK, and all for...better ray tracing? Are console players really demanding that? It's nuts.
It would not be the first time that a console maker is replacing their product within four to five years.
I'd also be interested in seeing a source for this, and to know what advantage Xbox has over PC in this area.
The good news is that Microsoft allows low-level access to the RT acceleration hardware.
"[Series X] goes even further than the PC standard in offering more power and flexibility to developers," reveals Goossen. "In grand console tradition, we also support direct to the metal programming including support for offline BVH construction and optimisation. With these building blocks, we expect ray tracing to be an area of incredible visuals and great innovation by developers over the course of the console's lifetime."