Do you think there will be a mid gen refresh console from Sony and Microsoft?

That's absolutely false, GE ps5 occupies bigger space in the hardware chipset than AMD gpu and surely has more set features.
Source? Cuz that would be very new to me.

And XSX factually does have features that PS5 doesn't have. Support for mesh shaders, hardware accelerated variable rate shading, and sampler feedback are all specific DX12 hardware features that PS5 straight up doesn't have. Nothing about this is BS, though of course we can always argue the importance and impact of these things.
 
That's absolutely false, GE ps5 occupies bigger space in the hardware chipset than AMD gpu and surely has more set features.

According to AMD, the Geometry Engine in everything from RX Vega to RDNA 3 uses their Primitive Shader. On PS5 you can access this directly through SDK, on Series consoles it's used to implement mesh shaders (but hardly anyone is using Mesh Shaders yet).

It's from a Google translate, but here you go:


Primitive Shader as hardware exists in everything from Radeon RX Vega to the latest RDNA 3-based GPU. When viewed from DirectX 12, Radeon GPU's Primitive Shader is designed to work as a Mesh Shader.

Since the PS5 GPU is an AMD RDNA-based GPU, it is equipped with a Primitive Shader and can be used natively (from the PS5 SDK). As a result, some PS5 exclusive game titles effectively utilize Primitive Shader.
 In my sense, Primitive Shader is used in the first party title for PS5 as it is, so I think that the number of usage cases of Mesh Shader is exceeded. Compared to that, Mesh Shader has become an industry standard, but it seems that there are few examples of active use of Mesh Shader in recent game works.

They're talking about PS5 using their Primitive Shader hardware. Maybe Sony requested some PS5 specific additions, I dunno, but at it's core it's an AMD technology shared amongst all their GPUs since pre-RDNA. It's not something that Cerny floated down from heaven with, surrounded by angels, and gifted to AMD.

I can't see a PS5 Pro (if it even exists) diverging massively from AMD's ongoing roadmap in terms of RT or anything other core GPU technology.
 
If there a difference between patent filings and what actually appears? eg. The Photon Mapping patent, was that Cerny or Sony? What's the history on Cerny patents manifesting in PS hardware?
 
For anyone interested or who dosent know what is so special about Cerny patent here comes short summary from Sony patents method for “significant improvement of ray tracing speed” | Ars Technica

"System and method for accelerated ray tracing with asynchronous operation and ray transformation."

"Sony engineer Mark Cerny lays out a method that could significantly speed up the ray-tracing process by offloading certain calculations from the GPU to specially designed ray-tracing unit (RTU) hardware...

In Cerny's method, the RTU hardware is specially designed to efficiently traverse so-called acceleration structures in a 3D environment, going through a stack of bounding volumes to identify points where a virtual light ray intersects with an object. Those intersections are then sent to a shader program running on the GPU, which determines whether the object is opaque (a "hit" for the ray-tracing algorithm) or transparent (i.e., the intersection can be ignored).

In the case of a hit, the GPU can then send that information back to the RTU, which can "shorten the ray, as there is no point testing past the location of the intersection of the ray with [the opaque object]." That saves processing time that would be wasted calculating further "hits" for objects that are occluded by a closer object.

Crucially, these RTU functions "can be asynchronous with respect to the shader program." That means the GPU can perform other functions as it waits for the RTU to send back any intersections it finds between light rays and in-game objects.

Handling these basics functions in the RTU (which "may include hardware circuitry" specially designed for this traversal) "may result in a significant improvement of ray tracing speed," the patent says, "as the shader program is only performing hit testing. It is not performing acceleration structure traversal or managing the corresponding stack."




So it looks like something between what is currently available in RDNA2 and RTX cores, one comment explain this in more details

"...
Note that this Sony hardware only does part of the intersection calculation, and doesn't even get to the point where it can determine what material was hit.

Calculating if and where a ray hits on detailed geometry is rather time-consuming, especially if you need to consider a lot of geometry, so the problem is simplified by first checking it the ray hits a simple bounding volume (such as a box) that just encompasses the actual object.

If the ray misses the box you're guaranteed it also misses the object. Only if it hits the box do you need to do the more complicated intersection inside.
You can continue with this idea and put groups of boxes into bigger boxes so if you miss the big box you can ignore all the boxes inside. Do this a couple of times and you've built a hierarchy of bounding volumes (a BVH).

The Sony hardware only calculates the boxes-in-boxes part, so it can only give you a list of objects the ray might potentially intersect. Nvidia's RTX hardware does that and also calculates ray-triangle intersection."


It still sounds like a great performance uplift for RDNA2 if AMD does not have anything similar or better for RDNA4. The biggest issue with RT and RDNA2 cores as explained in article is


"That AMD pipeline already uses ray accelerators to help calculate intersections. The pipeline, however, leaves the costly acceleration structure traversal (i.e., moving through a series of "bounding volumes" in 3D space) to the GPU shader."
 
Sony should just ditch AMD and go with other vendors - Intel, Imgtech, nVidia... Everyone is better with Raytracing than AMD...

They could do a PS3 revival with nVidia's NVLink C2C interconnect. C2C provides 900GB/s interconnect speed which is twice as fast as the memory bandwidth of the PS5. So there isnt even a need of dedicated memory to the CPU core.
 
Sony should just ditch AMD and go with other vendors - Intel, Imgtech, nVidia... Everyone is better with Raytracing than AMD...
And they probably would if they could get similar terms from the others as they have with AMD, also BC needs to be taken care of and CPU interaction might be easier with AMD and probably support from AMD is easier with an AMD SOC.
 
That's absolutely false, GE ps5 occupies bigger space in the hardware chipset than AMD gpu and surely has more set features. But no one has ever said ps5 is unique and god like machine for it. To be fair I heard more such bullshitness like those around XSX hardware than PS5 even from the same MS slides but I find useless such kind of discussion from the last posts.
I think you are confused with the bigger spaced allocated to ROPs units on PS5 compared to all RDNA2 and XSX GPU because of double more stencil / depth ROPs. Anyways Cerny never claimed to have invented the geometry engine inside PS5 and there are no patents from him about it.
 
They could ditch x86, too... Apple is emulating x86 just fine...

They dont need an APU. Cheap interconnects allow the combination of Ryzen and IMG IP.

Lol dude, come on. You're talking about fundamentally changing the architecture for a midgen refresh to get potentially better RT. This is ridiculous.

We're over 2 years in and Sony still can barely meet demand. What is the possible impetus for Sony to tear down this relationship with AMD which has made them wildly successful? It makes no sense.
 
It would not be the first time that a console maker is replacing their product within four to five years. Microsoft did it with the Xbox 360. Sony and Microsoft cant be so blind to not read the market.
 
Lol dude, come on. You're talking about fundamentally changing the architecture for a midgen refresh to get potentially better RT. This is ridiculous.

We're over 2 years in and Sony still can barely meet demand. What is the possible impetus for Sony to tear down this relationship with AMD which has made them wildly successful? It makes no sense.
He has no idea of the technical challenges ...

BTW Apple's x86 emulation has several limitations like not supporting kernel extensions or AVX instructions. The inevitable future of x86 extensions is AVX-512 and there's virtually no ARM vendor out there that has a response ...
 
It would not be the first time that a console maker is replacing their product within four to five years. Microsoft did it with the Xbox 360. Sony and Microsoft cant be so blind to not read the market.

MS 'replaced' their 360 with...an X86 APU, so not sure why you would use that as an example for them moving away from X86, I mean that RISC design was melting their motherboards.

What does 'reading the market' mean as well? Aside from moving to X86 being the best decision Sony has made in the past decade in terms of actual market success, if you're looking at truly performant ARM-based designs, you're really looking at one company right now, and that's Apple. Those chips are massive, and have no bearing on what is actually economical for a console vendor.

Every design is hitting the process node wall. There is no indication AMD is suffering from this particularly worse than others, in fact when you look at the wattage the 3D Vcache chips are demanding with the performance they're delivering, they seem to be doing quite well - and designs such as 3D cache that can actually be managed per CCX make even more sense on a singular platform where developers know exactly how to maximize it and how to avoid its pitfalls.

If you're going to risk backwards compatibility/developer familiarity, you've really, really got to bring something truly extraordinary to the table. Even Nvidia's improvements in RT have been fairly predictable, yes they have big gains but also big, expensive GPU's. I don't see anything to indicate Nvidia has something that would be such an exponential performance uplift at a comparable wattage to warrant Sony wanting to sign separate license agreements with now 2 potential vendors, risk backwards compatibility, require a substantial reworking of their SDK, and all for...better ray tracing? Are console players really demanding that? It's nuts.
 
If you're going to risk backwards compatibility/developer familiarity, you've really, really got to bring something truly extraordinary to the table. Even Nvidia's improvements in RT have been fairly predictable, yes they have big gains but also big, expensive GPU's. I don't see anything to indicate Nvidia has something that would be such an exponential performance uplift at a comparable wattage to warrant Sony wanting to sign separate license agreements with now 2 potential vendors, risk backwards compatibility, require a substantial reworking of their SDK, and all for...better ray tracing? Are console players really demanding that? It's nuts.
Even Nvidia doesn't care about binary compatibility on their own hardware designs and is the main reason why their console partner is holding out on a successor for so long because they can't guarantee that they'll get a design from Nvidia that's compatible with Maxwell binaries so that their partner will have to be content with whatever suboptimal software solution they'll come up with ...

AMD designed GFX10 with backwards compatibility to GFX7 from the outset which is exactly why current generation consoles are backwards compatible without much fuss ...

Nvidia's console customer is far more likely to ditch them for another vendor as opposed to vice versa with AMD's console customers since former are the ones who refuse to offer an easy path to backwards compatibility with their partner's successor and as a result may not even see a new platform even during next year as well ...
 
MS 'replaced' their 360 with...an X86 APU, so not sure why you would use that as an example for them moving away from X86, I mean that RISC design was melting their motherboards.

What does 'reading the market' mean as well? Aside from moving to X86 being the best decision Sony has made in the past decade in terms of actual market success, if you're looking at truly performant ARM-based designs, you're really looking at one company right now, and that's Apple. Those chips are massive, and have no bearing on what is actually economical for a console vendor.

Every design is hitting the process node wall. There is no indication AMD is suffering from this particularly worse than others, in fact when you look at the wattage the 3D Vcache chips are demanding with the performance they're delivering, they seem to be doing quite well - and designs such as 3D cache that can actually be managed per CCX make even more sense on a singular platform where developers know exactly how to maximize it and how to avoid its pitfalls.

If you're going to risk backwards compatibility/developer familiarity, you've really, really got to bring something truly extraordinary to the table. Even Nvidia's improvements in RT have been fairly predictable, yes they have big gains but also big, expensive GPU's. I don't see anything to indicate Nvidia has something that would be such an exponential performance uplift at a comparable wattage to warrant Sony wanting to sign separate license agreements with now 2 potential vendors, risk backwards compatibility, require a substantial reworking of their SDK, and all for...better ray tracing? Are console players really demanding that? It's nuts.

I think he was confusing the X360 with the original Xbox. I base that on the starting statement.

It would not be the first time that a console maker is replacing their product within four to five years.

That would describe the OG Xbox. Of course, that had such a short lifespan because it wasn't a profitable product and MS knew they would have to compete with the PS3 which they thought would be out around when the X360 launched.

So his rebuttal to why Sony would ditch the partnership with AMD when it's bringing them money hand over fist lacks teeth.

Might as well also throw in that currently Sony relies more on hardware than software for backwards compatibility. IE - while it's theoretically possible they could make some form of x86 translation layer, it's an order or magnitude more difficult to make one that is not only reasonably compliant, but also bug free and performant. Hypothetically moving away from AMD mid-gen means a divergent console that is unlikely to be able to run any current, much less past, PlayStation games. Even just switching GPU providers would prove to be an incredibly large undertaking as unlike Microsoft, they don't abstract much if anything via APIs.

Regards,
SB
 
I'd also be interested in seeing a source for this, and to know what advantage Xbox has over PC in this area.

There's this comment from that Goossen chappy, back in March 2000 in an interview with Leadbetter:


The good news is that Microsoft allows low-level access to the RT acceleration hardware.

"[Series X] goes even further than the PC standard in offering more power and flexibility to developers," reveals Goossen. "In grand console tradition, we also support direct to the metal programming including support for offline BVH construction and optimisation. With these building blocks, we expect ray tracing to be an area of incredible visuals and great innovation by developers over the course of the console's lifetime."

So at the very least there's offline BVH construction and optimisation, which sounds like it could save a chunk of power at runtime. Assuming anyone were to actually use that, of course, and not just chuck the PC implementation over because "damn it, it'll do, I haven't seen my children in 3 months!" I'm sure I've read that PS5 allows developers to do the same ... perhaps it's more likely to get used there?

And that's probably another reason to keep any potential mid gen console pretty close to core RDNA 2 functionality. I reckon you'd be wanting your optimised offline BVH structures to be just as well suited for both systems, as at that point they're basically an asset you've spent time generating.
 
With mid gen consoles last gen, we went went from 1.3-1.8 Tflops base consoles with 8 GBs of DRAM to 4.2-6.0 Tflops with 8-12 GBs of DRAM. All for $400-$500.

Can Sony or MS put out a mid gen console with 28-46 Tflops of performance and rocking up to 24 GBs of DRAM for $500 in 2024?

I highly doubt it. Inflation has been so bad that if MS and Sony put out a console today for $500 it probably would be spec’d similarly to what we already have now. An extra year would make little to no difference.

Mid gen consoles for this gen don’t make a lot of economic sense especially given certain realities. Inflated prices for PC gaming hardware doesn’t make for an attractive incentive for consoles gamers to migrate to PC hardware. Nor does the less than flattering reports of all the stuttering displayed by PC titles at launch. The main motivating factors for releasing mid gen last gen aren’t so motivating today.
 
Back
Top