Changes absolutely nothing about what I said. Drawing some strict line between non and mesh-shading supported GPU's was clearly not a correct way to view things given these new results.We have a whole forum dedicated to this
Changes absolutely nothing about what I said. Drawing some strict line between non and mesh-shading supported GPU's was clearly not a correct way to view things given these new results.We have a whole forum dedicated to this
We have a literal forum of people that have come together to discuss the various differences here. Even Alex writes about it here:Changes absolutely nothing about what I said. Drawing some strict line between non and mesh-shading supported GPU's was clearly not a correct way to view things given these new results.
Changes absolutely nothing about what I said. Drawing some strict line between non and mesh-shading supported GPU's was clearly not a correct way to view things given these new results.
Take it up with him in his recent video then. Obviously he had views before that dont hold up now that he's actually talked to professionals about it.We have a literal forum of people that have come together to discuss the various differences here. Even Alex writes about it here:
No, I am not claiming that personally. That is what Alex said when he talked to an actual developer from Remedy. You are free to take it up with him, but I still find it absolutely bizarre why some of you are incredibly resistant to this and are taking up any argument tactic you think might work to deny it. Actually, that's a lie. I know exactly why some of you are doing it...Are you suggesting AW2 is taking advantage of RDNA1 primitive shader hardware somehow? Using which api?
No, I am not claiming that personally. That is what Alex said when he talked to an actual developer from Remedy. You are free to take it up with him, but I still find it absolutely bizarre why some of you are incredibly resistant to this and are taking up any argument tactic you think might work to deny it. Actually, that's a lie. I know exactly why some of you are doing it...
I don't know what Alex said or what resistance you're talking about. I didn't understand your last few posts so hence asked the question.
I think what _all_ of us are trying to say is that what Alex says in this video does not invalidate what we have written, but validates it.Take it up with him in his recent video then. Obviously he had views before that dont hold up now that he's actually talked to professionals about it.
Not that you guys are the gospel about any such subject anyways.
Why are you arguing this? I dont get it. Alex is literally not supporting what you're saying anymore. This isn't complicated and I dont understand what you're insisting on being so stubborn about learning that there's obviously some middle ground here that y'all hadn't considered before.
My only guess would be that AMD does something driver side since devs have no access to NGG on Vega or RDNA 1.Timestamped and though I don't think anyone is arguing here, we are just trying to figure out how the RX5700 is still "playable" despite not having DX Mesh Shader support.
Timestamped and though I don't think anyone is arguing here, we are just trying to figure out how the RX5700 is still "playable" despite not having DX Mesh Shader support.
That is a real possibility, I could swear I read something about AMD doing something in the driver somewhere but I have not been able to find it.My only guess would be that AMD does something driver side since devs have no access to NGG on Vega or RDNA 1.
amplification shader step is optionalIt’s a primitive shader.
When people refer to mesh shaders on directX; they are talking about both amplification and mesh shader.
On PS5, they only have a primitive shader which aligns with the mesh shader, they are missing an amplification shader, which means developers have to do it.
It’s not a bad thing, and we have seen examples in the past: see UE5 Nanite uses compute shaders and mesh/primitive shaders. But it is on the developer to do which is why I don’t think it’s often used.
What does an Amplification Shader do?
While the Mesh Shader is a fairly flexible tool, it does not allow for all tessellation scenarios and is not always the most efficient way to implement per-instance culling. For this we have the Amplification Shader. What it does is simple: dispatch threadgroups of Mesh Shaders. Each Mesh Shader has access to the data from the parent Amplification Shader and does not return anything. The Amplification Shader is optional, and also has access to groupshared memory, making it a powerful tool to allow the Mesh Shader to replace any current pipeline scenario.
Couldn't it just be that the fallback vertex shader path just runs better on AMD hardware? A lot of competing AMD parts from that era had higher TFlops, and better async compute. I seam to remember them being more sought after for mining and other compute heavy tasks.My only guess would be that AMD does something driver side since devs have no access to NGG on Vega or RDNA 1.
Certainly possible. Though with the prior pipeline Pascal was still much better than even RDNA 1 when it came to geometry.Couldn't it just be that the fallback vertex shader path just runs better on AMD hardware? A lot of competing AMD parts from that era had higher TFlops, and better async compute. I seam to remember them being more sought after for mining and other compute heavy tasks.
Btw I noticed Alex sounds very different on recent videos. Rich sounds different too in this last video. Is DF doing something funky with audio processing or are my ears starting to give out?
They're both ill. They mention it at the start of the video.
Also note that the main character doesn't cast a shadow from this light source, leading me to believe it's not a real light source in the RT scene, but indeed the contribution is baked. (Edit: just skimmed the DF tech article as well and they mention the same thing, so seems to be confirmed.)
That would explain it. Speedy recovery to both.