Nvidia are so good, we love them so much! NvidiaaaaaaaaaaaaObviously RT vs RT amd isnt even close. And thats RDNA2 pc gpus.
Nvidia are so good, we love them so much! NvidiaaaaaaaaaaaaObviously RT vs RT amd isnt even close. And thats RDNA2 pc gpus.
Nvidia are so good, we love them so much! Nvidiaaaaaaaaaaaa
Or there's the idea that a performance delta doesn't have to be constantly posted when it's known and doesn't advance the discussion. The only reason for those here to constantly do so is some childish fanaticism with a company, or a financial benefit in doing so.I know, it hurts, but hey its just computer technology in the end
If you're referring to TPU's list then there's a bunch of made-up SKUs over there. AFAIK there's no laptop Radeon Pro W, nor a Pro 5300. There is one desktop Radeon Pro W5500 with Navi 14, but then again so is there a W5700 with a Navi 10 without mixed precision dot product.Navi 14 has 13 skus associated with it with 7 being professional cards.
What would motivate AMD more? Including capabilities in low-end consumer cards specifically for gaming use that its mid and high range lack. Or adding the capability to low-end professional cards where deep learning is more readily utilize with the side effect that some low-end consumer cards have the hardware because it cheaper to repurpose than redesign?
You can twist the "burden of proof" fallacy to use it both ways (which some here are doing). I can't ask anyone to "prove the feature isn't there", nor can you ask me to prove "the feature isn't absent".There is no proof that PS5 has them either. The burden of proof shouldn't require any of us to prove a negative.
The PS5 definitely doesn't use DX12 Ultimate's VRS, nor DX12's Mesh Shaders, and Infinity Cache isn't there from looking at the X-rays from Fritz.Should we assume or have assumed (without any proof) that PS5 had VRS, Mesh shaders and/or Infinity Cache because Sony made no declaration that the PS5 doesn't sport those features?
I'm not readily assuming anything, just providing an opinion. I've been claiming no certainties whatsoever, other that we can't be certain of a number of things being taken for granted by some.AMD RDNA white paper describes int4 and int8 being part of variants of the RDNA CU. That implies there are RDNA CUs with no mixed-precision dot product functionality. There is enough hardware variation between RDNA based processors that you can't readily assume all the hardware is present but is simply turned off or broken.
There's a very good reason why I didn't dare posting that "moot point" quote from Scott Herkelman in the PC GPU subs. Half the thread about RDNA2 GPUs consists of the same 5 users repeating the "but RT performance" and "what about DLSS" rethoric to derail it.Or there's the idea that a performance delta doesn't have to be constantly posted when it's known and doesn't advance the discussion. The only reason for those here to constantly do so is some childish fanaticism with a company, or a financial benefit in doing so.
Personally, I don't see why there's even big argument over about whether or not platform X,Y, or Z features a HW implementation of VRS ...
The PS5 definitely doesn't use DX12 Ultimate's VRS, nor DX12's Mesh Shaders, and Infinity Cache isn't there from looking at the X-rays from Fritz.
From Matt Hargett's comments, it seems Sony didn't see much of a value in DX12U's VRS, but it remains to be seen if they're just not making use of the functionality in their SDKs, or there's really not the same hardware capability as all the other RDNA2 GPUs. Maybe it is absent and they're doing it through software, or maybe they're implementing foveated rendering through hardware in a different way.
the developer doesn't want to roll their own software solution; as in they either don't bother at all or they use what's provided.A hardware implementation for VRS will arguably see the most gains when
Hum?We know geometry can be done on compute, but no one wants to do it and most would rather go the route of Primitive/Mesh Shaders to perform that function for instance.
Brian Karis of Epic said:The vast majority of triangles are software rasterised using hyper-optimised compute shaders specifically designed for the advantages we can exploit.
As a result, we've been able to leave hardware rasterisers in the dust at this specific task. Software rasterisation is a core component of Nanite that allows it to achieve what it does. We can't beat hardware rasterisers in all cases though so we'll use hardware when we've determined it's the faster path. On PlayStation 5 we use primitive shaders for that path which is considerably faster than using the old pipeline we had before with vertex shaders.
But even then Epic are using the primitive shaders path for the bigger polygons because that is still more efficient than using their compute shaders method.Yea that's sort of my point. They are the only ones that we know of so far. Not many companies are (largely) bypassing the entire front end just yet.
yea, I don't think they can use it everywhere IIRC. I think Nanite is only for static geometry. I think object geometry is likely made with Primitive/Mesh Shaders.But even then Epic are using the primitive shaders path for the bigger polygons because that is still more efficient than using their compute shaders method.
yea, I don't think they can use it everywhere IIRC. I think Nanite is only for static geometry. I think object geometry is likely made with Primitive/Mesh Shaders.
The whole point is, just because software variants exist, doesn't mean developers will go for it. It's costly for studios to invest so much in an engine R&D team and build a game simultaneously. Only a handful of companies have this.
Misinterpreted by you then.From Matt Hargett's comments, it seems Sony didn't see much of a value in DX12U's VRS,
Sure. I don’t disagree. I guess what I meant to write is that historically studios that are shipping games while advancing features tend to do it slower than teams that particularly just keep working on engine development 24/7.I wouldn't doubt they're trying for all geometry on Nanite. Media Molecule can do it, skinned meshes too. And hey look at that RT performance, you've got RT shadows and ambient occlusion on a PS4 in realtime. Not too mention basically unlimited detail already, hey look more detailed models than Horizon Forbidden West, done by one person! The cost is just plain worth it if the studio has the technical chops to pull it off, the entire point is the normal front end is slow in comparison. And it is, why are you even drawing giant polygons if the idea is subpixel detail?
This "slow and gradual" mindset makes no sense to me. Replacing triangles works, and it works really, really well. The triangle pipeline has grown into a leviathan of a headache, how many representations do you need simultaneously? BVH, LODs, physics mesh, capsules for GPU animation tricks. How much of an art pipeline do you need to go through, how long would devs sit there tackling that? High res mesh, low res mesh, UV map, bake lods, bake normals, and now on top of normal material painting you get to do a ton of other parameter painting as well!
Or you can look at Dreams and it just... kinda works. Average users can use it, dedicated amateurs can make an entire Sonic game by themselves without any training. The time and cost savings potential are absolutely enormous. What's the argument for traditional pipeline, it's "familiar"? Don't take the "risk"? It seems like nonsense. Which is part of the reason why I see even more contraction of upper end game engines in the future. Yes, smaller studio with a hundred people still trying to make a high end game, you can make your own game engine. Or you can work through the pain of using UE5, or whatever it is Embark Studios is making, or etc. and probably save time and money and end up with a better looking game.
The fact that it requires VRS is maybe why they haven't released it for PS5 as well.FidelityFX was just announced for XSX/S and AMD has a dedicated page for it with the features it supports atm: https://gpuopen.com/xbox/#fidelityfx
It is being added to the Xbox GDK.
It's fidelity fx VRS which is just a module of the fidelity fx toolkit.I'm a little out of the loop.
Wasn't vrs and all already available on Series before FidelityFX?
Yes, there's VRS implemented since launch-date games for the series consoles.Wasn't vrs and all already available on Series before FidelityFX?