What? you choose what samples get rendered at lower resolutions. Clearly it's an effect that can vary based on implementation.
Personally I see value in VRS, but it probably won't be truly leveraged until later in the generation, once games get more ambitious in scope and need to squeeze out as much perf as possible. Preferably you use a technique like VRS on parts of the frame that aren't going to be visible to the player and have no major impact on visual quality or fidelity when implemented.
Like any technique if it's used skillfully it will absolutely make a difference, doesn't even matter if the impact it gives in performance gains is "less" than other techniques, they all add up.
Essentially this. There was something bugged with it which I "confirmed" caused artifacting in the game. That bug was fixed in the subsequent 2.0 patch, and now the game looks great. It looks like the Series X utilizes a lower precision implementation of VRS and still causes some nasty artifacting. It runs at a lower base resolution too than what I was running on my PC, so that doesn't help things either.
In the beginning I wasn't sure what was happening with the VRS artifacting in Dirt 5.. I wasn't even really sure if it was a VRS issue.. until I played Gears 5 Hivebusters with the new patch that added VRS and other gfx options to the game. When I started up that game, I turned on VRS and... I had similar issues that Dirt 5's VRS implementation had.
You can easily see on the faces.
but then.. I tried something. I tried turning off Async compute.... and... it fixed the problem. So it's something with running VRS and Async compute simultaneously which causes the issue (at least on my machine/drivers.. which is fully updated btw)
So perhaps there's an issue with Nvidia and VRS/Async compute when used together currently? Maybe Codemasters realized this and disabled Async compute on Nvidia GPUs? Maybe not.. I dunno. It's hard to tell because Dirt 5 doesn't have any options to enable/disable both settings separately like Gears 5 does.
It's worth noting though that Gears Tactics works perfectly fine with both VRS and Async Compute enabled.. so I'm thinking it's just a quirk with Gears 5 and the previous build of Dirt 5.
Lol that's exactly how not to use VRS xD; things like faces or I'd argue anywhere on the player character model, it's probably best to not use VRS there because it will be more noticeable especially with time. If it's being done on things that aren't of significance that always seems like the better (and would argue, the intended) purpose.
Maybe the reason VRS & Async compute aren't playing friendly together with these games, on Nvidia cards no less, sounds like driver incompatibility issues. As far as what tier of VRS Series X is using, how many tiers are there exactly to VRS? MS mentioned Series systems supporting Tier 2, but is there a Tier 3 already? If they are supporting the full suite of RDNA 2 features and there's a Tier 3 but by some chance MS aren't using it, well that'd be a bit of a face palm imo.
So at least regarding Series X performance with VRS in Hivebusters I'm more leaning with it being slight lack of time optimization and possibly something with the engine that's causing a bit of incompatibility requiring some tuning. Gears 5 had some odd issues running on Series X as Digital Foundry pointed out a while back, things the team was aware of and ended up fixing with an update, though I don't exactly remember what they specifically did to fix them. Could be something similar they would need to do with slightly more optimization of VRS in the Gears 5 engine and, obviously, the DiRT 5 engine as well, which would probably need even more work given its status.
There is no good implementation of VRS. It's not a reconstruction tech like DLSS, CBR or TI where the positives should easily outweight the negatives. It's a method to gain a few FPS while losing image sharpness. It could explain why they added a sharpening filter on XSX (not on PS5) at 120hz in order to improve the image clarity in that patch.
Gonna have to disagree on this one. By that notion, foveated rendering would fall in a similar ballpark, but IMO it doesn't. Any technique needs careful usage in order to leverage the benefits, some just require more skillful handling than others. Part of that is also impacted by time of the software development.
As well IIRC the sharpening filter was mentioned in reference to Series S, but not Series X. I could be wrong, but when MS rolled out the Series S reveal and in a couple of Jason Ronald's subsequent interviews they specified an image sharpening filter for that hardware, but I've not seen any mentions of similar for Series X.