So we all agree that games giving additional scalable graphics settings and quality options that it's up to the user whether or not to activate is a good thing? Even if those settings perform better on one IHV than the other?
Surely everybody agrees on this.
I see two questions arising here:
1. Is there a pattern of disappointing RT support on AMD sponsored titels?
If so, we get new questions form that:
Is AMD HW too weak with HW RT, so they tone the feature down to keep fps up?
Is AMD software R&D weaker than NVs? Do game developers rely on this at all (considering it's their task not that of an HW vendor)? Can we expect strong software R&D from any IHV just because NV has it?
On the HW side, HW is quite behind at the moment, but they can easily support any future API changes. So the HW might age well, which is what's needed for a console generation?
On the SW side i don't request anything from them, other than API extensions to expose flexibility. Superresolution is fine for those who stick their nose on a 4K screen. At comfortable distance this whole thing feels bogus to me.
2. Does RT add a new dimension to the 'lazy consloe port' debatte?
Can it be scaled up to high end PC easily?
In case of RE8 i'd say not really, because reflections don't matter much for a dull and dark horror setting like this, and mixed baked / dynamic GI is pointless anyways IMO.
I expect similar situations in many games which implement only a small number of RT effects. But hey - this is still mostly cross gen time. Not that impressive yet but it's something.