The defensive position that you're taking up seems to indicate otherwise ...
And the offensive, anti RT position you are showing is indicative of otherwise too, funny these arguments are showing only when AMD is beaten time and again in RT, which indicates again that these fake arguments stem only from a defensive position of AMD's weak spot in RT.
We'll see soon enough either way with the next batch of games coming up regardless ...
Yep, just stick around this thread, and enjoy the RT ride.
How do you reconcile with the fact virtually no developer is implementing Nvidia's physics library anymore ?
PhysX was a precursor for GPU particles. Just like TXAA was a precursor for TAA.
What do you think is more important in winning a technical standard ? The brand or the technical specification/implementation itself ?
NVIDIA forced display makers to make quality displays, the point of the G-Sync module was never about making stuff exclusive to NVIDIA, it was to force displays to have some sort of quality design to deliver optimal variable refresh rate experience. FreeSync encouraged the spread of trash displays, and later had to adopt several tiers to distinguish good ones from bad ones. With G-Sync you knew your display is good. That's why display makers rushed to have the G-Sync badges, because having it meant the display is good. So you see, you are contradicting yourself again, what won here is the standard that promoted quality, not the standard that made a mess out of quality, and was made in a rush to steal headlines with no regards to quality.
If you care about proper implementations and specifications then you know it's the one that ended up winning with G-Sync.
which means the likes of Apple or others like ARM and Qualcomm have bigger voices at Unity
Who cares? Unity has RT, that's what matters. Also you are wrong, ARM GPUs are now RT capable. Only Apple is left behind.
These two useless demos show that developers now have real capability in a real game engine to make content incompatible with HW RT!
Two useless demos still, the real demo is the Matrix demo, you know .. the one where you walk, fly and drive around like an actual game. The one that actually supports HW-RT in a spectacular way. When Epic wanted to make a nex gen demo, they used this demo and they used HW-RT to pack the punch. And that's what the rest of the industry is doing.
How are we supposed to somehow trust that developers won't use Nanite to render terrain or foliage ? Epic Games is literally undermining the future of HW RT itself by opening the doors to the creation of incompatible content ...
UE55.1 RTX branch fully supports nanite and foliage with very good performance. UE5 is a constantly changing landscape .. you are naive if you think Epic will risk losing NVIDIA's long standing support. More RT features and enhancements will come, as Epic stated. So stick around, and watch them do it.