And the offensive, anti RT position you are showing is indicative of otherwise too, funny these arguments are showing only when AMD is beaten time and again in RT, which indicates again that these fake arguments stem only from a defensive position of AMD's weak spot in RT.
Yep, just stick around this thread, and enjoy the RT ride.
Either way only time will tell ...
PhysX was a precursor for GPU particles. Just like TXAA was a precursor for TAA.
That must be a very polite way of describing it as a dead end ...
Mantle was a precursor to more explicit APIs but would you award it a similar amount of credit by your own logic ?
NVIDIA forced display makers to make quality displays, the point of the G-Sync module was never about making stuff exclusive to NVIDIA, it was to force displays to have some sort of quality design to deliver optimal variable refresh rate experience. FreeSync encouraged the spread of trash displays, and later had to adopt several tiers to distinguish good ones from bad ones. With G-Sync you knew your display is good. That's why display makers rushed to have the G-Sync badges, because having it meant the display is good. So you see, you are contradicting yourself again, what won here is the standard that promoted quality, not the standard that made a mess out of quality, and was made in a rush to steal headlines with no regards to quality.
If you care about proper implementations and specifications then you know it's the one that ended up winning with G-Sync.
@Bold And look where that got Nvidia ? Everyone else is implementing alternative technology that's not even tied with the original brand! G-Sync technology became both redundant and obsolete overnight ever since Nvidia decided to embrace the industry backed alternative that didn't originate from them. No one else but Nvidia "implements" the G-sync "standard" in the industry ...
Can you make any real guarantees that the original G-sync displays will still somehow work on Nvidia HW in the next 5 years ? How about in 10 years ?
Who cares? Unity has RT, that's what matters. Also you are wrong, ARM GPUs are now RT capable. Only Apple is left behind.
Well consumers and technical outlets certainly care because using a port of an iOS game isn't compelling content since the design limitations are still carried over from that platform. Adding ray tracing to a port of a mobile game doesn't change the fact that people are still playing a low budget production value game ...
It's not just Apple that has yet to divulge any details but Qualcomm doesn't have any public plans so far for HW RT as well. Together both Apple and Qualcomm dominate the high-end mobile devices market and Unity Technologies with their mobile centric engine can't ignore their voices since the demographics they appeal to (America, high-income European countries) are more likely to pay for more lucrative services as opposed to the average ARM graphics user ...
Two useless demos still, the real demo is the Matrix demo, you know .. the one where you walk, fly and drive around like an actual game. The one that actually supports HW-RT in a spectacular way. When Epic wanted to make a nex gen demo, they used this demo and they used HW-RT to pack the punch. And that's what the rest of the industry is doing.
How were the other demos any less "real" when they had interactivity as well ?
UE55.1 RTX branch fully supports nanite and foliage with very good performance. UE5 is a constantly changing landscape .. you are naive if you think Epic will risk losing NVIDIA's long standing support. More RT features and enhancements will come, as Epic stated. So stick around, and watch them do it.
Can the latest RTX branch actually run those "two useless demos" with HW RT yet ?
Notice how you mentioned that Nvidia's "support" comes in the form of their own proprietary fork (RTX branch) as opposed to actually contributing back to the upstream/master branch ? Epic Games doesn't really care about the RTX branch since it's mostly full of Nvidia's libraries for RT effects for which Epic Games themselves don't maintain at all. The RTX branch could burn in hell and Epic Games still wouldn't care ...
Do you seriously think that developers will actually prefer the RTX branch over Epic Game's own upstream/master branch where it's far more likely that Epic Games will provide superior technical support for their own product ?