However the idea of playing an FPS with a controller or being locked into stale console hardware, middling IQ and no opportunity to configure games to my liking is
so much worse.
While I agree with all of the above, I think there's an increasing number of mitigating factors for each point:
-
You can use KBM on the consoles. We'll just need for 3rd party FPS developers to increase the number of games that officially support it (Overwatch, Elder Scrolls Online and Call of Duty already support it).
- Console hardware became less stale with the launch of the mid-gens. We're now getting a new console every 3-4 years. Is your PC upgrade cadence much lower than that?
- Middling IQ was the main point of the discussion. It seems the ROI in IQ is ever decreasing. I wonder if as soon as the consoles start using whatever FSR is, if the IQ difference will become moot.
- Some of the conventionally moddable games like Skyrim are already moddable on the consoles too.
It looks like both consoles have been targeting an increasing chunk of the PC crowd. From integrated audio+video streaming support, modding, KBM, mid-gen releases to even native 1440p monitor output, it seems like both platforms are slowly converging into a single "gamer" market.
Just as I thought. Even in AMD sponsored and console optimized Raytracing titles, Nvidia pulls far ahead of AMD.
Can you link to independent and publicly available benchmarks for this AMD sponsored raytracing title you speak of?
If those assumptions are coming from the RE8 specs recommendations, are we also supposed to believe the RX 6800 offers the exact same 4K-45FPS performance as the RX6700XT, despite
benchmarks showing 24-32% higher 4K+RT performance on the former? The 3070 is only 33% faster in 4K RT than the 2070, when
other benchmarks point to a 65-70% difference?
And where does that post or AMD claim their RT performance would be better than NVIDIAs?
It (
I) did not as
later clarified in a later post. Nor did AMD, who
just recently very clearly stated they know their RT performance isn't matching Ampere's and aren't trying to hide it (nor the fact that they think rasterization performance is more important and their RT performance is sufficient).
My problem here is the yet another derailment of this thread into a
nvidia has best raytracing hardware series of statements, out of a fake response to a statement they fabricated.
This is a speech that apparently needs to find its way into this thread every handful of pages.
I don't get what's actually being claimed here. Is this just more Nvidia are gods crap?
I'm surprised DLSS hasn't been put into the mix yet.