Is a 2060 at 1080p (or 1440p with DLSS) accessible enough? Or should we only look at 4K ultra to decide what’s playable?
https://www.pcgamesn.com/control/nv...formance-benchmarks#nn-graph-sheet-1440p-dlss
I finished Control. You think Control is a game-changer in visual immersion?!
Can you compare Control with maxed out RT with e.g. Demon's Souls 2020?
We're still missing what it takes for RT to truly take off.
Here's the case with real-time raytracing. There are 3 important factors:
- Raytracing being used at a level that is universally perceivable as being substantially better than a rasterization trick (and without sacrificing
everything else, like you see in e.g. LEGO or RTX Minecraft)
- Getting good enough performance with raytracing
- Using affordable hardware
In 2021, you pick two. You can't pick three. I hope you don't think Control is where you get all three.
Given how we're in Q4 2021 and the next generation of Nvidia and AMD videocards are expected to cost an arm and a leg, picking all three won't happen in 2022 either, and I doubt 2023 will be much better.
Now I’m really curious to know which graphics tech you think has been more transformative in a similar timeframe (3 yrs).
RT is up there with 3D acceleration and unified shaders.
Texture filtering, pixel shaders and then unified shaders (which eventually gave way to compute shaders). Next one for me is definitely virtualized geometry for "unlimited" geometry detail, without a shred of doubt.
Raytracing is super cool but it's just not feasible to use on a large scale of hardware in a meaningful way, so I think RT's "transformative 3 years" won't start until 2023.
They can't hold that opinion forever. As long as the limits of graphical prowess wants to be pushed more power is required.
Of course they can't and they won't!
All those videos I shared from the past 6 months present opinions based on what they have now and what is expected to come up to 2 years from now. They all say that in some form or another.
All companies are moving back towards accelerators provided people want to see better graphics and people will eventually have to come to agreement with that over time; the better accelerators will have the best performance in games.
All companies? Intel and Nvidia are, for their dGPUs.
But Microsoft and Sony, whose consoles move the majority of AAA game sales at launch, decided to stick to an architecture that repurposes the TMUs for RT acceleration and don't use dedicated tensor cores.
The first RT implementation in a smartphone is apparently coming in the form of Exynos 2200 which uses RDNA2 just like the consoles. The first RT implementation in a handheld console is arguably the Steam Deck, with another RDNA2 GPU.
Is Apple expected to use exclusively dedicated RT units in their future iGPUs? Is there a chance Nintendo will actually pay for the footprint of a SoC that has dedicated RT units, for their next handheld?