One question arises for me. Why does raytracing reduce the performance in the first place? Is there a second bottleneck that is independent of the conventional rendering (or the hardware that is responsible for it)? If this is the case, then one could upgrade the area of the hardware that is responsible for it independently in the next generations.
When the RT/T cores make up (for example) 30 % of the chip area then one can upscale this 30% much better with future shrinks to minimize the fps loss as fast as possible? If so it would be a much better perspective for raytracing and future generations in terms of resolution and frame rate.
Other points:
1) trivial implementations. There is still some potential for parallelization and optimization (as seen in a DigitalFoundry video)
2) no experience of real-time raytracing among developers
3) many beams also need shading
If a developer only has a short time and wants to make a feature look good it will automatically become a massive performance bottleneck.
With rasterizers everything that lands on the shaders and GPUs with double the throughput gets roughly twice as fast. With a hybrid approach the balance has to be right. If the rasterization part is finished too quickly it has to wait for the raytracer. If the raytracer is finished too quickly it has to wait for the rasterization.
If one attaches raytracing to a game and replace all the lighting afterwards it is probably not trivial to just get it done.
The question is whether the drop with theoretically infinitely increased raytracing performance goes to zero or whether the conventional hardware is significantly affected by it. If the former is the case, a better performance with raytracing in future generations would be easier to guarantee because one cant upscale all units.
I meant in comparison to the previous gen prices. You can find it starting at
500$, it's the same price as 1080 when it launched. But faster and with more features.
Turing also removes some of the performance issues of Nvidia GPUs. I would not buy a Pascal GPU anymore.