Next gen lighting technologies - voxelised, traced, and everything else *spawn*

Is 30% silicon or whatever it is for raytraced reflections in BFV worth it versus the higher resolutions and framerates obtainable without?
First it may not even be 10%. We still yet to determine that exactly. Because if it was truly just 10% then it's worth every cost.

But lets for arguments sake assume they are indeed 30%. How much performance do you think 30% more traditional silicon will give you? Because it won't really be 30% more performance at all. A Titan V has about 17% more cores, 45% more ROPs and 12% more Texture Units, yet despite all of that it's still slower than a 2080Ti. So a 30% more silicon area might actually throw you off into the territory of diminishing return, you could maybe have 10% more performance, or 15% at best. Which is massively underwhelming. You are better off spending that area on image quality features and non conventional performance boosters.
 
As they should be, show me a game that has this much dynamic true and accurate reflections across dozens of objects like this.
I mean pricey in terms of dollars gamers need to spend to unlock.

Also NVIDIA has proven they can push their own technology even when they are proprietary and exclusive, that's what they did with TXAA, HairWorks, PhysX and other effects. GameWorks stuff thrive to this day. This time DXR is standard, and is already having a good adoption rate before it even lefts off. They can support it till flourishing.
Well, DXR in DirectX isn't going to amount to much if the hardware can't run it quickly enough.

Dynamic destruction is hard using baked GI because it looks horrendous when it breaks the lighting, it needs dynamic GI, and we now have it courtesy of RT.
That's true, and as a result I'd say RT should be used primarily for lighting. I have on problem with that either, but it just means representing the tech a little less enthusiastically. Which of course isn't what selling a product is about. ;)

Pick one for now on the first RTX gen, with gen 4 or 5 we could be talking about all of them combined and with increased rays per pixel too.
Well sure, but we can't realistically have a discussion about tech five generations from now, especially in a thread discussing measuring performance ;). Five generations from now should be in the realm of lithographic nightmares. I suppose for now we can be sure of a node shrink for the RTX 2070/2080 so may double up power, but from there gets uncertain when and where improvements in processing power will happen.
 
I mean pricey in terms of dollars gamers need to spend to unlock.
Only the 2080Ti is pricey at the moment. A 2070 is very reasonably priced. And the 2080 is coming down in prices now that the 1080Ti is disappearing. And even if it didn't you are only paying like a 100$ more for extra visual flair and potential features. That's not pricey. With the advent of 7nm prices will become even more stable as die sizes shrink.

Well, DXR in DirectX isn't going to amount to much if the hardware can't run it quickly enough.
I agree, but quickly enough is relative. Developers and most gamers consider 60fps enough. Consoles are content with 30fps. These are not hard goals to meet.
That's true, and as a result I'd say RT should be used primarily for lighting
And why not shadows too? It's useful there too and avoids many of the flaws of old techniques. Besides, we have seen the limits of what rasterization can do in that arena, with stuff like VXAO and HFTS being so expensive for what they offer.

Well sure, but we can't realistically have a discussion about tech five generations from now, especially in a thread discussing measuring performance
We can if the discussion is gearing toward an unrealistic scenario of doing unprecedented amount of visual features all at once! It's like demanding real life graphics right now because RTRT is finally possible on current hardware!

but it just means representing the tech a little less enthusiastically.
Enthusiasm is warranted because we can finally push the boundaries of visual flair in games. Especially in the PC space where visuals have stagnated for far too long. When was the last time we saw so wide and far reaching dynamic reflections in a game? Was there ever a game with so much pervasive reflections?
 
16054


Not att all that bad, i mean this GPU is doing 2560x1440 Ultra settings with RT/DXR ultra on a avg fps of 77. Thought the performance impact would be much greater, not bad for a game not even fully optimized for RT, and early drivers etc.
I'm guessing this is higly scene-dependent, since TPU for example got a "bit" different results https://www.techpowerup.com/reviews/Performance_Analysis/Battlefield_V_RTX_DXR_Raytracing/4.html
1440.png


Guru3D is also on the lines of TPU https://www.guru3d.com/articles_pages/battlefield_v_pc_performance_benchmarks,7.html
index.php


PC Games & Hardware is painting even worse picture http://www.pcgameshardware.de/Battl...5-Day-1-Patch-Direct-X-12-Nvidia-RTX-1269296/
 
PCGH is using the Tirailleur stage, a very demanding stage indeed, but one that has a known bug that greatly reduces fps.
DXR Performance degraded in maps which feature a lot of foliage
This particularly effects War Stories “Liberte” and “Tirailleur”
Status: Currently investigating
https://forums.battlefield.com/en-us/discussion/161023/battlefield-vs-known-issues-list

Multiplayer maps are free of this issue.
 
PCGH is using the Tirailleur stage, a very demanding stage indeed, but one that has a known bug that greatly reduces fps.
Multiplayer maps are free of this issue.

I guess its allways possible to find bad performance figures if one wants, BF is about multiplayer though. 64 sized conquest maps have always been the most heavy in BF games.


The performance (RTX 2080Ti with 1440p/60fps) is better than I expected it would be and this is just the beginning of raytracing. In addition, reflections seem to be one of the most extensive effects while AO and GI are faster.

Exactly my thoughts, there were many anti-RT/RTX, but it didnt turn out as bad as some thought it would. A developer like Insomnia could probally do something nice with a GPU like that.
 
I want your disposable income; $550 to $600 is "reasonably priced"?

What is reasonable anyway, for a 2070? For me personally nvidias GPU's are way too expensive. Thought that my GF3 TI 500 was expensive in 2001.... doesnt seem expensive now. TI4200 was extremely well priced for its performance. Prices need to come down....
 
One question arises for me. Why does raytracing reduce the performance in the first place? Is there a second bottleneck that is independent of the conventional rendering (or the hardware that is responsible for it)? If this is the case, then one could upgrade the area of the hardware that is responsible for it independently in the next generations.

When the RT/T cores make up (for example) 30 % of the chip area then one can upscale this 30% much better with future shrinks to minimize the fps loss as fast as possible? If so it would be a much better perspective for raytracing and future generations in terms of resolution and frame rate.

Other points:
1) trivial implementations. There is still some potential for parallelization and optimization (as seen in a DigitalFoundry video)
2) no experience of real-time raytracing among developers
3) many beams also need shading
If a developer only has a short time and wants to make a feature look good it will automatically become a massive performance bottleneck.

With rasterizers everything that lands on the shaders and GPUs with double the throughput gets roughly twice as fast. With a hybrid approach the balance has to be right. If the rasterization part is finished too quickly it has to wait for the raytracer. If the raytracer is finished too quickly it has to wait for the rasterization.
If one attaches raytracing to a game and replace all the lighting afterwards it is probably not trivial to just get it done.

The question is whether the drop with theoretically infinitely increased raytracing performance goes to zero or whether the conventional hardware is significantly affected by it. If the former is the case, a better performance with raytracing in future generations would be easier to guarantee because one cant upscale all units.

I meant in comparison to the previous gen prices. You can find it starting at 500$, it's the same price as 1080 when it launched. But faster and with more features.

Turing also removes some of the performance issues of Nvidia GPUs. I would not buy a Pascal GPU anymore.
 
Last edited:
Well sure, but we can't realistically have a discussion about tech five generations from now, especially in a thread discussing measuring performance ;). Five generations from now should be in the realm of lithographic nightmares. I suppose for now we can be sure of a node shrink for the RTX 2070/2080 so may double up power, but from there gets uncertain when and where improvements in processing power will happen.
If I can borrow one of your earlier talking points “we can’t expect developers to invest all this effort for graphics for a niche population”. I mean we also have to consider development time to get AO, GI, Shadows all working into their pipeline as well. Ignoring performance there is still development time required, on generation 1 equipment and algorithms. I’m not saying you’re wrong, but perhaps 2 years from now we could see some games with more than 1 feature.

Enthusiasm is warranted because we can finally push the boundaries of visual flair in games. Especially in the PC space where visuals have stagnated for far too long. When was the last time we saw so wide and far reaching dynamic reflections in a game? Was there ever a game with so much pervasive reflections?
My agenda here is simple; HRT is a pretty solid equalizer across development studios. To @Shifty Geezer points earlier, rasterization will always improve and get better. Every single generation of consoles the tail half of the games always look better than the front half, so we know these statements are true. But the developer teams that have more resources, more time, and less platforms to work with, have more time to make these specific approximations work for their games. It's a lot of work, and that's how we're seeing these mid-range consoles push out so much graphical horsepower with certain studios. But that's the thing, with HRT, there is an equalizer in that the developers can rely on hardware to brute force it and be able to compete and exceed those teams that are spending over a decade building their titles.

I do find it absurd that people praise Sony's 1P titles but scoff at PC settings ultra, but that gap is going to widen by a country when compared to HRT titles.
If there's one thing I know about PC gamers, it's that they love paying $1200 to play games at 1080p, Low.
1080p at Ultra settings.
DXR at low.
 
The performance (RTX 2080Ti with 1440p/60fps) is better than I expected it would be and this is just the beginning of raytracing. In addition, reflections seem to be one of the most extensive effects while AO and GI are faster.

I assume you mean 'expensive' instead of 'extensive'.
If you think raytraced reflection is more expensive than raytraced AO or GI, you must be joking.
In general specular reflection can be done with 1 ray while correct AO or GI need many.
(correct diffuse reflection is also very expensive)
It get's even more expensive of course when reflected scenery itself does not make use of baked AO or GI like in BFV, and this reflected AO/GI also is done with raytracing.
 
If I can borrow one of your earlier talking points “we can’t expect developers to invest all this effort for graphics for a niche population”. I mean we also have to consider development time to get AO, GI, Shadows all working into their pipeline as well. Ignoring performance there is still development time required, on generation 1 equipment and algorithms. I’m not saying you’re wrong, but perhaps 2 years from now we could see some games with more than 1 feature.
I'm not sure there's any solution to combining features beyond more power. Each aspect requires tracing different rays with zero crossover.

My agenda here is simple; HRT is a pretty solid equalizer across development studios...
I agree in principle. My concern is just that realtime raytracing isn't quite a reality. All the promise is actually, possibly, a significant way off. The offering here is a significant advance visually, but also at a cost that may well put it out of most people's reach. RTX isn't going to feature in the midrange until a 2070 can be shrunk significantly. Then you get games with some RT aspects like lighting, which'll be technically much better, but it still won't look real. And beyond that, is lithography going to shrink enough to ever get 5x the performance of RTX into a mainstream GPU? Well, that's a very theoretical question.

In the console discussion, one of the pro points for RT hardware is what else it could accelerate in game. But it would appear it'd have its hands full tracing the graphics so couldn't be used for AI or audio or whatever even if you wanted to. I guess that's the area where software optimisation would come in, because that's something devs could work with to get more work out of each ray.

I assume you mean 'expensive' instead of 'extensive'.
If you think raytraced reflection is more expensive than raytraced AO or GI, you must be joking.
In general specular reflection can be done with 1 ray while correct AO or GI need many.
(correct diffuse reflection is also very expensive)
It get's even more expensive of course when reflected scenery itself does not make use of baked AO or GI like in BFV, and this reflected AO/GI also is done with raytracing.
It depends what the limiting factor is. Reflections need the reflected surface to be evaluated, so you have to run some level of surface shader to get colour and lighting for the ray. AO and GI just need an object illuminance. Each reflected ray costs more, but you need less of them. Also, very low sampling is suitable for AO and GI thanks to very effective denoising.
 
Back
Top