Next gen lighting technologies - voxelised, traced, and everything else *spawn*

I meant in comparison to the previous gen prices. You can find it starting at 500$, it's the same price as 1080 when it launched. But faster and with more features.
That's not how it's supposed to work though, before it would have been 2080 that's priced around the $500 / 1080 price or so
GTX 1070 launched cheaper than GTX 980 was priced at the time and it beat GTX 980 Ti with the same margin 2070 now beats 1080, granted GTX 1080 launch price was notably higher even over 980 Ti, but still, 1070 beat 980 Ti while being cheaper than 980 which is a far cry from what we have now
 
Last edited:
I'm not sure there's any solution to combining features beyond more power. Each aspect requires tracing different rays with zero crossover.

I agree in principle. My concern is just that realtime raytracing isn't quite a reality. All the promise is actually, possibly, a significant way off. The offering here is a significant advance visually, but also at a cost that may well put it out of most people's reach. RTX isn't going to feature in the midrange until a 2070 can be shrunk significantly. Then you get games with some RT aspects like lighting, which'll be technically much better, but it still won't look real. And beyond that, is lithography going to shrink enough to ever get 5x the performance of RTX into a mainstream GPU? Well, that's a very theoretical question.

In the console discussion, one of the pro points for RT hardware is what else it could accelerate in game. But it would appear it'd have its hands full tracing the graphics so couldn't be used for AI or audio or whatever even if you wanted to. I guess that's the area where software optimisation would come in, because that's something devs could work with to get more work out of each ray.

It depends what the limiting factor is. Reflections need the reflected surface to be evaluated, so you have to run some level of surface shader to get colour and lighting for the ray. AO and GI just need an object illuminance. Each reflected ray costs more, but you need less of them. Also, very low sampling is suitable for AO and GI thanks to very effective denoising.
Pretty sure GI needs surface to be shaded in some way as it needs amount of light leaving the surface. (radiance?)
For diffuse GI the resolution and shading should be reasonably easy to be cached though and thus quite fast to use. (store result of shading in light to vertexes or texture.)
 
Last edited:
For GI you only need a very basic colour and intensity. You don't need things like specular, texture detail, bumps, etc. For example, snooker. GI light bounces from the table only need evaluate the green and intensity, whereas reflections of that table would need to evaluate the texture including, if going very detailed, the bumps and chalk marks.
 
I'm not sure there's any solution to combining features beyond more power. Each aspect requires tracing different rays with zero crossover.
I mean to say that developers can develop all of the features, whether the power exist to run them all together is something they leave to the user to decide.

But the features need to be developed with each engine still and I don’t think there has been significant amount of time to develop all the RT features individually in a title just yet.

2070 is 14nm. There is space there at 7nm. The real performance is in improving RT acceleration over the coming years.
 
Nvidia probably expected that they could use 10 nm Chips for Turing.
https://www.techpowerup.com/247657/...s-in-2019-turing-originally-intended-for-10nm

I assume you mean 'expensive' instead of 'extensive'.
If you think raytraced reflection is more expensive than raytraced AO or GI, you must be joking.
In general specular reflection can be done with 1 ray while correct AO or GI need many.
(correct diffuse reflection is also very expensive)
It get's even more expensive of course when reflected scenery itself does not make use of baked AO or GI like in BFV, and this reflected AO/GI also is done with raytracing.

Exactly.

In the Star Wars Unreal Engine presentation they used four Volta GPUs. Three of them calculated the reflections alone. The other one calculated the rest with raytracing AO and shadows.

Metro is said to have raytracing GI and AO as well and the performance is relatively good.
 
Last edited:

Yeap, performance is terrible and the difference between screen space reflections and raycasting reflections is hardly noticeable at least in a fast paced shooter.

But all RTX cards are still great at rasterization, though. At least there's that.
 
For GI you only need a very basic colour and intensity. You don't need things like specular, texture detail, bumps, etc. For example, snooker. GI light bounces from the table only need evaluate the green and intensity, whereas reflections of that table would need to evaluate the texture including, if going very detailed, the bumps and chalk marks.
Yup.

I'm quite sure that we will see some interesting ways to create and use surface caches and such to store shading results for tracing. (Or even re-use results from rays.)
 
Seems a good and fair video. The improvement in quality over screenspace reflections is unmistakable and very welcome, but as it drops framerates so much, how many gamers are going to willing to be spend significant money for an effect that kills their framerate? Furthermore, given the huge framerates using rasterising without RT, couldn't a few realtime cubemap renders be used to add more realistic environment reflections? We don't have a comparison of what 1080p60 raytraced versus 1080p60 rasterised looks like. And yes, if you're pushing rasterising down to 60 fps, you may as well use ray tracing, but rasterising would add more fidelity - cast more cubemaps and larger cubemaps, sort of thing.As a production card, $1000 for a 2080 is well worth it. As a gaming card, its value is hugely debateable.

These first game results again point me to the real reason for these cards as RT accelerators for production, not games. 5 fps realtime rendering is absolutely phenomenal and hugely valuable for content creators. Even 1 fps is a massive improvement and well worth the tech. In games though, framerates have to stay above a minimum and even more so for gamers who spend big bucks on big cards to get faster framerates.

For me, the next important showcase will be RT lighting performance. The quality should be the most notable improvement over rasterised lighting and lift visuals to next gen similar to The Tomorrow Children's best visuals, or Uncharted 4's baked interiors. If RT can handle next-gen lighting in decent framerate realtime, it might well be worth including as a de facto tech.
 
I'm not sure there's any solution to combining features beyond more power. Each aspect requires tracing different rays with zero crossover.
Path tracing.

https://cg.ivd.kit.edu/atf.php


Seems a good and fair video. The improvement in quality over screenspace reflections is unmistakable and very welcome, but as it drops framerates so much, how many gamers are going to willing to be spend significant money for an effect that kills their framerate? Furthermore, given the huge framerates using rasterising without RT, couldn't a few realtime cubemap renders be used to add more realistic environment reflections? We don't have a comparison of what 1080p60 raytraced versus 1080p60 rasterised looks like. And yes, if you're pushing rasterising down to 60 fps, you may as well use ray tracing, but rasterising would add more fidelity - cast more cubemaps and larger cubemaps, sort of thing.As a production card, $1000 for a 2080 is well worth it. As a gaming card, its value is hugely debateable.

These first game results again point me to the real reason for these cards as RT accelerators for production, not games. 5 fps realtime rendering is absolutely phenomenal and hugely valuable for content creators. Even 1 fps is a massive improvement and well worth the tech. In games though, framerates have to stay above a minimum and even more so for gamers who spend big bucks on big cards to get faster framerates.

For me, the next important showcase will be RT lighting performance. The quality should be the most notable improvement over rasterised lighting and lift visuals to next gen similar to The Tomorrow Children's best visuals, or Uncharted 4's baked interiors. If RT can handle next-gen lighting in decent framerate realtime, it might well be worth including as a de facto tech.
1080p@60fps is already amazing for first-gen HRT. Framerate and resolution junkies are unhappy but who cares, if it was up to them we would be stuck with Quake III level graphics just so they could play at 8K@144fps.
 
Yeap, performance is terrible and the difference between screen space reflections and raycasting reflections is hardly noticeable at least in a fast paced shooter.
Seems a good and fair video.
Again once more they used the terrible bugged stage of Tirailleur. That's why the results are significantly lower than the other outlets. Once more they admit multiplayer stages offer better fps yet they never bothered to test one. So no, neither great or fair video. Look for other outlets for that.
DXR Performance degraded in maps which feature a lot of foliage
This particularly effects War Stories “Liberte” and “Tirailleur”
Status: Currently investigating
https://forums.battlefield.com/en-us/discussion/161023/battlefield-vs-known-issues-list
 
Last edited:
What am I supposed to be looking at here?

1080p@60fps is already amazing for first-gen HRT. Framerate and resolution junkies are unhappy but who cares, if it was up to them we would be stuck with Quake III level graphics just so they could play at 8K@144fps.
You're using an extreme. Ignoring the resolution and framerate junkies, who is the audience for high-end GPUs and why? As I understand it (correct me if I'm wrong), it's for high framerates (120Hz +) because that results in the smoothest gameplay, and/or 60fps at higher resolutions. This points comes up always in the framerate/resolution/graphical quality discussions and I don't recall any clear answer, but when Pc gamers choose their settings, do they prioritise visual fidelity or graphical quality? If on h3e whole gamers are preferring to play 30 fps with everything turned up, then raytracing at 1080p is going to fit their preferences. If however they prefer lower quality and faster framerates, raytracing won't be valued.

What you or I prefer doesn't matter - it's what the market wants that'll determine adoption of raytracing.

Again once more they used the terrible bugged stage of Tirailleur. That's why the results are significantly lower than the other outlets. Once more they admit multiplayer stages offer better fps yet they never bothered to test one. So no, neither great or fair video. Look for other outlets for that.
Fair enough.
 
What am I supposed to be looking at here?

You're using an extreme. Ignoring the resolution and framerate junkies, who is the audience for high-end GPUs and why? As I understand it (correct me if I'm wrong), it's for high framerates (120Hz +) because that results in the smoothest gameplay, and/or 60fps at higher resolutions. This points comes up always in the framerate/resolution/graphical quality discussions and I don't recall any clear answer, but when Pc gamers choose their settings, do they prioritise visual fidelity or graphical quality? If on h3e whole gamers are preferring to play 30 fps with everything turned up, then raytracing at 1080p is going to fit their preferences. If however they prefer lower quality and faster framerates, raytracing won't be valued.

What you or I prefer doesn't matter - it's what the market wants that'll determine adoption of raytracing.
An algorithm that combines all the features you mention and runs decently on non-Turing hardware already.

The audience for these GPUs is people who want the best quality graphics. PC gamers isn't a monolothic group. Some prioritize framerate, others resolution and others graphics. The reason why you see the first two the most is because barely any PC games push graphics nowadays. The market is filled with console ports. There are no more Crysis-class games.
 
An algorithm that combines all the features you mention and runs decently on non-Turing hardware already.
Rather than linking me to a sprawling site of papers (and in particular the link takes me to denoising), how about you link to the specific paper or reference that shows how ray sampling can be homogenised across different requirements? I'm not saying you're wrong, but there's no way I'm going to spend however many hours searching through that site to find research that shows yes, you can reuse data and samples for relfections and lighting. ;)

The audience for these GPUs is people who want the best quality graphics. PC gamers isn't a monolothic group. Some prioritize framerate, others resolution and others graphics. The reason why you see the first two the most is because barely any PC games push graphics nowadays. The market is filled with console ports. There are no more Crysis-class games.
Firstly, how big is that audience? How many millions of gamers will devs have to target their raytracing R&D towards? Secondly, you haven't answered the question about what do gamers actually use? It's not a monolithic group. There are lots of gamers. Do they choose higher quality settings at lower framerates on average, or lower quality settings to get higher framerates?
 
Again once more they used the terrible bugged stage of Tirailleur. That's why the results are significantly lower than the other outlets. Once more they admit multiplayer stages offer better fps yet they never bothered to test one. So no, neither great or fair video. Look for other outlets for that.

https://forums.battlefield.com/en-us/discussion/161023/battlefield-vs-known-issues-list

I'm not too sure the slow raytracing in the presence of foliage is a bug.
See: Effectively Integrating RTX Ray Tracing into a Real-Time Rendering Engine
"Note that the dependent memory access issue typical to all hit shaders can be especially pronounced in the alpha test shader as it is so trivial, and the compiler doesn’t have many opportunities for latency hiding."
 
It's a bug because it happens only in these two stages, other Story stages or multiplayer maps don't suffer the same problem.

Why do you claim to know more than Dice ? They write:
"Status: Currently investigating"
This means we don't know yet the cause.
 
Why do you claim to know more that Dice ? They write:
"Status: Currently investigating"
This means we don't know yet the cause.
Why do you think they listed this as an issue or bug in the first place? are you following Battlefield forums?
It's listed because these two stages are the only ones suffering this problem.
This means we don't know yet the cause.
You are contradicting yourself, if this is normal ray tracing behavior due to extensive alpha tests, DICE wouldn't list this as a known bug. They wouldn't say the cause is unknown.
 
Why do you think they listed this as an issue or bug in the first place? are you following Battlefield forums?
It's listed because these two stages are the only ones suffering this problem.

You are contradicting yourself, if this is normal ray tracing behavior due to extensive alpha tests, DICE wouldn't list this as a known bug. They wouldn't say the cause is unknown.

You are wrong to imply that an issue, such as slow rendering, is automatically a bug. (DICE does not list it as a bug)
I'm just speculating about the cause with reasonable arguments. I don't claim to know the cause neither.
It could be a bug, it could be inherent due to the all hit shader used for alpha testing of foliage.
 
Back
Top