Next gen lighting technologies - voxelised, traced, and everything else *spawn*

If the performance disappoints you (rhetorical), turn RT off.
Exactly, RT really looks beautiful! The inclusion of DLSS should add a boost to RTRX performance. Looking forward to games incorporating not only these two features but other "new" features in the Turing stack.
 
The performance is as expected and on low it's ok for such a new feature, 2070 at 60fps in 1080p and 2080Ti above 60fps in 1440p.
But BF V is just the wrong game for RTX. In a multiplayer game i want higher fps. They already want to include DLSS in Hitman, there RTX would've fitted much better.
 
Here is BFV running @ Ultra 1080p with RTX on in a live multiplayer match, and achieving 90fps and more with RTX 2080Ti.


Nice we can finally see some raytracing in action, albeit used only for reflection.
Some obvservations regarding the opening scene showing a window, with that window reflected on the floor.
Right of the bat the quality difference between the scenery through the window and the scenery reflected on the ground is disturbing.
The scenery in the window is nicely anti-alliased, the reflected scenery is full of aliasing artifacts.
Not sure how difficult it would be to get decent anti-aliasing in raytraced reflections.
 
Nice we can finally see some raytracing in action, albeit used only for reflection.
Some obvservations regarding the opening scene showing a window, with that window reflected on the floor.
Right of the bat the quality difference between the scenery through the window and the scenery reflected on the ground is disturbing.
The scenery in the window is nicely anti-alliased, the reflected scenery is full of aliasing artifacts.
Not sure how difficult it would be to get decent anti-aliasing in raytraced reflections.
we might be looking at the differences between high and low DXR.
Not sure though, but good catch.

Still early though in the process, I'm interested to see how this will evolve.
 
Here is BFV running @ Ultra 1080p with RTX on in a live multiplayer match, and achieving 90fps and more with RTX 2080Ti.
Yeah but we come back to the argument that gamers aren't buying a 2080ti to game at 1080p. And it's only reflections. Once we get titles trying to do shadows and lighting, it's going to be in the realms of unplayable.

Will the gamers with 2080ti's and 4k monitors continue playing at 1080p w/RTX after they've stared at building windows and car reflections for awhile? I don't know but I kinda doubt it. Depends on how they feel their experience has changed. Especially since most of the maps won't have surfaces doing anything with RTX. Maybe with DLSS in combination (though we don't know about the Tensor usage combining denoising and DLSS) we might be looking at 1440p-level rendering with image quality approaching 4k.
 
Yeah but we come back to the argument that gamers aren't buying a 2080ti to game at 1080p. And it's only reflections. Once we get titles trying to do shadows and lighting, it's going to be in the realms of unplayable.
Why? Metro is doing fine in it's early demos with ray traced GI and AO.
You only start to become bottlenecked when you pile up every RTX effect in existence, which won't happen in this early stage of the tech. Every RTX game is choosing to support a single RTX effect at a time, Reflections or Shadows or GI.
Especially since most of the maps won't have surfaces doing anything with RTX.
RTX works on every water or mud surface (puddles, lakes, rivers, seas), frozen ice, most windows or vehicle glass, vehicle metallic surfaces (car, trucks), and all weapons surfaces. Not to mention various other objects such as metal poles, hand rails on stairs, glossy paint, most types of flooring, and metal buildings.

So in every map and in every area you have an RTX reflection being applied to something.
Maybe with DLSS in combination (though we don't know about the Tensor usage combining denoising and DLSS) we might be looking at 1440p-level rendering with image quality approaching 4k.
One of DICE's community managers hinted at an imminent DLSS release, especially as DICE stated they are using their own shader denoising solution, which means that tensor cores are now free to do DLSS.
 
Last edited:
https://www.guru3d.com/news-story/battlefield-v-raytracing-features-are-now-enabled.html

index.php

index.php

index.php


doa? Performance is laughably bad.
 
Every RTX game is choosing to support a single RTX effect at a time, Reflections or Shadows or GI.
Which is kind of the problem. As a technique, raytracing offers this simplified, unified creation and rendering pipeline, that produces the whole scene with superb realism. That's what we want. However, it's not fast enough, meaning you have to take a typical game and add a bit of RT on top, meaning everything else remain very similar. These are very pricey effects, which in turn undermines the value of these GPUs to gamers (all that money, not much for it), which in turn makes one wonder if sales will suffer, which in turn means a smaller market for devs which would ordinarly lead to them overlooking the inclusion of the tech because it's more trouble that it's worth.

A common theme so far has been 'wait until you see engines designed for RTX', but why is anyone going to write games targeting RT when it's so niche in numbers?

Is 30% silicon or whatever it is for raytraced reflections in BFV worth it versus the higher resolutions and framerates obtainable without?
 
meaning you have to take a typical game and add a bit of RT on top, meaning everything else remain very similar
I don't see this as a negative. What is the alternative? More FPS than monitors can handle? More resolution than 4K?
It's already running ultra settings, it's layering DXR on-top of that.

You can't make the claim that the silicon could have been put to better use elsewhere. It's already beating those expectations as well. These cards without DXR can across the board run the game at 4K60. 2070 might need some slight tweaking, but what is the discussion here?

That we need 144fps@4K ? That sounds a hell of a lot more niche, than a person with an expensive GPU and 1080p monitor. The game isn't even 7 days old yet, and this is among the cream of the crop in graphics across the industry and these new cards already have it wrecked.
 
Last edited:
Performance is great actually.

Which is kind of the problem. As a technique, raytracing offers this simplified, unified creation and rendering pipeline, that produces the whole scene with superb realism. That's what we want. However, it's not fast enough, meaning you have to take a typical game and add a bit of RT on top, meaning everything else remain very similar. These are very pricey effects, which in turn undermines the value of these GPUs to gamers (all that money, not much for it), which in turn makes one wonder if sales will suffer, which in turn means a smaller market for devs which would ordinarly lead to them overlooking the inclusion of the tech because it's more trouble that it's worth.

A common theme so far has been 'wait until you see engines designed for RTX', but why is anyone going to write games targeting RT when it's so niche in numbers?

Is 30% silicon or whatever it is for raytraced reflections in BFV worth it versus the higher resolutions and framerates obtainable without?
Is WHATEVER_GRAPHICAL_FEATURE worth it versus running at 8K? Or 200fps?

Some people sure are eager to write RT off.
 
We don't know minimum frame rates to make sensible decisions.
 
I don't see this as a negative. What is the alternative? More FPS than monitors can handle? More resolution than 4K?
It's already running ultra settings, it's layering DXR on-top of that.

You can't make the claim that the silicon could have been put to better use elsewhere. It's already beating those expectations as well. These cards without DXR can across the board run the game at 4K60. 2070 might need some slight tweaking, but what is the discussion here?

That we need 144fps@4K ? That sounds a hell of a lot more niche, than a person with an expensive GPU and 1080p monitor. The game isn't even 7 days old yet, and this is among the cream of the crop in graphics across the industry and these new cards already have it wrecked.
That line of thinking is only true if there's nothing else that can be improved and the only way forwards for rasterised graphics without RT is higher resolution and framerate. BFV is a current-gen game designed for current-gen limitations. Kinda like looking at a Ps3 game and thinking next-gen would just be more of the same at higher resolutions and framerates. What actually happened is PBR, procedural shaders, compute-based reconstruction and ati-aliasing, etc.

The things RT offers are things we want improving next gen. However, there are other things that can also be improved. Physics, so clothing moves naturally and we have more practical destruction. Procedural textures instead of baked textures could create visually richer words. Also, the more these things are applied, such as procedural geometry, the slow RTing becomes, because the shaders need to be resolved (in part at least) for the tracing.

Is WHATEVER_GRAPHICAL_FEATURE worth it versus running at 8K? Or 200fps?

Some people sure are eager to write RT off.
No-one's writing it off. It's a valid question and 'worth it versus 8K or 200 FPs isn't the question. 2070 is 1080p60. People like Iroboto say they can't go back to 1080p after experiencing 4K, while competitive shooter players will most likely prefer 120 Hz over 60 with nice reflections. Furthermore, the sales pitch for raytracing and talked about all the things it can improve - realistic lighting, correct shadows, true reflections and refractions - but if the reality is more a case of 'pick one', it's a very different value proposition.
 
Which is kind of the problem. As a technique, raytracing offers this simplified, unified creation and rendering pipeline, that produces the whole scene with superb realism. That's what we want
Baby steps first.
These are very pricey effects
As they should be, show me a game that has this much dynamic true and accurate reflections across dozens of objects like this.
which in turn means a smaller market for devs which would ordinarly lead to them overlooking the inclusion of the tech because it's more trouble that it's worth.
Performance improves with successive generations, each DX feature had this very same problem, it's nothing new, in fact it's the standard.

Also NVIDIA has proven they can push their own technology even when they are proprietary and exclusive, that's what they did with TXAA, HairWorks, PhysX and other effects. GameWorks stuff thrive to this day. This time DXR is standard, and is already having a good adoption rate before it even lefts off. They can support it till flourishing.

The things RT offers are things we want improving next gen. However, there are other things that can also be improved. Physics, so clothing moves naturally and we have more practical destruction.
Dynamic destruction is hard using baked GI because it looks horrendous when it breaks the lighting, it needs dynamic GI, and we now have it courtesy of RT. Physics generally demand more CPU power more than GPU, and that includes clothing and destruction too. The whole argument about GPU PhysX is that most of the effects can be done in a convincing manner on the CPU. Especially now that we have so much unused CPU cores in our PCs.
the sales pitch for raytracing and talked about all the things it can improve - realistic lighting, correct shadows, true reflections and refractions - but if the reality is more a case of 'pick one', it's a very different value proposition.
Pick one for now on the first RTX gen, with gen 4 or 5 we could be talking about all of them combined and with increased rays per pixel too.
 
Last edited:
No-one's writing it off. It's a valid question and 'worth it versus 8K or 200 FPs isn't the question. 2070 is 1080p60. People like Iroboto say they can't go back to 1080p after experiencing 4K, while competitive shooter players will most likely prefer 120 Hz over 60 with nice reflections. Furthermore, the sales pitch for raytracing and talked about all the things it can improve - realistic lighting, correct shadows, true reflections and refractions - but if the reality is more a case of 'pick one', it's a very different value proposition.
It can improve all those things at once, you just have to be willing to pay the performance price. There are already demos of real-time 1spp path tracing running on non-Turing hardware.

Some people are resolution whores, others are performance whores and others are graphics whores. The first two have been spoiled this past decade, it's only fair the latter get some love too :p
 
That line of thinking is only true if there's nothing else that can be improved and the only way forwards for rasterised graphics without RT is higher resolution and framerate. BFV is a current-gen game designed for current-gen limitations. Kinda like looking at a Ps3 game and thinking next-gen would just be more of the same at higher resolutions and framerates. What actually happened is PBR, procedural shaders, compute-based reconstruction and ati-aliasing, etc.

The things RT offers are things we want improving next gen. However, there are other things that can also be improved. Physics, so clothing moves naturally and we have more practical destruction. Procedural textures instead of baked textures could create visually richer words. Also, the more these things are applied, such as procedural geometry, the slow RTing becomes, because the shaders need to be resolved (in part at least) for the tracing.
But those restrictions will always be there and I think your counter argument while has truth, is forgetting this point. Budget, time to develop, labour costs, R&D time. All those things are competing with each other just to move the graphical barrier. BFV has been in the pipeline for a long time, they're constantly updating the frostbyte engine from one game to the next. And in that time, this is the best in terms of features they could release from a rasterization perspective.

These features: "PBR, procedural shaders, compute-based reconstruction and ati-aliasing, etc." would still be developed even HRT was released 3 generations ago. They can work together to produce a better image, there should not be implication that they compete against each other.

I'm not saying that things can't improve with rasterization, there needs to be more consideration on how long it's taking to get to the next leap in graphics; we're talking 5 generations of frostbyte running on what feels like the 20th generation of rasterization hardware, and in what appears to be, just a few months, with new drivers and a new API, and they churned out ray traced reflections that works everywhere in their game despite the situation.

What else could they have accomplished in those few months other than optimization?
How or in what way, with the information we have, do we know that they could have accomplished more than they did here with the resources they had.

Hybrid ray tracing is very much about doing what is costly, for cheap. And if you don't have the algorithm, or can't change the design of your game to make that algorithm work, then HRT is the cheapest solution not rasterization. If one day someone invents a non RT hardware solution that generic and works wonders for reflections, for AO, for whatever, then that's what everyone will use, and for every other situation HRT is faster than rasterization to move the graphical barrier, then, that's what will be used.
 
People like Iroboto say they can't go back to 1080p after experiencing 4K, while competitive shooter players will most likely prefer 120 Hz over 60 with nice reflections. Furthermore, the sales pitch for raytracing and talked about all the things it can improve - realistic lighting, correct shadows, true reflections and refractions - but if the reality is more a case of 'pick one', it's a very different value proposition.
But we can still have these things. Seeing these RT only benchmarks is only 1/2 the equation. The AI up-res (DLSS) needs to come next. When they are working in tandem, I hope your opinion changes here on this one.
 
16054


Not att all that bad, i mean this GPU is doing 2560x1440 Ultra settings with RT/DXR ultra on a avg fps of 77. Thought the performance impact would be much greater, not bad for a game not even fully optimized for RT, and early drivers etc.
 
Back
Top