What Happens to Rasterization in the Era of Ray Tracing?

It's been roughly 6 years since the introduction of the first consumer cards with dedicated ray tracing hardware, and all of these years later we're finally seeing games that require it to work. I imagine that in another 6 years time we'll see massive gains to the ray tracing hardware, and increasingly modest development towards raster performance. So my questions are basically:

A) Will hardware and software reach "peak raster" in the near future? The point where progress basically halts for raster before silicon is increasingly dedicated strictly to ray tracing?

And B) what will happen to all the games produced that require raster hardware? Will future ray traced focused hardware retain, or be capable of emulating in software, features necessary to play them?
 
I think it'd be interesting to see whether GPU will be less focused on rasterization hardwares, and more on computation hardwares.
Some "rasterization" hardwares will probably always be there, such as triangle setup, depth buffer, and maybe textures. There might be case for replacing texture hardwares with computation units although it probably won't be very soon.
However, as one can easily see that today very few GPU are actually limited by those traditional "rasterization" hardwares such as triangle setup and texture units, so there will be much less incentive to improve on these fronts. For example, 4090's theoretical pixel fillrate is more than 450Gpix/s, and that's enough to fill 4K at more than 50,000 FPS. That's obviously way more than enough for practially any situation.
 
We are not in the era of pure ray tracing yet, we are doing hybrid raster + ray tracing. Even path traced games are still hybrid, this isn't going to change anytime soon. Ray Tracing still relies on general purpose compute hardware, and I think it's going to remain this way for a long time.

Judging by the current trends we are more likely to head in a direction of more hybrid rendering involving raster + ray tracing + machine learning with variable degrees. I think machine learning hardware could see more rapid adoption than the other two.
 
I wouldn't count texture mappers as raster hardware in this context as their function is equally applicable to RT. This is just a guess but I think with RT there will be less need to write out pixel buffers to memory. Shadow maps and cube maps will be gone. Gbuffers and depth buffers will likely be useful for a very long time for primary visiblity and various post processing effects. Hardware raster, tessellation and blending (ROPs) seem like good candidates for being replaced by software in the medium term if and when there is less demand for classic pixel rasterization in games.
 
I'm going to nod knowingly at the responses I've seen so far and take it that raster will stick around for long enough that it will not be much of a problem. Which is good to know!

I have to say, this entire paradigm shift feels weird to me. It's not a clean break (for understandable reasons), but it also feels somewhat directionless. nVidia seems to be the heaviest proponent, with a few devs here and there tagging along. It would be rather interesting if, for example, the next Playstation just goes all out on ray tracing. Same raster performance as the PS5 for backwards compatibility, but all out ray tracing for the rest. Becoming a showcase for what PC hardware could develop into if given a few more years.

To be honest a lot of the incremental ray tracing stuff we see isn't worth it to me personally. Low hanging fruit like slightly better reflections or shadows that do little more than strain the hardware for nigh imperceptible differences. A showcase platform might actually make it more exiting to the general consumer like myself. And if there is still enough raster do play all the games released up until now decently then I say go for it.
 
The consoles will never lead raytracing (being a "showcase"), the are bugdet hardware and the current generation illustrates this quite cleary.
 
It would be rather interesting if, for example, the next Playstation just goes all out on ray tracing. Same raster performance as the PS5 for backwards compatibility, but all out ray tracing for the rest. Becoming a showcase for what PC hardware could develop into if given a few more years.
You're not getting anything "all out" on RT because RT isn't just about tracing rays. This part is required but what you do with the information you get from that is your usual shading which means that to get a sizeable performance and/or quality improvement in RT you need to scale the shading part of the h/w to the same degree as you'd be scaling the RT part of it. These aren't two parts which exist in parallel to each other and sit idly when the other does things, both are required for rendering anything with RT.

The "rasterization" part of the h/w is in fact a rather small one which does exactly that - rasterization i.e. turns triangles into pixels. The fact that tech media decided to call non-DXR graphics "rasterized" and DXR graphics "ray-traced" is on them I guess but this is wholly incorrect from both the h/w and s/w perspective. You can have RT in a game which doesn't use DXR (s/w Lumen is a major example of this; Nanite also kinda doesn't use a lot of h/w rasterization though so UE5 is even more interesting in this regards) and you definitely can have rasterization in a game which is using DXR - the vast majority of games with RT use h/w rasterization in fact.

This animosity between RT and "rasterization" exist only in the tech press and nowhere else. It's a product created to gather clicks and divide the audience for drama's sake.
A ray traced game can use h/w rasterization, a non-RT game can avoid using it, and generally you do need all h/w a modern GPU has to do ray tracing. There isn't much to remove if all you want to do is RT, and even stuff which could be removed should probably still be there for b/c reasons and the simple fact that it's not that big of a die space consumer anyway.
You don't need the RT h/w in particular if you're not doing RT but that's hardly something new - games which don't use bilinear filtering don't need that h/w either since Voodoo days.
 
Thank you for that great explanation, I've really misunderstood how the hardware works.

I was under the impression that parts of the standard "raster" side of the hardware would become superfluous as RT h/w solutions gained traction. Like a muscle withering from disuse as another is focused on instead.

Correct me if I'm wrong, but would it be better to think of the RT h/w solutions as feeding data to the "raster" part then?
 
Correct me if I'm wrong, but would it be better to think of the RT h/w solutions as feeding data to the "raster" part then?

Not really. Most hardware in a GPU is used by both rasterization and raytracing. Then there are a few smaller bits that are raster or RT specific. Memory pipelines, texturing, shaders are not raster hardware. Raster specific hardware include the rasterizer, tessellator, triangle setup, vertex cache and ROP units.
 
Is Vertex cache still a thing? I thought we'd moved on to just compute units with caches running whatever code. Same with tessellators - these still exist in hardware?
 
Vertex cache probably doesn't exists anymore either on most hardware I'd think
That was the conclusion from "Revisiting The Vertex Cache: Understanding and Optimizing Vertex Processing on the modern GPU" in 2018.
 
As people said, highly coherent rays are better rasterized, since we have the hardware anyway, and the knowledge to use it.
I'd still like a comparison of a AAA game's dynamic scene running both, to have an idea of how much slower/faster one is compared to the other.
When it comes to smaller budget titles, I wouldn't be so sure... Maybe Overcooked would run as fast completely ray traced... (But you wouldn't sell it, because there's too small a user based for RT only games ATM.)
 
If I look at Steam numbers just for NVIDIA the RT capable hardware is close to 55%, so with AMD we are close to 60% RT capable GPU's, so the tipping point is close.
 
And B) what will happen to all the games produced that require raster hardware? Will future ray traced focused hardware retain, or be capable of emulating in software, features necessary to play them?
I asked someone at NV about this once, and they said even if all aspects of rendering in games are done with rays being traced, they still would want to maintain rasterisation hardware for power reasons in displaying windows GUI.
 
That soothes my mind. Though I'm sure its unwarranted, I've had this niggling worry in the back of my head that games could become difficult to run in the future as hardware evolves beyond our current rendering paradigm and looses backwards compatibility. Granted, other hardware might well be fit for purpose to emulate whatever is necessary whenever we might reach such a point. But still.
 
That soothes my mind. Though I'm sure its unwarranted, I've had this niggling worry in the back of my head that games could become difficult to run in the future as hardware evolves beyond our current rendering paradigm and looses backwards compatibility. Granted, other hardware might well be fit for purpose to emulate whatever is necessary whenever we might reach such a point. But still.
We are already there.
Non-RT GPUs cannot run Indiana Jones.
Nor will they be able to run the new Doom game.
This will only increase, like it or not.
 
We are already there.
Non-RT GPUs cannot run Indiana Jones.
Nor will they be able to run the new Doom game.
This will only increase, like it or not.
But current hardware can still run non-RT games. Which was the capability I was worried mostly about loosing to some degree in the future.

RT becoming mainstream isn't a problem for me personally. I look forward to seeing what RT will bring for developers and gamers alike. But yea, for those of us with non-RT hardware (I'm firmly in that camp, still sporting my trusty Radeon 5700 XT) it'll force us to upgrade if we want to play these games.
 
Back
Top