DXR games and demos; Where's the beef? *spawn*

But those DX11 cards didn't cost over $1000 in addition to being a relatively poor performance upgrade compared to the cards they were replacing in their respectif category.
Thats fair.
That's also a function of the times.
Back when moore's law was cheaper. Back when ram was readily available and not as expensive.
Back when PC parts weren't competing that heavily with mobile.

I get where you are coming, but it's not an apples to apples comparison here.
There were full generations of DX11 and DX10 cards that were marketed, sold, and succeeded and never once used their features. All of it held up by the fact that consoles were still largely stuck at DX9 feature set.

Which if we're going to go through this one more time, it makes sense for consoles to adapt DXR now so that the PC market doesn't stagnate and everything you just said doesn't come true.
 
but it was pointless to buy a DX11 based card hoping to get today's games back in 2010.
See anything during the 7570 eras.

But you also got an significant boost in performance which enabled either more features or higher resolution or more consistent framerate or any combination of those at a reasonable price increase over the previous generation.

So, depending on the type of gamer you were, you could opt for performance, IQ, or resolution or combinations of those.

With RTX, currently it's the promise that at some point in the future there will be better looking games, albeit with a larger performance hit. Or a minor to modest increase in speed at a massive increase in price.

Even the Geforce 256 which came with a promise of future titles with T&L at least came with a bunch of demos to showcase the effect to the buyer.

Regards,
SB
 
But you also got an significant boost in performance which enabled either more features or higher resolution or more consistent framerate or any combination of those at a reasonable price increase over the previous generation.

So, depending on the type of gamer you were, you could opt for performance, IQ, or resolution or combinations of those.

With RTX, currently it's the promise that at some point in the future there will be better looking games, albeit with a larger performance hit. Or a minor to modest increase in speed at a massive increase in price.

Even the Geforce 256 which came with a promise of future titles with T&L at least came with a bunch of demos to showcase the effect to the buyer.

Regards,
SB
It's a paradigm shift. All those other cards merely improved rasterization hence why it was easier and faster to implement the new features.
 
It's a paradigm shift. All those other cards merely improved rasterization hence why it was easier and faster to implement the new features.

A paradigm shift only happens when it's time for the paradigm shift.

RTX may be getting things moving, but it most certainly isn't the paradigm shift yet. Probably the next round of DXR accelerators might be.

I'm looking forward to what NV follows up with. Usually their follow up to an introduction of new features is significantly better than the intro hardware.

Geforce 256 - Geforce 2. For example.

Or Fermi to Keplar.

Regards,
SB
 
A paradigm shift only happens when it's time for the paradigm shift.

RTX may be getting things moving, but it most certainly isn't the paradigm shift yet. Probably the next round of DXR accelerators might be.

I'm looking forward to what NV follows up with. Usually their follow up to an introduction of new features is significantly better than the intro hardware.

Geforce 256 - Geforce 2. For example.

Or Fermi to Keplar.

Regards,
SB
This feels like unnecessary killing it before it gets out of the gates type of talk. You can't declare anything until you've seen it either not get adopted for a great deal of many years, or you see a large adoption, followed by drop off if the tech is not good enough, see VR. You make your point, but it's still way too early.
 
Did the new features for DX10 and DX11 also bring a 50% price increase from the previous generation before it?

Yes, both times.

5870, first DX10, 60% price increase vs 4890, 43% performance increase, worse perf/dollar as the generation before, linear scaling of perf/$ with old gen competitor. https://www.techpowerup.com/reviews/ATI/Radeon_HD_5870/30.html
7970,first DX11, 62% price increase vs 6970, 30% more performance, worse perf/dollar as the generation before, linear scaling of perf/$ with old gen competitor. https://www.techpowerup.com/reviews/AMD/HD_7970/28.html

That's the reason i can't understand all this screaming with turing. Old launches were totally the same. Sure, it seems now it got to a point, where people don't want to pay so much. But costly new gens aren't the problem, as it started years ago. Problem is, that the gens nowadays keep their pricing for 2 years and don't come down over time.
 
RTX may be getting things moving, but it most certainly isn't the paradigm shift yet. Probably the next round of DXR accelerators might be.
It's the start of the paradigm shift, which is more important than the paradigm shift itself. And this time it's much greater and visually impactful than most DX features. You can now do proper reflections, translucency in refractions, translucent shadows, transparent shadows and real time dynamic GI. None of these things are possible with the previous version of DX, only with DXR.
Geforce 256 - Geforce 2. For example.
People laughed and ridiculed at Hardware T&L in the time of Geforece 256 because no games used it, because performance wasn't impressive, because API support was minimal, and look where we are now.

RTX is the opposite here of Hardware T&L, several games and engines are lining up to use it, APIs are ready, and performance with ray tracing is much greater than ever before.
 
Last edited:
I think its far too early to say RTX is the opposite. We will need to wait and see how it turns out.
 
With no high end PC games, gamers have grown accustomed during this past decade to maxing out the graphics settings and still get high framerates at high resolutions. When an actual high-end feature shows up they are shocked by the performance cost.
 
With no high end PC games, gamers have grown accustomed during this past decade to maxing out the graphics settings and still get high framerates at high resolutions. When an actual high-end feature shows up they are shocked by the performance cost.

That's a rather condescending and ignorant way to look at things. While the PC gaming space isn't as extensive as it once was at the high end of AAA development, there are still developers that take advantage of advances in GPU technology.

Ashes of the Signularity took advantage of Async compute and explicit multi-adapter (multiple GPUS), for example.

iD pushed new advances in rendering on both the PC and consoles.

Dice continue to look at new ways to leverage advances in GPU technology and have for the past decade. RT isn't the first time DICE have done this and it likely won't be the last.

Nixxes often pushes new GPU technology in their PC ports of games (Tomb Raider series, for example). We have some threads in the PC forum that talk about some of PC only GPU features that was pushed in the PC version of the game. Sometimes before any other developer.

Square Enix-Eidos pushed and experimented with many new GPU features in the Hitman games.

Etc. etc.

Just because it wasn't RT doesn't mean they didn't push or use the latest advances in GPU tech.

Just because you particularly like RT, doesn't mean that it is the only significant GPU advancement in the past 10+ years, or the only one that developers are or have been excited for, or the only one that might or has been used in games.

It's significant, but excitement and tentative exploration of a feature doesn't lead to a paradigm shift unless the entire industry moves to it.

There have been multiple technologies that could have lead to a paradigm shift and early on looked like they might lead to one, but after multiple years it ends up that it didn't. GPU Physics and Tesselation to name just a few.

Some are still in the process and could potentially falter still. Compute has actually lead to a paradigm shift, but a subtle one. GPU dispatch is another that is still in the process of potentially becoming a paradigm shift.

People can talk about paradigm shifts all they want, but it isn't one until it actually happens.

This isn't unlike Chemistry and Physics. I know a Nobel prize winning research physicist that has at multiple times over the years proclaimed something cool as a paradigm shift in Physics...some of those are still in the process and may or may not become one...some of them never left the theoretical stage.

All that said. I've already stated that I think it's likely to be a shift, but it hasn't happened yet and it likely won't for a few years still...assuming it does lead to one.

Regards,
SB
 
There have been multiple technologies that could have lead to a paradigm shift and early on looked like they might lead to one, but after multiple years it ends up that it didn't. GPU Physics and Tesselation to name just a few.
GPU Physics and Tessellation can't certainly be considered paradigm shifts, as they only marginal improve upon existing tech. But they don't actually bring revolutionary new effects to the pipeline, Physics can be done on the CPU, and models can have high number of polygons. GPU Physics and Tessellation just accelerate performance of said techniques.

Yet still, GPU Physics is an important part of games nowadays, GPU particles are found in many games, same for GPU hair as well (TressFX, HairWorks). In fact, The whole console generation has learned how to offload many workloads onto the GPU (including water simulation among many other things). As for Tessellation, it is used in most titles nowadays. If not for characters then for objects, volumetric light, water or terrain. So no, these effects proved their worth already.
 
I don't see why everyone is surprised that there aren't any games supporting RTX/DXR yet.
Because nvidia showcased games with RTX enhancements during the RTX launch, said games came out, DXR came out, RTX cards came out and RTX implementations are nowhere to be seen.


5870, first DX10, 60% price increase vs 4890, 43% performance increase, worse perf/dollar as the generation before, linear scaling of perf/$ with old gen competitor. https://www.techpowerup.com/reviews/ATI/Radeon_HD_5870/30.html
7970,first DX11, 62% price increase vs 6970, 30% more performance, worse perf/dollar as the generation before, linear scaling of perf/$ with old gen competitor. https://www.techpowerup.com/reviews/AMD/HD_7970/28.html

Comparing new cards with older cards that saw their price plunge at the end of their lifetime seems awfully convenient.
In both cases the previous cards had a similar MSRP at launch.
nVidia has been steadily increasing the launch MSRP for each performance bracket and, boosted by the mining craze and AMD's lack of competition on the top-end, it culminated in the RTX launch.
The 2080 costs about twice as the 980 did at launch.


Just because you particularly like RT,
When referring to @OCASM, that's the understatement of the year...


I OTOH haven't seen a single demo that shows real time raytracing as something we should all look forward to adopt asap no matter what.
Rasterization with smart tricks seems great to me.

VR/AR is a much greater paradigm shift than RTRT at the moment IMHO.
2D LCD panels are boring in comparison.. And this is coming from someone who recently invested in a 34" UWQHD monitor and a 55" HDR TV.
 
Last edited by a moderator:
That's a rather condescending and ignorant way to look at things. While the PC gaming space isn't as extensive as it once was at the high end of AAA development, there are still developers that take advantage of advances in GPU technology.

Ashes of the Signularity took advantage of Async compute and explicit multi-adapter (multiple GPUS), for example.

iD pushed new advances in rendering on both the PC and consoles.

Dice continue to look at new ways to leverage advances in GPU technology and have for the past decade. RT isn't the first time DICE have done this and it likely won't be the last.

Nixxes often pushes new GPU technology in their PC ports of games (Tomb Raider series, for example). We have some threads in the PC forum that talk about some of PC only GPU features that was pushed in the PC version of the game. Sometimes before any other developer.

Square Enix-Eidos pushed and experimented with many new GPU features in the Hitman games.

Etc. etc.

Just because it wasn't RT doesn't mean they didn't push or use the latest advances in GPU tech.

Just because you particularly like RT, doesn't mean that it is the only significant GPU advancement in the past 10+ years, or the only one that developers are or have been excited for, or the only one that might or has been used in games.

It's significant, but excitement and tentative exploration of a feature doesn't lead to a paradigm shift unless the entire industry moves to it.

There have been multiple technologies that could have lead to a paradigm shift and early on looked like they might lead to one, but after multiple years it ends up that it didn't. GPU Physics and Tesselation to name just a few.

Some are still in the process and could potentially falter still. Compute has actually lead to a paradigm shift, but a subtle one. GPU dispatch is another that is still in the process of potentially becoming a paradigm shift.

People can talk about paradigm shifts all they want, but it isn't one until it actually happens.

This isn't unlike Chemistry and Physics. I know a Nobel prize winning research physicist that has at multiple times over the years proclaimed something cool as a paradigm shift in Physics...some of those are still in the process and may or may not become one...some of them never left the theoretical stage.

All that said. I've already stated that I think it's likely to be a shift, but it hasn't happened yet and it likely won't for a few years still...assuming it does lead to one.

Regards,
SB
There's a reason why the meme of "but does it run Crysis?" lives on: all high end eye candy games are designed for consoles and then ported to PC nowadays. Whatever extra features they have are optional afterthoughts.

We've already seen the transition from rasterization to ray tracing in the CG film industry. The game industry is simply catching on.

Whereas rasterization is a bunch of hacks that try to mimic the look of lighting simulation, ray tracing is lighting simulation. The way both work is fundamentally different hence why you need different hardware to run it efficiently. It is is a paradigm shift. Of course, it doesn't happen overnight. We're at the beginning of the transition.
 
GPU Physics and Tessellation can't certainly be considered paradigm shifts, as they only marginal improve upon existing tech. But they don't actually bring revolutionary new effects to the pipeline, Physics can be done on the CPU, and models can have high number of polygons. GPU Physics and Tessellation just accelerate performance of said techniques.

Yet still, GPU Physics is an important part of games nowadays, GPU particles are found in many games, same for GPU hair as well (TressFX, HairWorks). In fact, The whole console generation has learned how to offload many workloads onto the GPU (including water simulation among many other things). As for Tessellation, it is used in most titles nowadays. If not for characters then for objects, volumetric light, water or terrain. So no, these effects proved their worth already.
If tessellation had ushered in Reyes rendering for games it would have been a paradigm shift. The possibility was there but it didn't end up being the right fit for real time rendering.
 
We've already seen the transition from rasterization to ray tracing in the CG film industry. The game industry is simply catching on.
CG isn't realtime. It has to prioritise production pipeline and final quality. Gaming has to prioritise production, performance, and accessiblilty (can your market use it). The dream of real-time fully raytraced graphics is one worth pursuing, but just because the CGI industry is using RT now, doesn't mean gaming should. RT has to be proven valuable for gaming on its own merits.

That is, Renderman introduced raytracing in 2006. GPUs aren't just now 'catching on' - "Duh, we should have been raytyracing all this time instead of rasterising." No matter what Hollywood was doing, GPUs had to carry on with rasterising as the only viable way to get realtime graphics. It's those decisions, not Hollywood, that should be considered.
 
CG isn't realtime. It has to prioritise production pipeline and final quality. Gaming has to prioritise production, performance, and accessiblilty (can your market use it). The dream of real-time fully raytraced graphics is one worth pursuing, but just because the CGI industry is using RT now, doesn't mean gaming should. RT has to be proven valuable for gaming on its own merits.

That is, Renderman introduced raytracing in 2006. GPUs aren't just now 'catching on' - "Duh, we should have been raytyracing all this time instead of rasterising." No matter what Hollywood was doing, GPUs had to carry on with rasterising as the only viable way to get realtime graphics. It's those decisions, not Hollywood, that should be considered.
GPUs are just catching on. We weren't doing ray tracing because it was too slow for real time use even in limited form. Not so anymore. The performance cost of high-end rasterization is not that much better than hybrid ray tracing. Production and accessibility are some of the the main reasons offline CG producers moved to ray tracing. All the complexity and limitations of rasterization are gone. As Jensen said:

 
"Just catching on" means it could have been done long before but people missed the trick, and are only just now 'catching on'. GPUs aren't 'just catching on'. They are evolving to support more features now that that's possible, primarily because lithography allows large enough processors, and partly because of the evolution of GPGPU to compute as driven by software developers. Everyone knew what RT was and what it offered, but it wasn't possible to do anything about it until now.
 
Oct 3 was the big release for W10 and DXR.
Nvidia driver here to support it:
https://www.nvidia.com/en-us/geforc...update-directx-ray-tracing-game-ready-driver/

BFV DXR patch inc Nov 15.

Microsoft seems to think differently
https://www.microsoft.com/en-us/itpro/windows-10/release-information
Notice: October 8, 2018: We have paused the rollout of the Windows 10 October 2018 Update (version 1809) for all users as we investigate isolated reports of users missing some files after updating. We apologize for any inconvenience this may have caused. We will provide an update when we resume updating customers. For reference, please see Windows 10 update history for additional information.
 
Back
Top