DXR games and demos; Where's the beef? *spawn*

Finally a feature that makes high-end cards sweat.
This is not indicative of the card sweating. It reflects that the pipeline for processing ray tracing isn't as parallel as they suggested.
It indicates that RT cores really aren't doing all that much beyond basic intersection testing.
It indicates why Nvidia was so secretive about information pertaining to performance pre-launch and even post launch.
It indicates and vindicates people who were critical about this flop of a card for obvious reasons.
It indicates that they really didn't resolve the harder problem w/ ray tracing regarding divergent execution.
It indicates that this is literally a beta/prototype card like people said.
It indicates that the bulk of the operations/algo is serialized in the traditional pipeline.

It indicates that nvidia just con'd the crap out of people and they were dumb enough to pre-order and continue ordering.
I don't think consumers of such cards care. So it's win/win for everyone.
Those who saw this for what it was months ago knew better than to buy one for the very reason that is now obvious.

real-time Ray tracing won't be a reality until GPUs are composed of MCM configurations and ray tracing is broken out into a completely separate and truly parallel compute chip. As it stands now, this is a gimmick for real-time ray tracing. They essentially promoted a sham for Geforce consumers to subsidized Quadro pro cards (offline rendering acceleration).
 
This is not indicative of the card sweating. It reflects that the pipeline for processing ray tracing isn't as parallel as they suggested.
It indicates that RT cores really aren't doing all that much beyond basic intersection testing.
It indicates why Nvidia was so secretive about information pertaining to performance pre-launch and even post launch.
It indicates and vindicates people who were critical about this flop of a card for obvious reasons.
It indicates that they really didn't resolve the harder problem w/ ray tracing regarding divergent execution.
It indicates that this is literally a beta/prototype card like people said.
It indicates that the bulk of the operations/algo is serialized in the traditional pipeline.

It indicates that nvidia just con'd the crap out of people and they were dumb enough to pre-order and continue ordering.
I don't think consumers of such cards care. So it's win/win for everyone.
Those who saw this for what it was months ago knew better than to buy one for the very reason that is now obvious.

real-time Ray tracing won't be a reality until GPUs are composed of MCM configurations and ray tracing is broken out into a completely separate and truly parallel compute chip. As it stands now, this is a gimmick for real-time ray tracing. They essentially promoted a sham for Geforce consumers to subsidized Quadro pro cards (offline rendering acceleration).
Ray tracing itself might happen in parallel but the shading still needs to be done on the regular hardware. Reflections are expensive.
 
The only thing RT cores do is intersection tests.
v2-73eb8257213f94dcce63e9c61e3e8c41_hd.jpg

Ray generation and the object acceleration structure are computed i cuda cores which takes away resources from pumping higher frame rates. Shading as you correctly state occurs in the traditional shading pipeline. Lastly, the denoising stage of the pipeline (necessitated by the ridiculously noisy and incomplete ray tracing output) occurs in series at the end of the rendering pipeline adding a new latency block to the over-all rendering pipeline.
This is why performance tanks inconceivably. Calling this a hybrid pipeline is a bit of a joke.
This is lipstick on a pig. The more true form of Real-time ray tracing will have its own dedicated chip IMO. This is a stop gap and RT cores are doing something less than even tensor cores in the basic math department. I wouldn't be surprised if its just chained ALUs. Cleary nothing resolves divergent ray issues as there are huge drops in frames when you encounter scenes with it.

This product is a gimmick. Nvidia's stock down 15% after-hours based on disappointing earnings.
No one's buying this

1_rtx-sales-vs-1080ti.png
 
No-one buying it aftger announcement can mean 'wait and see' along with '1080 is a good price now.' The important stat will be whether sales pick up with more games showing raytracing.
 
No-one buying it aftger announcement can mean 'wait and see' along with '1080 is a good price now.' The important stat will be whether sales pick up with more games showing raytracing.
Very few people buy at the price point of 1080 and higher on Pascal.
Even a smaller minority would entertain the price point of Geforce 20 series.
After that fact you're met w/ the insane FPS drop when you enable Ray Tracing....
What exactly is going to result in sales picking up when any informed reviewer is trashing the cards and its plain to see why? Especially at ridiculous price points that many consider an immediate write off?
I'm sorry but this idiocy is exactly what got Intel into trouble in their late stages. Over-all stagnation... Gimmicky features and insane price premium.

Many are wait and see armed w/ Pascal cards. Wait and see until 7nm GPUs come along and insane power consumption is brought back down to reality along w/ price.
I see no-bid action on a stop-gap product line. Most people who've contacted me end up on 1080tis or lower. I advice getting a 1070/1080 w/ minimal loss impact for the point in time in which 7nm GPUs are the standard. Spending $1,200 on this hot garbage only to see frame rates go from 150fps to 50fps is downright hilarious.... There's nothing more to see... It's what all of the informed people were telling you was coming.
 
Even a smaller minority would entertain the price point of Geforce 20 series.
After that fact you're met w/ the insane FPS drop when you enable Ray Tracing....
The cards are no slouches in rasterization, they are second to none, and with no competition, people will buy them after Pascal dries up (the 1080Ti has already dried up). Their prices will come down too.

The fact that 2080 and 2070 are being purchased at all (with the presence of 1080Ti and 1080) is a strong indication of things to come.
 
Yeah, they'll be high-end rasterisers anywhere. People may then choose to run with RT disabled. That's the sort of data we need but will probably never get!
 
The fact that 2080 and 2070 are being purchased at all (with the presence of 1080Ti and 1080) is a strong indication of things to come.
Well a lot of people bought 1080ti's after the RTX release. Myself I upgraded from my 970 to a used 1080 to get me through until the next round of cards from AMD/Nvidia. Many gamers were waiting for the release to decide and most decided against, meaning they're done with their upgrade until 7nm Turing or w/e comes from AMD.

So will RTX cards see good sales soon? Yes, by virtue of no high end Pascal cards available anymore (at least new) but the pool of buyers I believe is a lot smaller now.
 
The cards are no slouches in rasterization, they are second to none, and with no competition, people will buy them after Pascal dries up (the 1080Ti has already dried up). Their prices will come down too.

The fact that 2080 and 2070 are being purchased at all (with the presence of 1080Ti and 1080) is a strong indication of things to come.

Sure, but as Malo has mentioned, all those people that just bought a 1080Ti or 1080 are highly unlikely to buy a 2080 or 2080Ti, meaning that the pool of buyers for those cards is going to be relatively smaller than for past launches of enthusiast level cards. IE - 1080/1080Ti at launch having a much larger pool of buyers than 2080Ti and a much larger pool of buyers overall for their lifetime (excluding crypto buyers), as people didn't just skip the 1080Ti and 1080/1070 due to them not offering much over the 980/980Ti as we see happening with the 2080/2080Ti.

Regards,
SB
 
so apparently, just reading around, to expect only Ray Traced Audio for COD.

Which is not quite great because I posted it in the wrong thread, but also great because I want to see how good the implementation is. Also, might be viable for current gen consoles maybe ?
hm... I guess it makes sense for a current gen endeavour where AMD had shifted towards using some spare CUs (on PC) for RT Audio as opposed to fixed function TrueAudio blocks (?), although I wonder if the console audio HW are ignored then.
 
hm... I guess it makes sense for a current gen endeavour where AMD had shifted towards using some spare CUs (on PC) for RT Audio as opposed to fixed function TrueAudio blocks (?), although I wonder if the console audio HW are ignored then.
Yea, unverified broken telephone I was peddling there.
Nvidia wouldn't push RT audio only, since anyone can do it without the RT cores.
We will see something later for sure.
 
I forget, and I must have asked this before at some point - anyone looking at ray traced A.I. pathing?
 
I forget, and I must have asked this before at some point - anyone looking at ray traced A.I. pathing?

On the surface that seems like a pretty interesting implementation. You wouldn't have to worry about any extra overhead for denoising as having the logic be a little fuzzy isn't a bad thing.

OTOH - if you have more than a few AI's that might make it performance prohibitive, no?

Also RT already impacts GPU performance significantly, would games that hypothetically use it for AI pathing have to significantly simplify the visual rendering load?

Regards,
SB
 
On the surface that seems like a pretty interesting implementation. You wouldn't have to worry about any extra overhead for denoising as having the logic be a little fuzzy isn't a bad thing.

OTOH - if you have more than a few AI's that might make it performance prohibitive, no?

Also RT already impacts GPU performance significantly, would games that hypothetically use it for AI pathing have to significantly simplify the visual rendering load?

Regards,
SB
RT performance depends on what you're trying to accomplish SB. If you're not bouncing rays around it could be quite straight forward. We do ray casting for a lot of things today in games because we're forced to. ie Mouse clicks, UI, NPC vision etc.
 
RT performance depends on what you're trying to accomplish SB. If you're not bouncing rays around it could be quite straight forward. We do ray casting for a lot of things today in games because we're forced to. ie Mouse clicks, UI, NPC vision etc.

Sure but when it comes to AI pathing, it's all node based, AFAIK. You can see this quite easily if sit in place and just watch AI movements. Games with more CPU time allocated to this will have denser nodes so that it's not quite as obvious. I'd imagine using rays in real time to do pathing (you'll still potentially need bounces in order for the AI to determine where to go after it reaches a point) is going to be far more compute intensive (even when done on the GPU) than using a pre-computed node map with the CPU determining via the AI's algorithm which nodes to use to reach whatever its destination may be.

The load will increase dramatically with more AI agents. I could see something like that working with say, Alien Isolation, where there is basically just one AI or alternatively if you have a game with very simplistic environments. But if you have more than a handful of AI agents?

Also, how would you combine that with AI awareness of the shape of a location (boundaries) and objects within the location that are outside of it's current "view" such that with ray tracing it can plan a route through that environment without being able to see the environment? How many bounces will be required for something like that in order for it plan its movements? Movements that could then change depending on what it sees when it comes around a corner and assesses the new situation if it is different than what it may be expecting?

With node maps that's relatively easy.

Lighting is far easier, IMO, as there is no need to try to determine where you want the light to go based on AI decision making, you're just tracing out where the light will go from the source of the light.

I guess, you'd have do some kind of hybrid RT and node based pathing. Maybe use a node map to determine the path you want to go. Then RT to determine the immediate path to take to get to the general location of X node?

Regards,
SB
 
Last edited:
Do you mean nodes as regular grids? A* pathfinding can use geometry and irregular maps with no regularly spaced nodes, and only barriers to be avoided. I've a 2D Unity asset called PolyNav which does just that, where agents manoeuevre freely around geometry. You create simple bounding geometry to define the level and the agents will work their way around corners to any requested space. So I use this to get my ships to pick routes, and ray-traced collision tests (actually circle-casts) to avoid immediate obstacles.

I doubt raytraced pathfinding would be an efficient solution and A* is already highly efficient and completely versatile.
 
I doubt raytraced pathfinding would be an efficient solution and A* is already highly efficient and completely versatile.
Yea I was thinking about npc vision in terms of seeing or hearing the player as being ray traced pathing.

Unless they are NPCs trying traversal In very fluid ways, I can’t see it replacing A*
 
Back
Top