Impact of nVidia Turing RayTracing enhanced GPUs on next-gen consoles *spawn

Status
Not open for further replies.
True if what we want is the best thing to drive RT progress. Debatable if what we want is the best console in terms of cost, efficiency, and performance for a 5+ year gaming product.

You state that as fact, but it's speculation. A game designed for raytracing in mind might not be able to hit 4K60 no matter what you do because the minimum possible demands of RT may be too much for the hardware that can be put in a console.There are still plenty of unknowns. The only step forward in our understanding so far is that we see the first gen hardware is not powerful enough to offer everything in these first attempts. We need to see more attempts to see how far this first-gen tech can be stretched.
1) Specialized hardware is more efficient than general purpose. Or should we dispense with the rasterization-specific hardware too? It would allow for more flexibility but say good bye to speed.

2) Just lower the quality until it hits the mark :p
 
People are acting like consoles demand 100fps, consoles are content with 30fps. In 2020, we can get a good enough GPU that does ray tracing just fine @1440p30.
I don't see Sony or MS designing a console APU now that requires 550mm+ for the GPU alone to achieve 1440p30, and that's being generous for the 2080 to achieve that with any decent HRT effects.
 
I agree with you, but for argument's sake, the reason to think this is a 'line in the sand' is because ray tracing is a very straightforward solution that has existed for decades, and the only solutions to speed it up, which has been a huge priority for the professional industries, is more power and denoising. ;) If there were tricks that could double the throughput of raytracing on a CPU by reusing ray data, it'd already be in use in every raytracer out there. Of course, depending how the code needs to be implemented there could be bottlenecks in this version, and there might be aspects in a hybrid renderer that can share workloads between rasterising and tracing that'll improve overall performance.

I guess another aspect is backlash. We've been hearing about raytracing improving everything, but this first example is only adding reflections and performance absolutely nose-dives. If BFV looked next-gen with the low framerate, it'd be a different story, or if it had reflections at only a mild impact. The initial impression after the beauty of things like the Pica Pica demo hasn't carried over, while of course there's the astronomical price-tag. So, yeah, I can understand a somewhat reactionary "is this it?" response after all the build up giving additional emphasis to this first title.

I'm not sure there's a way to make ray-tracing cache friendly, unless you do cone-tracing. I would expect the efficiencies would be gained in the construction and manipulation of the BVH, ray culling, importance sampling, the material shading, lods etc. There are probably a lot of things they can look at, but I'm not sure what the limits of Microsoft's DXR API are. I'd even expect that cpu culling will be of huge importance. You need to keep a model of the world in 360 degrees around the player, and smartly culling that world before building the BVH would be important. I have no idea how Frostbite handled visibility culling and how it had to be hacked for RT. Maybe what they did is great, maybe it isn't.

Graphics is typically a place where real-time leads to a lot of discoveries that otherwise wouldn't happen. A lot of the offline render companies that work in film have real-time raster tools to improve their productivity because real-time ray tracing was so far out of reach. Now that the hardware is available, maybe they'll shift focus to moving some of that to ray tracing, and find smart ways to find performance. The people who work on consoles have to do more with less, which is why it's actually a very good space for ray tracing. PC space, you can get away with brute force and raising the system requirements for the "ultra" setting on the box.

Again, I have no idea where it'll go, but the backlash, to me, is unfounded. Software takes time.
 
I'm not sure there's a way to make ray-tracing cache friendly, unless you do cone-tracing. I would expect the efficiencies would be gained in the construction and manipulation of the BVH, ray culling, importance sampling, the material shading, lods etc. There are probably a lot of things they can look at, but I'm not sure what the limits of Microsoft's DXR API are. I'd even expect that cpu culling will be of huge importance. You need to keep a model of the world in 360 degrees around the player, and smartly culling that world before building the BVH would be important. I have no idea how Frostbite handled visibility culling and how it had to be hacked for RT. Maybe what they did is great, maybe it isn't.

Graphics is typically a place where real-time leads to a lot of discoveries that otherwise wouldn't happen. A lot of the offline render companies that work in film have real-time raster tools to improve their productivity because real-time ray tracing was so far out of reach. Now that the hardware is available, maybe they'll shift focus to moving some of that to ray tracing, and find smart ways to find performance. The people who work on consoles have to do more with less, which is why it's actually a very good space for ray tracing. PC space, you can get away with brute force and raising the system requirements for the "ultra" setting on the box.

Again, I have no idea where it'll go, but the backlash, to me, is unfounded. Software takes time.
This is from the offline world but could prove to be useful for real-time:

 
darn, too bad for the Raytracing lovers, the difference in performance between RTX off and RTX on is huge. This is like the Pantene's hair in games. The difference in visuals though is also noteworthy.

 
The people who work on consoles have to do more with less, which is why it's actually a very good space for ray tracing. PC space, you can get away with brute force and raising the system requirements for the "ultra" setting on the box.
The only problem with that is the argument is, "hey, console companies, can you add ray tracing hardware for the benefit of advancing ray tracing for the good of the wider industry?" The console companies are only interested in picking hardware that suits their earnings reports.
People are acting like consoles demand 100fps, consoles are content with 30fps. In 2020, we can get a good enough GPU that does ray tracing just fine @1440p30.
Plenty of console players prefer higher framerates. They accept lower framerates for quality as appropriate. Consider one console plays COD/Battlefield/Whatever with raytracing looking 10* Pretty at 30fps, and the other plays the games at 7* Pretty at 60 fps. Which will gamers prefer? If they prefer the pretties and you choose pretties, hurray. If not, if they prefer framerate, boo.

Same question as in the other thread. That question can be answered with PC gaming metrics if we could find what players actually choose. Also mid-gen games with performance versus pretty mode, we could see which gamers prefer.
 
Last edited:
I don't see Sony or MS designing a console APU now that requires 550mm+ for the GPU alone to achieve 1440p30,
On 7nm it will be much smaller.
that's being generous for the 2080 to achieve that with any decent HRT effects.
It can with low or medium RTX. We are talking mere 30fps here. Consoles also don't need to run Ultra rasterization graphics, they settle to High or even Medium, that gives you plenty of wiggle room.

Plenty of console players prefer higher framerates.
Nah, most console gamers are clueless, don't be fooled by the forum visitors or DF followers, they are a tiny bit in an ocean of casual couch players.
Consider one console plays COD/Battlefield/Whatever with raytracing looking 10* Pretty at 30fps, and the other plays the games at 7* Pretty at 60 fps.
Irrelevant question, as almost all beautiful and visually advanced console games are 30fps. You simply can't have strong visuals without that amount of fps, especially if you insist on high resolutions like 2.5K or 4K.
 
Nah, most console gamers are clueless, don't be fooled by the forum visitors or DF followers, they are a tiny bit in an ocean of casual couch players.

You say that, and yet all the successful FPS and Racing games have all been 60 FPS regardless of whether a "clueless" console gamer even knew that FPS could refer to something other than a First Person Shooter. :p COD ratcheted up to god tier shooter game franchise due to its insistence that performance trumps graphics.

BF games ditching 30 FPS in favor of 60 FPS is what's allowing them to kinda sorta catch up to COD a little bit. If a future Battlefield game offers RT at 30 FPS versus regular rendering at 60 FPS on console, it'd likely lose massive sales.

It's not like PC where you can dial up the graphics and go, "Oooooh pretty." before promptly lowering graphics quality for 60 FPS (or higher) gameplay.

Players can feel the difference between 30 and 60 FPS fairly easily. When it matters to gameplay, and when offered a choice, gamers will usually opt. for the better performing and thus better "feeling" game.

Just because many popular games are locked to 30 FPS doesn't mean that many people that play those game don't wish for a better performing version of that game.

Regards,
SB
 
Last edited:
darn, too bad for the Raytracing lovers, the difference in performance between RTX off and RTX on is huge. This is like the Pantene's hair in games. The difference in visuals though is also noteworthy.


I could not say which is which, if they would not have labels. Maybe it's my screen(1080p amoled phone atm), but they look almost identical to me, so rtx feels like waste of recourses.

If they want to sell it, it should have BIG difference, if I as long term gamer cant see it as noteworthy improvement, casuals definetly wont see it either
 
Nah, most console gamers are clueless, don't be fooled by the forum visitors or DF followers, they are a tiny bit in an ocean of casual couch players.
Link? You may think that, but unless you have data, that's supposition. I know plenty of people who are aware of 'smooth' and not.

Irrelevant question, as almost all beautiful and visually advanced console games are 30fps. You simply can't have strong visuals without that amount of fps, especially if you insist on high resolutions like 2.5K or 4K.
It's exactly the question next-gen console engineers are facing, so it's far from irrelevant. If Console A is released with RTing and 30 fps framerates, and Console B playing the same multiplats without raytracing hits 60 fps, which will buyers prefer? Neither of us has data on that and can only guess.
 
You say that, and yet all the successful FPS and Racing games have all been 60 FPS regardless of
For racing games, 60fps is obtained by sacrificing quality, for example games like Forza 7. Most graphically advanced racers are 30fps. DriveClub, GT Sports, Forza Horizon 3, Forza Horizon 4, Need For Speed and so on.

For shooters like Call Of Duty and Battlefield (Killzone and Halo maybe?), they obtain 60fps by sacrificing quality and complexity too, especially when it comes to polygon count, physics and resolution. They often look worse than comparable 30fps titles on the same engine.
Just because many popular games are locked to 30 FPS doesn't mean that many people that play those game don't wish for a better performing version of that game.
If Console A is released with RTing and 30 fps framerates, and Console B playing the same multiplats without raytracing hits 60 fps, which will buyers prefer?
Of course, people will prefer 60fps, no question about that. That's why I only said "content", because console players are usually not up in arms about it. When the game is 30fps (which is most of the time) they say fine.
Link? You may think that, but unless you have data, that's supposition.
Indeed, I was just voicing my experience with my console friends.
 
darn, too bad for the Raytracing lovers, the difference in performance between RTX off and RTX on is huge. This is like the Pantene's hair in games. The difference in visuals though is also noteworthy.

It is probably just me, but I prefer the RTX off version look. I don't think just having raytraced reflection is an auto win in terms of producing beautiful visual. Beautiful visual doesn't have to be correct.
 
Indeed, could defitinly be used in more static, less demanding genres than highest profile FPS.
As I said in other posts, I'm excited that we're finally enjoying RTRT in games, so I'm obviously pro-RT in the sense that I appreciate what Nvidia has done now, even though I'm not pro-Nvidia (I honestly don't care about the manufacturer). No, it's not the best solution and it's expensive as fuck, but come on, guys! It's friggin RTRT! Can't we cut them some slack?

Thanks to this step mainstream developers are implementing and further testing this tech in a better scale right now. All this will be used to improve both software and hardware in the future. Should we expect any better, bearing in mind all the circumstances and facts of the current tech/market?

I honestly don't understand the shitstorm against Nvidia. They took a risk and they pulled out better graphical cards with the addition of another hardware solution which you're totally free not to use if you want to push rasterization further, instead. If you don't like the features, you're not forced to buy them either!

Of course I understand all the criticism, but, IMO, let other people praise and enjoy the positive aspects which they already have.
 
, but come on, guys! It's friggin RTRT! Can't we cut them some slack?

Believe me, most people are impressed by the 20x0's standard rasterisation performance, its a big leap over the previous 10 series. Its a whole new architecture with RT performance that isnt bad at all for being the first hw implementation on a consumer gpu. Ive played BFV yesterday and it looks amazing with RT in 64 sized MP CQ maps. No other platform offers this kind of performance even without RT.

RTX is abit too expensive for me right now but lets see 2020/2021 what amd and nvidia might come with, that and zen 3/ddr5 arriving somewhere 2020 things are going to be intresting.
 
Status
Not open for further replies.
Back
Top