Impact of nVidia Turing RayTracing enhanced GPUs on next-gen consoles *spawn

Status
Not open for further replies.
"Agreement across the board of people who just want console graphics at higher framerates/resolution".

The complainers are always the loudest.


Oh you know me so well.

Honestly, what is your problem?? (And DavidGraham)

Literally everyone here only wants next gen consoles to be as powerful, and as capable of producing stupidly good graphics as humanly possible.

We discuss for years on end how the manufacturers should (or should not) make humongous losses on hardware, because ALL WE WANT is to play games at the best possible quality on the hardware we will inevitably get.

NOBODY here wants bad graphics, or anything higher than 60fps (on consoles anyway).

Just bloody relax and learn how to accept criticism without being a douche.

All the “critics” are saying is that RIGHT NOW the trade off with RTX doesn’t look that hot, and BFV only proves that point. While all you do is judge “console gamers” as inferior beings to you, for the crime of stating their well deserved opinions.

Just stop
 
Last edited:
How and when rays are cast is pretty much in the hands of the devs, as well as what to do once you get your hit. That part is pretty open, which is nice.
The black boxy part is the BVH acceleration structure. As it is right now, You just hand in your world geometry to the DX12, and it creates a BVH as the vendor's driver views best, and you have no control over it nor even know what it looks like.
This might seem like a detail, but in the world of Ray tracing optimization, the way you build your acceleration structures is one of the richest areas of study, hence why I'd personally like to see more devs experimenting with that.
My other concern is with how performant mixing different primitive types within your trace (triangle mesh/sdf volume/heightfield/ss g-buffer/shadow maps) is in real world. As of now I know of no info regarding that.

Thanks a lot for the info.

I should probably try and find time to dig into this instead of asking too many questions. Right now, it sounds like a general solution that might not be optimal for any one particular case. I suppose the question of allowing developer control over optimisation might be seen as a risk (too many ways to break it), and there's always the question as to whether flexibility slows the solution down more than the ability to optimise would speed it up.

Whatever the case, on console it seems likely that more of the hardware's features would be exposed, which in the long term would lead to more optimal results.
 
I'm in agreement that the parties could break a bit and give it some time to just let the debate and arguments settle in their heads before continuing. As always, we do need to focus on debating the argument and not directly attacking the posters who position the argument.

That being said, some break might be necessary to just go through some of the arguments brought up and attempt to understand them. At this point in time I'm seeing a lot of desire to respond, but not a lot of desire to understand.
 
Just what are "console graphics", given that PC and consoles share architectures, assets and shaders? And given that the 1X is faster than the top 4 GPUs in the steam survey, and faster than 15 of the top 20. Bleedin' "Haswell" graphics are at number 19!

Trying to separate graphics quality between "PC" and "console" is, especially when trying to use it as a weaponised, loaded term, pretty destructive.

And frankly, if you're running "console" graphics at higher frame rates and resolutions than any console can muster, then they're by definition not console graphics. Another reason not to get caught up in the semantics of platform wars.
It's about fidelity. A console game running with better performance is just a console game running with better performance. Compare that with Crysis when it was released back in 2007. It was leapfrogging console games fidelity by quite a margin. Ray tracing in BFV is similar to that but not as extreme.

Honestly, what is your problem?? (And DavidGraham)

Literally everyone here only wants next gen consoles to be as powerful, and as capable of producing stupidly good graphics as humanly possible.

We discuss for years on end how the manufacturers should (or should not) make humongous losses on hardware, because ALL WE WANT is to play games at the best possible quality on the hardware we will inevitably get.

NOBODY here wants bad graphics, or anything higher than 60fps (on consoles anyway).

Just bloody relax and learn how to accept criticism without being a douche.

All the “critics” are saying is that RIGHT NOW the trade off with RTX doesn’t look that hot, and BFV only proves that point.

The end.
I'm simply putting the criticism in context. And really, just because a bunch of people believe something it doesn't mean it's true or that everybody else should believe it too.

If your thing is pixel quality, not quantity AND you have the money for it, then why wouldn't it be worth it?
 
I'm simply putting the criticism in context. And really, just because a bunch of people believe something it doesn't mean it's true or that everybody else should believe it too.

If your thing is pixel quality, not quantity AND you have the money for it, then why wouldn't it be worth it?

No, I just put it into context. This is a console thread, about the impact of RTX on next gen consoles. What exactly is your contribution here?

Leave the “stupid console gamers” discuss things in peace, you can continue posting on the hundreds of RTX threads you can find on B3D.
 
It's about fidelity. A console game running with better performance is just a console game running with better performance. Compare that with Crysis when it was released back in 2007. It was leapfrogging console games fidelity by quite a margin. Ray tracing in BFV is similar to that but not as extreme.

As pointed out, fidelity is often lower for a PC gamer than a console gamer. And "console game running with better performance is just a console game running with better performance" is simply truism in place of an argument. It's noise, and the appearance of meaning without the presence of any.

On most PCs of the day Crysis did not leapfrog "console fidelity" at all. It ran like shit even with settings pared back, and Crytek later talked about all the ways in which they'd been able to improve performance, slash the number of individual buffers needed and save large amounts of memory in subsequent versions of the engine. Don't confuse fidelity and performance, or fidelity and requirements.

And certainly don't misrepresent "PC" as being a fringe of a vast market, then put that disingenuously into relief against none specific "console graphics" as a way of fud bombing an entire discussion about something completely different - the performance of current ray tracing implementations and their potential usefulness to and implementation in next gen consoles.

"PC fidelity". "Console graphics." FFS. We have a new generation of consoles on the horizon and many of us have very little thinking time in the day after the bullshit of work, bills and taxes.
 
And what credential has this youtube channel?

Perhaps some hurt console gamers that wanted RTX in their version but didnt get it? :D

That's an oft repeated piece of general ignorance, but my poll here suggests very much otherwise. Unless someone has really good data on PC gamer preferences, it'd be best for us all to stop repeating this as if fact. For myself, I'm going to stop believing that until presented with some good evidence.

A poll on a forum mostly crowded by pro-console people yes. People on PC seem in general wanting the highest framerates possible, in special in multiplayer, even in BF4 theres servers for 144hz, dont think ive ever seen such a thing on console. FPS is a bigger thing on pc.

Just what are "console graphics", given that PC and consoles share architectures, assets and shaders? And given that the 1X is faster than the top 4 GPUs in the steam survey, and faster than 15 of the top 20. Bleedin' "Haswell" graphics are at number 19!

Do more people own a One x then a 1060/RX580 gpu or higher?

Literally everyone here only wants next gen consoles to be as powerful, and as capable of producing stupidly good graphics as humanly possible.

We discuss for years on end how the manufacturers should (or should not) make humongous losses on hardware, because ALL WE WANT is to play games at the best possible quality on the hardware we will inevitably get.

NOBODY here wants bad graphics, or anything higher than 60fps (on consoles anyway).

Just bloody relax and learn how to accept criticism without being a douche.

All the “critics” are saying is that RIGHT NOW the trade off with RTX doesn’t look that hot, and BFV only proves that point. While all you do is judge “console gamers” as inferior beings to you, for the crime of stating their well deserved opinions.

Just stop

Others are doing exactly the same thing, but the other way around, every new tech and graphic development on pc is discredited as being a bad implementation. You guys are even digging for youtube channels that think RT wasnt worth it. That while there also are youtube channels out there that think it really enhances graphics in the way they are advertised.

Offcourse everyone wants the PS5 be as powerfull as possible, so that the market isnt held back again for another 7 years or more. There is no gain for hoping the PS5 will be a low to mid end hardware console like the PS4 was.

I dont think anyone is saying console gamers are inferior, i have both console and pc, and both have their place. Console for their exclusives and thats it. For anything else really i personally think im better off on pc.
 
As pointed out, fidelity is often lower for a PC gamer than a console gamer.

Isnt it the other way around? I mean personally i have my consoles not for the graphics but for the games i cant get on pc. I have my pc's if i want the newest tech and graphics, highest frame-rates, image quality etc.
Are there people here really getting a PS4 for anything else then its exclusives? Perhaps ease of use?

On most PCs of the day Crysis did not leapfrog "console fidelity" at all.

It sure did on my 8800GTS/Q6600, i couldnt run highest settings but even paired back somewhat and still on a higher res then most PS3/360 titles where doing, but it still looked better then anything else out then. The consoles got a version too but it was a far cry from what Crysis was.

bullshit of work, bills and taxes.

What?
 
All the “critics” are saying is that RIGHT NOW the trade off with RTX doesn’t look that hot, and BFV only proves that point.
That's precisely the point. What is the definition of critic here? A console gamer? A youtuber? A performance reviewer?

Your statement here is factually incorrect, I can link you to several outlets right now that believe the visual impact of Ray tracing is great. And to several developers who think this is the start of a good thing, and that things will only improve from this point onward.

What's wrong here, is you trying to generalize your own predetermined convictions as the only truth here, you are entitled to your opinion, but to broadcast it as the public consensus is simply crappy argument. Especially at this early stage of the tech, when you and others lack a deeper understanding of the possibilities here and the innerworkings of the implemention.
 
What's wrong here, is you trying to generalize your own predetermined convictions as the only truth here, you are entitled to your opinion, but to broadcast it as the public consensus is simply crappy argument. Especially at this early stage of the tech, when you and others lack a deeper understanding of the possibilities here and the innerworkings of the implemention.

This is nothing more then just what you want to see and read or hear. Filtering out the negative reviews and claiming your right can anyone do. What i did was playing the new battlefield myself, on a 2080Ti powered system with RT enabled, the one x version was running on the same location some metres away. On first sight on a distance you dont really see it especially in MP, everythings going so fast. But the difference is really there, quit big difference in water, reflections and lighting.

You have to have in mind its the very first game not even designed with RT or the new GPU's in mind. It will be intresting to see how the new Metro will do with RT.

Even without RT, there are notable differences. Yes a pc will cost you much more but thats not my point.

 
What's wrong here, is you trying to generalize your own predetermined convictions as the only truth here, you are entitled to your opinion, but to broadcast it as the public consensus is simply crappy argument. Especially at this early stage of the tech, when you and others lack a deeper understanding of the possibilities here and the innerworkings of the implemention.
What on earth are you talking about?

Me, trying to generalise my own predetermined convictions? Come off it.

My only “predetermined conviction” was and still is that ray tracing rocks and all I (and most others) want is to play games that look as real/CGI/correct as possible and RT is probably the way to get there.

You can’t tell me what you want me to think when I see the only RT game so far looking OK, considering how badly it runs with RT enabled.

Again, this thread is about THE IMPACT OF RT GPUS ON NEXT GEN CONSOLES. This is literally what we’re meant to discuss. Things will improve, of course, but so far we can only look at today’s results and form opinions based on that, and talk about how this will fit in next gen consoles, and most importantly whether it’s all worth it or not.

My opinion right now, formed after seeing the ACTUAL results so far with something priced at two or three times as much as a console, is MEH.

That’s what this thread is about.
 
A poll on a forum mostly crowded by pro-console people yes. People on PC seem in general wanting the highest framerates possible, in special in multiplayer, even in BF4 theres servers for 144hz, dont think ive ever seen such a thing on console. FPS is a bigger thing on pc.
Evidence! You may well be right, but without evidence you're just spouting an opinion rather than a factual argument. I wouldn't bet anything on my poll being representative, but faced with the question, "do PC gamers value framerate more than console gamers," I did what little I could to ascertain meaningful data rather than just spouting unvalidated opinion.

Turning my mod-hat up to full power, I'm starting to feel the tech forum needs some stronger moderation, because these tech discussions are getting very watered down
 
Evidence! You may well be right, but without evidence you're just spouting an opinion rather than a factual argument.
I appreciate the intention here on the polls. But the results would largely be inconclusive due to the data set itself.

There are far too many variables to make generalized and to make generalized statements we need large data sets that isn’t found here.
 
I appreciate the intention here on the polls. But the results would largely be inconclusive due to the data set itself.

There are far too many variables to make generalized and to make generalized statements we need large data sets that isn’t found here.

And with that we both have no real prove. On pc though you certainly can achieve much higher frame rates, especially for esport can be important, whilest on console your more limited. Theres people that cant live with 30 or even 60, and those you wont find on consoles, thats why i think pc gamers are more on the FPS thing.
 
Others are doing exactly the same thing, but the other way around, every new tech and graphic development on pc is discredited as being a bad implementation. You guys are even digging for youtube channels that think RT wasnt worth it. That while there also are youtube channels out there that think it really enhances graphics in the way they are advertised.

That's not really how I've perceived the stance of those of us who are more sceptical of an RTX2080 approach to the next-gen consoles, and I think we might all be off on the wrong foot here.

Here's how I see the gist of the sceptics:

-- Nvidia's first crack at RTRT is obviously only the tip of the iceberg, and so far it's big and expensive.

-- BFV looks better with RT, but not night and day. Admittedly, it's early days, but we can only work with what we have.

-- Nvidia, and the PC space in general, can afford to do fun and mad shit like a ~775mm2 die.

-- If the PS4 and XBoxOne are anything to go by, we're looking at another ~350mm2 APU.

-- We don't necessarily think that a 350mm2 APU should barely increase in traditional rendering power when RTRT is nascent, and only two cards have been released from a single company.

-- Maybe the move to 7nm will put at least RTX2070 performance within reach of the next-gen consoles, but would a ~7.5TF GPU cut it?

-- Maybe Nvidia will improve the architecture to the point where there's a lessened sacrifice of rasterisation, in which case it - or AMD's facsimile - starts to become the more obvious choice.

-- Maybe techniques utilising Nvidia's current iteration of RTX will prove so beneficial that any sacrifice of rasterisation performance is worth it, but if there is any sort of "black box," those techniques would be hastened by letting devs dabble with said box.

So when I refer to the next-gen consoles and say I would expect a more flexible, less performant approach to RTRT, it's because I think we're more likely to see a focus on the relatively known quantity of rasterisation, with some tweaks to hardware that can allow developers to dabble in RTRT.

I don't think anyone here doubts ray tracing, or really even Nvidia's architecture given that it does something that, just a few years ago, we could only dream of whilst nursing semis. It's just early days, maybe too early to influence the next-gen consoles.
 
You can’t tell me what you want me to think when I see the only RT game so far looking OK, considering how badly it runs with RT enabled.
-- BFV looks better with RT, but not night and day. Admittedly, it's early days, but we can only work with what we have.
Unless the game is located into tight metallic corridors full of glass, mirrors and all the shiny things in the world, then reflections have little to add to the overall image quality.

Battlefield is an open game, any reflection technology will not transform the look of it. Even if that technology is cranked to 11. However, the RTX reflections in Battlefield V are impressive by the sheer volume of accurate reflections on most surfaces, even characters eyes. DICE could have easily slapped RTX reflections on some water puddles or wet surfaces and called it a day, performance would be much higher and amateurs would sing the praises of RTX. But DICE didn't do that, instead they opted to upgrade ray traced reflections to into a whole new level. That's how you push the boundaries of PC graphics.

Again, this thread is about THE IMPACT OF RT GPUS ON NEXT GEN CONSOLES.
The impact is quite clear despite how stubborn headed the ray tracing opposition may think: Reduce ray tracing effect to their low levels "as seen in BFV", maybe even a tad lower than that, optimize the heck of that, then stick to a lower resolution and upscale, and hybrid ray traced games will be possible and very much doable on something like an RTX 2070 shrunk down on 7nm with a frequency boost courtesy of 7nm. Not rocket science.
 
Last edited:
Status
Not open for further replies.
Back
Top