Impact of nVidia Turing RayTracing enhanced GPUs on next-gen consoles *spawn

Status
Not open for further replies.
Ray tracing will open up so many artistic opportunities that it'll be a huge shame if consoles are stuck without it for another full generation.

Edit: Actually, thinking about it more, I wouldn't touch next-gen consoles if they launch without ray tracing, because there's a good chance of a mid-gen upgrade console, like PS4 Pro and One X, that would support it.
 
it may already be flexible - from what we've read so far, it's some additions to the Turing SM.

The Tensor core is probably the only fixed function hardware in there, which, is not necessarily directly related to RT at all. They were leveraging it for an improved AA technique.

Aren’t the tensor cores used for denoising?
 
Edit: Actually, thinking about it more, I wouldn't touch next-gen consoles if they launch without ray tracing, because there's a good chance of a mid-gen upgrade console, like PS4 Pro and One X, that would support it.

That is what I'd consider a genuine Next-Generation that happens in 2023 / 2024, since the 2020 generation likely wont have Forwards-Compatibility.
 
I think that is one application of it. But I don’t think all the games were leveraging denoising. I could be wrong though.

That may be true. But currently it seems AI based denoising is the primary motivator for the tensor cores inclusion. Tensor cores seems a little much for an AA solution. Could have wasted that space on more general purpose hardware if that’s the case.
 
Are there any updates (or rumors) about AMD's Super-SIMD designs possibly showing up in the NAVI architecture? Because the current GCN CUs doesn't support these type of instructions or ALU setup.
 
Last edited:
I'm not touching Raytracing until consoles can run it at 4k or even CBR 4k, not sure wanna put up with 1080p on my 4k tv in year 2020+ :). If the midgen consoles do come with RT units then I hope there's options for mode select.
 
Reality is for any competitive multiplayer game, I’d turn everything to low to get the highest FPS possible. But I’d take 1080p60 with more eye candy over 4k60 with less eye candy for any single player game.

I won’t be buying next generation consoles if they don’t include whatever hardware features are available on current gpus at the time. I’ll be buying a new video card.
 
Reality is for any competitive multiplayer game, I’d turn everything to low to get the highest FPS possible. But I’d take 1080p60 with more eye candy over 4k60 with less eye candy for any single player game.

I won’t be buying next generation consoles if they don’t include whatever hardware features are available on current gpus at the time. I’ll be buying a new video card.
But in the case with RT tho even at 1080/60 you'll still get less eye candy than a 4k/30 mode with all the fluid dynamic, gpu particles and what not turned on.
Here's a simple comparison
RT 1080/60
Rasterization 4k
https://polycount.com/discussion/196941/book-of-the-dead-real-time-unity-demo-teaser
I know which one looks better overall.
 
I won’t be buying next generation consoles if they don’t include whatever hardware features are available on current gpus at the time.

Well it looks like the 2060 cards are going to be branded GTX and not have Raytracing tech and the last gen Nvidia cards lasted about two years and with the prices Nvidia are charging for there RTX line of cards it's going to be a long way from being mainstream.

I wonder what percentage of PC gamers have a $500 + graphics card in there machine?
 
We have no idea what prices will do, not for consoles either. 2 years is also a good timeframe for a new gpu line with refined RT, and probally availeble in lower end products too. Amd will have to follow suit sometime too.
In two years il get the 3070 or something.

Also some people are willing to pay more for newer/better tech. Personally i want RT as already many games have it, its the new way of rendering.
How many that have a 1070Ti or better in their machine doesnt affect my or his experience?
 
Idtech7 supports ray tracing. Adoption is going to be quick.

Games trail new engines by years - Epic released Unreal Engine 4.0 in 2012 and the first games didn't release until 2014 and even then only four games and I bet you're never heard of any of them! Many more games released in 2015 but I bet you're not head of 99% of those. It takes a while for new engines features to be adopted. Projects already underway are unlikely to change unless it's a quick/painless win. Even minor changes in engines can result in a lot of time spent on re-testing and generally you don't want to dive in on something new because you're betting on unproven software.
 
Last edited by a moderator:
Yet even Digital Foundry was amazed by the amount of games supporting ray tracing. Would be cool if Cyberpunk would have it implemented for those that want it.
 
How many that have a 1070Ti or better in their machine doesnt affect my or his experience?

It effects the devs. What's the point of spending parts of your financial and time budget if it's only going to be used by a small audience.
 
It effects the devs. What's the point of spending parts of your financial and time budget if it's only going to be used by a small audience.

A 1080Ti today wont be taken advantage of to 100% like a console, but if you want the newest AAA games on 4k 60fps max settings, a 1080Ti is atleast needed for most titles. Seeing what the Cyberpunk demo was running on, i dont doubt a RTX 2080TI will be atleast needed to enjoy native 4k 60fps at maxed settings. If RT comes into play, that is, if RT and Nvidia/AMD doesnt get binned, you will need a high-end GPU even more so.

Small audience compared to what? your consoles maybe, but theres quit many having a 1060/1070's, thats at least a match for one x gpu performance, how many own an one x vs a 1060/RX570/GTX970 or better?
 
Status
Not open for further replies.
Back
Top