Impact of nVidia Turing RayTracing enhanced GPUs on next-gen consoles *spawn

Status
Not open for further replies.
Yes but resolution and frame rate doesn't take dev time and budget away.

True, but there are actually people that want native 4k and atleast a stable 60fps, that in combination with the highest settings possible. This requires often AMD's or Nvidias highest end GPUs.

https://wccftech.com/nvidia-geforce-gtx-1080-success/

''High-End GPUs are becoming more and more popular among PC Gamers''

Aslong as people buy them, they will produce those products. I myself are in the middle, i dont want to spend $700 on a gpu, but something like a 2070 or 3070 down the line could be something.
The ones that want this native 4k or higher, 60fps with the highest settings possible, and perhaps even VR/RT in games like Cyberpunk or the next Halo will probally have to go the pc route, nothing wrong with that, nothing wrong either for those that go the console route and dont mind 30fps with checkerboarding and a lesser form of AA. And if you want exclusives to a certain platform you will have to buy that and enjoy it there.
 
I don't understand what you're suggesting. Gamers will buy a high-end GPU because it plays their games in higher quality (resolution and framerate), which is supported by the devs because it's basically free. They create their engine with a few parameters and the hardware scales. Gamers wanting to buy a 20xx for higher quality reflections etc. may find devs don't support those features because it's added work for them at no cost.

Unless devs can sell RT features as a downloadable extra, why would they add to their workload to implement these features? What's in it for them?
 
True, but there are actually people that want native 4k and atleast a stable 60fps, that in combination with the highest settings possible. This requires often AMD's or Nvidias highest end GPUs.

That's not issue. Rendering engines have long been designed with scalability in mind but supporting RT isn't about scalability its about writing new code for a tiny segment of the market using DirectX Raytracing-supported hardware.

Microsoft's own SDK for raytracing is labelled as 'experimental'. :rolleyes:
 
Not only resolution and fps, but higher AA/AF and settings too (draw distance, shadows etc). But i agree a 2080 wont be taken advantage of in forms of RT etc.
 
Unless devs can sell RT features as a downloadable extra, why would they add to their workload to implement these features? What's in it for them?

True, but atleast someone had to start with RT, its atleast a good start. I didnt even expect RT to happen already, and i do think Nvidia, AMD, and soon Intel will develop further on it, and release GPUs with RT, it will become more mainstream and more titles will support it.
 
Well it looks like the 2060 cards are going to be branded GTX and not have Raytracing tech and the last gen Nvidia cards lasted about two years and with the prices Nvidia are charging for there RTX line of cards it's going to be a long way from being mainstream.

I wonder what percentage of PC gamers have a $500 + graphics card in there machine?

I have no idea how many PC gamers have high-end gpus. I'm just saying what I'm going to do. If I can't afford a gpu when the new consoles launch, then I'll wait another year or two and buy one. I won't buy a new console if the gpu architecture is seriously outdated.
 
But in the case with RT tho even at 1080/60 you'll still get less eye candy than a 4k/30 mode with all the fluid dynamic, gpu particles and what not turned on.
Here's a simple comparison
RT 1080/60
Rasterization 4k
https://polycount.com/discussion/196941/book-of-the-dead-real-time-unity-demo-teaser
I know which one looks better overall.

So a tech demo looks better than an unreleased game using unreleased hardware, beta drivers and a beta sdk, and that's the standard for how all games will look for the next two to three years?
 
High end GPUs, thats 1080, perhaps 1080Ti or higher. A 1070 is already much more powerfull then what you find in PS4 pro, let alone vanilla PS4. How many do own a GTX660 class GPU or higher?
 
Games trail new engines by years - Epic released Unreal Engine 4.0 in 2012 and the first games didn't release until 2014 and even then only four games and I bet you're never heard of any of them! Many more games released in 2015 but I bet you're not head of 99% of those. It takes a while for new engines features to be adopted. Projects already underway are unlikely to change unless it's a quick/painless win. Even minor changes in engines can result in a lot of time spent on re-testing and generally you don't want to dive in on something new because you're betting on unproven software.

Doom Eternal is idtech 7. The question asked was whether Doom Eternal would support RT, and the answer was idtech7 supported RT, so I guess that means we'll have to wait and see whether Doom Eternal has it or not.

Edit: Unity and UE will most likely add support for ray tracing features. It might be take a while for games to come out, but probably well within the time frame of the next generation of consoles, which will most likely run from 2020 to 2026, or something like that. I'd rather buy a gpu with a reasonable performance level for ray tracing in 2021 or 2022 than buy a new console in 2020 that doesn't support ray tracing in a sufficient manner.
 
Last edited:
Doom Eternal is idtech 7. The question asked was whether Doom Eternal would support RT, and the answer was idtech7 supported RT, so I guess that means we'll have to wait and see whether Doom Eternal has it or not.

True but Doom to Bethesda is like Crysis to Crytek; a game to showcase the engine. idtech7 isn't, to the best of my knowledge, available to anybody other than people inside Bethesda. idtech7 on PC also supposedly only supports Vulkan.

High end GPUs, thats 1080, perhaps 1080Ti or higher. A 1070 is already much more powerfull then what you find in PS4 pro, let alone vanilla PS4. How many do own a GTX660 class GPU or higher?

I honestly don't understand what point you're trying to make here?
 
I honestly don't understand what point you're trying to make here?

The high end gpu market (1070 or higher?) is small in comparison to the low and mid end market (GTX660 and up?), but its the same with consoles really, one x and Pro are a small market compared to the vanilla consoles, which are basicly nothing more then a very low end cpu even for 2013 and a mid end gpu for 2013. Its a given that console hardware is taken advantage of much better though.

This way, consoles share the same problem pc has, if developers could only focus on Pro and one X as a baseline, games wouldnt have to also run on vanilla hardware and thefore perform/look better.
 
Last edited:
True, but atleast someone had to start with RT, its atleast a good start.
RT started with compute. Sebbbi's Claybook is raytraced. The point here is whether the tech is good enough and viable enough for consumer level use, or is it a pro feature? Because normally, new tech starts with the professionals and is then commoditised down to the consumer a few years later. nVidia putting in their pro-level feature into a gaming card could very well just be a marketing ploy of no real use for a good few years.

I'm not really sure why I'm typing this, actually. It's been mentioned several times before in this thread. New features, like everything DX12, appears in hardware but never gets used effectively in first gen hardware and is always underpowered when the features become mainstream a few years later. They're typically only worth anything to professionals who can upgrade their cards every year - accelerated RT is going to be huge for content creators.

The discussion should focus on the viability of RT for consoles, given costs and silicon budgets and alternatives. Maybe perhaps even it should be dropped now until something comes to suggest it features in the next consoles? The last few pages are going off on an odd tangent about raytracing and nVidia GPUs.

Edit: Spawned thread.
 
Last edited:
This way, consoles share the same problem pc has, if developers could only focus on Pro and one X as a baseline, games wouldnt have to also run on vanilla hardware and thefore perform/look better.
What has that got to with an raytracing? :???:
 
As much as this:

Clearly that was a whoosh. RT hardware is only in the highest tier graphics cards, highest tier graphics cards, even the 1080 which came out more than two years ago, present a tiny fraction of cards in use. Ergo, the likelihood of RT technology being exploited in PC where you can't buy the cards yet and Microsoft's SDK is experimental, naively optimistic. Those same factors are stacked against RT hardware appearing in nextgen consoles as well, along with Nvidia not referring the to 2060 part as RTX2060 - as commented above.

You follow?
 
RT started with compute. Sebbbi's Claybook is raytraced. The point here is whether the tech is good enough and viable enough for consumer level use, or is it a pro feature? Because normally, new tech starts with the professionals and is then commoditised down to the consumer a few years later. nVidia putting in their pro-level feature into a gaming card could very well just be a marketing ploy of no real use for a good few years.
...

I really don't see why Nvidia would double their transistor count for a marketing feature. From a financial perspective it doesn't make much sense to me.
 
Clearly that was a whoosh. RT hardware is only in the highest tier graphics cards, highest tier graphics cards, even the 1080 which came out more than two years ago, present a tiny fraction of cards in use. Ergo, the likelihood of RT technology being exploited in PC where you can't buy the cards yet and Microsoft's SDK is experimental, naively optimistic. Those same factors are stacked against RT hardware appearing in nextgen consoles as well, along with Nvidia not referring the to 2060 part as RTX2060 - as commented above.

You follow?
We don't actually know if Nvidia are going to put RT cores in mid-level hardware, do we? I don't remember any announcements regarding 2060 regarding naming? Granted it probably doesn't make sense to.
 
You follow?

Saying that high end pc hardware has a small user-base can be countered with the fact that high-end console hardware is small too. High end in a console is GTX 1060 level performance, which now in pc gaming world is actually more on the low end line.
Thus it affects console users with vanilla hardware aswell, as buying a one x or a pro wont net you more then higher resolutions and fps for the most.

So baseline hardware for consoles is 7850 level performance, on pc you can stay on that level if you want, by either keeping a gtx680 or something or getting a 1050 class gpu.
Not many had 1060 level performance on one x launch but that can be said for the one x aswell.

On a note, a 2070 is about midrage, as a 2080 and 2080ti fill the role of high-end, in my opionion. Just as a 1070 is more mid range then highend.
 
I'd rather buy a gpu with a reasonable performance level for ray tracing in 2021 or 2022 than buy a new console in 2020 that doesn't support ray tracing

There's nothing wrong with this way of thinking but it's been this way for awhile now. If your big influencer is tech then you should definitely stick with PC's.
Raytracing is definitely the future and it's fantastic that Nvidia have a solution already.

When it comes to consoles though there is a balancing act that has to happen. If they can implement it in next gen consoles it would be fantastic for PC because the amount of time and budget given towards Raytracing will be massively more.

It feels just like yesterday that people were saying that it would be a struggle to get 12TFlops out of the next gen APU and that 12TFlops won't be enough of a step up from current gen( I feel it will be enough). So to think they will have enough silicon to add Raytracing tech is ambitious?
 
Status
Not open for further replies.
Back
Top