Impact of nVidia Turing RayTracing enhanced GPUs on next-gen consoles *spawn

Status
Not open for further replies.
It's not about power but how they operate. If compute shaders can be made to perform BVH intersects well, RT can be done on compute. That's the only part RTX is accelerating. Or, alternatively, perhaps compute will not accelerate intersect tests but other aspects of the resolve may be accelerated, meaning the final result is just as good.

When AMD introduced unified shaders, nVidia said they were slower than discrete shaders which is why they didn't have them in 7800. Now we all use unified shaders, and indeed their evolution into compute...
Oh, I see, we're going back to wishing for magical algorithms that run just as good as fixed function hardware.

The Titan V struggles at emulating a high level implementation of Ray Tracing on PC on an API for WindowsI10 that is so generic it probably misses dozens optimisation opportunities it could have were dice building a game from the ground up with it as a baseline, and probably also forces an approach that is not as optimal as the one the developer would have chosen themselves.
Commercial ray tracing software has seen big improvements thanks to RTX. You're not going to call those unoptimized, are you?

Got to love it when people see or imagine what they want just to suite their agenda...Everybody knew from the get go that SVOGI wasn't going to be a reality in 2012, even Epic. It was simply used a marketing tool to prop up UE4 which was in development hell. Nobody had done anything with SVOGI at the time as the technique was literally invented by Cyril Crassin (in collaboration with Fabrice Neyret, Miguel Sainz, Simon Green, Elmar Eisemann) a few months prior to UE4's announcement. Once Crassin became a full time Nvidia employee (by the end of 2011) Nvidia approached Epic and pimped it to them as a joint marketing effort...Epic didn't magically develop Lightmass as a replacement in a few weeks once "SVOGI" was scrapped because they suddenly discovered that it would be impossible to run on consoles or even high end PCs. seriously.
Those companies (Nvidia, Epic, AMD, Intel or whatever) are all about marketing BS to sell their products and make money. They don't care if this technique is better than this one. They only care about which one runs/works better on their product. Nvidia has been promoting the hell out of SVOGI for years because that (particular implementation) was theirs and worked best on their GPUs. The new buzz word is RT and Nvidia build dedicated HW to accelerate a single part of it so now they are going to BS their way into claiming that this is the best thing since sliced bread (while continuing to work on SVOGI and literally released VXGI 2.0 on the day DXR was unveiled btw..but on the low to not mess up the marketing message..). You only need RT "Hardware" if you want to make RT the way Nvidia wants you to make it (and this applies to all companies..).
"Everybody knew..."

Hindsight is 20/20 isn't it?

Of course, the major difference between SVOGI and DXR is that DXR is a standarized API while SVOGI is just one specific technique. If consoles don't support it that'd be AMD holding a whole generation back and would deserve crashing and burning for it.
 
Can we get back to the topic at hand or take all of this off-topic somewhere else?

If it doesn't concern the impact on next-gen consoles it shouldn't be in this thread.
 
So you are predicting the trend to shift. Why? The odds are against you. The natural bet is that things will behave as they have before.

PC graphics in general have allways been ahead of console graphics, this is logic, nothing more.

Weak compared to what? To premium high-end cards of their time?

Current gen consoles where rather weak in every aspect imo. 7850 is a lower mid tier gpu from 2012, atleast a year ahead of the consoles. A 7970ghz 3/6GB could be had atleast one year before the PS4 saw the light. Its substantially faster its not even funny.
Premium high end cards sound more like a R9 290X, or even higher then that, a Titan black or something. AMD Fury wasnt long from there either.
Their CPU's arent even worth discussing, the ram perhaps was the best spec, but 8GB total isnt that thrilling either.

And as such, it will remain a extra "ultra" optional feature, and never a fundamental pillar of a game's engine and content design.

Perhaps for Sony products in a potentional Pro version of the PS5. If MS is going to further push DXR PS5 will be lagging behind.

finances by the HIV that wants to use it for marketing.

Or play on console where you get to live with lower settings of graphics, lower framerates, and lower resolutions, and since 2018, missing RT features.

That's how it's been for the last 2 decade

I have seriously never ever seen someone stating that a console is technologically ahead of pc in any way, it has allways been the other way around, in special some years into the console generations.

You are not thinking straight. You are wishfull-thinking.

And you are not? One shouldnt think straight either, theres no such thing as consoles reign the world, nor does the pc.

Never say never. But again, even if it remains an Ultra option. People buy high end GPUs for this Ultra option. Developers do these Ultra options for people seeking them. And for their games to look good years after their launch. Which means DXR will get wider adoption still.

Hes also forgetting that margins are much bigger in the pc gaming market, theres a reason Nvidia is doing much better then AMD, even though they left the console space a looooong time ago. Hes also forgetting the fact that MS is taking another strategy coming gen, and allready now every game of xbox is on windows. If their biggest title, Halo, is designed with PC in mind, that could mean more and more titles see this and DXR on pc, or other options.

Can we get back to the topic at hand or take all of this off-topic somewhere else?

If it doesn't concern the impact on next-gen consoles it shouldn't be in this thread.

Had allready written my post, i got your message :) Some of my comments could be offtopic in the post but i wont continue.
 
PC graphics in general have allways been ahead of console graphics, this is logic, nothing more.







I have seriously never ever seen someone stating that a console is technologically ahead of pc in any way, it has allways been the other way around, in special some years into the console generations.

And you are not? One shouldnt think straight either, theres no such thing as consoles reign the world, nor does the pc.



Hes also forgetting that margins are much bigger in the pc gaming market, theres a reason Nvidia is doing much better then AMD, even though they left the console space a looooong time ago. Hes also forgetting the fact that MS is taking another strategy coming gen, and allready now every game of xbox is on windows. If their biggest title, Halo, is designed with PC in mind, that could mean more and more titles see this and DXR on pc, or other options.

I will refrain from talking to a person that completely misinterprets what I say. You are arguing against points I never made. If you disagree, re-read my posts. If it does not help, read better.
 
Can we get back to the topic at hand or take all of this off-topic somewhere else?

If it doesn't concern the impact on next-gen consoles it shouldn't be in this thread.
As far as I've been able to find by skimming through this thread, there has been no clear indication that the next generation consoles will actually support real time ray tracing with specific hardware at all. And unless they do, can't it be justifiably concluded that the impact of nVidias RTX on next gen consoles will be (very close to) zero? Have I missed something? Has there been leaks from within Sony studios for instance that RT hardware is going to be utilised for next gen titles?

(As far as impact on the gaming industry as a whole, that is largely an economic/volume question. And if the overwhelming majority of the underlying hardware for gaming from mobile through consoles through mass market PCs do not support performant RT, it's a given that impact will be small.)
 
No, you haven't missed anything from a hardware perspective. There is no indication of realtime raytracing being on next-gen consoles. Theres also no solid indication of realtime raytracing being on next-gen GPUs from AMD and Intel.

There seemed to be a lot of hope that RTRT would be pushed wider. If it was to be pushed into next-gen consoles, it would likely have to do one of the following, perhaps even both:
1. steal die area from something else (trade off RTRT for weaker CPU)
2. increase cost of console to cover costlier chips
3. increase complexity of motherboard if cpu and gpu are separate chips.
 
Is a dedicated RT dsp/chip possible ? Like, cpu/gpu/rt engine ? Like in some arcade boards at one point, the T&L was a on separate chip. So consoles maker wouldn't have to wait on a full gpu with rt being available, but adding just the rt "bloc" (I guess Sony can design one, or, in a twist, using IMG RT tech :eek:) ?

For the next generation (PS5/Xb2019-20) I guess it's too late, but, I don't know,...
 
I could be a dedicated unit of the soc, but I don’t think anyone would want to deal with the complexities of adding another chip that would require high bandwidth access to the memory as well.
 
So, after the reveal on GDC that nvidia enables the GTX 10x0 to use RT, and the RT-demo of crytek it is safe to assume that "classic" hardware is capable of a bit RT, enough to get a convincing lighting and reflections on a Vega 56. So there won't be any extra hardware in next gen consoles, just some intelligent shader-implementation.
It just wasn't realistic to get extra hardware that can only be used for such a purpose in a $300-$400 console.
 
So, after the reveal on GDC that nvidia enables the GTX 10x0 to use RT, and the RT-demo of crytek it is safe to assume that "classic" hardware is capable of a bit RT, enough to get a convincing lighting and reflections on a Vega 56. So there won't be any extra hardware in next gen consoles, just some intelligent shader-implementation.
It just wasn't realistic to get extra hardware that can only be used for such a purpose in a $300-$400 console.
It's very realistic?
What do you mean - the majority of the work is still done by compute shaders. The RTX cores are handling intersection. That shouldn't inflate the cost 200%.
Looking at the 2060RTX or 2070RTX are reasonable performance profiles for a 2020/2021 console.
 
So, after the reveal on GDC that nvidia enables the GTX 10x0 to use RT, and the RT-demo of crytek it is safe to assume that "classic" hardware is capable of a bit RT, enough to get a convincing lighting and reflections on a Vega 56. So there won't be any extra hardware in next gen consoles, just some intelligent shader-implementation.
It just wasn't realistic to get extra hardware that can only be used for such a purpose in a $300-$400 console.
The performance metrics show otherwise. DXR on GTX makes it obvious RT cores are a big deal.
 
Status
Not open for further replies.
Back
Top