Impact of nVidia Turing RayTracing enhanced GPUs on next-gen consoles *spawn

Status
Not open for further replies.
Are we positive raytracing is going to take over?

All rendering seems like tricks to more speedily fake the real thing. At what point can you 70, 80, 90 percent fake what raytracing does, much faster on traditional GPU's. And, the 10% different Rting makes, will people honestly care? Are people going to be that worried about really good reflections if the competitors card is twice as fast because they aren't dedicating massive area to raytracing cores?

It'll be interesting to see it play out in my book anyway. If I'm AMD I'm trying to use my die are to make my cards 2x as fast as Turing at traditional rendering and see what happens. Of course, it's AMD, who seems to intentionally cripple themselves (why in the hell they are still mucking around with HBM while Nvidia cleans their clock with GDDR is beyond me)
 
It'll be interesting to see it play out in my book anyway. If I'm AMD I'm trying to use my die are to make my cards 2x as fast as Turing at traditional rendering and see what happens. Of course, it's AMD, who seems to intentionally cripple themselves (why in the hell they are still mucking around with HBM while Nvidia cleans their clock with GDDR is beyond me)
GDDR6 is an interim solution until HBM becomes the norm. There is a physical limitation of energy per bit on an FR4 pcb (effectively a transmission line) and detectable threshold within the "eye" of the signal. The reason we have GDDR6 at all is because HBM keeps failing it's cost/performance promises. Lots of challenges.

As I understand it, 20gbps is the end of the road on pcb without going balanced. Any effort to advance HBM will pay off in the future. So... I like that AMD tried, even if it didn't pay off yet.
 
Last edited:
RT allready made entry on the pc, nvidia will develop it only further, amd will have to follow. Consoles lagging behind, but thats a given seeing the price range and development time, the HW spec of a console probally was finished before RT was a thing.
I co-wrote a molecular ray-tracer 30 years ago, you mean I will finally take over the world?
AAA titles find most of their revenue on consoles, that should tell you something about development directions right there. (Not to mention that mobile generates the greater revenue than either consoles or PC.) And there are already methods that achieve the main things that pure ray tracing solves with current GPU technology. It boils down to if ray tracing hacks given better hardware support can solve those better/faster/cheaper. If not, that transistor and power budget is better spent elsewhere. The jury is still out on that of course, although the distinct lack of hard data at launch was a broad hint about the current state of affairs.
Ray-tracing is the room temperature super conduction of rendering. We have been able to use it for decades and have done so where it make sense. Noone has been able to make that approach efficient enough even with dedicated hardware for general gaming use. I doubt nVidias hybrid proposal is now, it’s just implemented on powerful enough hardware that their margins allow some use of raytracing while still maintainig reasonable performance. But that is not good enough for the overall industry where the push towards efficiency is total.
 
Give me typical next-gen console hardware for 2020, like an Xbox One X Plus (Or Xbox Two), without any PC RayTracing hardware, and then give me second or third generation PC RayTracing hardware for consoles in 2024. Dont call the 2024 consoles as midgen upgrades, but call it genuine next-gen that has full BC with Xbox One X Plus. Dont give me any of this weaker and currently unused first-generation PC RayTracing.

The 360 gave us first gen unified shaders.

About the best thing that can happen for subsequent generations of RT hardware on PCs is for first gen RT hardware to show up in the next consoles.
 
Last edited:
Don't dismiss the fact that their priority is their professional and other high-margin parts. So, the bulk of their R&D is going to be spent on delivering products that deliver the best performance for those markets (or create new, lucrative, ones).

Exactly this. Nvidia know that vast majority of their profits are derive from workstation, server and datacenter purchasers. Their market segment slide suggests gaming is big but they segment markets by product and a metric ton of 'gaming' products (everything Geforce branded) go in small, medium and large server farms than gaming PCs to the extent that Nvidia updated the end user licence terms for Geforce products to exclude data centre usage (not enforceable in many countries) earlier this year.

5-6 years back, Nvidia's R&D was focussed on gaming GPUs and everything else else was a byproduct, whereas now Nvidia's R&D is focussed on what-we-used-to-call 'big iron' and gaming GPUs are taking a backseat. 2080 isn't getting RT because Nvidia think gamers want it, it's because this is what a lot of large scale commercial server farms want in new server products.
 
My concerns if I were MS or Sony
a) a 4 year generation isn't long at all and it could backlash
b) 6 years from now is a ton of time for things to get going, Xbox and PS4 are only 5 years old now and look how far rendering and algorithms have changed. RT being so fresh could move substantially faster.
c) RT is a cheaper way to develop games, less hacks required for them to get what they want.
d) RT is likely more effective to differentiate between next gen and this gen as a marketing tool
e) RT is something they can market.
f) People always want the new shit.


I think 1 company launching with and another company without would be a colossal misstep. They stand to benefit to move together towards RT.
While I mostly agree with your abcdef points, I don't think the last thing you mention would be such a misstep. Yes, think about a company launching with RT BUT the other showing off games with better resolution, AA, physics and whatnot thanks to the current state and evolution of traditional graphics, harboured by the additional power of a new hardware that will not use any of its resources for the computationally expensive process of RT. As a visual standpoint, the competition is there.

I think that next generation is a transitory one where RT will be resolved either by including it or by leaving it out, waiting for the tech to be more mature and cheap in the future. So, I'm sure that in the next generation that comes after that, RT will be a given.
 
c) RT is a cheaper way to develop games, less hacks required for them to get what they want.

Until RT technology is ubiquitous and all non-RT hardware platform are dropped, RT is more expensive because you do the solution for everything not-RT (100% of all gaming technology now) and then expend effort to support RT on the 0.01% (maybe) market that have RT hardware this year and next. That assumes that a better solution to the problem Nvidia's RT is aiming to solve hasn't been developed that makes the bespoke hardware model obsolete.

Again, like I keep mentioning, PhysX hardware that became obsolete in gaming platforms as compute was at the core of conventional graphics hardware. If you want an idea of how technology will evolve in the next five years, you can gain valuable insights looking back at the last fifteen, tracking the technology paradigms and seeing how they went.
 
https://nvidianews.nvidia.com/news/nvidia-announces-financial-results-for-second-quarter-fiscal-2019

The Pro market is the margin cream market on top of the volume gaming market for Nvidia, gaming is $1.8 billion in revenue vs $0.281 billion for pro vs $0.78 billion in datacentre. The gaming market pays the bills across the product range and the fat margins are earned in these other lines. The difference between a Pro and Gaming card is down to manufacturing quality (better PCB, solder fill, etc) and drivers. Nvidia wins in the Pro space almost by default because their Pro drivers are better and have broader support. AMD has stepped up but Nvidias investment in CUDA and the tools means a lot of vfx tools for example still rely on CUDA vs OpenCL. Hardware features has feck all to do with it.
 
https://nvidianews.nvidia.com/news/nvidia-announces-financial-results-for-second-quarter-fiscal-2019

The Pro market is the margin cream market on top of the volume gaming market for Nvidia, gaming is $1.8 billion in revenue vs $0.281 billion for pro vs $0.78 billion in datacentre. The gaming market pays the bills across the product range and the fat margins are earned in these other lines.

No. :nope: For the reason I stated above. Nvidia report all non-pro/data/server card sales under gaming even if 20,000 1080Ti cards go into a single data centre. This is a big problem for Nvidia who, as I stated above, don't want consumer cards being used in data centres and have updated their end user licence in a bid to deter data centres buying 'gaming' cards.
 
No. :nope: For the reason I stated above. Nvidia report all non-pro/data/server card sales under gaming even if 20,000 1080Ti cards go into a single data centre. This is a big problem for Nvidia who, as I stated above, don't want consumer cards being used in data centres and have updated their end user licence in a bid to deter data centres buying 'gaming' cards.
I can attest that most game & VFX studios I've personally dealt with use Geforces instead of Quadros.
And as stated earlier having RT today in the Geforce line is a direct by-product of developing it principally for the professional market where it was really needed especially for content creation.
 
I can attest that most game & VFX studios I've personally dealt with use Geforces instead of Quadros. And as stated earlier having RT today in the Geforce line is a direct by-product of developing it principally for the professional market where it was really needed especially for content creation.

I manage a large farm and know a lot of people who also manage farms and this is also my experience.
 
Yup as someone who used to sell Quadro into the digital media space this is exactly what happened because for the same perf the Quadro card was 4-5 times the cost so even though the Geforce cards burnt out faster it was still cheaper to buy spares. It was the shift by Adobe to DirectX and ceasing to certify specific cards that really started the rush to Geforce. The Quadro space is now mostly about CAD/CAM which is largely still OpenGL and will be for a long time yet. In the customers I worked with real time RT was seen as a feature in search of a use case, animation houses need better accuracy and thus preferred offline renders, architects saw little need, now it has been 2 years since I last sold into this space but it wasn't driving much interest then.

Oh I agree Nvidia does not want Geforce in servers as the dedicated server cards are larded to hell and back with margin same as the Quadro series which is why I describe them as "cream". Geforce is a relatively low margin relative to the rest which is why Nvidia has always invested heavily in driver support and certifications in the pro space to ensure they dominate there.
 
PC gaming is bigger then ever, even bigger then console gaming in total, nvidia knows this. The 20 series are going to be popular in special when cheaper variants arise.
 
I co-wrote a molecular ray-tracer 30 years ago, you mean I will finally take over the world?
AAA titles find most of their revenue on consoles, that should tell you something about development directions right there. (Not to mention that mobile generates the greater revenue than either consoles or PC.) And there are already methods that achieve the main things that pure ray tracing solves with current GPU technology. It boils down to if ray tracing hacks given better hardware support can solve those better/faster/cheaper. If not, that transistor and power budget is better spent elsewhere. The jury is still out on that of course, although the distinct lack of hard data at launch was a broad hint about the current state of affairs.
Ray-tracing is the room temperature super conduction of rendering. We have been able to use it for decades and have done so where it make sense. Noone has been able to make that approach efficient enough even with dedicated hardware for general gaming use. I doubt nVidias hybrid proposal is now, it’s just implemented on powerful enough hardware that their margins allow some use of raytracing while still maintainig reasonable performance. But that is not good enough for the overall industry where the push towards efficiency is total.
Push towards wasteful resolutions you mean.
 
RT is more expensive because you do the solution for everything not-RT
This is true, unless you don’t. Consoles represent such a large piece of determining what the base system requirements are that as long as the console manufacturers all support it, you can probably move forward with RT as a base and ignore any sort of rasterization hacks on PC. Developers will need to weigh out the cost of renderering hacks, drivers, world builders, artists and everything else that goes into a perfectly hacked scene. I see RT being built into the engine but perhaps I’m wrong. But aside from capital investment into RT methods, I don’t see it being a per scene orchestra that some faked scenes need to be.

I suspect by the time 2020 rolls around we should have seen the 2060 RTX and below models by then. AMD should have responded by then, it’s not like DXR was announced last week.

A majority of AAA don’t run on iGPUs very well either, and in order for them to run the games need to be cut down so badly for them to operate.

I get all of this is new and the skepticism should be very present and up front. But we are also looking st consoles that will not launch for another 2 years.

2 years is a lot of time for things in this landscape to change.
 
This is true, unless you don’t. Consoles represent such a large piece of determining what the base system requirements are that as long as the console manufacturers all support it, you can probably move forward with RT as a base and ignore any sort of rasterization hacks on PC. Developers will need to weigh out the cost of renderering hacks, drivers, world builders, artists and everything else that goes into a perfectly hacked scene. I see RT being built into the engine but perhaps I’m wrong. But aside from capital investment into RT methods, I don’t see it being a per scene orchestra that some faked scenes need to be.

I suspect by the time 2020 rolls around we should have seen the 2060 RTX and below models by then. AMD should have responded by then, it’s not like DXR was announced last week.

A majority of AAA don’t run on iGPUs very well either, and in order for them to run the games need to be cut down so badly for them to operate.

I get all of this is new and the skepticism should be very present and up front. But we are also looking st consoles that will not launch for another 2 years.

2 years is a lot of time for things in this landscape to change.

Exactly my thoughts too. By the time PS5 releases something like a RTX2060 will be there, or even the next series 3070 etc. Guess console hardware will be in the same state as it was with PS4/xbox one, below medium pc hardware but good enough graphics thanks to optimization etc.
Im thinking with AAA games like Cyberpunk, or the next game from cd projekt, or perhaps the next battlefield or even halo pc using RT tech in some form.
 
This is true, unless you don’t.

Sure. Maybe you're already a dev millionaire and are content to market your game to a target audience of a few million people with prospective sales measured in tens, or if you are exceeding lucky - a couple of hundred thousand, prospective buyers by limiting your game to technology in marginal usage.

Can you take a quick look at games released this year on Steam and tell me how many have minimum hardware requirements less than five years old at the time of release? Let's see how many billionaire devs/publishers are out there who can afford to make such brave decisions.

Greater than the risk of missing on the latest tech is heavily investing in obsolete-by-next-year tech. No doubt if you were posting here in 2006 you would have been advocating for Sony and Microsoft to be incorporating PhysX hardware in their consoles. How stupid would Microsoft and Sony have been to pass on that! :runaway:
 
I get all of this is new and the skepticism should be very present and up front. But we are also looking st consoles that will not launch for another 2 years.

Potentially, but there's still a chance of a launch next year.

If the PS5's coming in 2019, the recent DXR demos won't have any bearing on its direction.
 
Sure. Maybe you're already a dev millionaire and are content to market your game to a target audience of a few million people with prospective sales measured in tens, or if you are exceeding lucky - a couple of hundred thousand, prospective buyers by limiting your game to technology in marginal usage.

Can you take a quick look at games released this year on Steam and tell me how many have minimum hardware requirements less than five years old at the time of release? Let's see how many billionaire devs/publishers are out there who can afford to make such brave decisions.

Greater than the risk of missing on the latest tech is heavily investing in obsolete-by-next-year tech. No doubt if you were posting here in 2006 you would have been advocating for Sony and Microsoft to be incorporating PhysX hardware in their consoles. How stupid would Microsoft and Sony have been to pass on that! :runaway:
Right I don't disagree with your points, perhaps there's a large degree of selfishness in my perspective. To be clear, I didn't really get into consoles until this gen. Prior to that it was mainly to purchase consoles for fighting games and what not, paying for Gold didn't make sense to me.

But I do recall somewhat the pain of being a PC gamer during that period.
When Xbox 360 was first released, it released with DX9 and some enhancements, DX9 came with a lot of new features, and 360 shipped with unified shaders as well.
Fast forward some years DX10 is skipped, DX11 is released, and majority of games were still using DX9 as their platform. And video card after video card with newer and newer features were largely ignored, support for those features were dismal.
I bought 1 DX10, and 3 DX11 cards, and none of them were really utilized until after 2013 as far as I can see.
And that's brutal considering compute shaders were ready to go and out there years before XBO and PS4 landed on the scene and it still took and additional 5 something odd years for compute shaders to be a big part of rendering.
But you can't blame that generation for holding things up, because compute shaders weren't ready, the APIs weren't communicated, nothing was ready for when they shipped.

Now we're in the _exact_ same position again, DXR hardware is released 2 years before the next console release. Are platform holders really going to miss the boat on this one and hold us up for another 8 years? Because without console support, the PC space is going to have dismal movement.

So I don't know how or what RT will bring to the market, I really don't know. But I do know that if consoles miss the boat on it, RT isn't going to move forward at all. And we're stuck with basically nearly 15+ years of DX11 features titles, 17+ years if you count the DX11 release date. It would mean that RT in 2018 wouldn't really ramp up to anything effective until 2030 nearly due to how long it takes to ship a title.

This whole discussion is ironic honestly. Because prior to turing I was a believer in soft transitions and rolling generations and you believed in hard cut generations to move forward new technologies.

Now you and I appear to have switched positions entirely on it since the release of Turing.
 
Last edited:
Status
Not open for further replies.
Back
Top