Impact of nVidia Turing RayTracing enhanced GPUs on next-gen consoles *spawn

Status
Not open for further replies.
This may consolidate NV in the PC space but in the console space this does bupkiss for them as they still can't produce an APU like product as they lack an x86 license and for all the hype ARM is still a ways off from being competitive in the console space. This is why we talk retail PC prices sure the integrated version of this chip would be cheaper thanks to bulk discounting etc but it's not going from $599 -> $188 (est. BOM cost at launch of PS4 APU + DRAM) and that's without factoring in the cost of adding the CPU.

I agree MS will scream about RT all day long but unless they pony up the cash to internal dev teams it'll be unlikely to be seen outside of least effort implementations
 
anexanhume is correct though in that it's not the retail price that matters but the cost to make. If MS could broker a deal to license the tech, with nVidia offering it to them cheap (for obscure 'we don't want to make money' business reasons), then theoretically MS could get it at cost price and it could appear in a console. Same way we see super cheap Chinese no-brand tablets and phones with specs that rival branded products at 2-3x the retail price.

The question is really why nVidia would like to cheap out and sell silicon without their insane margins. I suppose the answer there is to gain a monopoly on future card sales as nVidia is the only platform to buy, so perhaps 40+ million units to XBox would mean many, many more units sold to PCs. Might be worth considering.
 
This comes back to the old issue of "Why doesn't NV make a console?"

1) Experience - both Sony (PS3 RSX) and Microsoft (Xbox) have an unhappy history with NV in the console space, it's reasonable to point out it wasn't all NVs fault in either case but that's two relationships that both failed to last more than a single cycle
2) No x86 - ARM is getting better but fundamentally it's still not at the races even vs Jaguar
3) Margin - NV loves them some fat, fat margins and consoles are a slim margin, high volume play, suits AMD down to the ground as the baseload of business consoles gives them a return on existing investments in both CPU and GPU design (nvidia has lost most of it's CPU contracts lately I wonder how much longer they're going to stay in the game)
4) Vendor support - This is more nebulous and I'd be happy to be told I'm talking nonsense but the TX1 USB debacle can't have helped NVs rep, to have missed such a severe bug for so long that it requires a respin of the chip is not going to recommend starting a project with them today.
 
anexanhume is correct though in that it's not the retail price that matters but the cost to make. If MS could broker a deal to license the tech, with nVidia offering it to them cheap (for obscure 'we don't want to make money' business reasons), then theoretically MS could get it at cost price and it could appear in a console. Same way we see super cheap Chinese no-brand tablets and phones with specs that rival branded products at 2-3x the retail price.

The question is really why nVidia would like to cheap out and sell silicon without their insane margins. I suppose the answer there is to gain a monopoly on future card sales as nVidia is the only platform to buy, so perhaps 40+ million units to XBox would mean many, many more units sold to PCs. Might be worth considering.

I would not be shocked if Navi had some form of hardware for accelerating ray tracing. I doubt Microsoft developed their ray tracing api in secret from AMD.
 
The reflections demo in battlefield 5 was the best, because it was demoing something that screen-space reflections just can't do. The other demos were kind of trash, especially the gi demo. They compared gi to no gi in a static scene, instead of comparing it to a number of current gi solutions that are available.

Crytek's Hunt has Voxel GI and nearkly everything fits very well into the environment (but a Voxel AO would be desirable). With the exception of Alien Isolation and some rare titles other current GIs in games are of a poor quality in comparison. However, Cryteks Voxel GI relies on direct light sources. I dont think it works in games that usually play indoors.

Shadows and reflections in games usually look very ugly. Most games don't even have Softshadows and I can't stand that anymore.

I don't know what you saw but there were some very impressive raytracing effects to see in some videos since yesterday.
 
Last edited:
Until GPUs with RT sell for under $200 its a complete non-starter to consider the tech suitable to be included in consoles. Thats just the way the console BOM breaks down.

Again with the arbitrary dollar amounts. Die cost plus licensing is the real cost. Let’s see how big the 7nm shrinks are.

anexanhume is correct though in that it's not the retail price that matters but the cost to make. If MS could broker a deal to license the tech, with nVidia offering it to them cheap (for obscure 'we don't want to make money' business reasons), then theoretically MS could get it at cost price and it could appear in a console. Same way we see super cheap Chinese no-brand tablets and phones with specs that rival branded products at 2-3x the retail price.

The question is really why nVidia would like to cheap out and sell silicon without their insane margins. I suppose the answer there is to gain a monopoly on future card sales as nVidia is the only platform to buy, so perhaps 40+ million units to XBox would mean many, many more units sold to PCs. Might be worth considering.

Because it’s sales they wouldn’t have otherwise. They do not have a practical limit to the number of die they can sell, so any additional sale is good overall, and it doesn’t affect their ability to price desktop, professional, and research use GPUs.
 
I suspect this real time RT is going to be a reason why the next gen consoles will be delayed. This feature is going to be a standard. They will want it in. Ad they will want it in when the manufacturing costs allow for an affordable console that doesn't eat money. Which for me thats a good thing because technology will mature more. The previous gen lasted so many years and we didnt get the jump we wanted.
 
I still think the biggest problem with NV as a partner is that they can't bring an x86 CPU license to the table (either Sony or MS could license ARM themselves) by the time you've sorted out your licensing for both an NV GPU and a Intel/AMD x86 license your costs will be so underwater (as you still need to pay someone to integrate these) that whomever takes option B and just licenses an AMD APU will blow the other out of the water
 
I still think the biggest problem with NV as a partner is that they can't bring an x86 CPU license to the table (either Sony or MS could license ARM themselves) by the time you've sorted out your licensing for both an NV GPU and a Intel/AMD x86 license your costs will be so underwater (as you still need to pay someone to integrate these) that whomever takes option B and just licenses an AMD APU will blow the other out of the water
We are only two generations removed from where discrete CPUs and GPUs were the norm. Xbox had an x86 CPU and Nvidia GPU and still launched at $300. If they’re willing to launch at $500 MSRP, there’s a lot they can do.
 
We are only two generations removed from where discrete CPUs and GPUs were the norm. Xbox had an x86 CPU and Nvidia GPU and still launched at $300. If they’re willing to launch at $500 MSRP, there’s a lot they can do.
They launched at €300 but lost billions from the XBOX. It was literally a zombie product that was kept alive just enough so that MS would support DirectX/Windows gaming and prevent Sony from becoming a monstrous competitor that would have tried to inject their selves into personal computers.
If someone else launched the XBOX there would have been no XBOX today. They would have exited the same generation they launched.
I believe Intel and Nvidia were very very expensive choices and MS had no control in the manufacturing process that would have allowed for cheaper production later on
 
I would not be shocked if Navi had some form of hardware for accelerating ray tracing. I doubt Microsoft developed their ray tracing api in secret from AMD.
Potentially. Navi is rumoured to be heavily Sony guided, and AMD may not be making RT a part of Navi for launch because it's not yet mainstream so there's no major point. They could be working on their RT solution for later inclusions. Anything can happen, really.

Where are the engineers that developed PVR RT working these days?
 
Potentially. Navi is rumoured to be heavily Sony guided, and AMD may not be making RT a part of Navi for launch because it's not yet mainstream so there's no major point. They could be working on their RT solution for later inclusions. Anything can happen, really.

Where are the engineers that developed PVR RT working these days?

As a PC gamer foremost, that would be troublesome. AMD shouldn't be risking or waiting for anything to become mainstream (or taking a backseat to Nvidia). Sure Nvidia RTX cards will contain early-adopter hardware and be supported by a handful of developers, but the problem will be a PR gaming nightmare for AMD being two generations late with consumer RT ready graphics cards. By the time they launch a PC variant of the Navi architecture with RT functionality, Nvidia will more than likely have a more mature RT architecture ready for AMD's first showings, not just hardware wise, but software and driver wise. AMD is already lagging behind in overall PC gaming performance and shadowing features, so adding tardiness to RT is just setting them back further on catching-up with Nvidia in the high-end / premium space.
 
As a PC gamer foremost, that would be troublesome. AMD shouldn't be risking or waiting for anything to become mainstream (or taking a backseat to Nvidia). Sure Nvidia RTX cards will contain early-adopter hardware and be supported by a handful of developers, but the problem will be a PR gaming nightmare for AMD being two generations late with consumer RT ready graphics cards.

Not if it's better. PhysX beat Havok to market and look how that turned out. Being rarely guarantees long term success, particularly in technology where people buy products frequently and will be swayed to buy the best product at the time. In two years time, if RT has taken off, people will be basing buying decisions on whether AMD or Nvidia hardware has better DirectX RT performance. Nobody will give a toss if Nvidia launched first.
 
Not if it's better. PhysX beat Havok to market and look how that turned out. Being rarely guarantees long term success, particularly in technology where people buy products frequently and will be swayed to buy the best product at the time.

But that's the challenge of any business... you know, being successful? One doesn't know if a particular product or service will succeed or fail, without trying. Nvidia didn't become the GPU market-leader by accident. They weren't afraid to challenge the norms, having certain gimmicky GPU features fail, or allowing the competition to stagnate technology advancements. That's is not to say Nvidia is perfect... or hasn't run into trouble... but they have always understood the concept of pushing boundaries even if it meant failure.

In two years time, if RT has taken off, people will be basing buying decisions on whether AMD or Nvidia hardware has better DirectX RT performance. Nobody will give a toss if Nvidia launched first.

But that's the point. Nvidia is making the first move. Seeing if it's a valid product for consumers or a valid architecture worth pursing and refining as time goes along. Nvidia can't sustain their market-share (number 1 position), or it's gamer's mantra "the way it was meant to be played," without challenging the conventional norms others except.
 
Last edited:
Let's not put NV on a pedestal here, they love them some proprietary tech (CUDA, PhysX, etc) and stuck with a heavily raster biased design when AMD was leaning heavily on compute power in the early DX12 era because that resulted in better perf for more games (or held back the adoption of compute heavy design if you want to phrase it another way). Still surprised not to see too much detail on what the RT cores are actually doing either, from looking into the Nvidia web resources they put a lot of emphasis on their ability to denoise low sample count data, is it possible that the RT acceleration is a clever denoising algorithm that allows for ray counts low enough that they wouldn't be useful otherwise? Like they come right out and explain that Tensor cores are MMA units why so shy on the RT stuff?
 
Let's not put NV on a pedestal here

If the gpu in the ps5 is rumoured to have some form of RT cores there wouldnt be that much bashing. I think NV does a nice job of atleast starting with new tech, maybe its not the greatest but its a beginning. And who knows it might have advantages in engines optimized for those RT cores.

Perhaps consoles are getting some hardware RT acceleration too, but chances are there they wont, which means they would be lacking even about 2 years in some gpu features, how important that will be remains to be seen. Its no suprise to see ppl hurt by this on forums like system wars but here? :)
 
Status
Not open for further replies.
Back
Top