Impact of nVidia Turing RayTracing enhanced GPUs on next-gen consoles *spawn

Status
Not open for further replies.
Truth be told, I believe AMD can have the edge in RT performance over Nvidia, if all the Navi CUs are RT ready - without the need for specialised cores (less latency design).

I don't necessarily think it'll be better, but knowing AMD's modus operandi, it'll be a more transparent and open standard. It's been quite pleasing to see Freesync get so much support while Gsync often gets the bird, but that extends well beyond just AMD into a territory that involves many other GPU makers in different devices.

Nvidia's form of ray tracing, while still currently sophmoric could infact be laying the groundwork for the next few years of graphics architectures because of their defacto standards and the status quo in the professional space. I don't think it'll be the 8800GTX all over again, but it's obviously shaking things up immensely. Just look at all the buzz and conversation around it.
 
But that's the challenge of any business... you know, being successful? One doesn't know if a particular product or service will succeed or fail, without trying. Nvidia didn't become the GPU market-leader by accident. They weren't afraid to challenge the norms, having certain gimmicky GPU features fail, or allowing the competition to stagnate technology advancements. That's is not to say Nvidia is perfect... or hasn't run into trouble... but they have always understood the concept of pushing boundaries even if it meant failure.

What's that have to do with, how did you put it - the "PR gaming nightmare for AMD being two generations late with consumer RT ready graphics cards."? Look at every successful technology company around, guess what? They weren't first. Apple weren't first with a GUI personal computer, smartphone or MP3 payer. Microsoft weren't the first with a personal computer operating system, not a GUI one. Google were way late to the search party. Sony only entered the console market with the fifth generation.

Being first means you take all the risk while your competitors observe and learn from your missteps and follies.
 
Being first means you take all the risk while your competitors observe and learn from your missteps and follies.

So who was suppose to take the first steps towards forging (providing) the first consumer level RT ready GPUs? AMD? Intel? Because most of your statements are negative-nanny dogpiling about Nvidia providing the first steps towards such needs. I'm not seeing your point at all.
 
So who was suppose to take the first steps towards forging (providing) the first consumer level RT ready GPUs? AMD? Intel? Because most of your statements are negative-nanny dogpiling about Nvidia providing the first steps towards such needs. I'm not seeing your point at all.

I thought my point was clear in the first post but to re-iterate. In terms of design, which for years has been honing ever-more flexible core hardware that can be used to assist any part of the graphics or compute pipeline, to paradigm shift back a decade to bespoke single-purpose hardware that will see questionable use because this are processor cores only deployed on high-end hardware from one manufacturer, is I can fully see the opportunities augmented raytracing will bring, but not for a while. Nvidia are asking you to pay now for bespoke hardware on the promise of what the hardware might be capable of if/when more games support it - assuming you haven't ditched your 2070/2080/Ti to the better gen 2 version in 18 months time anyway.

Getting back on track, I would be extraordinarily surprised to see any iteration of this technology in the new consoles. If it's present, it will be so watered down as to be next to useless for the intended purpose or enhancing graphics significantly. Not unless AMD have something very cool cooking that they've not shown.
 
In terms of design, which for years has been honing ever-more flexible core hardware that can be used to assist any part of the graphics or compute pipeline, to paradigm shift back a decade to bespoke single-purpose hardware that will see questionable use because this are processor cores only deployed on high-end hardware from one manufacturer, is I can fully see the opportunities augmented raytracing will bring, but not for a while.

I agree with you on a more unified general-purpose approach towards rendering (i.e., less latency, more efficient design, smaller possible shrinks, and of course, more developer friendly), without the need for specialized logic [cores] outside the primary rendering array. But the question becomes why did Nvidia go this route?

If we take Nvidia at their word; that Turing architecture was 10 years in the making (I know, possible PR fluff), somewhere in R&D they must of had multiple designs (possibly baked GPUs) for both designs (one with a unified approach with rasterization and RT, and the current Turing architecture). If the unified approach had problems, what possible roadblocks stopped Nvidia from going down that path? Was it core size and complexity? TDP issues of such a core design? Yield issues of such a core design? Current tech? Cost? A mixture of everything? Something had to push Nvidia in that direction of a nonunified rendering design.

Hopefully, AMD has figured out a more unified rendering approach.

Nvidia are asking you to pay now for bespoke hardware on the promise of what the hardware might be capable of if/when more games support it - assuming you haven't ditched your 2070/2080/Ti to the better gen 2 version in 18 months time anyway.

Let's be honest, this has always been Nvidia's approach. To get the gamers excited, more specifically the premium consumers, to invest more and more into their brand. To see what sticks, or fails. Nvidia's corporate culture of wanting to grow their (already colossus) market-share beyond their competitors.

Getting back on track, I would be extraordinarily surprised to see any iteration of this technology in the new consoles. If it's present, it will be so watered down as to be next to useless for the intended purpose or enhancing graphics significantly. Not unless AMD have something very cool cooking that they've not shown.

Disagree. If it's present, watered down, that would be a waste of die space and bad judgement on engineering. If it is present, fully capable, then more than likely first-party developers (i.e., Naught Dog, Guerilla, SMS, 343 Industries, The Coalition, etc.) would jump at the chance of using RT.
 
Last edited:
Baked Alaska? o_O

...the unified yumminess. *drool*
1383254816256.jpeg
 
Generic ray casting acceleration would be pretty useful all over the game engine. It doesn't have to be a paradigm shift in rendering, it doesn't need to be forced to be used by devs. It would allow a much better lighting pass and GI as soon as the major engines support it. Hardware rays will be much more efficient regardless of the size of the chip, and it has a much better chance of being used on a fixed console than on PC.

The same goes for tensor cores, sony have been investing into sparse rendering resolvers and CB, it would equally help PSVR foveated and non-planar rendering too.

It comes with a big compromise in transistor count per compute block, but on the surface the gain seems worth it. If AMD comes out with something competitive to the 2070, that could end up a reasonable size and clock on 7nm (or 7nm+) in 2020.

I might be a bit optimistic but I don't see next gen without at least some helpers for both GI (be it RT cores or some other idea) and sparse rendering (be it tensor core or something else).
 
Considering how weak these consoles are for this generation, I’m a bit surprised by the commentary.

Give me weak RT hardware support in 2020 than waiting until 2028 for real hardware support.
 
I'm not sure where the idea that these consoles are weak comes from, given the price envelope and available tech the XB1/PS4 represent good value for the price point (excepting XB1 at launch, it became a decent offering with the S). Sure it would have been nice to have a stronger CPU core but with Bulldozer being the only game in town (Intel doesn't console, Ryzen was unavailable and ARM nope) 6ish Jaguar cores is a good compromise. I don't think we've crossed that "PC for <$400 beats PS4 Pro" point yet have we?

If the new boxes get some sort of RT hardware that would be lovely but I don't see it as a deal breaker given the relative immaturity of H/W accelerated RT in general.
 
What's the chance of the next Xbox sticking with Vega? I seem to remember reading somewhere that AMD will have a 7nm Vega.
 
What's the chance of the next Xbox sticking with Vega? I seem to remember reading somewhere that AMD will have a 7nm Vega.

7nm Vega won't be released as a general consumer product. It's for their pro line of cards.

I think it's most likely that Navi will be the basis of next-gen consoles. We know as much for Playstation 5.
 
I think it's most likely that Navi will be the basis of next-gen consoles. We know as much for Playstation 5.

That's what I expect too but it's that whole 2/3 working on Navi for Sony which made me think about Vega. I still think Intel might be a long shot for Xbox.
 
7nm Vega won't be released as a general consumer product. It's for their pro line of cards.

I think it's most likely that Navi will be the basis of next-gen consoles. We know as much for Playstation 5.
Well it is not likely for big N ;) Though I really wonder what their options are going to be. Nviddia produces bigger and biggerSOC that are everything but cheap. There are no off the shelves SOC that they could use in sight. It is not an emergency for the Switch but the 3DS is aging and the Switch SOC do not seem to be a good choice.

As for Microsoft, I would not make any bet, imho the whole division needs to rebooted. THe XB1X do not have the expected effect, it is obvious they are not sure how to advertize, they speak of upcoming system, etc Are they trying to kill it as they kill the decent XB1S? Saying that they are "browsing view" is giving them quite some credit. It is a bunch of good communicators sitten on a pile of money and other resources that hardly manages to maintain the sail afloat (quite not actually).
MSFT need to reconsider its approach to gaming and do so as only one entity. There are glaring lackig in their offering. The way Windows behaves on TV comes to mind. You can have nice gaming mini PC but the software is not OK.


Sony well they are in a pretty good spot now, as in the PS2 era they can lauch pretty much whenever they want as long as the product is competitive from the get go and backed with solid launch titles.
 
I could see Nintendo asking a Qualcomm for a derivative of their SoC family, I like my Switch but the battery life on that thing is even worse than my 3DS, but the key challenge is that will Qualcomm be bothered to spend the R&D dollars to produce an SDK for Switch2 or what have you? They make a lot of money right now just selling chips for Android and support for the that without having to help make a whole new SDK environment.

I am genuinely unsure how much longer Nvidia are going to be able to keep the SoC division going from the most recent quarter they have an announcement that they will be supplying DRIVE to Daimler/Bosch for next year but $123 million in revenue for the last Q makes it < 3% of gross revenue and is presumably a bigger drain on R&D than the other 3 divisions which can leverage the general GPU work.
 
I agree with you on a more unified general-purpose approach towards rendering (i.e., less latency, more efficient design, smaller possible shrinks, and of course, more developer friendly), without the need for specialized logic [cores] outside the primary rendering array. But the question becomes why did Nvidia go this route?

Don't dismiss the fact that their priority is their professional and other high-margin parts. So, the bulk of their R&D is going to be spent on delivering products that deliver the best performance for those markets (or create new, lucrative, ones). Creating derivative designs from those products and then finding ways to leverage the technology in them for the gaming market is going to be a lot cheaper than starting from the ground up with a new design specifically targeting what is going to provide the best performance for games today and over the marketing life of the product. While this doesn't necessarily mean that this route doesn't also deliver the best result for their gaming designs, it is not a given that this route was chosen because it was the best for gaming.
 
Status
Not open for further replies.
Back
Top