Next Generation Hardware Speculation with a Technical Spin [2018]

Status
Not open for further replies.
Cheap like nothing. Turing 10% smaller than GV100 with 10% less units.
The TU102 is a 7.5% smaller GPU than GV100 with 10% less SM / Tensor units and 25% less ROPs.
40 ROPs of difference are certainly not nothing.

Especially when you consider that the 2080 Ti's 68 RT units are the bare minimum to achieve those effects at 1080p60.
 
Last edited by a moderator:
on the Xbox side? Yes they have marketing rights in the console space
BFV: https://www.gamespot.com/articles/looks-like-microsoft-has-battlefield-5s-marketing-/1100-6459172/
https://powerup-gaming.com/2018/04/05/microsoft-marketing-exclusive-borderlands-anthem/
  • Borderlands 3
  • Cyberpunk 2077
  • Splinter Cell
  • Battlefield V
  • Madden
  • Anthem
It’s also been claimed that Microsoft marketing will extend to Shadow of the Tomb Raider and The Division 2.
some of these could be inaccurate, (questionably SOTR and Madden). We Happy Few is a MS owned company.

There's more than just the DX team working on DXR. Coalition developers are also there helping out from what I've been able to see thus far.

I thought that the subject at hand was Ray Tracing? and in this case nobody else but Nvidia is marketing DXR/RTX support for those games. What does having console marketing rights for those games have anything to do with what we were talking about? You lost me there..
The Coalition like all MS 1st party studios (besides Turn10 & 343i) are using UE4. They will "freely" get DXR support once it's implemented by Epic later in 2019 (the current internal branch developed with Nvidia is a total hack job done for R&D. UE4.22 will initially only support RT shadows next spring). And all those studios are also PC devs so once again we can't assume that the console versions of their games will have any kind of RT support just because of this.
 
You can, and you can also do it in hardware and software together. We should be discussing this, instead of shunning off ray tracing completely for consoles.
No-one's shunned it off. Some realistic objections have been presented but weren't addressed with reasonable comebacks, only references to 'raytracing is coming.'

As a pro-RT enthusiast looking forwards to the RT'd future, I still have concerns about its economic viability in a console. If RTX was $300 and rendering what we're seeing at 4K30 or 1080p60, we'd be having a very different conversation with everyone agreeing it was the future based on the data. However, the data so far is that a massive piece of silicon can raytrace but not amazingly quickly, from which we are to speculate how things will move and whether they'll move fast enough that rasterisation won't be able to keep up and a console without will date very quickly, particularly with regards using non-specialist HW to RT so that it's 'good enough'; we already know raytracing as a software solution exists and will continue.
 
I thought that the subject at hand was Ray Tracing? and in this case nobody else but Nvidia is marketing DXR/RTX support for those games. What does having console marketing rights for those games have anything to do with what we were talking about? You lost me there.
My bad. Let me try to do my best.
It's a representation of chumminess. The game developers are coding using DXR, nvidia releases the drivers that support DXR and it works. Ultimately the games are being coded in DXR and not some weird nvidia extension.
Why this is important is due to the nature of DXR, it's a flag in which if you have the capability it will access the hardware, if not it will access another path entirely. This operates like any other feature in Direct X.
If you're following me up to here. You need to remember the way that MS developed X1X. They used existing games/code and they simulate their performance and profile and make modifications to the chip to see the output before they started burning silicon tests. This is how they got great performance for hitting 4K without having to do much guess work or insane amounts of optimization on the developer side of things.

Well assume they want to do that exact same thing again for their next console. They can already profile 4K performance, X1X games provide all the data points they need there. But they don't have any real DXR games to test against... but now they will. That DXR code will stay for consoles because it'll just be ignored, but running that code in the simulation it may not.

They can then leverage it and profile accordingly.
That is of course, if they were planning to have RT in a console.

all speculation though. please ignore.
 
I think pretty much everyone agrees (some form of) RT is the future, the question is how far in the future, next gen, or gen after.

The problem is that we have no idea what AMD has lined up:
  • Fully programmable flexible
  • Fixed single use function
  • Separate accelerator
  • Nothing at all

We all know that NVidia won't be in either the next PS or Xbox.
Will amd's implementation be like NVidias, who knows.
But if the next gen does have RT, it will be used as long as it's net affect isn't negative. Which would be a huge cockup.
If it's usable, it will be a very good thing as it will cause it to be a reasonable base for games sooner.

We should all be blaming amd for not knowing anything about their lineups :yep2:
 
Well, some of us don't believe nVidia's solution is at all optimal being based on their AI work rather than a dedicated push for realtime RT for computer games. Fixed function denoising or compute based intelligent denoising (using fat buffers instead of machine learned 2D image comparison) could be a lot more efficient, saving on the Tensor cores. Ray intersect tests sounds like something that could be added to compute units as a block?? Then you have the memory access, which might just need some advanced cache thingy (possibly a thingie instead).
 
Well, some of us don't believe nVidia's solution is at all optimal being based on their AI work rather than a dedicated push for realtime RT for computer games. Fixed function denoising or compute based intelligent denoising (using fat buffers instead of machine learned 2D image comparison) could be a lot more efficient, saving on the Tensor cores. Ray intersect tests sounds like something that could be added to compute units as a block?? Then you have the memory access, which might just need some advanced cache thingy (possibly a thingie instead).
From strictly a selfish point of view, I find that a lot more interesting conversation.
Other possible implementation for use in a console etc
I.E. Sony or MS, said we want some form of RT acceleration, how can we go about this without having to worry about other market considerations like pro cards etc
 
Well, some of us don't believe nVidia's solution is at all optimal being based on their AI work rather than a dedicated push for realtime RT for computer games. Fixed function denoising or compute based intelligent denoising (using fat buffers instead of machine learned 2D image comparison) could be a lot more efficient, saving on the Tensor cores. Ray intersect tests sounds like something that could be added to compute units as a block?? Then you have the memory access, which might just need some advanced cache thingy (possibly a thingie instead).

Yes, RT when it is commercially feasible as a mass market consumable (in products below 400 USD) at a fast enough speed at those price targets to be a net benefit is unlikely to look anything like NV's RTX series.

IE - neither AMD nor NV's future DXR performant and consumer focused accelerators will likely be similar to RTX. Ike Turner is likely correct in that this was NV trying to find a way to make their pro-market GPU appealing to PC gaming enthusiasts.

Heck, as early as NV's next consumer GPU release we might see something wildly divergent from RTX from them with less fixed function units. RTX could just as well be NV's backup plan to release something with node transitions coming slower than they might like (IE - a proper next gen consumer product may have been reliant on earlier availability of a smaller silicon node).

I wouldn't take RTX as some predictor or foreshadowing of how future GPUs with RT support will look.

Regards,
SB
 
I agree. If there are midgen upgraded refresh consoles, they certainly wouldn't be made for the purpose of doing 8K (like PS4 Pro & X1X were to begin to push 4K)
Rather, they'd be used to warm up developers on raytracing, several years before PS6/XB3.

Now that you mention 8K, I guess that won't be the target. If not RT then maybe 60 FPS where the next gen Pro and X have a heavily improved CPU and not as much improved GPU? That kinda sounds less unless VR really takes off where higher framerate matters plus it seems to be a rabbit hole that console makers don't usually push much for since FPS is preferred for easier marketing through more improved graphics per generation.
 
I see people being cautious until there is corroborating evidence that RT is the general trend that makes market sense to support.

Currently the market for RT games is less than 1% of all PC Gamers. Will it increase in the next 3 years? Absolutely. Will it be over the threshold to release RT-only games in 5 years? Highly Improbable. So the general trend for released games in the next 5 years is non-RT.
Hybrid rendering will be dominant for at least the next decade.

I think pretty much everyone agrees (some form of) RT is the future, the question is how far in the future, next gen, or gen after.

The problem is that we have no idea what AMD has lined up:
  • Fully programmable flexible
  • Fixed single use function
  • Separate accelerator
  • Nothing at all

We all know that NVidia won't be in either the next PS or Xbox.
Will amd's implementation be like NVidias, who knows.
But if the next gen does have RT, it will be used as long as it's net affect isn't negative. Which would be a huge cockup.
If it's usable, it will be a very good thing as it will cause it to be a reasonable base for games sooner.

We should all be blaming amd for not knowing anything about their lineups :yep2:
My bet is on fixed function. First gen what matters is speed and ease of adoption. Flexibility comes later.

Well, some of us don't believe nVidia's solution is at all optimal being based on their AI work rather than a dedicated push for realtime RT for computer games. Fixed function denoising or compute based intelligent denoising (using fat buffers instead of machine learned 2D image comparison) could be a lot more efficient, saving on the Tensor cores. Ray intersect tests sounds like something that could be added to compute units as a block?? Then you have the memory access, which might just need some advanced cache thingy (possibly a thingie instead).
OTOY uses AI denoising as well. RTRT and denoising go hand in hand. AI denoising makes use of fat buffers too.
 
Now that you mention 8K, I guess that won't be the target.

If not RT then maybe 60 FPS where the next gen Pro and X could have heavily improved CPUs and not as much improved GPU compared to PS4 Pro and 1X.

That's could be unlikely unless VR really takes off where higher framerate matter more plus chasing towards a higher framerate standard seems to be a rabbit hole that console makers might not want to go into yet since much better graphics are preferred for easier marketing of a new generation.

Yikes.

Edited.
 
Professional programs called "game engines"?
yes, even game engines, but not actually to get it into the final product, but to see how e.g. lightning conditions behave in real-time at design-time. RT is just easier here to get the optimal settings but this is only for design-time.
There are other hybrid "RT technologies" that are already in use for current consoles. Well we could call them RT as nvidia promotes their RTX as RT which really is just a hybrid solution. Well there may be some lightweight hardware in the next consoles that might be used for something like lightning conditions, but nothing for reflections etc like we've seen in the BF V demos. This is just to compute intensive. But this technology can be used to get e.g. better "textures" for screenspace reflection-like effects at design time.
Yes, MS made an API for it, but that doesn't mean it will be used for games on consumer devices. MS makes many APIs and right now, RT(X) is just a buzzword they needs to be used everywhere, just like VR in the last 2 years, or like MS needed to promote cloud-compute for the xb1.
 
So:
Phil Spencer going on the record at E3 and mentioning ray tracing for the next Xbox platform
Microsoft launching DXR
NVIDIA doing RTX
AMD saying they are going to do ray tracing too
Intel officials excited for ray tracing, probably will integrate it into their dGPUs too considering their history with ray tracing on Larrabee
Major developers making demos and playing with ray tracing in major engines
Most developers speaking enthusiastically about doing ray tracing

All of that is not enough to convince some that rasterization has reached it's limits, and that the general trend in the industry is to use ray tracing to advance real time graphics?
Why on Gods green earth should that mean that "rasterisation has reached it's limits"?
It simply means that ray tracing is the new buzzword to try to sell new stuff to the yokels. Like VR or 3D, or... Surely you've seen this over and over during the years?
Ray tracing has the same issue today as it always had - efficiency. And it actually has a harder battle to fight today since raster shader approaches actually do a decent job today compared with, say, fifteen years back. The fact that the prospects for lithographic advances are pretty grim, and that computing has emphatically gravitated towards mobile devices isn't helping its case either.
The proof of the pudding will be in its eating. If ray tracing approaches will produce a better result than spending the same resources elsewhere, then it will have a case. Otherwise not, particularly in gaming which is all about providing entertainment value.
 
yes, even game engines, but not actually to get it into the final product, but to see how e.g. lightning conditions behave in real-time at design-time. RT is just easier here to get the optimal settings but this is only for design-time.
There are other hybrid "RT technologies" that are already in use for current consoles. Well we could call them RT as nvidia promotes their RTX as RT which really is just a hybrid solution. Well there may be some lightweight hardware in the next consoles that might be used for something like lightning conditions, but nothing for reflections etc like we've seen in the BF V demos. This is just to compute intensive. But this technology can be used to get e.g. better "textures" for screenspace reflection-like effects at design time.
Yes, MS made an API for it, but that doesn't mean it will be used for games on consumer devices. MS makes many APIs and right now, RT(X) is just a buzzword they needs to be used everywhere, just like VR in the last 2 years, or like MS needed to promote cloud-compute for the xb1.
Screen space reflections are garbage and should disappear as soon as possible.

The rest of your post is just denial.


Why on Gods green earth should that mean that "rasterisation has reached it's limits"?
It simply means that ray tracing is the new buzzword to try to sell new stuff to the yokels. Like VR or 3D, or... Surely you've seen this over and over during the years?
Ray tracing has the same issue today as it always had - efficiency. And it actually has a harder battle to fight today since raster shader approaches actually do a decent job today compared with, say, fifteen years back. The fact that the prospects for lithographic advances are pretty grim, and that computing has emphatically gravitated towards mobile devices isn't helping its case either.
The proof of the pudding will be in its eating. If ray tracing approaches will produce a better result than spending the same resources elsewhere, then it will have a case. Otherwise not, particularly in gaming which is all about providing entertainment value.
Except thanks to reconstruction techniques ray tracing took a massive leap forward in recent years making it finally viable for realtime use:

 
The rest of your post is just denial.

Is there really any need to be so rude?

Every engine worth its salt had global illumination in from before this generation began, and we can count the number of globally illuminated console games on one hand. One deformed hand, missing several fingers, at that.

There's every chance that RTRT will see the same fate next generation: global illumination becomes the norm, and the occasional game - of a similar scope to The Tomorrow Children or Driveclub - knocks off everyone's eye-socks.

So, before getting so salty over some rays, please just bear in mind that your arguments were applicable to GI only a few years ago, except that RTRT is a less known quantity.
 
RT may be a mid gen thing for the next console cycle. Current mid-gen's main selling point was 4K which leads me to believe they'll want to check off an obvious selling point for the next mid-gen refresh. With diminishing returns already being somewhat apparent going from 1080-4K and will be even more apparent when we go beyond that I could see RT as a great reason to upgrade to a mid-gen console. Also I'm not sure AMD or NVIDIA has tech that's ready or that makes sense for a console APU being launched in 2019/20 but I'm certainty not an expert on that.
 
Except thanks to reconstruction techniques ray tracing took a massive leap forward in recent years making it finally viable for realtime use:
How does that equate to rasterisation having met its limits?

Image1.jpg

Yay, raytracing's gonna solve all our shadowing problems. :p

Kidding aside, raytracing is reliant on hacks to accelerate it, so the ideal, perfect renderer remains a ways off. Furthermore, that video shows the pursuit of really low-level raytracing, before RTX existed. Good quality lighting is being achieved with one sample per pixel, which is in the realms of doable as RT on compute in a next-gen console. If adding RT acceleration structures is cost effective in silicon, it behoves its inclusion, but if it requires considerable compromise of the raw shader power, it could potentially be left out without games suffering too much and maintaining maximum flexibility.
 
Status
Not open for further replies.
Back
Top