AMD Radeon RDNA2 Navi (RX 6500, 6600, 6700, 6800, 6900 XT)

The argument is quite simple. WDL gets to 50s FPS on a 3080 with DLSS. The "today cards will get higher FPS with RT in the future as the tech gets optimize" narrative could only be true if games would not progress in any way, which we know wont be the case. games like WDL and CP2077 will be the baseline for next gem RT performance.



you said that consoles will be the baseline and that tech will be develop for them but you talk about tech the those consoles doesn't have...All of those tech are in fact quite impressive but we don't have them yet. How many games use DLSS2.0?

I think ur missing the point here. a 3080 will be good in 2 years to play games with rasterization but RT will get more and more complex and more and more effects. Even with DLSS you won't have the performance to use must of those effects. So the "future proof" RT won't happen.

Console RT capability isn't changing, and so far it seems even a RTX 2060 is capable of PS5/XSX RT.

So while ultra RT settings may become more demanding, console equivalent settings should always be playable on any GPU with hardware RT in the PC space.
 
My view is that DirectStorage could relieve memory pressure to some extent. Another thing is that 2 years from now we will likely still game in 4k. If framebuffer resolution doesn't increase would individual frames really require more textures/memory to render than is used today? If it's texture changes over multiple frames/location in game streaming should solve that by pulling in the needed textures. Those items come to mind even before considering that perhaps 2 years from now 3080 level card is not anymore 4k capable, i.e. runs less than 60fps maxed out. At that point dlss like solution would be real nice to have to extend life of card or otherwise lowering settings to get good fps.

I don't believe hardware unboxed is biased. However they are conservative and go with what happened in past instead of considering if there is new items in play like DirectStorage and rendering resolution getting stagnated around 4k.

UE5 will be interesting, will it use ton of memory or is it going to be more streaming and compute that makes difference?

edit. It's kind of funny though how they think ray tracing performance sucks and doesn't matter. In that point I don't agree at all and dlss2.0 and it's future iterations would also probably beg to differ. Would have been nice for them to note dlss2.0 can give perf and memory back in case user likes that option to bring perf back up.

edit2. Will be super interesting to see how cyberpunk2077 turns out and if hardware unboxed is forced to walk back some of their talking points.
 
Last edited:
Console RT capability isn't changing, and so far it seems even a RTX 2060 is capable of PS5/XSX RT.

So while ultra RT settings may become more demanding, console equivalent settings should always be playable on any GPU with hardware RT in the PC space.
I wouldn't be so quick to judge off of a single cross gen game by a studio who has never put out a technically competent title. Remember the initial "a 750ti is plenty to keep up with a PS4" sentiment?
 
I think worries about VRAM limits come in to play at 4k, but are likely not a big issue if you have a 1440p screen, and zero issue with a 1080p screen.

@manux Hardware Unboxed is currently going through their first console cycle. Maybe those guys have been around before that with other outlets, but this is the first console generation they've been through since they started their channel. System requirements tend to jump up when the new consoles hit and then things stabilize. Any PC that people buy now is not going to get worse relative to the new consoles over time. They'll likely keep the same relative performance difference. Where PS5 and Series X run something at 30fps, the 6800XT will probably run it at 60+ fps. Unless they think ray tracing is a dead feature on the consoles, the likelihood that it won't be important in some capacity is very small.
 
I think worries about VRAM limits come in to play at 4k, but are likely not a big issue if you have a 1440p screen, and zero issue with a 1080p screen.

@manux Hardware Unboxed is currently going through their first console cycle. Maybe those guys have been around before that with other outlets, but this is the first console generation they've been through since they started their channel. System requirements tend to jump up when the new consoles hit and then things stabilize. Any PC that people buy now is not going to get worse relative to the new consoles over time. They'll likely keep the same relative performance difference. Where PS5 and Series X run something at 30fps, the 6800XT will probably run it at 60+ fps. Unless they think ray tracing is a dead feature on the consoles, the likelihood that it won't be important in some capacity is very small.

This likely only applies to AMD GPUs. There is plenty of historical precedent that this will not be the case for Nvidia.
 
This likely only applies to AMD GPUs. There is plenty of historical precedent that this will not be the case for Nvidia.

History proved cards with featuresets comparable to the consoles, are going to age well. Nvidia had huge troubles in the past especially with DX12 and stuff like async compute, but those are gone now.

DirectStorage, Sampler Feedback, VRS, Mesh Shaders, DXR 1.1 all is supported on Nvidia. It's easy to see Nvidia cards from Turing and above will age well. Same with RDNA2, that architecture will age well too despite the lower RT performance compared to the green team, it is still very sufficient.

A card like the 5700XT will age horribly though
 
Unless they think ray tracing is a dead feature on the consoles, the likelihood that it won't be important in some capacity is very small.

Quite the opposite. They think it will get more complex so current hw won't be able to deal with it.

About DLSS yes it's a very good answer but again, how many game actually use it? doesn't matter if u have a killer feature if only a couple of games actually make use of it. As many of you said consoles will determine a lot of the new tech and techniques in the next gem games. Well console doesn't have DLSS. One thing is scaling RT effect to meet PC performance another diff adopt a completely diff technology like DLSS(do we even know how much Nvidia ask for the "privilege" of having it?). I think DLSS will be the future but, how long until it is widely adopted?
 
History proved cards with featuresets comparable to the consoles, are going to age well. Nvidia had huge troubles in the past especially with DX12 and stuff like async compute, but those are gone now.

DirectStorage, Sampler Feedback, VRS, Mesh Shaders, DXR 1.1 all is supported on Nvidia. It's easy to see Nvidia cards from Turing and above will age well. Same with RDNA2, that architecture will age well too despite the lower RT performance compared to the green team, it is still very sufficient.

A card like the 5700XT will age horribly though
The 5700xt that’s performing from 2080 to 2080ti level in almost every next gen game we have? Maxwell supported a feature set greater than the consoles and isn't aging too well. We still don't know how performant these new DX12U features are between the architectures.
 
I wouldn't be so quick to judge off of a single cross gen game by a studio who has never put out a technically competent title. Remember the initial "a 750ti is plenty to keep up with a PS4" sentiment?

I'm basing my assumption off 6800(XT) RT performance, which is not impressive. The consoles GPUs are not as wide (less TMUs) and logically will not be as capable, so baseline RT performance for the next x number of years is going to be low.​
 
The 5700xt that’s performing from 2080 to 2080ti level in almost every next gen game we have? Maxwell supported a feature set greater than the consoles and isn't aging too well. We still don't know how performant these new DX12U features are between the architectures.

what next gen games? We just have cross gen games, most of them AMD optimized and quite frankly I don't see how a 5700Xt can realistically come close to the 2080ti other than the possibility that these games were not optimized for the green team at all.

Godfall is the only "next gen game" But even that is not using any DX12U features aside from DXR and VRS ( but right now only on the new AMD hardware)

Once real next gen games hit that are using the whole DX12U featureset, 5700XT will struggle heavily and frankly it already does because it's incapable of rendering Raytracing in games that benefit heavily from it like Control and Metro and the upcoming Cyberpunk. Hell, once games use RT for their lighting like RTXGI which is very cost efficient, games straight up won't boot on this card anymore because AMD didn't even bother to enable DXR emulation support. It's truly a lost generation and I'm sorry for everyone who bought it. But you get what you pay for, it does deliver good performance for the price, right now.
 
Last edited:
AMD owe a lot of 10 bucks....Idk why companies cannot just be clear and said "look due to covid and high demand we will have low stock until next year" instead of legally lying about it...

Do we know if CP trailers use RT? if they don't idk if I like the fact that it makes so little diff or not.

 
Well, they look like a screen space effect in that the shadow for an object will not appear until the object appears on screen, and the shadow for an object will disappear if the object moves off screen. Seems to have the same issue as screen space reflections. In terms of how they're doing it, I don't know.

Maybe they use screen space shadow to complement RT shadows like in Days Gone or Demon's souls where they are used with shadow maps and in DS shados maps and capsule shadow.

 
I can see the argument that the consoles will last six years and the relative performance of the 6800 and 3080 won't change during those six years. These cards should easily last for six years if you play at console settings, which will include ray tracing. You just can't expect to run at ultra settings for six years.
It's overally a good argument, but it has many weak points. E. g. you can't play on PC at console settings, because PC games don't allow to use such settings (e. g. console setting maybe lower than available on PC). Adaptive resolution is used more (and will be used even more in future) in consoles, but isn't usually used by PC gamers, especially high-end gamers. Several FPS-adaptive techniques are not available on PC and their utilisation on consoles will riste. Future console games will be more tightly optimized for the particular hardware, because developers will have more experience with console hardware, but it won't be true in PC space, where current hardware will be obsolete soon a games will be optimized mostly for the newer generations. Anyway, users of high-end hardware don't ususally use it for long periods of time to game on it with the lowest quality settings. One example for all: GeForce GTX 680 was considered to be more powerfull than Xbox One or Playstation 4. At this time, there are games, which run ok on those consoles, but don't run acceptably on GeForce GTX 680 even at the same resolution.
 
Not sure about the other European countries, but it looked like there was around 200 reference GPUs in total for 6800XT here in Russia. I managed to put an order but it was cancelled half an hour afterwards. Looks like there won't be any non-AIB cards for quite a while, if they ever appear again.
 
There were again cards yesterday on sale on AMD site. Gone in minutes but there were. In the meanwhile some users in Italy are receiving their cards during these days, most of them next Friday.
 
Can't blame them for not being able to meet demand if it's not physically possible to produce more, but the messaging? 3 weeks telling us they were so ready, so much better prepared than NVIDIA... I guess simultaneous launch of Ryzen 5000, RDNA2 card, XBSX/S and PS5 has proven too much and I guess they've known this since probably July.
 
Back
Top