DegustatoR
Legend
I didn't say anything.You said "the Turing alternative" which is a 2060 Super in this case, not a 2080Ti.
I'd imagine that anything below 2080 will be too slow for RT in AW2 but then I haven't seen how the game scale yet.
I didn't say anything.You said "the Turing alternative" which is a 2060 Super in this case, not a 2080Ti.
It really tells something when people have to ignore price to justify comparing product y to x instead of z to paint a picture fitting their viewIn price but not in specs. 5700 xt has quite a fillrate and flops advantage.
5700xt vs 2060 super vs 2070 super
Fillrate: 122 vs 101 vs 113
Bandwidth: 448 vs 448 vs 448
TFlops: 9.8 vs 7.2 vs 9.1
They literally stuck themselves into a corner in the conclusion of the video, resorting to mind bending justifications as lame as the 2060 Super can't play the game too! Or DLSS doesn't matter at 1080p, or DLSS in general wasn't a good advantage for the 2060 Super over its life span!I'm definitely seeing some desperation here, just not from them...
Again, no. The latest example of them downplaying the DLSS advantage is loud and clear here. They still refuse to say they were wrong about 5700XT being better than 2070 Super on the basis of lacking DLSS, and they still refuse to do it in Alan Wake 2, claiming that DLSS on 1080p is useless even if it's vastly better than FSR.No, what you did was dishonestly twist everything they've said in order to make them sound unreasonable.
Heck, the claim that they were down on DLSS2 early on is a straight up lie.
Disagree, NVIDIA cards command a premium over AMD GPUs, especially since the introduction of RTX, on the basis of offering more features and vastly more RT performance.Price is the first thing determining what competes with what
It really tells something when people have to ignore price to justify comparing product y to x instead of z to paint a picture fitting their view
Price is the first thing determining what competes with what
Not disagreeing with your post. It was directed as commentary to someone saying 5700 XT compete(d) against 2070S rather than 2060S.What view is that exactly? And what part of my post do you disagree with?
That's just ridiculous view on any front. If they price something at 500 its first competition is others priced about 500, not some some card price notably cheaper (or more expensive for that matter), regardless of your personal views on some specific featuresDisagree, NVIDIA cards command a premium over AMD GPUs, especially since the introduction of RTX, on the basis of offering more features and vastly more RT performance.
The fact that four years after the launch of those GPU's there's only one (?) game that massively benefits from mesh shaders says to me that the lack of mesh shaders in the 5700 XT had close to zero significance.Alan Wake 2 presents a disaster case for HardwareUnboxed, they championed RDNA1 over Turing. Now here comes a game that not only have bad performance on RDNA1 due to lacking DX12U features, but also bad image quality due to relying on FSR2. While the Turing alternative suffers none of this, have superior DLSS upscaling quality as well as access to Ray Tracing and Path Tracing.
I mean, what settings are they using? Why not lower settings? Do they have post-processing set to high or low? Them using anything but low there is an extreme waste of performance on a lwotier GPU.
The fact that four years after the launch of those GPU's there's only one (?) game that massively benefits from mesh shaders says to me that the lack of mesh shaders in the 5700 XT had close to zero significance.
If you sum these up and add in the fact that DLSS was there since 2018 (19 for v2+) you may see why the original HUB stance on RDNA1 seems quite a bit shaky at the moment.The fact that four years after the launch of those GPU's there's only one (?) game that massively benefits from mesh shaders says to me that the lack of mesh shaders in the 5700 XT had close to zero significance.
Ray tracing support on the other hand is a feather in the cap of the 2060S but not because of Alan Wake 2. That GPU just isn't powerful enough. I don't know what "access to path tracing" even means here... using it for photo mode? Wow... big win.
Yep the ship has sailed. For it to matter there would need to be a few super popular games in the next year using mesh shaders at playable frame rates on 5 year old cards. Unlikely to happen.
The mesh shader situation makes the progress on raytracing in the same timeframe that much more amazing.
RT is the same. You need to make two versions of some approach if you want to be compatible with both RT and non-RT h/w.Mesh shaders is a core feature in the sense that it's a core part of the pipeline. I don't think it has automatic fallbacks. If you want to support mesh shaders, it's a new geometry pipeline. If you want to run on cards that support mesh shaders and also cards that don't, you need to engineer two geometry pipelines. It's a lot of work to do twice. Most developers are not going to do that. Basically they took the approach to wait until there was enough market coverage and then make the switch.
Ray tracing obviously requires a ton of engineering too, but I the fact that you can choose to just do AO, or shadows, or certain reflections allowed developers to ease in.
Yes, DLSS has certainly been an advantage too. However, using mesh shaders as an argument is grasping at straws. Not sure where the need to make arguments as desperate as that comes from... Just sounds like bad marketing tbh.If you sum these up and add in the fact that DLSS was there since 2018 (19 for v2+) you may see why the original HUB stance on RDNA1 seems quite a bit shaky at the moment.
But they (Remedy) didn't make the switch, AW2 works workout mesh shaders too despite the initial wrong informationMesh shaders is a core feature in the sense that it's a core part of the pipeline. I don't think it has automatic fallbacks. If you want to support mesh shaders, it's a new geometry pipeline. If you want to run on cards that support mesh shaders and also cards that don't, you need to engineer two geometry pipelines. It's a lot of work to do twice. Most developers are not going to do that. Basically they took the approach to wait until there was enough market coverage and then make the switch.
Is it? Game's unplayable (or close to it) on 5700XT because of that. Seems like a pretty big deal. A bigger one than RT I'd say since you can't just turn mesh shaders off and get the usual performance back.However, using mesh shaders as an argument is grasping at straws.
Mesh shaders is a core feature in the sense that it's a core part of the pipeline. I don't think it has automatic fallbacks. If you want to support mesh shaders, it's a new geometry pipeline. If you want to run on cards that support mesh shaders and also cards that don't, you need to engineer two geometry pipelines. It's a lot of work to do twice. Most developers are not going to do that. Basically they took the approach to wait until there was enough market coverage and then make the switch.
Ray tracing obviously requires a ton of engineering too, but I the fact that you can choose to just do AO, or shadows, or certain reflections allowed developers to ease in.
Yes it absolutely is. Again, it's one game four years after the GPU's launched. Changes nothing in the big picture when looking back at recommendations between 2060S and 5700 XT dating back 3-4 years. Funny to even have to argue about this but somehow I'm not surprised.Is it? Game's unplayable (or close to it) on 5700XT because of that.
Your argument falls apart when discussing alternatives priced way down due to market forces. The Arc A770 is priced way lower than it should be, I should now compare vs the RTX 3050 because they are similar in price?If they price something at 500 its first competition is others priced about 500
There are visual glitches and many crashes, so it doesn't really work in an optimal manner.AW2 works workout mesh shaders too despite the initial wrong information
It's one game now, who knows how many games in a year from now. It's also a heavy game where this difference isn't very important but what if the next one will be a lighter game where 2060S will be capable of outputting >60 fps?Yes it absolutely is. Again, it's one game four years after the GPU's launched. Changes nothing in the big picture when looking back at recommendations between 2060S and 5700 XT dating back 3-4 years. Funny to even have to argue about this but somehow I'm not surprised.