AMD Radeon RDNA2 Navi (RX 6500, 6600, 6700, 6800, 6900 XT)

Console and mobile gaming are the common denominators, for low value gaming and you do not need a PC to game at 30, or 40 fps. We understand why Angry Birds doesn't need to be 60FPS for you to enjoy it, and even in certain PC genre style game don't need high Frames; like DOTA vs Cyberpunk. Typically multiplayer games need more fps, but again in a game like DOTA where isometric top down view, aiming is not time critical... or even single player games like Cyberpuke where nothing is critical, or matters. Game is saved.

But if Games where Vistas and in first person view, need to be as fluent and unfettered as possible, therefor your in-game movement rate should never be hindered by hardware. 60Hz + 60 FPS is the bare minimal for PC and anything less on Windows PC platform and you are just lying to yourself. Get a Console or retro Pi.
 
I am satisfied the way I play on PC. So please don't project your frustration over framerates on other people, and less than ever try to tell me what I must and must not do. Thanks.
 
Which data? CP2077 is perfectly playable at 40 FPS. Those needing 60+ FPS in all games for playing it have e-penis issues. This is not a ultra-competitive FPS. Also, Ultra Ray tracing is a scam, you lose basically nothing in IQ going a notch down but you gain a lot of FPS. So saying "max settings" was a little too much, but having "high" RT setting instead of "Ultra" makes you lose nothing IQ wise.

Don't say maximum settings it not true...simple as that.
 
AMDs Radeon RX 6800M im Asus ROG Strix G15 im Test - ComputerBase
July 12, 2021
With the mobile Radeon RX 6000M, AMD also proves in the notebook that the RDNA-2 architecture, with the exception of ray tracing, is competitive with Nvidia's Ampere GPU when both platforms meet in games with comparable power dissipation. With a permanent 145 watts, the Radeon tends to require slightly less power consumption for the same rasterizer performance, even if the data published by Asus (145+ and platform TDP up to 180 watts) initially suggest the opposite. But because AMD SmartShift distributes the available budget much more variable between GPU and CPU than Nvidia Dynamic Boost 2.0, which at most gives the fixed TGP of the GPU even more power, the consumption of radeons in games is often not at the value you had to expect – the CPU simply gets more share than expected.
 
Yes, but think of all the epeen e-centimeters you lose for not playing Cyberpunk with ULTRA RAY TRACING.

The G.I. is quite better, but then again...since AMD cannot do it playable, nobody should right? ;)
The argument your are making is actually an argument for that you should drop the PC and go console, since I.Q doesn't matter to you.

But that would be a lie looking at your post in the consoles sections.

You cannot have your cake and eat it too.

And no one should say maximum settings when it is not true...are you encouraging lies? ;)

EDIT:
I would actually say that it is trying to "e-peen" by claiming maximum settings (in order to elevate your hardware) when it is not true...thanks for that own goal ;)
 
Last edited:
The G.I. is quite better, but then again...since AMD cannot do it playable, nobody should right? ;)
The argument your are making is actually an arguemnt for that you should drop the PC and go console, since I.Q doesn't matter to you.

But that would be alie looking at your post in the consoles sections.

You cannot have your cake and eat it too.

And no one should say maximum settings when it is not true...are you encouraging lies? ;)

Again, I've clarified this, because there is nothing to hide (and the only change from max settings is RT not set to "Ultra" - which changes nothing). Anyway, your point being? If we consider CP2077, practically no present card can run it at max settings at 60FPS+ at their marketed target resolution, if you want to consider it "playable" only at 60+ FPS, that is. In the link you gave, the 3090 WITH DLSS manages a poor 39.8 FPS with all setting max at 4K (and 1440p is not reaching stable 60FPS+, too). Imagine spending more than 2000$ for playing at 4K and not being able to do so. No current hardware is able to do it. But it does not end here. In reality, during the game you have sometimes FPS drops quite probably to CPU saturation going below 30 FPS. So is the game unplayable for everyone? Of course not. The game has its slowdowns, but as it does not turn in a slideshow it is anyway OK to play because it is not an high speed/high precision FPS in first instance, then even at not very high frame rates it does not create a lot of issues, at least to me. In other games you can perceive the stuttering or microstuttering clearly, in CP2077 it's way better. It's a matter of what you find "acceptable" or not. If your limit for "acceptable" is 60+FPS only then stay far away from Cyberpunk. If you want to PLAY a game and not wanting simply to tell everyoene else "I maxxed CP framerates oh I'm God" then a 2070 is quite OK at FHD resolution. Simple as that.
 
Last edited:
Console and mobile gaming are the common denominators, for low value gaming and you do not need a PC to game at 30, or 40 fps. We understand why Angry Birds doesn't need to be 60FPS for you to enjoy it, and even in certain PC genre style game don't need high Frames; like DOTA vs Cyberpunk. Typically multiplayer games need more fps, but again in a game like DOTA where isometric top down view, aiming is not time critical... or even single player games like Cyberpuke where nothing is critical, or matters. Game is saved.

But if Games where Vistas and in first person view, need to be as fluent and unfettered as possible, therefor your in-game movement rate should never be hindered by hardware. 60Hz + 60 FPS is the bare minimal for PC and anything less on Windows PC platform and you are just lying to yourself. Get a Console or retro Pi.

I admit that I enjoyed Lego Builder's Journey quite thoroughly for the short while it lasted on my HD 7970, which was below 20 fps at times. For this game, it mattered not to me.
OTOH, my performance in Mechwarrior Online got much better in 2014, once I had upgrad my then ancient PC to a more recent one and was able to play at three-digit framerates.

It all depends. It even depends on your skill level, how much better experience you get out of it with 60 or more fps.
 
Last edited:
Again, I've clarified this, because there is nothing to hide (and the only change from max settings is RT not set to "Ultra" - which changes nothing). Anyway, your point being? If we consider CP2077, practically no present card can run it at max settings at 60FPS+ at their marketed target resolution, if you want to consider it "playable" only at 60+ FPS, that is. In the link you gave, the 3090 WITH DLSS manages a poor 39.8 FPS with all setting max at 4K (and 1440p is not reaching stable 60FPS+, too). Imagine spending more than 2000$ for playing at 4K and not being able to do so. No current hardware is able to do it. But it does not end here. In reality, during the game you have sometimes FPS drops quite probably to CPU saturation going below 30 FPS. So is the game unplayable for everyone? Of course not. The game has its slowdowns, but as it does not turn in a slideshow it is anyway OK to play because it is not an high speed/high precision FPS in first instance, then even at not very high frame rates it does not create a lot of issues, at least to me. In other games you can perceive the stuttering or microstuttering clearly, in CP2077 it's way better. It's a matter of what you find "acceptable" or not. If your limit for "acceptable" is 60+FPS only then stay far away from Cyberpunk. If you want to PLAY a game and not wanting simply to tell everyoene else "I maxxed CP framerates oh I'm God" then a 2070 is quite OK at FHD resolution. Simple as that.

No, once again:
Do not claim maximum settings when that is a lie.
Simple as that.

You made the false claim, don't get angry at me for your dishonesty /shrugs
 
No, once again:
Do not claim maximum settings when that is a lie.
Simple as that.

You made the false claim, don't get angry at me for your dishonesty /shrugs

Again,
1) I clarified, something that you did not acknowledge. The difference on max setting was only not using "Ultra" on RTX.
2) I never claimed 60+FPS. I said only it was playable.

And you are calling someone other "dishonest" lol. Look at yourself first before calling someone else "angry" or "dishonest" because it suits your narrative.
 
Again,
1) I clarified, something that you did not acknowledge. The difference on max setting was only not using "Ultra" on RTX.
2) I never claimed 60+FPS. I said only it was playable.

And you are calling someone other "dishonest" lol. Look at yourself first before calling someone else "angry" or "dishonest" because it suits your narrative.

All I objected was your claim about maximum settings...which was false as I suspected.

All the rest is just you.
 
Which data? CP2077 is perfectly playable at 40 FPS. Those needing 60+ FPS in all games for playing it have e-penis issues. This is not a ultra-competitive FPS. Also, Ultra Ray tracing is a scam, you lose basically nothing in IQ going a notch down but you gain a lot of FPS. So saying "max settings" was a little too much, but having "high" RT setting instead of "Ultra" makes you lose nothing IQ wise.


When you're used to 60fps or more, you really see the difference with 40fps, even with a freesync/gsync monitor....
 
When you're used to 60fps or more, you really see the difference with 40fps, even with a freesync/gsync monitor....

It depends on the game, mostly. Many games can help to "hide" the lower frame rate with an appropriate use of motion blur, i.e. And yes, my laptop has a Gsync screen, so it alleviates a lot the issue.Then again, seeing the difference and judging that unplayable when less than 60 Fps (which happens a lot on CP2077 whatever system you have) is more a matter of e-peen than objectivity, in non-competitive environment.
 
Errrmmmhhh?
5.png


Max setting at 1080p on CP2077 with NO DLSS at maximum settings (these numbers are Ultra, not Psycho WITH DLSS)...I call not true...
Cyberpunk 2077 DLSS + Ray Tracing Benchmark | TechSpot

Please provide data to back up you claim.

Ummm, did you bother to even look at the graph you posted? It says right in the graph "ULTRA + RT Ultra + DLSS Quality". I'm pretty sure that's not NO DLSS. :p At least try to link to something that shows what you're saying.

Regards,
SB
 
Ummm, did you bother to even look at the graph you posted? It says right in the graph "ULTRA + RT Ultra + DLSS Quality". I'm pretty sure that's not NO DLSS. :p At least try to link to something that shows what you're saying.

Regards,
SB

TRY READING:
"Max setting at 1080p on CP2077 with NO DLSS at maximum settings (these numbers are Ultra, not Psycho WITH DLSS)...I call not true..."

This shows me unplayable settings at lower I.Q. even with DLSS...
 
Back
Top