My initial claim was not about broad equivalence but about performance. Using DLSS to match imagine quality does not make the GPU equivalent in performance. However thats not the point of this discussion.
You claimed specifically that "
consoles are more powerful than ~85% of pcs". And you specifically referenced the Steam Hardware Survey in that claim, despite clearly not checking the Steam hardware Survey before you made it.
As already noted, both the 3060 and 2070S generally offer more performance than the PS5 in RT enabled games. So the claim that the consoles are more powerful than those GPU's is already on shaky ground, especially in a debate that is focused around RT performance.
However even removing those 2 GPU's from the equitation bringing the lowest performers down to the 6700XT/RTX 2080, we are still looking at about 30% of PC GPU's being faster than the consoles (again excluding mobile parts completely), so either way, your claim was wrong.
And to add to that, the original argument I put forward was that the 2060 "
should be able to offer a roughly console level RT experience". So you're attempting to create a strawman by making this about raw raster performance rather than the end result seen on screen in RT enabled games - where DLSS absolutely plays a huge factor vs the consoles.
Well it’s about options isn’t it? When you don’t have an option and you want to play the game, you just deal with it. When players have options, they’re rejecting the bad option.
Which again, is a totally different argument to the question of whether gaming is viable at 30fps. It clearly is since entire console generations were based around that frame rate and even today virtually every console game ships with a
viable 30fps mode which many gamers choose to use.
The question being debated here is whether the 2060 is a viable gaming card for RT use. And the answer is that in the vast majority of cases it can achieve a solid 30fps in most RT titles with some level of RT enabled.
Arguing that 30fps isn't a viable gaming frame rate when their are literally 100's of millions of purchases for 30fps only games out there makes no sense. And arguing that the 2060, a 6 year old lower mid range part at release MUST play all games at 60fps also makes no sense. If people want 60fps gaming, higher tier options are available in the PC space. If people are content with 30fps gaming and middling image quality coupled with some graphical compromises, then the 2060 has been a viable option for the last half decade+. Obviously expecting it to run every single latest and greatest RT enabled game 6 years after it's launch with RT enabled is unrealistic, but that is every different to saying it wasn't a viable GPU for RT when launched, and for many years after.
How many console games have RT modes? Of those, which console games are using 700p-800p as an input resolution in RT mode? How many of them fall in that category? What percentage of console games using RT does that represent? Of the console games using RT, how many are using fsr2 vs an alternative upscaling method? You surely can’t expect to throw that statement out there with absolutely no data to back that up and expect us to just accept that as true?
Ah the old 'I'll ask a question that is literally impossible for anyone to answer so that I look good when the person I'm arguing against isn't able to answer it'... nope, we're not doing that. Of course I don't know the above statistics, nor do I need to in order to defend the point I made, which is; that if console RT modes are considered viable (which they must be or they would not exist and would not be being used by anyone) then the 2060 should also be considered a viable GPU for RT because it will generally be able to offer a
roughly equivalent experience to those consoles in those RT modes thanks to it's higher RT performance and ability to run at lower internal resolutions for an overall similar experience thanks to DLSS.
Well that only works if viability is defined as being used by at least 1 person. In the context of this discussion, that’s certainly not how I’d define viability.
Except it's not "just 1 person" defining 30fp as viable, it's hundreds of millions of gamers who were perfectly happy gaming at that framerate for the last several console generations along with the tens of millions who continue to do so today on the current gen consoles despite having 60fps options in many cases.
And from a resolution perspective, HWU own results show that at 1080p DLSS Quality most modern RT enabled games can hit a minimum of 30fps with some level of RT enabled on the 2060. Are you claiming that 1080p DLSS quality is an unviable level of image quality for a 6 year old lower mid range GPU?
Console gamers are enjoying it so much that there have been numerous complaints have been made about games with poor image quality this gen? So much enjoyment that the general sentiment regarding ue5 is generally poor across pc and console gamers if various discussion forums(here to even YouTube comments) are to be believed? Like I said above, the options posed to console gamers are to either deal with the issues or not play the game. Do not equate the lack of an option to enjoyment.
Wishing for better image quality is entirely different to the game being
unviable at said image quality. Sure console gamers don't have a choice and have to accept whatever image quality they get. But on PC, you tailor your GPU purchase for the image quality and framerates that you desire. The 2060 is a 6 year old lower mid range part and so anyone choosing it should expects compromises in settings, resolution and frame rate. But if they can play those games with at least 30fps and passable image quality, then they are viable.
3840x800? Firstly, is that a typo or do you really game at 24:5? Secondly, finding acceptable does not negate my earlier statement. I said DLSS flaws are far too visible when the input resolution is lower that 1920x1080. If you’re happy ignoring the flaws, that is fine. It doesn’t change the fact that the flaws are visible.
It's quite obviously a typo considering the very next sentence states my input resolution is 1920x800 at DLSS performance. And as I noted, I generally find image quality to be excellent at these settings. I totally accept that you may prefer better image quality and that's entirely your prerogative. However that's not the argument you're making here. You're trying to claim that this resolution is actually
unviable to play games in. That it's literally so bad, the game cannot be played. This is clearly absurd. This level of image quality is better than the vast majority of console games 60fps modes which you've previously argued are the only way games should be played on consoles.
Yes, anything is possible for sure. You can game on console with a keyboard and mouse. It doesn’t make it the predominant preferred input choice for the platform….
You're basing your argument that PC gaming is unviable at 30fps because you must use a mouse. Yet you do not need to use a mouse for modern PC gaming which invalidates that argument. PC gaming is viable at 30fps using a control pad, which is an entirely viable method of playing all modern RT enabled PC games.
The most popular games on pc do not require high end hardware. That is why most pc gamers have worse systems than consoles. Sometimes, I don’t think people on here realize how irrelevant they are in terms of market sentiment. Basically the discussions that exist on this forum are almost not ever reflective of general market sentiment. Like you use a 4070ti so only ~3% of pc gamers have a better GPU than you. I think you should keep this in mind when making arguments.
I don't see how me gaming on a high end GPU has anything to do with whether significant numbers of PC gamers are willing to accept a lower than 60fps target or not? Do you have any kind of evidence suggesting that all (or at least the overwhelming majority of) PC gamers are gaming at at least 60fps all the time?
We certainly do have ample evidence that hundreds of millions of console gamers are content to play at 30fps where no other options exist so is it your suggestion that PC gamers simply have higher standards as a rule? Even if gaming on much weaker hardware than consoles? Seems a bit of a stretch....