PC Gaming Market breakdown or something *spawn*

The STEAM survey is all we really have to go on and it indicates there are more PC's out there that are closer to the last generation PS4 Pro than they are to the current generation PS5.
Again, if you look at active Playstation gamers, most of them are on PS4 also. By an almost identical margin in percent and raw numbers.

Why not? I have stated consoles in my previous replies, you decided to just use PS5.
So why aren't you including Xbox One, PS4, Switch, Atari VCS and Evercade numbers as well? You are simply adding competing platforms numbers to try to trivialize the install base of higher end PCs wile discarding the lower performance consoles that are still being used. The fact remains that RTX's install base is roughly equal to PS5. Why did I choose PS5? It's the better selling high end console. If I was trying to make PC look better I would have picked Series since it sold less.
 
PC gaming market has the upper hand in average performance per user, if these numbers are anything to go by. In raw rasterization ps5 is equal to 6600xt, or ballpark 2070. Much below when calculating in current gen features like RT and dlss.

This was what DF pointed out last year and its still true. The gap will widen as time goes by.
 
You do know consoles have been upscaling for years right?

And is it not native 1440p on PS5 vs native 1080p with DLSS on your laptop?

So it would seem it's not a better experience due to better RT acceleration (DF have shown multiple times PS5 is equivalent to RTX2060/s in regard to RT acceleration and your laptop variant has a decent clock speed drop so would be even slower) and DLSS, but more about developer choice on PS5.

EDIT: Techpowerup has the RTX2060m (I'm assuming this is yours?) being 40% slower than the desktop variant so it's no where near PS5 in terms of RT or raster performance.

1440p to 4K on PS5 in Control looks very much like 1440p though, the upscaling is far inferior to DLSS. On my laptop 720p looks actually very close to 1440p thanks to DLSS and when you add the better RT acceleration it's pretty easy to see how it can achieve 60 FPS with Raytracing on my laptop.

The mobile 2060 is not 40% slower than a desktop 2060, it probably around 15-20% slower, on par with a GTX 1070.
 
Basically, as you were alluding to, "PC gamers" are a very diverse lot that can't easily be generalized. The list could go on and on about how one PC gamer prefers to do things X way while another PC gamer prefers to do things Y way. :p
You missed the weird that is me. Every 3-4 years I will completely rebuild my main (non-laptop) PC and I'll shove as much PSU, CPU and GPU and cooling and in that bastard as around £5k will cover just so that I don't have to spend more time upgrading the bastard. Storage and RAM being a bit easier. But frankly, once that thing is built and running, I do not want to have to crack out as much as a screwdriver for anything.

I bet there are dozens of us. DOZENS! Roll on the Geforce 4080. :yes:
 
Which is interesting because the 3000RTX series launched a few months before the current generation consoles. And Don't forget the Xbox!

This is true, but the sales of PC GPU's of console capability or greater accelerate at a much greater rate than console sales the further into the generation we get as the price of that performance level drops. Its at the xx60 level with Ampere but the pricing issues have greatly dampened sales in this generation. The Ada generation should see prices return to more normal levels while the xx50 tier should be able to match or exceed console performance. That should result in very significant shift in the average PC performance point.
 
This is true, but the sales of PC GPU's of console capability or greater accelerate at a much greater rate than console sales the further into the generation we get as the price of that performance level drops.
I would hope so. That fixed performance target of launch consoles will plummet in cost for PC cards over time.
 
1440p to 4K on PS5 in Control looks very much like 1440p though, the upscaling is far inferior to DLSS.
Actually I'm not sure Control is even using any form of TAAU reconstruction on the consoles to begin with. 1440p in Control on my PC (without DLSS) actually looks exactly like the PS5 version, aside from some basic quality improvements due to the settings. I've never heard that Control does any form of reconstruction on consoles, it certainly doesn't look anywhere near what other titles do when using temporal upscaling from ~1400p or 1920x2160 for checkboarding. Perhaps the engine isn't just well suited to whatever they're using if it's not just native bilinear upscaling - it's a particularly noisy/fuzzy game (even DLSS doesn't fully rectify that).
 
Actually I'm not sure Control is even using any form of TAAU reconstruction on the consoles to begin with. 1440p in Control on my PC (without DLSS) actually looks exactly like the PS5 version, aside from some basic quality improvements due to the settings. I've never heard that Control does any form of reconstruction on consoles, it certainly doesn't look anywhere near what other titles do when using temporal upscaling from ~1400p or 1920x2160 for checkboarding. Perhaps the engine isn't just well suited to whatever they're using if it's not just native bilinear upscaling - it's a particularly noisy/fuzzy game (even DLSS doesn't fully rectify that).

Control is using temporal reconstruction on console just like they do on PC. It'd be rather strange if Remedy chose to not use temporal reconstruction after all the work they did to set the groundwork for it in their engine with Quantum Break.

Regards,
SB
 
Control is using temporal reconstruction on console just like they do on PC. It'd be rather strange if Remedy chose to not use temporal reconstruction after all the work they did to set the groundwork for it in their engine with Quantum Break.

Regards,
SB

Yeah, it stands to reason that how the engine works as doing temporal reconstruction is how it creates its lighting and effects - but what I'm talking about is I'm just not sure it's doing it in terms of 'render resolution vs resolution' in the settings menu.

tF5VniU.png


There does not seem to be any reconstruction in changing the display resolution.

We know there's a cost to temporal reconstruction, but in these two examples with the same render res but 4k vs 1440p output res, there's absolutely no difference in GPU % required, and I can't tell the difference in quality either (save perhaps the UI?). The PS5 version looks exactly like this, if anything even a tad blurrier.

Control, no DLSS, Render Res: 1440p, Display Resolution: 2160p

loxyg5l.jpg


Control, no DLSS, Render Res: 1440p, Display Resolution: 1440p

4j0Xyc7.png
 
Last edited:
The Steam Survey has many times been under critics for its accuracy. Alex/DF has shared an article awhile ago there are actually more RTX2060S or better out there then there are current generation consoles combined in the same hardware class.
Dont forget either that there are about 100 million PS4's and a huge number OneS out there. Console gamers are generally on older hardware, probably more so than pc gamers.



Dont thinkso, il side with kingston's statement that PC gamers are eager to upgrade and usually dont want to be stuck at console quality in games.

Steam's survey numbers can be very misleading because it tracks a lot of data points. It bothered me just staring at it and trying to mentally parse out numbers so I copied and pasted into excel.

There are 22 skus related to RTX so those 0.5% to 5% figures add up to 25%. There are more 3000 RTX series users than 2000 RTX series users. There are more 3080 and 3060 users than 2080 and 2060 users. 2070 users are slightly higher than 3070 users. Also, there is no such thing as a 2050 or a 2090, so there are roughly 30% more 3000 RTX series owners than then there are 2000 RTX owners. Given there are 120 million users of Steam, that’s roughly 17.3 million users (30 million RTX users overall) who game on a 3000 RTX GPU. Thats more than enough gamers for any dev to support features only a RTX GPU can provide.

Sales number related to users would offer alot of insight. But even without it, I'd bet that those RTX users are bigger spenders on software than the 1000 series owners.

Steam Hardware & Software Survey (steampowered.com) April 22 data
upload_2022-5-17_14-43-24.png
 
Last edited:
Why not? I have stated consoles in my previous replies, you decided to just use PS5.
Combining consoles together is not needed, the relevant point here is: PCs with RT capable GPUs and competent upscaling capable GPUs are a match for a next gen platform. Developers can use this fact to develop next gen games on PC without worrying that their game won't be purchased due to lack of capable hardware.

You also deemed it fine to combine RTX2000 and 3000 series sales for your numbers.
Of course it's fine, they are the same platform. Just like combining Series S and Series X numbers, or PS4 and PS4 Pro numbers.

We know from DF videos that the low end RTX cards can't match PS5, so do we discount them from your 20% number?
I think when you look at all the variables it could be argued only 10-15% of all gaming PC's are as fast or faster than PS5.
Definitely NOT. The Steam numbers alone put the RTX GPUs on 25%, not counting AMD RX 6000 series, not counting pirates, or people who don't use Steam and use other gaming services instead.
 
Steam's survey numbers can be very misleading because it tracks a lot of data points. It bothered me just staring at it and trying to mentally parse out numbers so I copied and pasted into excel.
The Steam numbers aren't misleading, they are precise. It's tracks individual GPUs, CPUs, RAM configuration and so on.

I think that's the point.
 
I'm just going to risk it wading into. 10% of those RTX cards have 6GB or less of VRAM. From the PC gaming side as someone who isn't entirely content with developers having to feel constrained by 8GB going forward, that's a bit concerning to me in terms of implications.
 
I'm just going to risk it wading into. 10% of those RTX cards have 6GB or less of VRAM. From the PC gaming side as someone who isn't entirely content with developers having to feel constrained by 8GB going forward, that's a bit concerning to me in terms of implications.
Series S has even less. 7.5 GB shared memory for games results into just around 4 GB video memory in most games if you compare texture quality, that's the baseline devs have to work with and the reason why entry level GPUs have 4 GB as well.

But don't fret. Software gets more efficient like Virtual Texturing and SFS, thanks to SSDs much less stuff has to be loaded in VRAM compared to before which can be used to increase texture quality for the same amount of VRAM.

Tech demos like Valley of the Ancient and the matrix demo have crisp 8K textures and only use around 6-7 GB VRAM in 4K.
 
The Steam numbers aren't misleading, they are precise. It's tracks individual GPUs, CPUs, RAM configuration and so on.

I think that's the point.

STEAM number are and can be misleading as it samples a small amount and then projects totals based on what it samples.
 
Yeah, it stands to reason that how the engine works as doing temporal reconstruction is how it creates its lighting and effects - but what I'm talking about is I'm just not sure it's doing it in terms of 'render resolution vs resolution' in the settings menu.

tF5VniU.png


There does not seem to be any reconstruction in changing the display resolution.

We know there's a cost to temporal reconstruction, but in these two examples with the same render res but 4k vs 1440p output res, there's absolutely no difference in GPU % required, and I can't tell the difference in quality either (save perhaps the UI?). The PS5 version looks exactly like this, if anything even a tad blurrier.

Control, no DLSS, Render Res: 1440p, Display Resolution: 2160p

loxyg5l.jpg


Control, no DLSS, Render Res: 1440p, Display Resolution: 1440p

4j0Xyc7.png

In the 4k vs 1440P display resolution comparison, do you have it locked to 60 fps? Because both show the same framerate. If you do have it locked, the only difference would show in GPU utilization % which is about 6% higher for the one with the 4k display resolution but then maybe that's jumping around too much even in like-for-like scenes.

I have no idea if that's the normal amount of overhead to expect with traditional temporal reconstruction but it does make sense that the 4k display res would have a slightly higher utilization percentage than running it at native res.
 
The mobile 2060 is not 40% slower than a desktop 2060, it probably around 15-20% slower, on par with a GTX 1070.

Relative performance listed on Techpowerup is 40% slower than the desktop RTX2060.

The closest desktop GPU is the GTX1660 at 4% faster with the GTX1070 listed as being 19% faster.

The clock speeds alone (base and boost) are 30%+ slower than the desktop variant so that alone shows it can't be 15-20% slower as you stated.

Here a piece from the Techspot review of the RTX2060m

If you're looking for a direct desktop comparison to the RTX 2060 laptop GPU, it's hard to give a specific one, but so far performance looks to be between the GTX 1660 and the GTX 1660 Ti

This would put your 2060m some 30-40% slower than PS5 when running games with RT and up to 50% when running games with no RT.
 
Relative performance listed on Techpowerup is 40% slower than the desktop RTX2060.

The closest desktop GPU is the GTX1660 at 4% faster with the GTX1070 listed as being 19% faster.

The clock speeds alone (base and boost) are 30%+ slower than the desktop variant so that alone shows it can't be 15-20% slower as you stated.

Here a piece from the Techspot review of the RTX2060m



This would put your 2060m some 30-40% slower than PS5 when running games with RT and up to 50% when running games with no RT.
Then it's referencing the 60 watt 2060 in the older G14.

Most laptops use the 80 watts variant, which is around GTX 1070 performance.
 
Relative performance listed on Techpowerup is 40% slower than the desktop RTX2060.

The closest desktop GPU is the GTX1660 at 4% faster with the GTX1070 listed as being 19% faster.

The clock speeds alone (base and boost) are 30%+ slower than the desktop variant so that alone shows it can't be 15-20% slower as you stated.

Here a piece from the Techspot review of the RTX2060m



This would put your 2060m some 30-40% slower than PS5 when running games with RT and up to 50% when running games with no RT.

Techpowerup's charts are not much to go by unfortunately. By no means is a RTX2060m in a laptop going to beat the PS5's GPU, it shouldn't either as not even the desktop 2060 does that.
With DLSS and ray tracing though, i can see the laptop 2060m being very close to the PS5 in actual performance/what you get on screen, which is impressive considering its a lower-end laptop GPU originating from the 2018 Turing line-up.
Though, a 3060m/3070m are a match for the PS5 in raw rasterization without dlss. With RT and/or dlss its the better performer all around. It's very impressive on many fronts, a decade ago having a laptop beating consoles wasn't all that of a thing, much less so two decades ago.

Anyway, there's more dGPU's out there that are atleast as fast as the PS5, and faster. Actually everything combined, AMD, laptops and probably soon enough Intel, theres more pc gpu's out there then all current generation consoles combined in the same power-class (and again faster too). Its not either reasonable to assume that gamers with a RTX gpu are going to have it slotted into a system with just a mechanical drive, slow cpu and tiny amounts of system ram. No, nvme m2 is very common as Kingston noted, so are Zen cpu's and atleast 8gb for system ram, more often 16gb for gaming oriented systems.

The amount of serious gamers on pc are most likely having a higher class of hardware than console gamers do, whom are largely left with low end 2013 hardware (100m ps4 vs 19m ps5s', not to forget switch and Xbox one consoles).

The gap is going to widen up the more you move into a generation. Since the pc stood tall already at the beginning of the launch, its only going to grow wider. Supply shortage and all just making matters worse.
 
Back
Top