Next-gen console versus PC comparison *spawn

It's way worse than GTA V. Its; exceedingly rare to have even four of five cars on screen on PS5 whereas GTA will often render dozens. Likewise pedestrians. Except is specifically-scripted areas like markets, the numbers of pedestrians around the city is pretty sparse.

I really liked my 35ish hours in Cyberpunk but every patch detracted from the atmosphere of this being a living, breathing crammed city of the future. So I've put a pause on it until either they tweak the patches so the density of cars and pedestrians is cranked up - this doesn't seem a massive undertaking, there is literally a setting for this in the PC version - or until the full monty nextgen version is released.

But you do should to go play GTA V again to see what a busy city looks like! :yes: Cyberpunk ain't it. :no:

GTA V put a lot less strain on my system and looks a LOT more dated than CP2077...comparing apples and bananas is bad.
 
It does when using raytracing.
Raytracing is comming more and more and the consoles really do not have the RT performance to keep up now and in the future that gap will only expand.
Based on Alex's analysis in WD:L and Control, the PS5 looks to be performing a bit slower than a 2060 Super in ray tracing. So it should comfortably beat that 1080 Ti. And then the absolute performance delta vs. RTX 3080 should be roughly in line with what we saw last generation with the 7850 vs. R9 290X/GTX 970.

Only this time, we have a comparatively much faster CPU and an IO system that is on the cutting edge of PC performance. These were the big factors that CDPR list as limiting what they could do with Cyberpunk 2077 on the PS4.
 
Based on Alex's analysis in WD:L and Control, the PS5 looks to be performing a bit slower than a 2060 Super in ray tracing. So it should comfortably beat that 1080 Ti. And then the absolute performance delta vs. RTX 3080 should be roughly in line with what we saw last generation with the 7850 vs. R9 290X/GTX 970.

Only this time, we have a comparatively much faster CPU and an IO system that is on the cutting edge of PC performance. These were the big factors that CDPR list as limiting what they could do with Cyberpunk 2077 on the PS4.

I am not living 4 years ago today and seeing how RT performance increased from Turing to Ampere I have no doubt it will increase substantially to the next gen...meaning the consoles are locked (HARD) in the past, meaning more and more loss of visual fidelity compared to the PC...Ampere is already way ahead so I see nothing changing the points I made.

The current gen consoles really had bad timing, raytracing just taking off and AMD kinda dropping the ball on raytracing performance meaning this generation is really underpowered for the future.
 
I am not living 4 years ago today and seeing how RT performance increased from Turing to Ampere I have no doubt it will increase substantially to the next gen...meaning the consoles are locked (HARD) in the past, meaning more and more loss of visual fidelity compared to the PC...Ampere is already way ahead so I see nothing changing the points I made.

The current gen consoles really had bad timing, raytracing just taking off and AMD kinda dropping the ball on raytracing performance meaning this generation is really underpowered for the future.
The current console generation is not significantly more underpowered than the last one was, relative to the PC. Maybe the PC will shoot far ahead, but so far we have had stagnation in the mid range GPU space, and Ampere only slightly reduces the relative performance cost of ray tracing.
 
Based on Alex's analysis in WD:L and Control, the PS5 looks to be performing a bit slower than a 2060 Super in ray tracing.

Weird, I was under the impression the PS5 is more like the 2070 Super. Actually the CPU he is using should also be way better than the 3600x type CPU on the PS5. So I would say maybe the GPU is almost 2070Ti to 2080 level.


upload_2021-2-7_13-57-55.png
 
The current console generation is not significantly more underpowered than the last one was, relative to the PC. Maybe the PC will shoot far ahead, but so far we have had stagnation in the mid range GPU space, and Ampere only slightly reduces the relative performance cost of ray tracing.

Ampere is practically double Turing in ray-triangle intersection checking rate, so I disagree.
 
Weird, I was under the impression the PS5 is more like the 2070 Super. Actually the CPU he is using should also be way better than the 3600x type CPU on the PS5. So I would say maybe the GPU is almost 2070Ti to 2080 level.


View attachment 5254

If you subtract the "cheats" (like dynamic resolution) and "consolitties" (not able to scale features) and AMD vs NVIDIA bias in this game and the PC port port issues...you are really cheery-picking IMHO
 
Weird, I was under the impression the PS5 is more like the 2070 Super.
I'm talking about ray tracing performance though. In that clip you link to, Alex mentions that the 2060 Super "edges out" the PS5 in WD:L.

Obviously it does much better in rasterization.

Ampere is practically double Turing in ray-triangle intersection checking rate, so I disagree.
Yes, and the 3080/3090 (GA102) also has more than double the transistors of the 2080 (TU104). The 3080/3090 enables a higher absolute level of performance, but the relative performance hit incurred by enabling ray tracing is similar.

https://www.techspot.com/article/2109-nvidia-rtx-3080-ray-tracing-dlss/
 
It's way worse than GTA V. Its; exceedingly rare to have even four of five cars on screen on PS5 whereas GTA will often render dozens. Likewise pedestrians. Except is specifically-scripted areas like markets, the numbers of pedestrians around the city is pretty sparse.

I really liked my 35ish hours in Cyberpunk but every patch detracted from the atmosphere of this being a living, breathing crammed city of the future. So I've put a pause on it until either they tweak the patches so the density of cars and pedestrians is cranked up - this doesn't seem a massive undertaking, there is literally a setting for this in the PC version - or until the full monty nextgen version is released.

But you do should to go play GTA V again to see what a busy city looks like! :yes: Cyberpunk ain't it. :no:

My post isn't a PC vs console comparison, it's a city density in GTA V on PS4 vs a city density in Cyberpunk on PS4 comparison.
 
If you subtract the "cheats" (like dynamic resolution) and "consolitties" (not able to scale features) and AMD vs NVIDIA bias in this game and the PC port port issues...you are really cheery-picking IMHO
Actually, the PC version does do Dynamic Resolution as per Alex. It might be higher than on PS5 but it's there.


Also the CPU differences do make a difference.
 
I'm talking about ray tracing performance though. In that clip you link to, Alex mentions that the 2060 Super "edges out" the PS5 in WD:L.

Obviously it does much better in rasterization.


Yes, and the 3080/3090 (GA102) also has more than double the transistors of the 2080 (TU104). The 3080/3090 enables a higher absolute level of performance, but the relative performance hit incurred by enabling ray tracing is similar.

https://www.techspot.com/article/2109-nvidia-rtx-3080-ray-tracing-dlss/

Both have 68 RT cores (2080 Ti/3080) but the RT perfomance of Ampere is much higher...the devil is in the details.
 
Both have 68 RT cores (2080 Ti/3080) but the RT perfomance of Ampere is much higher...the devil is in the details.
Yes, so rather than increasing the number of cores, they scaled performance by increasing the throughput per core. Just like kept the number of SMs the same as the 2080 Ti, but doubled FP32 performance per SM.

The end result was that absolute performance in both ray tracing and rasterisation went up, but the cost of enabling ray tracing decreased only slightly. In several of the titles in the link I gave, the cost decreased by 10% or less.
 
Based on Alex's analysis in WD:L and Control, the PS5 looks to be performing a bit slower than a 2060 Super in ray tracing. So it should comfortably beat that 1080 Ti. And then the absolute performance delta vs. RTX 3080 should be roughly in line with what we saw last generation with the 7850 vs. R9 290X/GTX 970.

Only this time, we have a comparatively much faster CPU and an IO system that is on the cutting edge of PC performance. These were the big factors that CDPR list as limiting what they could do with Cyberpunk 2077 on the PS4.

Yes, for the GPU, they are around the same ballpark as in 2013, perhaps, the gap is somewhat wider now in normal rendering (then we have the 3090/6900XT too).
If we factor in ray tracing and DLSS, they are further behind then last generation. We cant just omit reconstruction tech and ray tracing anymore since all hardware has it, and uses it.

CPU side they are better off then in 2013 i think yes. Compared to whats available its still mid-range though (and a generation behind in CPU tech). A zen2 3700x class cpu isnt really all that 'cutting edge' anymore, actually its abit old now and Zen3 has improved alot. Not to forget the console cpus are lowly clocked by pc standards.
The IO is high-end (in special PS5), but in raw speed already surpassed. Though, the 2013 consoles has a rather huge memory advantage with 8gb gddr5, that they today certainly do not have the same advantage compared to pc. DRAM still is king even today....

The current console generation is not significantly more underpowered than the last one was, relative to the PC. Maybe the PC will shoot far ahead, but so far we have had stagnation in the mid range GPU space, and Ampere only slightly reduces the relative performance cost of ray tracing.

They are around the same ballpark as they where in 2013. They traded gobs of gddr for faster IO. The CPU side is better compared to the jaguar in 2013 though.

So I would say maybe the GPU is almost 2070Ti to 2080 level.

Alex quite clearly noted the PS5 is ballpark RTX2070 in normal rendering (hitman and others). It sits where expected (and RX5700XT). In valhalla it might perform somewhat better, bot across the board, its RTX2070 (discounting RT and DLSS).

So without the walled garden, you had no interest in the consoles right?

I would still be intrested in them, its still technology and their still abit different in some ways to a pc. but i wouldnt have had any need myself for a PS5 (i would still have one for the kids).
So far, i have had zero need for it since none of the exclusives are worth it to me, i wanted it for one sole reason, horizon forbidden west. This is a cross gen game and will most likely make its way to pc at a later date, but horizon is something im a fan of and want to explore the new world atleast the year it releases.

On the other hand of this, with the amount of exclusives these days a playstation sees, over a seven year lifespan, and considering one doesnt get to like ALL of them, its not really many games i play on the PS4 (and PS5).

its a dual purpose machine, the kids like console-time and i get to play the occasional exclusive i cant get on pc that intrests me. Since this is a PC vs console topic, i'd say that yes the PC again is quite far ahead even in normal rendering, and when factoring in RT and DLSS the gap only widens, games available for both will (and does already) have its best performance etc on the pc (at a higher cost to price ratio.

However, i do respect the consoles and have nothing against them. The PC is superior and that will be heard. Just like PS fans like to talk about their faster SSD as opposed to XSX etc etc etc
Aslong as we can do this in peace, some of your posts are abit too 'anti console' and that escalates quickly. Ivent been an angel either, its just to adapt abit to the B3D style of conversating :p
 
Last edited:
They are around the same ballpark as they where in 2013. They traded gobs of gddr for faster IO. The CPU side is better compared to the jaguar in 2013 though.
For rasterization, the 3080 is 1.97x faster than the 5700 XT at 4K, and the 3090 is 2.16x faster, according to a recent TechPowerup review summary (https://www.techpowerup.com/review/nvidia-geforce-rtx-3060-ti-founders-edition/35.html). For ray tracing, it looks like the 3080 and 3090 are around 2.4x and 2.7x faster respectively than the 2060 Super comparison point.

While, the TechPowerup index for the 7850 (https://www.techpowerup.com/gpu-specs/radeon-hd-7850.c1055), puts the R9 290X as 2.3x faster.

The difference is that the R9 290X launched at $549, while the 3090 launched at $1499. And within a year, the 970 came along delivering 290X class performance for only $329. So you could buy a PC GPU offering 2.3x the performance of the PS4 for $70 less than the price of the console itself.
 
For rasterization, the 3080 is 1.97x faster than the 5700 XT at 4K, and the 3090 is 2.16x faster, according to a recent TechPowerup review summary (https://www.techpowerup.com/review/nvidia-geforce-rtx-3060-ti-founders-edition/35.html). For ray tracing, it looks like the 3080 and 3090 are around 2.4x and 2.7x faster respectively than the 2060 Super comparison point.

While, the TechPowerup index for the 7850 (https://www.techpowerup.com/gpu-specs/radeon-hd-7850.c1055), puts the R9 290X as 2.3x faster.

The difference is that the R9 290X launched at $549, while the 3090 launched at $1499. And within a year, the 970 came along delivering 290X class performance for only $329. So you could buy a PC GPU offering 2.3x the performance of the PS4 for $70 less than the price of the console itself.

Well yes, price wasnt really the discussion, or perhaps somewhat, but i also noted in my post that price to performance ratio is in the consoles favor as it always has been.
I think some where talking purely in a performance perspective, and there is where they are sitting ballpark where they where in 2013 (which is fine and nothing wrong with that).
Ray tracing (and reconstruction) arent all that performant on RDNA2 which widens the gap, in special the combination of those, and thats before next gen games come around (consoles already sacrifice settings to have performant RT).

Its features like ray tracing that that come on top of the already over the twice the performance for the dGPUs. This time around, they are much better equipped due to more VRAM (16gb or higher even), whereas last time 2GB was the norm, versus 8gb.

Prices are adjusting since Ampere and i can see that happening again due to stiffer competition and emerging gpu markets.
 
Back
Top