Console optimisation, a gone myth? RE3R runs at 140+fps avg. on the GTX 1650 & 40-50 on a 10 y.o GPU

Cyan

orange
Legend
Supporter
There is a talk about the optimization in consoles, but today the hardware of consoles and PCs is very similar, so you no longer have to take advantage of Xenos or the Cell to the maximum, since the graphics APIs access the hardware to the metal.

With The Windows 10 game mode that allocates resources to games and that nVidia and AMD releasing drivers optimized for each game when new versions come out, this is the result.

The GTX 480, a 10 years old GPU featuring 1.5GB which was one of the first GPUs to support DirectX 11 runs the game at 40-50 fps and the 4GB GTX 1650 of 4GB runs the game at an average greater than 140fps, reaching 170+ fps.

When you see this and then you see that the Xbox One X can't achieve 60 fps and runs between 40-50 fps, you wonder if this console optimization thing is forever gone, and a myth.

 
There is a talk about the optimization in consoles, but today the hardware of consoles and PCs is very similar, so you no longer have to take advantage of Xenos or the Cell to the maximum, since the graphics APIs access the hardware to the metal.

With The Windows 10 game mode that allocates resources to games and that nVidia and AMD releasing drivers optimized for each game when new versions come out, this is the result.

The GTX 480, a 10 years old GPU featuring 1.5GB which was one of the first GPUs to support DirectX 11 runs the game at 40-50 fps and the 4GB GTX 1650 of 4GB runs the game at an average greater than 140fps, reaching 170+ fps.

When you see this and then you see that the Xbox One X can't achieve 60 fps and runs between 40-50 fps, you wonder if this console optimization thing is forever gone, and a myth.

I dont have the time to check but whats the resolution and detail settings in the PC with the old GPU?
 
I dont have the time to check but whats the resolution and detail settings in the PC with the old GPU?
since it is one of the first GPUs running with DirectX 11 support, there are some problems with it, but in DX11 mode the game runs on it, only at 1280x800, no other resolutions. The settings are:

DirectX11
1280x800 (any other resolution DirectX 11 won't run the game on that card, error message given)
Rendering Mode: Interlaced
Image quality: 100%
Refresh 60Hz
Framerate: Variable
Vsync: Off
FXAA
AFx16
Mesh: High
Shadow Quality: Min
Shadow Cache: On
SSR: On
SSSA: On
Volumetric Lighting: High (lol)
Particle Lighting Quality: High
HBAO+
Bloom: On
Lens flare: On
Motion Blur: Off
Depth of Field: Off
Lens Distortion: On
FidelityFX CAS + Upscaling: On
 
1280x800 w/ Interlaced mode (probably reconstruction)

Low Texture quality, Minimum shadows, Motion blur and DOF are off. Rest of settings are on.

Cyan's being very disingenuous. >_>
no I am not, you didnt mention he sets Volumetric Lighting, which costs a lot at High, on High, and SSAO is more efficient than HBAO+, which he uses.

Look at this video from @Dictator where he explains every setting if you dont trust the facts.

 
The XboneX has a pretty good GPU. If the consoles struggle with something like this I reckon it's the pathetic CPUs holding them back. Their single thread performance is absolutely horrible, worse than a modern smartphone I think.
 
Haven't watched the video but if console struggle to get 60 fps. Maybe cpu issue?

Edit: gah, hadn't read homerdog post.
 
Haven't watched the video but if console struggle to get 60 fps. Maybe cpu issue?

Edit: gah, hadn't read homerdog post.
DF had a look at the demo as well:

Frame rate is really awful on OneX, but while it's running higher res by a pretty significant amount - 4K vs 1620p, ~1.78x pixels, both with reconstruction - it can sometimes be worse than what the shader power should show; possibly, it's the ROP advantage that 4Pro has during those explosions (common theme in other games too).

The devs aren't slouches, so hopefully they're still tweaking things to hit a more stable 60 considering it's an old demo.

----

If 4Pro is at 60fps, then 1.78x scaling should drop the framerate down to 32 while +50% shading power should bump the framerate to ~50fps, and there are a number of times that One X is lower than that.

Perhaps they should only have bumped the resolution to 1800p.
 
What is this thread? Interlaced mode basically cuts the total number of pixels rendered in half if im remembering correctly. He also has FidelityFX Upscaling enabled. Im not sure how much further that drops resolution. When you factor that in the results suddenly arent impressive.
 
I’m really not understanding your point here, @Cyan.

Lowering settings yields higher frame rate on PC. How is that new? And how is that proof that console optimisation is dead?
 
What is this thread? Interlaced mode basically cuts the total number of pixels rendered in half if im remembering correctly. He also has FidelityFX Upscaling enabled. Im not sure how much further that drops resolution. When you factor that in the results suddenly arent impressive.
what about the 1650 then? As for FidelityFX Upscaling, this is a new setting that is enabled by default in the game settings, and I am sure console versions use it. I have it enabled 'cos it was there by default, despite the fact that I run the game on a GTX 1080 which is enough to run this game well.

Interlaced mode is present on consoles, at least in RE2 Remake, with different settings but they use some kind of temporal reconstruction.
I’m really not understanding your point here, @Cyan.

Lowering settings yields higher frame rate on PC. How is that new? And how is that proof that console optimisation is dead?
There is no concessions with the GTX 1650 graphics card, which is medium to low in the PC world. 140fps on average at 1080p means that the game can run on that card at 1440p at perfect 60 fps. Far better than any console, which use upscaling and so on. The Xbox One X doesnt run at native 4k but upscaled.

I guess it might be dead, not that it is dead. Maybe it's just me but I am not seeing the good ol' interviews to developers where they say they are using certain instructions or guru stuff like in the Cell-Xenos era.... Maybe I've missed something but I am not seein them
 
RG used the 1650 Super, a ~4.5TF GPU, not the ~3TF 1650 (vanilla). They’re all paired with a Ryzen 5 1600, a 3.2GHz base clock 6 core 12 thread Zen CPU that’s (conservatively?) ~2x faster than the ~2GHz ~7 core Jaguar CPUs in the “pro” consoles.

The 3.6TF PS4Pro runs the RE3 demo at 1620p at ~60fps (capped). 1080p is 56% of 1620p. 60fps is 43% of 140fps, which isn’t terrible considering the CPU and vsync handicaps.

The GTX 480 is 1.35TF, so 50fps at 800p seems in line with the 1.4TF X1 and its anemic bandwidth.

Horizon: Zero Dawn’s PC release should make for more interesting comparisons, especially vs a PS5/XSX. and their actually contemporary CPUs.
 
With respect to the OP, the console optimization advantage was always somewhat of a myth. Since PC hardware keeps advancing, at some point console "optimization" often ends up with devs trying to figure out how to make something work on console that can easily be done on modern PC hardware. For the majority of its life the G70 based PS3 was competing with DX10/11 class PC hardware which was far superior.
 
RG used the 1650 Super, a ~4.5TF GPU, not the ~3TF 1650 (vanilla). They’re all paired with a Ryzen 5 1600, a 3.2GHz base clock 6 core 12 thread Zen CPU that’s (conservatively?) ~2x faster than the ~2GHz ~7 core Jaguar CPUs in the “pro” consoles.

The 3.6TF PS4Pro runs the RE3 demo at 1620p at ~60fps (capped). 1080p is 56% of 1620p. 60fps is 43% of 140fps, which isn’t terrible considering the CPU and vsync handicaps.

The GTX 480 is 1.35TF, so 50fps at 800p seems in line with the 1.4TF X1 and its anemic bandwidth.

Horizon: Zero Dawn’s PC release should make for more interesting comparisons, especially vs a PS5/XSX. and their actually contemporary CPUs.

It would be interesting to compare HZD on a HD 7850 and GTX 660/650 boost as those were the closest GPUs to the PS4 in terms of rendering power. I suspect that would paint a very different picture, particularly on the Nvidia side. Death stranding would also be a great candidate. FWIW i saw a video some time ago, possibly by NXGamer, and a 750ti doesnt even keep up with an Xbox One anymore in the more optimized titles.
 
With respect to the OP, the console optimization advantage was always somewhat of a myth. Since PC hardware keeps advancing, at some point console "optimization" often ends up with devs trying to figure out how to make something work on console that can easily be done on modern PC hardware. For the majority of its life the G70 based PS3 was competing with DX10/11 class PC hardware which was far superior.
this! Nowadays there are games that reinforce the use of 120fps for instance, on PC, like Doom. And many people think that you sacrifice graphics to play at 120+fps which isnt true I've played RE2R dozens of hours with better graphics than the console versions, at high fps.

For instance, this guy plays at 120fps to his favour and beats the world record of RE2 Remake, playing totally fair.

 
It would be interesting to compare HZD on a HD 7850 and GTX 660/650 boost as those were the closest GPUs to the PS4 in terms of rendering power. I suspect that would paint a very different picture, particularly on the Nvidia side. Death stranding would also be a great candidate. FWIW i saw a video some time ago, possibly by NXGamer, and a 750ti doesnt even keep up with an Xbox One anymore in the more optimized titles.
That would surprise me. I've been using an Alienware Alpha (i3 2C4T Haswell and ~750Ti) as a base PS4 equivalent (1080p30, thanks to adaptive half-rate vsync) to mostly good effect. Funny that it can manage Outer Worlds and even Plague Tale Innocence but struggles with Operencia (probably because it's running windowed, not exclusive, fullscreen).
 
It would be interesting to compare HZD on a HD 7850 and GTX 660/650 boost as those were the closest GPUs to the PS4 in terms of rendering power. I suspect that would paint a very different picture, particularly on the Nvidia side. Death stranding would also be a great candidate. FWIW i saw a video some time ago, possibly by NXGamer, and a 750ti doesnt even keep up with an Xbox One anymore in the more optimized titles.
Kind of a strawman comparing PS4 to those old cards since the vast majority of PS4s were purchased well after the console launched when those cards were already obsolete. Still I think the HD7850 will provide a similar experience to a base PS4 in most games even today.
 
Kind of a strawman comparing PS4 to those old cards since the vast majority of PS4s were purchased well after the console launched when those cards were already obsolete. Still I think the HD7850 will provide a similar experience to a base PS4 in most games even today.

I dont think its a strawman at all. Its the fairest comparison there is. Equivalent tech from the same time period. Why does it matter when someone purchased the playstation? They are getting the same hardware someone who purchased it at launch got.
 
Last edited:
I dont think its a strawman at all. Its the fairest comparison there is. Equivalent tech from the same time period. Why does it matter when someone purchased the playstation? They are getting the same hardware someone who purchased it at launch got.
Fair enough, but devs and IHVs barely bother to support such old hardware (that wasn't even high end to begin with) these days, and the devs that do tend to not release their games on console (Blizzard, Riot etc.). But if you can find modern games that do support those cards by all means give it a shot. I think you'll find that they still perform about on par with the PS4, especially the HD7850.
 
Fair enough, but devs and IHVs barely bother to support such old hardware (that wasn't even high end to begin with) these days, and the devs that do tend to not release their games on console (Blizzard, Riot etc.). But if you can find modern games that do support those cards by all means give it a shot. I think you'll find that they still perform about on par with the PS4, especially the HD7850.

Isnt that the entire point though? The thread title questions whether or not console optimization is a myth. Sure if you upgrade to Nvidia's new architecture every single generation, console optimization will seem like a myth to you.
 
Back
Top