Console optimisation, a gone myth? RE3R runs at 140+fps avg. on the GTX 1650 & 40-50 on a 10 y.o GPU

Discussion in 'PC Gaming' started by Cyan, Mar 25, 2020.

  1. Cyan

    Cyan orange
    Legend Veteran

    Joined:
    Apr 24, 2007
    Messages:
    8,890
    Likes Received:
    2,520
    There is a talk about the optimization in consoles, but today the hardware of consoles and PCs is very similar, so you no longer have to take advantage of Xenos or the Cell to the maximum, since the graphics APIs access the hardware to the metal.

    With The Windows 10 game mode that allocates resources to games and that nVidia and AMD releasing drivers optimized for each game when new versions come out, this is the result.

    The GTX 480, a 10 years old GPU featuring 1.5GB which was one of the first GPUs to support DirectX 11 runs the game at 40-50 fps and the 4GB GTX 1650 of 4GB runs the game at an average greater than 140fps, reaching 170+ fps.

    When you see this and then you see that the Xbox One X can't achieve 60 fps and runs between 40-50 fps, you wonder if this console optimization thing is forever gone, and a myth.

     
  2. Nesh

    Nesh Double Agent
    Legend

    Joined:
    Oct 2, 2005
    Messages:
    11,920
    Likes Received:
    2,277
    I dont have the time to check but whats the resolution and detail settings in the PC with the old GPU?
     
  3. TheAlSpark

    TheAlSpark Moderator
    Moderator Legend

    Joined:
    Feb 29, 2004
    Messages:
    21,171
    Likes Received:
    6,525
    Location:
    ಠ_ಠ
    1280x800 w/ Interlaced mode (probably reconstruction)

    Low Texture quality, Minimum shadows, Motion blur and DOF are off. Rest of settings are on.

    Cyan's being very disingenuous. >_>
     
  4. Cyan

    Cyan orange
    Legend Veteran

    Joined:
    Apr 24, 2007
    Messages:
    8,890
    Likes Received:
    2,520
    since it is one of the first GPUs running with DirectX 11 support, there are some problems with it, but in DX11 mode the game runs on it, only at 1280x800, no other resolutions. The settings are:

    DirectX11
    1280x800 (any other resolution DirectX 11 won't run the game on that card, error message given)
    Rendering Mode: Interlaced
    Image quality: 100%
    Refresh 60Hz
    Framerate: Variable
    Vsync: Off
    FXAA
    AFx16
    Mesh: High
    Shadow Quality: Min
    Shadow Cache: On
    SSR: On
    SSSA: On
    Volumetric Lighting: High (lol)
    Particle Lighting Quality: High
    HBAO+
    Bloom: On
    Lens flare: On
    Motion Blur: Off
    Depth of Field: Off
    Lens Distortion: On
    FidelityFX CAS + Upscaling: On
     
    Nesh likes this.
  5. Cyan

    Cyan orange
    Legend Veteran

    Joined:
    Apr 24, 2007
    Messages:
    8,890
    Likes Received:
    2,520
    no I am not, you didnt mention he sets Volumetric Lighting, which costs a lot at High, on High, and SSAO is more efficient than HBAO+, which he uses.

    Look at this video from @Dictator where he explains every setting if you dont trust the facts.

     
  6. homerdog

    homerdog donator of the year
    Legend Veteran Subscriber

    Joined:
    Jul 25, 2008
    Messages:
    6,216
    Likes Received:
    1,001
    Location:
    still camping with a mauler
    The XboneX has a pretty good GPU. If the consoles struggle with something like this I reckon it's the pathetic CPUs holding them back. Their single thread performance is absolutely horrible, worse than a modern smartphone I think.
     
    Pete and orangpelupa like this.
  7. orangpelupa

    orangpelupa Elite Bug Hunter
    Legend Veteran

    Joined:
    Oct 14, 2008
    Messages:
    7,870
    Likes Received:
    1,585
    Haven't watched the video but if console struggle to get 60 fps. Maybe cpu issue?

    Edit: gah, hadn't read homerdog post.
     
  8. TheAlSpark

    TheAlSpark Moderator
    Moderator Legend

    Joined:
    Feb 29, 2004
    Messages:
    21,171
    Likes Received:
    6,525
    Location:
    ಠ_ಠ
    DF had a look at the demo as well:


    Frame rate is really awful on OneX, but while it's running higher res by a pretty significant amount - 4K vs 1620p, ~1.78x pixels, both with reconstruction - it can sometimes be worse than what the shader power should show; possibly, it's the ROP advantage that 4Pro has during those explosions (common theme in other games too).

    The devs aren't slouches, so hopefully they're still tweaking things to hit a more stable 60 considering it's an old demo.

    ----

    If 4Pro is at 60fps, then 1.78x scaling should drop the framerate down to 32 while +50% shading power should bump the framerate to ~50fps, and there are a number of times that One X is lower than that.

    Perhaps they should only have bumped the resolution to 1800p.
     
    orangpelupa likes this.
  9. techuse

    Newcomer

    Joined:
    Feb 19, 2013
    Messages:
    197
    Likes Received:
    108
    What is this thread? Interlaced mode basically cuts the total number of pixels rendered in half if im remembering correctly. He also has FidelityFX Upscaling enabled. Im not sure how much further that drops resolution. When you factor that in the results suddenly arent impressive.
     
    London-boy likes this.
  10. London-boy

    London-boy Shifty's daddy
    Legend Subscriber

    Joined:
    Apr 13, 2002
    Messages:
    21,750
    Likes Received:
    5,709
    I’m really not understanding your point here, @Cyan.

    Lowering settings yields higher frame rate on PC. How is that new? And how is that proof that console optimisation is dead?
     
  11. Cyan

    Cyan orange
    Legend Veteran

    Joined:
    Apr 24, 2007
    Messages:
    8,890
    Likes Received:
    2,520
    what about the 1650 then? As for FidelityFX Upscaling, this is a new setting that is enabled by default in the game settings, and I am sure console versions use it. I have it enabled 'cos it was there by default, despite the fact that I run the game on a GTX 1080 which is enough to run this game well.

    Interlaced mode is present on consoles, at least in RE2 Remake, with different settings but they use some kind of temporal reconstruction.
    There is no concessions with the GTX 1650 graphics card, which is medium to low in the PC world. 140fps on average at 1080p means that the game can run on that card at 1440p at perfect 60 fps. Far better than any console, which use upscaling and so on. The Xbox One X doesnt run at native 4k but upscaled.

    I guess it might be dead, not that it is dead. Maybe it's just me but I am not seeing the good ol' interviews to developers where they say they are using certain instructions or guru stuff like in the Cell-Xenos era.... Maybe I've missed something but I am not seein them
     
  12. Pete

    Pete Moderate Nuisance
    Moderator Legend Veteran

    Joined:
    Feb 7, 2002
    Messages:
    5,032
    Likes Received:
    467
    RG used the 1650 Super, a ~4.5TF GPU, not the ~3TF 1650 (vanilla). They’re all paired with a Ryzen 5 1600, a 3.2GHz base clock 6 core 12 thread Zen CPU that’s (conservatively?) ~2x faster than the ~2GHz ~7 core Jaguar CPUs in the “pro” consoles.

    The 3.6TF PS4Pro runs the RE3 demo at 1620p at ~60fps (capped). 1080p is 56% of 1620p. 60fps is 43% of 140fps, which isn’t terrible considering the CPU and vsync handicaps.

    The GTX 480 is 1.35TF, so 50fps at 800p seems in line with the 1.4TF X1 and its anemic bandwidth.

    Horizon: Zero Dawn’s PC release should make for more interesting comparisons, especially vs a PS5/XSX. and their actually contemporary CPUs.
     
    Cyan, techuse, BRiT and 2 others like this.
  13. homerdog

    homerdog donator of the year
    Legend Veteran Subscriber

    Joined:
    Jul 25, 2008
    Messages:
    6,216
    Likes Received:
    1,001
    Location:
    still camping with a mauler
    With respect to the OP, the console optimization advantage was always somewhat of a myth. Since PC hardware keeps advancing, at some point console "optimization" often ends up with devs trying to figure out how to make something work on console that can easily be done on modern PC hardware. For the majority of its life the G70 based PS3 was competing with DX10/11 class PC hardware which was far superior.
     
    London-boy and Cyan like this.
  14. techuse

    Newcomer

    Joined:
    Feb 19, 2013
    Messages:
    197
    Likes Received:
    108
    It would be interesting to compare HZD on a HD 7850 and GTX 660/650 boost as those were the closest GPUs to the PS4 in terms of rendering power. I suspect that would paint a very different picture, particularly on the Nvidia side. Death stranding would also be a great candidate. FWIW i saw a video some time ago, possibly by NXGamer, and a 750ti doesnt even keep up with an Xbox One anymore in the more optimized titles.
     
  15. Cyan

    Cyan orange
    Legend Veteran

    Joined:
    Apr 24, 2007
    Messages:
    8,890
    Likes Received:
    2,520
    this! Nowadays there are games that reinforce the use of 120fps for instance, on PC, like Doom. And many people think that you sacrifice graphics to play at 120+fps which isnt true I've played RE2R dozens of hours with better graphics than the console versions, at high fps.

    For instance, this guy plays at 120fps to his favour and beats the world record of RE2 Remake, playing totally fair.

     
  16. Pete

    Pete Moderate Nuisance
    Moderator Legend Veteran

    Joined:
    Feb 7, 2002
    Messages:
    5,032
    Likes Received:
    467
    That would surprise me. I've been using an Alienware Alpha (i3 2C4T Haswell and ~750Ti) as a base PS4 equivalent (1080p30, thanks to adaptive half-rate vsync) to mostly good effect. Funny that it can manage Outer Worlds and even Plague Tale Innocence but struggles with Operencia (probably because it's running windowed, not exclusive, fullscreen).
     
  17. homerdog

    homerdog donator of the year
    Legend Veteran Subscriber

    Joined:
    Jul 25, 2008
    Messages:
    6,216
    Likes Received:
    1,001
    Location:
    still camping with a mauler
    Kind of a strawman comparing PS4 to those old cards since the vast majority of PS4s were purchased well after the console launched when those cards were already obsolete. Still I think the HD7850 will provide a similar experience to a base PS4 in most games even today.
     
  18. techuse

    Newcomer

    Joined:
    Feb 19, 2013
    Messages:
    197
    Likes Received:
    108
    I dont think its a strawman at all. Its the fairest comparison there is. Equivalent tech from the same time period. Why does it matter when someone purchased the playstation? They are getting the same hardware someone who purchased it at launch got.
     
    #18 techuse, Mar 27, 2020
    Last edited: Mar 27, 2020
  19. homerdog

    homerdog donator of the year
    Legend Veteran Subscriber

    Joined:
    Jul 25, 2008
    Messages:
    6,216
    Likes Received:
    1,001
    Location:
    still camping with a mauler
    Fair enough, but devs and IHVs barely bother to support such old hardware (that wasn't even high end to begin with) these days, and the devs that do tend to not release their games on console (Blizzard, Riot etc.). But if you can find modern games that do support those cards by all means give it a shot. I think you'll find that they still perform about on par with the PS4, especially the HD7850.
     
    Pixel likes this.
  20. techuse

    Newcomer

    Joined:
    Feb 19, 2013
    Messages:
    197
    Likes Received:
    108
    Isnt that the entire point though? The thread title questions whether or not console optimization is a myth. Sure if you upgrade to Nvidia's new architecture every single generation, console optimization will seem like a myth to you.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...