720p or 1080p?

Discussion in 'Console Industry' started by Barso, Jul 27, 2014.

  1. Barso

    Newcomer

    Joined:
    Nov 24, 2008
    Messages:
    67
    Likes Received:
    0
    I am asking if setting my PS4 to 720p would lessen the burden on the GPU as it would be pushing less pixels but I learnt that the PS4 renders the game at it's full resolution and then scales it to whatever res I set it to.
    Why don't devs have a 720p mode were they push the frame-rate, AA and AF higher if 1080p can't support it?
    I am now enjoying the better image quality at 1080p as I had no idea that even on 720p the 1080p image was still being pushed on the PS4 and then down scaled to 720p.
    Why don't devs have a 720p mode were they push the frame-rate, AA and AF higher if 1080p can't support it?
    Thank you for all help and advice.
     
  2. tongue_of_colicab

    Veteran

    Joined:
    Oct 7, 2004
    Messages:
    3,600
    Likes Received:
    773
    Location:
    Japan
    Because that setting only relates to what resolution you want your ps4 to output to, not at what resolution it's actually rendering. That is decided by the game engine. So if you have a Full HD tv pick 1080p, there won't be any performance difference.
     
  3. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    43,577
    Likes Received:
    16,028
    Location:
    Under my bridge
    There are some games (or at least one) where you can force the resolution and gain framerate gains but very few games render 1080p anyway. Almost all are 720p or thereabouts (worse!) and upscaled to 1080p.
     
  4. DuckThor Evil

    DuckThor Evil Anas platyrhynchos
    Legend Veteran

    Joined:
    Jul 9, 2004
    Messages:
    5,878
    Likes Received:
    897
    Location:
    Finland
    He was actually talking about the PS4.
     
  5. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    43,577
    Likes Received:
    16,028
    Location:
    Under my bridge
  6. BRiT

    BRiT Verified (╯°□°)╯
    Moderator Legend Alpha

    Joined:
    Feb 7, 2002
    Messages:
    15,905
    Likes Received:
    14,820
    Location:
    Cleveland
    That significantly impacts development efforts and testing time, all of which they do not have.
     
  7. Barso

    Newcomer

    Joined:
    Nov 24, 2008
    Messages:
    67
    Likes Received:
    0
    Thank for all the info.
    So if I have a 1080p set connected to my PS4 then I get no benefits at all if I set the PS4 to 720p?
    Thanks but I thought that running it lower would be the equivalent of setting my PC graphics card to a lower resolution with vsync engaged to take the strain off the GPU.
     
  8. Jedi2016

    Veteran

    Joined:
    Aug 23, 2005
    Messages:
    1,021
    Likes Received:
    0
    Consoles don't work like PCs.

    Your best bet will always be to run the game at your display's native resolution. Even for PC, resolution is usually the absolute last thing you'll want to reduce to try to improve performance.

    And yes, I'm one of those "Give me 1080p or give me death" people. I've owned a 1080p TV since before last-gen, and I'm absolutely suck and f'in tired of not being able to play games at my TV's native resolution.
     
  9. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    43,577
    Likes Received:
    16,028
    Location:
    Under my bridge
    They kinda do now. They are PCs in design (maybe not XB1 that needs better software balancing). When running middleware that supports resolution scaling on PC and different quality levels, that'll be implementable on PS4 in exactly the same way. You set resolution in software and can enable/disable features.
     
  10. joker454

    Veteran

    Joined:
    Dec 28, 2006
    Messages:
    3,819
    Likes Received:
    139
    Location:
    So. Cal.
    Yeah that's the primary reason you don't see options on console games. It's not because they can't do it because surely they can, but they need to keep testing as simple as possible by removing all the variables. In theory they could ship with default tested settings and have optional "at your own risk" settings available, but let's face it people would still be pissed if a game glitched or crashed when they messed with the "at your own risk" settings and would probably tell the world about how crap the game is on facebook, amazon ratings,etc because that's what people do. So it's just not worth the risk.
     
  11. Jedi2016

    Veteran

    Joined:
    Aug 23, 2005
    Messages:
    1,021
    Likes Received:
    0
    You know what I meant. I'm not talking about hardware.
     
  12. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    43,577
    Likes Received:
    16,028
    Location:
    Under my bridge
    But the traditions are stooped in legacy hardware design where you output to the only resolution available (based on graphics mode you chose during development). We don't need to hold to them any more. If the market decides it wants options, devs can give them as readily as they can on PC. Is testing going to be more complex? Yes, but at the same time a lot of the options should be pretty trivial and games aren't properly QA'd any more anywhere. Get something out there and patch it is the current modus operandi. So I can see games targeting a spec and then providing options to tweak and buyer beware. Joker may have a point about backlash, but then maybe not. Reaching a wider audience but allowing them to personalise the experience sounds good on paper.
     
  13. Davros

    Legend

    Joined:
    Jun 7, 2004
    Messages:
    15,965
    Likes Received:
    3,203
    Thats whats always baffled me about consoles resolution is always the first sacrifice, not ao or shadow quality or shader quality or filtering ect.
     
  14. NRP

    NRP
    Veteran

    Joined:
    Aug 26, 2004
    Messages:
    2,712
    Likes Received:
    293
    Isn't that because most devs say:

    "better pixels" > "more pixels"?
     
  15. Jedi2016

    Veteran

    Joined:
    Aug 23, 2005
    Messages:
    1,021
    Likes Received:
    0
    Yeah, but that's not what most players say, when actually given the choice (i.e. PC gamers). Most of what they could conceivably cut down (shaders and the like), most people probably wouldn't even notice the difference.
     
  16. SlimJim

    Banned

    Joined:
    Aug 29, 2013
    Messages:
    590
    Likes Received:
    0
    No: Bioshock Infinite for example had a "disable v-sync" option; time it took to implement this: about 500 seconds. + 60 seconds per language version for the translation.
    Total time: under 20 minutes.

    In the same way: games could ship with an "resolution= 1280*720, af=16, aa=msaa4, vsync=60" option.
    I'll ask guerrilla to implement this so you will see it in the next killzone
     
  17. joker454

    Veteran

    Joined:
    Dec 28, 2006
    Messages:
    3,819
    Likes Received:
    139
    Location:
    So. Cal.
    ...and Siberia on the original Xbox had a 1080p option, which in no way means it's an easy thing to do on a wide scale, nor that developers are interested in making their already complicated lives even more complicated. I could make a long list of bugs I encountered over the years from "simple" changes like just disabling vsync, but would you even believe me? Fun stuff like a game I worked on that would randomly lock up and no one knew why, and it turned out because that particilar build didn't have vsync enabled which caused thread deadlocks on very rare occasions. Good times. Oh and about that "simple" 1080p option on Siberia? It had a later documented bug that would cause the game to eventually crash, a bug that didn't affect normal resolution mode.


    Let us know how that goes.
     
  18. Inuhanyou

    Veteran Regular

    Joined:
    Dec 23, 2012
    Messages:
    1,114
    Likes Received:
    286
    Location:
    New Jersey, USA
    As far as i know, only PS3 did this because it didn't have a good enough scaler to do output resolutions.

    I know if you forced your output to 720p, in certain games the framerate would run much smoother. I know in FFX HD for example, there's a lot of framedrops when summoning and doing super attacks, but at 720p they are gone
     
  19. SlimJim

    Banned

    Joined:
    Aug 29, 2013
    Messages:
    590
    Likes Received:
    0
    The Siberia thing is obviously something that came at a big performance penalty; I think most people can imagine a higher resolution requiring a higher frame buffer thus leading to memory shortage.

    Can disabling v-sync cause a game to crash? As an anomaly, sure.

    But if you really believe that for example v-sync is something that requires extensive testing, then by that notion you will also think that PC games require hundreds of thousands of playthroughs in testing to cover every setting variable.
    Unless PC games ship in a state where the games crash about every minute or something

    edit: if the animations or the game engine requires a certain frame time; then disabling v-sync could of course lead to many anomalies
     
  20. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    43,577
    Likes Received:
    16,028
    Location:
    Under my bridge
    PC has virtualised hardware that distances the developers, enabling a broader array of configurations to work from the same code base at a hardware performance penalty. Up until now, consoles have allowed very direct coding resulting in better performance extraction from finely tuned, more vulnerable code.

    Ummm....yes. Okay, broadly they work, but we see plenty of nasty crashes, like frame pacing bugs and virtual texture bugs recently discussed here.

    Joker's mention of multithreaded conflicts goes to show how vulnerable games can be to code. That said, I believe we're mostly passed those days of low-level code (save for ND and the like writing awesome exclusives) and the hardware is being abstracted enough that things like resolution and AA/AF amount can be determined by users. eg. These are exposed in Unity for any Unity based game. It'd be a fairly trivial to add an AF quality slider in an options menu, and that should work for any console rendition of the game. Obviously that's from an indie POV. AAA is going to be somewhat different, so that Guerilla are going to have to develop their engine to support this should you want to. For other (the majority?) games, any middleware like UE should be handling the rendering and should be handling it in a way that allows render setting tweaks on PC, which means the same engine should allow rendering tweaks on console unless it's a highly optimised engine.

    In summary, I think you grossly underestimate the difficulties in supporting variable render mode and the bugs that can be produced, but I also think for the majority of games, it wouldn't actually be an issue any more, and Joker's experience is outdated. I could be wrong though!
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...