720p or 1080p?

Barso

Newcomer
I am asking if setting my PS4 to 720p would lessen the burden on the GPU as it would be pushing less pixels but I learnt that the PS4 renders the game at it's full resolution and then scales it to whatever res I set it to.
Why don't devs have a 720p mode were they push the frame-rate, AA and AF higher if 1080p can't support it?
I am now enjoying the better image quality at 1080p as I had no idea that even on 720p the 1080p image was still being pushed on the PS4 and then down scaled to 720p.
Why don't devs have a 720p mode were they push the frame-rate, AA and AF higher if 1080p can't support it?
Thank you for all help and advice.
 
Because that setting only relates to what resolution you want your ps4 to output to, not at what resolution it's actually rendering. That is decided by the game engine. So if you have a Full HD tv pick 1080p, there won't be any performance difference.
 
There are some games (or at least one) where you can force the resolution and gain framerate gains but very few games render 1080p anyway. Almost all are 720p or thereabouts (worse!) and upscaled to 1080p.
 
Why don't devs have a 720p mode were they push the frame-rate, AA and AF higher if 1080p can't support it?

That significantly impacts development efforts and testing time, all of which they do not have.
 
Thank for all the info.
So if I have a 1080p set connected to my PS4 then I get no benefits at all if I set the PS4 to 720p?
Thanks but I thought that running it lower would be the equivalent of setting my PC graphics card to a lower resolution with vsync engaged to take the strain off the GPU.
 
Consoles don't work like PCs.

Your best bet will always be to run the game at your display's native resolution. Even for PC, resolution is usually the absolute last thing you'll want to reduce to try to improve performance.

And yes, I'm one of those "Give me 1080p or give me death" people. I've owned a 1080p TV since before last-gen, and I'm absolutely suck and f'in tired of not being able to play games at my TV's native resolution.
 
Consoles don't work like PCs.
They kinda do now. They are PCs in design (maybe not XB1 that needs better software balancing). When running middleware that supports resolution scaling on PC and different quality levels, that'll be implementable on PS4 in exactly the same way. You set resolution in software and can enable/disable features.
 
That significantly impacts development efforts and testing time, all of which they do not have.

Yeah that's the primary reason you don't see options on console games. It's not because they can't do it because surely they can, but they need to keep testing as simple as possible by removing all the variables. In theory they could ship with default tested settings and have optional "at your own risk" settings available, but let's face it people would still be pissed if a game glitched or crashed when they messed with the "at your own risk" settings and would probably tell the world about how crap the game is on facebook, amazon ratings,etc because that's what people do. So it's just not worth the risk.
 
They kinda do now. They are PCs in design (maybe not XB1 that needs better software balancing). When running middleware that supports resolution scaling on PC and different quality levels, that'll be implementable on PS4 in exactly the same way. You set resolution in software and can enable/disable features.
You know what I meant. I'm not talking about hardware.
 
You know what I meant. I'm not talking about hardware.
But the traditions are stooped in legacy hardware design where you output to the only resolution available (based on graphics mode you chose during development). We don't need to hold to them any more. If the market decides it wants options, devs can give them as readily as they can on PC. Is testing going to be more complex? Yes, but at the same time a lot of the options should be pretty trivial and games aren't properly QA'd any more anywhere. Get something out there and patch it is the current modus operandi. So I can see games targeting a spec and then providing options to tweak and buyer beware. Joker may have a point about backlash, but then maybe not. Reaching a wider audience but allowing them to personalise the experience sounds good on paper.
 
Even for PC, resolution is usually the absolute last thing you'll want to reduce to try to improve performance.
Thats whats always baffled me about consoles resolution is always the first sacrifice, not ao or shadow quality or shader quality or filtering ect.
 
Thats whats always baffled me about consoles resolution is always the first sacrifice, not ao or shadow quality or shader quality or filtering ect.
Isn't that because most devs say:

"better pixels" > "more pixels"?
 
Yeah, but that's not what most players say, when actually given the choice (i.e. PC gamers). Most of what they could conceivably cut down (shaders and the like), most people probably wouldn't even notice the difference.
 
Yeah that's the primary reason you don't see options on console games. It's not because they can't do it because surely they can, but they need to keep testing as simple as possible by removing all the variables. In theory they could ship with default tested settings and have optional "at your own risk" settings available, but let's face it people would still be pissed if a game glitched or crashed when they messed with the "at your own risk" settings and would probably tell the world about how crap the game is on facebook, amazon ratings,etc because that's what people do. So it's just not worth the risk.

No: Bioshock Infinite for example had a "disable v-sync" option; time it took to implement this: about 500 seconds. + 60 seconds per language version for the translation.
Total time: under 20 minutes.

In the same way: games could ship with an "resolution= 1280*720, af=16, aa=msaa4, vsync=60" option.
I'll ask guerrilla to implement this so you will see it in the next killzone
 
No: Bioshock Infinite for example had a "disable v-sync" option; time it took to implement this: about 500 seconds. + 60 seconds per language version for the translation.
Total time: under 20 minutes.

...and Siberia on the original Xbox had a 1080p option, which in no way means it's an easy thing to do on a wide scale, nor that developers are interested in making their already complicated lives even more complicated. I could make a long list of bugs I encountered over the years from "simple" changes like just disabling vsync, but would you even believe me? Fun stuff like a game I worked on that would randomly lock up and no one knew why, and it turned out because that particilar build didn't have vsync enabled which caused thread deadlocks on very rare occasions. Good times. Oh and about that "simple" 1080p option on Siberia? It had a later documented bug that would cause the game to eventually crash, a bug that didn't affect normal resolution mode.


In the same way: games could ship with an "resolution= 1280*720, af=16, aa=msaa4, vsync=60" option.
I'll ask guerrilla to implement this so you will see it in the next killzone

Let us know how that goes.
 
As far as i know, only PS3 did this because it didn't have a good enough scaler to do output resolutions.

I know if you forced your output to 720p, in certain games the framerate would run much smoother. I know in FFX HD for example, there's a lot of framedrops when summoning and doing super attacks, but at 720p they are gone
 
...and Siberia on the original Xbox had a 1080p option, which in no way means it's an easy thing to do on a wide scale, nor that developers are interested in making their already complicated lives even more complicated. I could make a long list of bugs I encountered over the years from "simple" changes like just disabling vsync, but would you even believe me? Fun stuff like a game I worked on that would randomly lock up and no one knew why, and it turned out because that particilar build didn't have vsync enabled which caused thread deadlocks on very rare occasions. Good times. Oh and about that "simple" 1080p option on Siberia? It had a later documented bug that would cause the game to eventually crash, a bug that didn't affect normal resolution mode.




Let us know how that goes.

The Siberia thing is obviously something that came at a big performance penalty; I think most people can imagine a higher resolution requiring a higher frame buffer thus leading to memory shortage.

Can disabling v-sync cause a game to crash? As an anomaly, sure.

But if you really believe that for example v-sync is something that requires extensive testing, then by that notion you will also think that PC games require hundreds of thousands of playthroughs in testing to cover every setting variable.
Unless PC games ship in a state where the games crash about every minute or something

edit: if the animations or the game engine requires a certain frame time; then disabling v-sync could of course lead to many anomalies
 
But if you really believe that for example v-sync is something that requires extensive testing, then by that notion you will also think that PC games require hundreds of thousands of playthroughs in testing to cover every setting variable.
PC has virtualised hardware that distances the developers, enabling a broader array of configurations to work from the same code base at a hardware performance penalty. Up until now, consoles have allowed very direct coding resulting in better performance extraction from finely tuned, more vulnerable code.

Unless PC games ship in a state where the games crash about every minute or something
Ummm....yes. Okay, broadly they work, but we see plenty of nasty crashes, like frame pacing bugs and virtual texture bugs recently discussed here.

edit: if the animations or the game engine requires a certain frame time; then disabling v-sync could of course lead to many anomalies
Joker's mention of multithreaded conflicts goes to show how vulnerable games can be to code. That said, I believe we're mostly passed those days of low-level code (save for ND and the like writing awesome exclusives) and the hardware is being abstracted enough that things like resolution and AA/AF amount can be determined by users. eg. These are exposed in Unity for any Unity based game. It'd be a fairly trivial to add an AF quality slider in an options menu, and that should work for any console rendition of the game. Obviously that's from an indie POV. AAA is going to be somewhat different, so that Guerilla are going to have to develop their engine to support this should you want to. For other (the majority?) games, any middleware like UE should be handling the rendering and should be handling it in a way that allows render setting tweaks on PC, which means the same engine should allow rendering tweaks on console unless it's a highly optimised engine.

In summary, I think you grossly underestimate the difficulties in supporting variable render mode and the bugs that can be produced, but I also think for the majority of games, it wouldn't actually be an issue any more, and Joker's experience is outdated. I could be wrong though!
 
Back
Top