720p or 1080p?

Yeah, but that's not what most players say, when actually given the choice (i.e. PC gamers). Most of what they could conceivably cut down (shaders and the like), most people probably wouldn't even notice the difference.

Most people wouldn't notice a drop in resolution either. In fact most people don't really give a rat's ass about technical shenanigans whatsoever. Does it play fine? Does it look good? If the answer is yes in both cases, then job well done.
 
Thank for all the info.
So if I have a 1080p set connected to my PS4 then I get no benefits at all if I set the PS4 to 720p?
Well, the game still is rendered at 1080p then downscaled to 720p and then upscaled by TV to 1080p.

This does give ~2xSSAA for the 720p image which does help with some aliasing in the image, also if there is some form of AA already in the image the TV scaling can give quite nice quality upscale to 1080p.

In some games using 720p mode may end up being more stable and better looking than 1080p native.
 
Well, the game still is rendered at 1080p then downscaled to 720p and then upscaled by TV to 1080p.

This does give ~2xSSAA for the 720p image which does help with some aliasing in the image.
By blurring. Downscaling to 720p for display on a 1080p set makes little sense. You could possibly turn sharpness on your TV as low as it'll go and get the same result.
 
By blurring. Downscaling to 720p for display on a 1080p set makes little sense. You could possibly turn sharpness on your TV as low as it'll go and get the same result.
Pretty much, although if TV has good scaler it can preserve sharp edges 'decently'.

This is not preferred for case where game actually has nice and stable image quality, but in some cases it might end looking better. (if game uses dithering sampling patterns. etc.)
 
Because that setting only relates to what resolution you want your ps4 to output to, not at what resolution it's actually rendering. That is decided by the game engine. So if you have a Full HD tv pick 1080p, there won't be any performance difference.

Does it mean that the image of 720p mode is downscaled from 1080p mode? Thx!
 
The Siberia thing is obviously something that came at a big performance penalty; I think most people can imagine a higher resolution requiring a higher frame buffer thus leading to memory shortage.

It wasn't performance for Siberia because it didn't render much of anything, it's a point and click adventure. I think it was due to a memory leak when running in 1080p mode, so things would get progressively more and more garbled on screen then it would crash because unlike on pc there was no virtual memory to save it. I finished the game in 1080p mode on Xbox hence why I experience the crash but you could see it coming so I was able to save my game and reboot before it took out my progress.


But if you really believe that for example v-sync is something that requires extensive testing, then by that notion you will also think that PC games require hundreds of thousands of playthroughs in testing to cover every setting variable.
Unless PC games ship in a state where the games crash about every minute or something

The way console game testing used to be handled is any change, no matter how small, required the entire test suite to be redone. So even if you just recompile a new build that had no code changes at all but was a forced clean and build, then all testing starts over from scratch. Likewise every variable had to run through a test suite, so graphics options would all have to be tested in various states to make sure they didn't inadvertently affect the game. On pc there is no governing body that can fail your game if it doesn't meet certain criteria like there is on console, so you have to heavily test on console to make sure you don't fail certification. So let's say you add all these graphics options and it turns out that some combination of them cause a crash, glitch, or some kind of game play issue far later in the game. On pc it doesn't matter because you can patch it later, but if it causes you to fail certification on console then you have just lost 2 to 3 weeks of time, whatever that costs, and have to go back, fix it, do your in house tests all over again and resubmit.


Joker's experience is outdated. I could be wrong though!

Yeah to an extent it is because there is so much middleware available now so in theory it should be far easier to enable visual settings on console games. But there still is the certification issue to deal with. With so many variables to deal with I don't think many developers will want to risk certification fail to add pc style graphics options especially when it's unclear if console gamers even care about such things.
 
I really don't like the idea of options in my console games. Here is why:

Consoles have weak hardware. Their only advantage is exclusive developers, that can optimize the game down to the metal.

Options mean either less optimization (see PC as a reference) or would imply a tremendous increase in dev time, i.e. dev money.

No to options. Yes to hardware/software ninja tricks.
 
Consoles have weak hardware. Their only advantage is exclusive developers, that can optimize the game down to the metal.
Exclusive developers will be first/second party, and can lock those games down to the metal and hard-coded functionality. For the mainstream who aren't going to the metal and are just using a middleware, these options should have very little overhead (depending on what they are) and be relatively easy to employ.

No to options. Yes to hardware/software ninja tricks.
Ninja tricks would cost far more money. That sort of dev is a dying breed.
 
I have noticed that running the Xbox One at 1080p thru my 720p Samsung Plasma actually improves image quality of certain games. Forza 5 is one example. When the Tv downscales the 1080p image to 720p I notice much less aliasing than when I play it thru my native 1080p RCA LCD.
 
The way console game testing used to be handled is any change, no matter how small, required the entire test suite to be redone.
um yes thats all great sorry to be larry logic here (if that dude exists)
but is this change
only for 720p / 900p/ 1080p / 30fps / 60 fps / LDR / HDR / WGFDR
or is it cause we want blue balloons on level one instead of red balloons, or perhaps we want a differnt font on the title page?
 
or is it cause we want blue balloons on level one instead of red balloons, or perhaps we want a differnt font on the title page?
and if this is true (which it sorta is)
why even change anything at all, its just costing us time ;)
 
720p with better pixel quality all the way. I don't like the software from AMD that try to optimize the setting for games on my PC with native res as a priority. On Kaveri means that everything is set to low.
I like Tomb Raider because I can even choose res independent of the screen aspect ratio. I can get away with 1024x768 16:9. I even don't mind disabling AA (on lower res the blur effect is too visible, thus I rather not have it).
IIRC, Tomb Raider on PC have FXAA or SSAA. One is too cheap and blurry and one is too expensive. Because of that, I have none.
 

I can imagine that it indeed was very different before, so respect to people programming games in the past.

I just thought of something:

When you boot the console game and enter.. the Konami code:
an editable file appears that reverts every time you boot the game:
in which you can set resolution, lock the framerate (soft v-sync for all i care), change AA settings, change AF settings.
When you change one setting it shows "save game backuped" automatically, also it displays a disclaimer: "the performance of the game is not guaranteed under this mode, it only is for enthusiast people who want to change some settings. Online play is disabled".

That way everybody is happy.
 
Using amd dont you also have the option of mlaa

From the game itself, no. Maybe from the catalyst control panel. Haven't mess with it. Anyway, I already finished the game (and 100% everything minus the achievement because I don't care about it). Only needed to look for walk through for finding 3 stuffs at the beach. For me, the minimum acceptable resolution is 720p (or 1024x768 if the game decouple res with aspect ratio). I tried to go as low as 480, but it was a bridge too far. Text becoming too blurry to be read. Basically modern games were created for HD, thus resolution lesser than that is a no go.
I do play Guacamelee and Trine2 @1080p because it's practically free. But if a game drops fps too much, the first thing I do is drop the res.
 
Back
Top