1080p Dilemma

Shortbread

Island Hopper
Legend
No.

They should have run it at a resolution that allowed them to hit somewhere even remotely close to their performance target.

1080p with AIDS ridden, aniso-less-full-screen-blurred, sub 30 fps "60 fps", tearing afflicted graphics is like proving to fuckwits what a man you are by cutting your own balls off.

Seriously, this is the worst generation ever. The 1080p cheerleaders have wrought destruction of console gaming through ignorance, hubris and wilful stupidity.

Horrendous.

So basically, ps4 should be 900 with more restriction on tearing, and Xbox should be 792 (or something).

Yet more 1080p chasing madness.

Since function is under the impression that 1080p is ruining gaming... should 3rd party developers just opt-out the 1080p drama, and go for lower native resolutions regardless of system capabilities? While, internal and 1st party teams strive for the best possible native IQ.
 
Quality over quantity.

I remember when that was an old Nintendo tag-line... :LOL: But for the most part, I do agree.

Could this be why certain people are claiming PS4 doesn't have any games... because of quality? j/k :LOL:
 
If the conversation is in reference to lords of the fallen I think the devs need to do a better job of optimizing their games. Same with The Evil Within. Im not a big fan of games being 1080p just to meet a spec that is "next gen". Whether people like it or not games will most likely resort to sub native res towards the end of this console gen. I guess it all comes down to what is better for the title in the end.
1080p and not hitting a stable framerate or Sub native with better framerate. Ofcourse not every title is going to have an issue with either.
 
I'm not sure what developers in general expected out of this gen consoles but it feels like they wanted something more powerful... or overestimated their ability to optimize and cut enough corners to get desired result. I'd say go for the 900p or under if needed. Try to get that stable 30fps and some aa methoid that's not extremely blurry. Those who care about framerate above that or higher resolution can always get PC version (outside console exclusives of course).

For example I have no idea why Ubisoft wanted to push thousands of mobs when hundred of mobs would have been better solution (for this generation of consoles. PC is different story or next gen if there's one). Aim lower I say :cool:
 
Since function is under the impression that 1080p is ruining gaming...

Not what I'm saying.

1920 x 1080 is just a resolution. It's one choice among many. The significance that particular resolution has taken on is far beyond its significance in determining the visual (or other) qualities of a game.

1920 x 1080 isn't hurting gaming, making wrong choices to appease the ill-informed opinionated is hurting gaming. Unplayable framerate due to resolution? Wrong resolution.

Also: quality above quantity (to repeat the words of sage Brit). The quantity s largely invisible even to the mouthbreathers who huff bullshit allover the internets, as demonstrated time after time. The quality is very much something that affects everyone, however.
 
I am currently still thinking that most of the framerate issues could be solved by moving CPU AI code to GPU AI Code. Since that's not trivial, and still new for a fair number of developers, that'll take a little while before that becomes common. And I wouldn't be surprised if, since games currently releasing mostly are still primarily developed on PC, they are really still very much tailored also to the strengths (strong CPU) and weaknesses (draw-calls ... ). As porting is much easier, we're seeing a lot more low quality ports of games that otherwise wouldn't have made it over at all.
 
So, should ACU lower its resolution to 720p (or lower)? From all indications, the framerate isn't consistent, nor texture quality. So is 900p the culprit? Or could it be the awful choices UBI has made overall (super-uber-duber A.I. (sarcasm), hundreds of NPCs, etc...) so far with ACU?

In other words, if a game is performing poorly regardless of resolution - shouldn't the Internet buzzwords be "poor development choices", not the boogeyman 1080p?

IMHO, 1080p just sounds like scapegoating for poor design choices.
 
Resolution is one of many tick box features one can play with. If devs really wanted to, they could scale back on shadow detail, LOD, lighting, or whatever parts of the graphics engine they want to sacrifice if they really needed 1080p as a constant without affecting the framerate.
The fact that we get 1080p with tearing and sub-30fps gameplay is, in my humble opinion, just a matter of bad decisions in the game making process.
 
Not what I'm saying.

1920 x 1080 is just a resolution. It's one choice among many. The significance that particular resolution has taken on is far beyond its significance in determining the visual (or other) qualities of a game.

1920 x 1080 isn't hurting gaming, making wrong choices to appease the ill-informed opinionated is hurting gaming. Unplayable framerate due to resolution? Wrong resolution.

Also: quality above quantity (to repeat the words of sage Brit). The quantity s largely invisible even to the mouthbreathers who huff bullshit allover the internets, as demonstrated time after time. The quality is very much something that affects everyone, however.
Does 1080p need AA at all? I don't think so.

I wonder to what degree they could improve performance by removing AA altogether and the aberration they use, too.

Still I like Lords of the Fallen, it's from a small developer and the result isn't that bad for a first experience on consoles.
 
I am currently still thinking that most of the framerate issues could be solved by moving CPU AI code to GPU AI Code.

The conundrum there is that shifting cpu code to gpu will require that they forgo the 1080p bullet point. In the current age of 1080p or we scream lazy devs, I'd say marketing will not want to go there.
 
Does 1080p need AA at all? I don't think so.
1080p definitely needs AA when viewed on large screes. 50"+ TVs are dirt cheap nowadays and people buy these to the same living rooms that used to have 28" CRTs. 1080p on a modern big screen TV produces roughly the same pixel size than 480p did some years ago on an average TV set.

Post AA is not enough for 1080p, not even on a small computer monitor. Edge crawling is still too distracting and even a bigger problem nowaways since the geometry complexity has increased (more small details and more draw distance -> more high contrast edges).

I'd say that (max) four geometry subsamples per pixel is a good compromise when combined with a smart custom resolve. Sampling doesn't need to be brute force. You don't need the same amount of sampling information on every screen location (not always even a single sample per pixel). It's the high contrast areas that matter. I personally feel that every console game should output at native 1080p. Scalers always cause image quality degradation. But I want to emphasize that this doesn't mean that the game should brute force sample everything at the same frequency (fixed distribution of ~2M samples per frame). Also throwing away all the work done in the previous frames is stupid. Game developers should definitely learn from the video codecs. 1080p wouldn't be possible if the video was uncompressed and no data was reused (every pixel stored again for every frame at full quality).

So I believe 1080p is still the way to go (even at 60 fps).
 
Food for thought, I just had a look there were actually quite a few games on ps360 that were 1080p eg
full auto 2, was 1080p with 4xMSAA! (none of your cheap afterwards added AA) OK these games arent as complicated as whats expected on todays consoles, but todays consoles have an order of magnitude more GPU power
 
1080p definitely needs AA when viewed on large screes. 50"+ TVs are dirt cheap nowadays and people buy these to the same living rooms that used to have 28" CRTs. 1080p on a modern big screen TV produces roughly the same pixel size than 480p did some years ago on an average TV set.

Post AA is not enough for 1080p, not even on a small computer monitor. Edge crawling is still too distracting and even a bigger problem nowaways since the geometry complexity has increased (more small details and more draw distance -> more high contrast edges).

I'd say that (max) four geometry subsamples per pixel is a good compromise when combined with a smart custom resolve. Sampling doesn't need to be brute force. You don't need the same amount of sampling information on every screen location (not always even a single sample per pixel). It's the high contrast areas that matter. I personally feel that every console game should output at native 1080p. Scalers always cause image quality degradation. But I want to emphasize that this doesn't mean that the game should brute force sample everything at the same frequency (fixed distribution of ~2M samples per frame). Also throwing away all the work done in the previous frames is stupid. Game developers should definitely learn from the video codecs. 1080p wouldn't be possible if the video was uncompressed and no data was reused (every pixel stored again for every frame at full quality).

So I believe 1080p is still the way to go (even at 60 fps).
I see.... So 1080p isn't just a buzzword then. Maybe, if I may be permitted the metaphor, I am a traditionalist -like bkilian would say-, and maybe I rather prefer some old fashioned roller skates over a motorcycle or over what for others should be nothing short of a Mercedes Benz, :p to visit my neighbour who lives 100 metres away from my house.

I just hope you don't forget about gameplay when making your games. Would dropping the resolution from 1080p to something else be open up for you to negotiations to create 3D stereoscopic games? :smile2:120 or 240 fps?
 
The conundrum there is that shifting cpu code to gpu will require that they forgo the 1080p bullet point. In the current age of 1080p or we scream lazy devs, I'd say marketing will not want to go there.
I'd say that the Xbox One CPU could be the iceberg of the Xbox One. :smile2:

I mean..., with a dedicated 30GB/s bus and judging from this Metro 2033 Digital Foundry interview on how they created the game for all the platforms, that might be the case.

http://www.eurogamer.net/articles/digitalfoundry-metro2033-article (also note the beast that X360 GPU was at its time, by the end of the article)
 
The important question is: What is GPGPU? Is animation GPGPU? Is occlusion culling GPGPU? Is scene setup (matrix & constant buffer update) GPGPU? Is texture transcoding GPGPU? Traditionally games have been doing all of these tasks purely on the CPU (and many still do), but modern engines can do all of these on the GPU (in the compute pipeline). You can free at least two whole CPU cores to gameplay code if you move these graphics engine tasks to the GPU. Is lighting or particle rendering GPGPU if I do them in a compute shader (not using any fixed function graphics hardware)? What about raytraced reflections or octree traversals (traditional CPU tasks)?

Are modern game engines flexible enough that one can change this decision, maybe even frame by frame?

Say, if the scheduler screams that CPU cycles are left put e.g. texture transcoding there, if CPU is used by processes with higher priorities (i.e. Parts which are not flexible enough for GPGPU) and nothing is left on the CPU use the GPGPU algorithm and spinning the thing even further: scale the dynamic resolution such that the target frametime is guaranteed...
 
I guess there's good arguments for and against 1080p. I say leave it up to the devs just as long as they are able to hit a rock solid frame rate at their target fps. If that requires a lower resolution then so be it. Hopefully later this gen these issues will be largely resolved, or the decision making process going into it will be easier.
 
Devs that actually put the effort into it, especially on exclusive titles, seem to have no trouble hitting 1080p at good performance levels. I wouldn't use someone like Ubisoft as a barometer for what devs in general are capable of doing on the hardware.
 
I can't think of any sub 1080p titles on the PS4 from Sony's own internal/1st party teams. Other than KZ:SF MP rendering 1080p in a different manner.
 
Back
Top