1080p Dilemma

I can't think of any sub 1080p titles on the PS4 from Sony's own internal/1st party teams. Other than KZ:SF MP rendering 1080p in a different manner.
It was rendering 960x1080 in terms of new pixels drawn every frame. Also "The Order" can be considered rendering <1080p due to letterboxing. It's 20% less pixels or thereabouts. Although thanks to aspect, we can consider The Order still the same '1080p' only in the horizontal. But technically, it's <1080p pixels being drawn.
 
Yep. But, if you care more about the one-to-one pixel aspect of the resolution than the actual number of pixels, native is native. ;)

I think Call of Duty choosing for a dynamic resolution rather than just the old 900p on Xbox One is promising though - in my experience so far with games that do that (like Wipeout HD on PS3 already did to achieve 1080p @60fps!) that's a much better solution.

Also, added Call of Duty with the reddit information ... http://techingames.net/Games/Details/55
 
1080p definitely needs AA when viewed on large screes. 50"+ TVs are dirt cheap nowadays and people buy these to the same living rooms that used to have 28" CRTs. 1080p on a modern big screen TV produces roughly the same pixel size than 480p did some years ago on an average TV set.

Post AA is not enough for 1080p, not even on a small computer monitor. Edge crawling is still too distracting and even a bigger problem nowaways since the geometry complexity has increased (more small details and more draw distance -> more high contrast edges).

I'd say that (max) four geometry subsamples per pixel is a good compromise when combined with a smart custom resolve. Sampling doesn't need to be brute force. You don't need the same amount of sampling information on every screen location (not always even a single sample per pixel). It's the high contrast areas that matter. I personally feel that every console game should output at native 1080p. Scalers always cause image quality degradation. But I want to emphasize that this doesn't mean that the game should brute force sample everything at the same frequency (fixed distribution of ~2M samples per frame). Also throwing away all the work done in the previous frames is stupid. Game developers should definitely learn from the video codecs. 1080p wouldn't be possible if the video was uncompressed and no data was reused (every pixel stored again for every frame at full quality).

So I believe 1080p is still the way to go (even at 60 fps).

And yet you shipped Trials on X1 at < 1080, prioritising solid frame rate and pixel quality (absolutely the right choice, of course). resolution was the last thing to be set, with frame rate being an absolute priority.

This is in stark contrast to the games that piss me off - 1080p games with sorry frame rates.

I agree with almost everything above, except that native res should always trump pixel quality. Lighting and aa - particularly edge aa - are huge. Msaa with custom resolve and down sample (possibly after an additional shader based pass) could easily look better than native 1080 with poor or no AA while being faster.

Most of those cheap 50" tvs will never be run at native btw. :)
 
Does 1080p need AA at all? I don't think so.

You can make a fairly direct comparison by using Forza 5 and Forza Horizon 2. Both are native 1080p, one with 4xMSAA and one without any AA.

These are obviously my own observations and different eyes can come to different conclusions:

Horizon 2 looks to me to be considerably cleaner at all ranges. Detail can be discerned at much further distances compared to Forza 5 without the distraction of shimmering and pixel crawl.

Even Though Forza 5 runs at 60fps it still feels to me like there is slightly more temporal resolution in Horizon 2. Obviously there isn't, it's technically half the temporal resolution when talking about framebuffers, but what the eyes and brain can resolve is not the same as what a TV can output.

Smoothness/image clarity through good AA is still important at 1080p IMO.
 
And yet you shipped Trials on X1 at < 1080, prioritising solid frame rate and pixel quality (absolutely the right choice, of course). resolution was the last thing to be set, with frame rate being an absolute priority.

Lol I was going to mention that! I have to agree though, for twitch gameplay like Trials framerate had to be prioritised over everything.

Sebbbi, did you ever get to have a shot at patching Trials with the June SDK, or had that window of opportunity passed, (i.e. no budget left for post-release performance patching)?


I agree with almost everything above, except that native res should always trump pixel quality. Lighting and aa - particularly edge aa - are huge. Msaa with custom resolve and down sample (possibly after an additional shader based pass) could easily look better than native 1080 with poor or no AA while being faster.

No need for 'could' there - just look at Ryse. Even at 900p upscaled it has image quality that I've never seen in another game at any resolution.

Compare that to Forza 5 which is 1080p with no AA. There is a world of difference, in Ryse's favour.
 
It was rendering 960x1080 in terms of new pixels drawn every frame. Also "The Order" can be considered rendering <1080p due to letterboxing. It's 20% less pixels or thereabouts. Although thanks to aspect, we can consider The Order still the same '1080p' only in the horizontal. But technically, it's <1080p pixels being drawn.

I forgot about "The Order" resolution setup.
 
Are modern game engines flexible enough that one can change this decision, maybe even frame by frame?

Say, if the scheduler screams that CPU cycles are left put e.g. texture transcoding there, if CPU is used by processes with higher priorities (i.e. Parts which are not flexible enough for GPGPU) and nothing is left on the CPU use the GPGPU algorithm

This would likely be tough because often the way (and where) you layout your data in memory will differ depending on what processor you intend to have chew through it to get most optimal use of it. Likewise you may use a totally different algorithm depending on which piece of hardware you want to process it, which would likely also require data be stored in different ways and possibly different locations.
 
Are modern game engines flexible enough that one can change this decision, maybe even frame by frame?
That's possible, but would be tricky to implement... however most graphics setup and animation code runs super quickly on GPU, as that kind of (math heavy + super parallel) code is much better suited for the GPU than the CPU. On GPU side you can use GPU generated structures (such as the depth buffer) to do more precise culling, freeing more GPU cycles than the culling stages cost. It might sound a bit strange, but sometimes moving processing to GPU saves BOTH CPU and GPU cycles.

spinning the thing even further: scale the dynamic resolution such that the target frametime is guaranteed...
That's definitely possible. Alternatively you could output always at 1080p and vary the pixel quality (use some sort of iterative refinement technology and give it always the same time). For example this OpenGL extension (timeAMD) can be used to do GPU side timing: https://www.opengl.org/registry/specs/AMD/gcn_shader.txt.
And yet you shipped Trials on X1 at < 1080, prioritising solid frame rate and pixel quality (absolutely the right choice, of course). resolution was the last thing to be set, with frame rate being an absolute priority.

This is in stark contrast to the games that piss me off - 1080p games with sorry frame rates.
I always choose locked 60 fps over 1080p. With Trials Fusion we unfortunately had to render at 900p on Xbox One. But it was a "launch window" game and we had to support DX10 PCs and Xbox 360 using the same code base, meaning that it was impossible for us to implement a brand new fully GPU compute based engine for the game. Xbox 360 was a high priority for us since we had such a large fan base on that console.

It's nice to see that people are moving to the next gen consoles faster than anyone expected. It took more than 3 years until PS3 games could compete with PS2 games in sales. Fast next gen adaptation will definitely allow the developers to start focusing on next generation sooner, and I am sure it will improve the quality of the game visuals.
I agree with almost everything above, except that native res should always trump pixel quality. Lighting and aa - particularly edge aa - are huge. Msaa with custom resolve and down sample (possibly after an additional shader based pass) could easily look better than native 1080 with poor or no AA while being faster.
You don't always need to sacrifice pixel quality to increase your output resolution. I'd say that 720p image reconstructed to 1080p (using for example a 1080p depth buffer) would look better than 900p in most cases (as the edge quality would be equal to 1080p). Obviously to reach a perfect result everywhere you'd need more input data (than just a depth buffer) at 1080p and you wouldn't want to lock in the material processing quality to some fixed resolution (like 720p). You'd want to analyze the screen and determine the sample locations dynamically. Definitely you can do more with less, if you are clever. Sometimes you can have both: good pixel quality (where it matters) and good resolution (where it matters).
 
I'd say that 720p image reconstructed to 1080p (using for example a 1080p depth buffer) would look better than 900p in most cases (as the edge quality would be equal to 1080p).
Has anyone ever tried YUV type rendering using half/quarter-res chrominance? It'd need a completely different pipeline to RGB but should give very acceptable results going by video compression if high framerate can be maintained.
 
Has anyone ever tried YUV type rendering using half/quarter-res chrominance? It'd need a completely different pipeline to RGB but should give very acceptable results going by video compression if high framerate can be maintained.
http://www.pmavridis.com/research/fbcompression/
It's almost impossible to see the difference (even the error difference pictures at 2x multiplier are mostly too dark to see for the eye).

There is actually a whole thread here in the forums already about this:
http://forum.beyond3d.com/showthread.php?t=63704
 
That's possible, but would be tricky to implement... however most graphics setup and animation code runs super quickly on GPU, as that kind of (math heavy + super parallel) code is much better suited for the GPU than the CPU. On GPU side you can use GPU generated structures (such as the depth buffer) to do more precise culling, freeing more GPU cycles than the culling stages cost. It might sound a bit strange, but sometimes moving processing to GPU saves BOTH CPU and GPU cycles.


That's definitely possible. Alternatively you could output always at 1080p and vary the pixel quality (use some sort of iterative refinement technology and give it always the same time). For example this OpenGL extension (timeAMD) can be used to do GPU side timing: https://www.opengl.org/registry/specs/AMD/gcn_shader.txt.

I always choose locked 60 fps over 1080p. With Trials Fusion we unfortunately had to render at 900p on Xbox One. But it was a "launch window" game and we had to support DX10 PCs and Xbox 360 using the same code base, meaning that it was impossible for us to implement a brand new fully GPU compute based engine for the game. Xbox 360 was a high priority for us since we had such a large fan base on that console.

It's nice to see that people are moving to the next gen consoles faster than anyone expected. It took more than 3 years until PS3 games could compete with PS2 games in sales. Fast next gen adaptation will definitely allow the developers to start focusing on next generation sooner, and I am sure it will improve the quality of the game visuals.

You don't always need to sacrifice pixel quality to increase your output resolution. I'd say that 720p image reconstructed to 1080p (using for example a 1080p depth buffer) would look better than 900p in most cases (as the edge quality would be equal to 1080p). Obviously to reach a perfect result everywhere you'd need more input data (than just a depth buffer) at 1080p and you wouldn't want to lock in the material processing quality to some fixed resolution (like 720p). You'd want to analyze the screen and determine the sample locations dynamically. Definitely you can do more with less, if you are clever. Sometimes you can have both: good pixel quality (where it matters) and good resolution (where it matters).
Thanks for the insightful reply. I think what developers are unconscious of is that "outsiders" now think of the new consoles. :cry: By outsiders I mean people who are on the fence whether to buy a new console or not, PC gamers, and so on.

In many cases they have a feeling that this generation is a total failure. :cry: And I am so happy with these new consoles and disagree with them with the heat of a thousand suns, but I can't shake off that feeling of powerlessness when they talk about them like that.

I also blame that on developers, because they choose fanciness over steady framerates and perfect cohesion, leaving the consoles vulnerable to attacks from the outside.

You hear complaints from people saying that these new consoles run games at 900p and can't go over 30 fps, and that they are sheer failure because they even struggle to achieve 30 fps and bla bla bla.

It certainly hurts and I hate that.
 
You can make a fairly direct comparison by using Forza 5 and Forza Horizon 2. Both are native 1080p, one with 4xMSAA and one without any AA.

These are obviously my own observations and different eyes can come to different conclusions:

Horizon 2 looks to me to be considerably cleaner at all ranges. Detail can be discerned at much further distances compared to Forza 5 without the distraction of shimmering and pixel crawl.

Even Though Forza 5 runs at 60fps it still feels to me like there is slightly more temporal resolution in Horizon 2. Obviously there isn't, it's technically half the temporal resolution when talking about framebuffers, but what the eyes and brain can resolve is not the same as what a TV can output.

Smoothness/image clarity through good AA is still important at 1080p IMO.
Well, from what I can gather from your post, I gotta agree with you on that, although Forza 5 has AA, as pointed out already, it's not very good. AA could be the least of developers concerns tho, it's stopped being an issue it seems, but definitely switching fully to another generation is.

This interesting and very recent article (from today) on how high resolution is irrelevant for the sake of it, sums up some interesting points -the guy says 1080p is fine, it's 4k the resolution he doesn't seem to like that much-.

http://www.examiner.com/article/next-gen-gaming-and-the-deception-of-4k-resolution

gamers love to brag about the 4K capabilities of the highest end rigs, even calling the current gen consoles "outdated." Apparently, the future of gaming lies in the resolution. I couldn't disagree more.

1080p in my opinion hits the sweet spot. If we stayed in 1080p for the next decade I'd have no problem at all. What game developers need to strive for are the graphics, not the resolution.

What am I talking about? Take a look at the movie Jurassic Park, which came out over 20 years ago. The dinosaurs rendered in that film look significantly more realistic and more lifelike than any creature we've seen in any current video game. Skip about a decade to Peter Jackson's King Kong. King Kong looks and moves like a real gorilla. Then take a look at Rango, its detailed worlds, textures and realistic lighting.

My point is, the visuals of these films are light years ahead of anything in any video game today, even at 720p or 480p. Developers need to strive for graphics that rival King Kong and Jurassic Park. 4K gaming is just a diversion and a misdirection, nothing else.
 
You don't always need to sacrifice pixel quality to increase your output resolution. I'd say that 720p image reconstructed to 1080p (using for example a 1080p depth buffer) would look better than 900p in most cases (as the edge quality would be equal to 1080p).

How about a 4xMSAA depth?

Aaaand all the pixel counters will go crazy :p

We're not already? :runaway:
 
Has anyone ever tried YUV type rendering using half/quarter-res chrominance? It'd need a completely different pipeline to RGB but should give very acceptable results going by video compression if high framerate can be maintained.

Crysis 3 & Ryse G-buffers have certain YUV components.

Crysis 3: Albedo Y stored in A-chan, Albedo Cb/Cr stored in B-chan

Ryse: GBA-chans
Specular color stored as YCbCr to better support blending to GBuffer (e.g. decals)
Allow blending of non-metal decals despite not being able to write alpha during blend ops
Can still break when blending colored specular (rare case that was avoided on art side)
Specular chrominance aliased with transmittance luminance
Exploiting mutual exclusivity: colored specular just for metal, translucency just for dielectrics
 
My point is, the visuals of these films are light years ahead of anything in any video game today, even at 720p or 480p.
But jurassic park, king kong, rango were all rendered at higher than 1080p on machines with greater power than todays consoles.
If they were rendered at 480p (and resized to say 1080p) the person that wrote that article would go, WTF turn on the antialiasing this looks horrible.
 
From a consumer point of view - I'd rate my priority as

framerate > resolution >> overall visual complexity

Depending on the genre, I'd happily trade 1080p for 60fps, like in fast paced games where the detail of a single pixel is less of a factor because it's always moving at high speed. This would be racers for instance, or fast paced games like shooters.

Other genres where you have slow moving pixels and details are important, I guess there's a valid case for prioritizing resolution over framerate. I.e. a strategy game, like C&C type of view - or games that use a 2d plain (Resogun and others) where the speed at which the camera speed moves is predictable.

In the perfect world, I'd like to have 1080p60 in every single game, but some games just don't require it, so I guess going 1080p30 is a valid tradeoff or 720p60. I do think however that most games would be better off if they were forced to target 60fps, perhaps even 1080p60 if needs be in order to have a more level playing field and a more consistent focus on gameplay across the board.

<puts on flamesuit> :p
 
How about a 4xMSAA depth?
That's possible as well. And even better if you want to do nasty tricks with the GPU depth compression structures (as keeping the full depth buffer around is overkill for edge detection / weighting purposes only). I hope that someday we get abstracted access to the GPU hierarchical depth/color compression structures on PC as well (without needing to use Mantle or some other vendor specific API).
 
Back
Top