1080p Dilemma

Would be nice to require 1080@60 or 720@60, that's the kind of nice things a console maker could put in place that players would appreciate.
Ideal would be to make mandatory 1080@60 or both 1080@30 & 720@60 and the player picks up the one he prefers.
 
Would be nice to require 1080@60 or 720@60, that's the kind of nice things a console maker could put in place that players would appreciate.
Ideal would be to make mandatory 1080@60 or both 1080@30 & 720@60 and the player picks up the one he prefers.

I thought it had been established on here and outside that 1080/30 is not easily interchangeable with 720/60 for a variety of reasons.
 
But jurassic park, king kong, rango were all rendered at higher than 1080p on machines with greater power than todays consoles.
If they were rendered at 480p (and resized to say 1080p) the person that wrote that article would go, WTF turn on the antialiasing this looks horrible.

I sometimes wonder though if we could do better with higher quality assets and better AA at the cost of resolution.
I think 1080p can be irrelevant if the artists put the right details where needed
 
I thought it had been established on here and outside that 1080/30 is not easily interchangeable with 720/60 for a variety of reasons.

Not if you target 1080p30, but if a developer were to target say 720p60 (with 1080p30 in the back of the mind), it would probably be easier and quite doable.

Although personally, I'd rather just have developers target one goal, that being ANYHDRESp60, so that every player has a consistent experience. Half the fun playing Fusion Trials i.e. is knowing that everyone is playing it the same as you are, so the leaderboard are representative.
 
I thought it had been established on here and outside that 1080/30 is not easily interchangeable with 720/60 for a variety of reasons.

Talking about making the game with those constraints in mind, it would mean slightly different effects and other changes to each version, but it would give players a choice, although, maybe "eye candy" and "responsivness" could be the two modes instead.
Thing is, it is quite possible to target both even if it's a little more work.

Of course people would prefer everything at maximum resolution and minimum acceptable framerate (which is 60Hz :p, not 30 :p)

Dunno was just an idea anyway, that could be enforced by consoles makers if they dared.
 
I'm not sure I'd want the choice. I don't like these kind of choices. I'd just end up dazed and confused, unable to decide, I'd turn off the system, hide in a corner and cry.
 
Of course people would prefer everything at maximum resolution and minimum acceptable framerate (which is 60Hz :p, not 30 :p)

Dunno was just an idea anyway, that could be enforced by consoles makers if they dared.

MS mandated 720p with MSAA at the beginning of last gen for the 360 but dropped that pretty quickly.

I doubt they would make any such demands again, as mandating a bullet point resolution or framerate and potentially crippling the creative vision of a game really does not benefit anyone.
 
Realistically though, 720p60 isn't a difficult target to reach on current consoles, unlike it could be on 360 (and PS3 for that matter) last gen for some titles. If MS had mandated 540p60, it's a mandate that could have stuck.

But yes, it should really be for the devs to choose. And the devs should pick the right choices! Maybe an incentives programme to encourage certain standards? How's about something like "The True Console Experience" as a prominent subset of titles on the online stores for any game with a locked 60 fps and reasonable enough res?
 
I know we've had this discussion some time ago, and usually, I agree that developers should get the most freedom they can get on how and with what tradeoffs they develop their games....

- but in this case, I do think a mandatory resolution and framerate target would generally be a good thing, not only for consumers, but developers themselves too.

Why? Because on many levels, visuals and as a result, screenshots (because they portray nicely how impressive your game looks) sell games. The problem I see is that 30fps is often looked at as the preferable tradeoff because it's good enough for the majority of [mainstream] gamers. If all developers ever targeted 30fps as the ultimate benchmark, who would dare target 60fps if they're effectively throwing away more than twice the amount of resources to get there? The result usually is an inferior looking game, which is bad for PR. If hardware makers ensured a framerate target, everyone would play by the same rules - meaning that developers could worry more about getting the most bang for buck every 1/60th of a second or simply concentrate more on other areas of their games because the [visual] difference between games would probably be smaller than if you have some games targeting 30 and fewer 60.

In other words, if developers were forced to target 60, we wouldn't be discussing in some topics like the DriveClub one, how much worse the game would look if the developer had done this or that - we'd simply be appreciating what they got out of the hardware.

Obviously, this would also mean that some games, like perhaps very slow paced games that don't benefit a high framerate would be effectively wasting some resources on a high framerate... but I do think that there are only a few of them. On the whole, I'm pretty much convinced that any game outthere that is already great at 30fps, would play quite a bit better at 60fps, even if some find this hard to believe.

Tomb Raider: Definite Edition demonstrates this nicely IMO. So does TLoU. Great great games on the PS3, but from a gameplay perspective, just better and more fluid at double framerate.

Perhaps the dream is that someday, hardware will offer so much performance and we'll be stuck at perhaps 4K resolutions that developers will find their ways back into targeting more 60fps games. The PS2 seemed to be one of those consoles - I think it was the only console I ever owned where the majority of games I played were actually at 60fps.
 
Realistically though, 720p60 isn't a difficult target to reach on current consoles, unlike it could be on 360 (and PS3 for that matter) last gen for some titles?

30fps obviously gives you twice as much CPU time per frame to run your game world compared to 60fps.

That can mean less complexity (dumber AI, less precise physics, less things on screen etc) and it's not always possible to fudge this. The Halo CE remaster in the Master Chief Collection has this issue (30hz world simulation, 60hz graphics) which DF says cause judder.

I'm happy letting devs decide where the balance should be.
 
Are there any situations where a platform, would not be able to run a game or any application at 60fps, regardless of how low the resolution is?
 
If platform holders mandated frame rates then titles like Shadow of the Colossus would likely never have seen the light of day. That would be a very sad situation indeed.

By fixing games to 1/60th of a second for a refresh you are limiting the scope of what a game can do, what it can be. The compromises will be too great IMO.

AI and physics might have to be limited and environments will likely be sparse or enclosed. Accommodate complex AI and physics and expansive complex environments together in your 16ms refresh and you could end up with incredibly basic looking games which simply aren't very marketable.

I love responsive games. Fighters, racers amd twitch shooters obviously shine at 60fps. I also like games with deep gameplay and realism brought about through interaction with the virtual world, and I'd hate to have some of my favourite game genres dumbed down because of a desire for completely arbitrary checklists to be ticked.

Using the Tomb Raider or TLoU remasters as an example of why 60fps > 30fps does not hold any water, as the games were originally released on machines that would not have been able to accommodate those games at 60fps without absolutely massive compromises which would make them very different games to what they turned out to be.
 
I sometimes wonder though if we could do better with higher quality assets and better AA at the cost of resolution.
I think 1080p can be irrelevant if the artists put the right details where needed
The thing is these movie scenes (jurassic part etc) are rendered at 4k+ and downsampled to 720p (or whatever the dvd is) if you rendered them at 720p & lots of MSAA its still gonna look worse since that only helps the edges, the textures etc are not going to look as good. Ignore what the examiner.com article writer said its bollux
 
Are there any situations where a platform, would not be able to run a game or any application at 60fps, regardless of how low the resolution is?

Titanfall and Dead Rising 3 might fall under that category.

Games with 4 player splitscreen may present problems as well.
 
Quality over quantity.

I recall running FEAR 1 way back on a 6600GT. About half way through the game I switched from 1024x768 no AA to 800x600 with 4xMSAA and it looked so much better. My friends liked to watch me play the game (all crowded around a 17" CRT :D) and everyone agreed.

It would be fantastic if new console games just focused on a great looking experience with high quality AA and AF at 720p.
 
Actually that's not how it's done, brute force supersampling wouldn't be good enough as movie VFX and CG animation can not have any aliasing at all.
Offline renderers use sophisticated antialiasing methods - they are supersampling indeed, but much more cleverly. Pixar's Renderman in particular (used on most movies up until a few years ago) is especially complicated, as it decouples shading from sampling and uses stochastic patterns; but most other renderers have all kinds of trickery as well, like adaptive supersampling and so on.
Some info on PRMan:
http://www.hradec.com/ebooks/CGI/RMS_1.0/mtor/rendering/Renderman_Globals/rg-reyes.html

Most movie material is rendered at about 2000 pixel vertical resolutions (referred to as 2K) but has practically no aliasing at all. When CG is composited into live action, there's also some slight blur, DOF, and film grain applied in post, but no artifacts are left in the source image despite of that.

You are right though that when it's converted to DVD resolution - 720 * 586 or so - the process kinda acts like an additional step of supersampling.
 
Isn't 1080p 2k? Pretty much? With 4k being approx 4000x2000 (can't remember the exact numbers), therefore 2000 vertical res? But they called it 4k based on the horizontal res? This is all so confusing.
 
Back
Top