More pixels or prettier pixels?


Here's why pixel quality matters far more than resolution. And if i have the choice, i prefer 1080p/60fps vs 4k/30fps (assuming graphical settings are identical).
But guess what, that video is not 60fps, it's good old 30fps with balls to the walls visuals and not a single person was complaining about its "lack of smoothness" :).
Do people not see what I mean by sticking with 30fps? In order to get as close to CGI visuals as possible and achieve the biggest visual impact you're gonna need every single last drop of power to be spent on quality pixels, not framerate. As for resolution 1440p should be the baseline if TV standard is 4k for PS5 gen, although 4k CBR or Temporal Injection are probably the best balance for next gen. And of course, people with 75" + TVs might suffer a bit in sharpness.
 
But guess what, that video is not 60fps, it's good old 30fps with balls to the walls visuals and not a single person was complaining about its "lack of smoothness" :).
1) They're not playing, but watching.

2) We don't complaing about stuff that's been stuck with us and isn't changing. I hate movie's 24 fps, but I don't complain about it every time I go to the cinema.

3) Some games are less sensitive to framerate issues than others. Would you really prefer a 24 fps photorealistic cinematic Wipeout to play??
 
1) They're not playing, but watching.

2) We don't complaing about stuff that's been stuck with us and isn't changing. I hate movie's 24 fps, but I don't complain about it every time I go to the cinema.

3) Some games are less sensitive to framerate issues than others. Would you really prefer a 24 fps photorealistic cinematic Wipeout to play??
Sure but I would find it hard to believe if the majority would hate or complain about playing those CGI graphics at 30fps tho :).
Wipeout is a game I don't care about but if it looks like Tron Legacy CG at 24fps then I'm sure the experience would be more transcending than 60fps with struggle street graphics.
 
But guess what, that video is not 60fps, it's good old 30fps with balls to the walls visuals and not a single person was complaining about its "lack of smoothness" :).
Do people not see what I mean by sticking with 30fps? In order to get as close to CGI visuals as possible and achieve the biggest visual impact you're gonna need every single last drop of power to be spent on quality pixels, not framerate.

I agree, but if everything is equal then i prefer a higher framerate. This choice typically exists on PC and not on consoles.
 
Higher framerate = higher sense of speed. Detail is not that important as long as you get your materials and lighting right.
 

Here's why pixel quality matters far more than resolution. And if i have the choice, i prefer 1080p/60fps vs 4k/30fps (assuming graphical settings are identical).
What am I supposed to be seeing from this ad , with its cutscene that proves that that pixel quality matters more ? Oh and how is 60 fps come into play from sony’s almost always 30 fps cinematic third person games ?

Anyway can’t wait until ps5 comes out so this 1440p is enough can be put to rest .
 
What am I supposed to be seeing from this ad , with its cutscene that proves that that pixel quality matters more ?

Because, in my opinion, this looks much better than any real game. I think people would be more impressed by such graphics in real time than the current most impressive PC game running at 4k/60fps.

Oh and how is 60 fps come into play from sony’s almost always 30 fps cinematic third person games ?

I only wanted to say that resolution is the least important factor to me once you reached 1080p :

1) Pixel quality
2) Framerate
3) Resolution

But this comment mainly concerns PC. I prefer to play in 1080p/60fps max settings instead of 4k/30fps max settings. But if higher framerate means lower graphical quality like on console, then it's much more debatable.
 
Not this again!

The reason the new GOW (or any game, really) doesn’t look like that CGI commercial is not, repeat is NOT because “1080p is enough”.

Any hardware available today would melt if it tried to render that kind of CGI in real-time, at 1080 or any other resolution.
 
Because, in my opinion, this looks much better than any real game. I think people would be more impressed by such graphics in real time than the current most impressive PC game running at 4k/60fps.



I only wanted to say that resolution is the least important factor to me once you reached 1080p :

1) Pixel quality
2) Framerate
3) Resolution

But this comment mainly concerns PC. I prefer to play in 1080p/60fps max settings instead of 4k/30fps max settings. But if higher framerate means lower graphical quality like on console, then it's much more debatable.
Don’t see anything special in it, or at least at various gameplay clips from this that have been released, something different than gears 4 or uncharted 4 or horizon or assassins creed origins or Star Wars battlefront 2 or whatever. A very good looking big budget game , that’s about it. How that proves that games should stick to 1080p or 1440p , I have no idea.
 
Don’t see anything special in it, or at least at various gameplay clips from this that have been released, something different than gears 4 or uncharted 4 or horizon or assassins creed origins or Star Wars battlefront 2 or whatever. A very good looking big budget game , that’s about it.

I only speak about the CGI video.

How that proves that games should stick to 1080p or 1440p , I have no idea.

Because you have far more ressources at 1080p than at 4k...

Any hardware available today would melt if it tried to render that kind of CGI in real-time, at 1080 or any other resolution.

Honestly, i think that something close could be possible at 1080p/30fps on the most powerful PC.
 
Don’t see anything special in it, or at least at various gameplay clips from this that have been released, something different than gears 4 or uncharted 4 or horizon or assassins creed origins or Star Wars battlefront 2 or whatever. A very good looking big budget game , that’s about it. How that proves that games should stick to 1080p or 1440p , I have no idea.
Really, nothing special? So which of those games you listed has extremely detailed mountain sized boss enemy, GPU particles and packed with dense, high quality environment at least as detailed if not more? If the Pro renders it at native 4k you would see the amount of downgrade that would cause a riot. 4k CBR is the perfect balance here.
 
Really, nothing special? So which of those games you listed has extremely detailed mountain sized boss enemy, GPU particles and packed with dense, high quality environment at least as detailed if not more? If the Pro renders it at native 4k you would see the amount of downgrade that would cause a riot. 4k CBR is the perfect balance here.
Gears of war 4 .
 
https://www.rtings.com/tv/reviews/by-size/size-to-distance-relationship

A person with 20/20 vision cannot distinguish any difference between 50" 1080p and 4K images at 1.93 m.


A console is designed to connected with TVs, which are often placed in larger space like living rooms. The viewing distance of TVs is often longer than PC monitors. For example, 1.93 m for a 50" TV is very common in a living room. Therefore 1080p should still be a good option for next-gen.

Besides, 1080p images with integer-ratio scaling and a 4K user-interface is perfect for 4K TV, because the image is not blurred.


However there are still some players sitting closer to a TV, so another high-resolution mode is helpful. I can see multiple display modes in future games even in next-gens. For example, 1080p/30fps with best pixel quality, 1080p/60fps and 4K/30fps. In fact a lot of games already have multiple display modes for mid-gen consoles.
 
Last edited:
https://www.rtings.com/tv/reviews/by-size/size-to-distance-relationship

A person with 20/20 vision cannot distinguish any difference between 50" 1080p and 4K images at 1.93 m.
I expect Zed to appear. ;)
20/20 isn't perfect vision. As Zed is sure to tell you, many people have better than that.

But there's no need for science on this. There's loads of empirical evidence of people with 4K sets playing 4K games and watching 4K movies and saying they are much clearer. So unless you consider all those people liars/delusional, you should accept that 4K has value.
 
I expect Zed to appear. ;)
20/20 isn't perfect vision. As Zed is sure to tell you, many people have better than that.

But there's no need for science on this. There's loads of empirical evidence of people with 4K sets playing 4K games and watching 4K movies and saying they are much clearer. So unless you consider all those people liars/delusional, you should accept that 4K has value.
https://uihc.org/health-topics/what-2020-vision

Only about 35% has 20/20 vision without glasses, contact lenses or corrective surgery.

That's why many people say "1080p is enough" because 1080p pixels are already indistinguishable for a lot of players at certain distance.

If a person has better vision than 20/20 or he sits closer to a TV, than he may need higher resolution. So multiple display modes will still be common in next-gen. However 1080p should be the baseline for next-gen since it already meets the need of many players.
 
https://www.rtings.com/tv/reviews/by-size/size-to-distance-relationship

A person with 20/20 vision cannot distinguish any difference between 50" 1080p and 4K images at 1.93 m.


A console is designed to connected with TVs, which are often placed in larger space like living rooms. The viewing distance of TVs is often longer than PC monitors. For example, 1.93 m for a 50" TV is very common in a living room. Therefore 1080p should still be a good option for next-gen.

Besides, 1080p images with integer-ratio scaling and a 4K user-interface is perfect for 4K TV, because the image is not blurred.


However there are still some players sitting closer to a TV, so another high-resolution mode is helpful. I can see multiple display modes in future games even in next-gens. For example, 1080p/30fps with best pixel quality, 1080p/60fps and 4K/30fps. In fact a lot of games already have multiple display modes for mid-gen consoles.

Movies have unlimited supersampling and therefore it can't be exactly compared to games. Usually I can see the differences between FullHD, 1620p, 1800p and UHD when playing games. However, I agree that UHD Blu Rays require at least 65 inches and one still have to sit close to it to see the advantages of the sharpness. Also, native UHD for gaming is a waste of resources. I prefer to invest in frame rate and graphics instead.
 
I think ray tracing should pursue the quality of 1080p blu ray rather than chasing 4k. And as always I'll take 120Hz before 4k, but that's just me.
 
Back
Top