*split* multiplatform console-world problems + Image Quality Debate

If the assets, AA and AF are the same of course the version that runs at higher resolution will be better but what if the 1080p version doesn't have AA at all and only 2xAF when the 720p or 900p version has 2xMSAA+FXAA and 8xAF? which is the better of the two? that's what I mean when I say that the obsession lately with resolution is getting out of hand with comments "I will not buy a 720p title on my PS4! 720p is so last gen & 1080p is next gen! OMG! e.t.c." so if we get a title on either console with for example Samaritan tech demo level of visuals at 720p/30fps on an open world environment will this game be last gen and not worth calling it next gen just because it's 720p/30fps? it's ridiculous to judge a game's tech only by it's resolution.
In those cases, a logical discussion or which would be better could be had. When it's the same game and one runs the game at a higher resolution AND more stable framerate, there are no what ifs.
 
Some TVs offer 1:1 pixel mapping so you could definitely bypass the TV's internal scaler regardless of resolution input.

so in that case if you ran ryse (1600x900) could you have it at that res with black borders or will the console always scale ?
i'll give an example of a released game on a released console
alan wake on the 360 runs at 960x544 or (Call of Duty 3 1040x624 on ps3 1088x624 ) could i have it at that res with black borders ?
 
I don't think you can set Ryse to output 1600x900, because it's not the X1's hardware scaler it's using.
For other 720p games you might be able to set the console to output a 720p image; any other resolutions are probably not possible at all.
 
Its almost like the N64 versus the PS1 all over again except the cartridges versus CD aspect.

Its getting to point that its like a bunch of highly respected chefs arguing who has better culinarians, McDonalds or Burger King.

You don't go to McDonalds for fine dining no more than you buy a console for exquisite IQ as thats what a PC is for.

I doubt as time goes by we will see native 1080p become standard on "whats to be current gen consoles". Native 4K isn't around the corner so as we move forward and PC visuals progress, consoles will have a hard time keeping up if console devs are stuck trying to produce native 1080p games. I think both the PS4 and Xb1 will deploy more advanced upscaling techniques and try to imitate PC visuals by pushing less pixels while providing the same quality per pixel.

We will see a ton of 1080p games but it probably be pretty much what we have now on consoles, where most 1080p titles are PSN or Arcade titles.
 
Last edited by a moderator:
so in that case if you ran ryse (1600x900) could you have it at that res with black borders or will the console always scale ?
i'll give an example of a released game on a released console
alan wake on the 360 runs at 960x544 or (Call of Duty 3 1040x624 on ps3 1088x624 ) could i have it at that res with black borders ?

Does HDMI accept those non standard resolutions? My TV is set to 1:1 but this is only valid when it gets a 1080p signal, which of course maps 1:1 to the screen. Anything else is scaled either by the TV or whatever device is connected.
 
That at parity of pixel quality, the higher resolution will always look better. Which is what is happening in reality with the unoptimised XO versions of some launch games.

In those cases, a logical discussion or which would be better could be had. When it's the same game and one runs the game at a higher resolution AND more stable framerate, there are no what ifs.

As I already said if the assets are identical with same or comparable AA and AF (like BF4 on PS4 and XB1) of course the version that runs at a higher resolution will have the better IQ, my post were more towards the posts that I see in a lot of gaming forums lately saying that IQ at 720p or 900p (when for example Ryse IQ wise looks really good) is unacceptable when there is more than resolution on why a game can look smooth and clean.

Hmm this may be news to you, but higher resolution will actually make the jaggies and shimmering less noticeable, I thought everyone understood this

Then how can you explain that Demon's Souls has noticeably more jaggies and shimmering than Dark Souls even though Dark Souls runs at a lower resolution? I chose those because they are running at the same engine and have very similar color palettes.

Are AA or AF irrelevant or of little importance when we're talking about IQ? personally I'd take American Nightmare's IQ over Portal 2's even though the former is running at 1120x576...hell even at a game with identical assets an AA method can make one hell of a difference, just look at Portal 2 PS3 with MLAA versus Portal 2 360 with edge blur, the IQ on the PS3 version is so much better even though both of them run at 720p. I don't get how anyone can discard AA and AF when we're talking about good IQ.

Maybe I'm in the minority here but I'd take a softer looking image that looks clean and smooth against a crisper looking image with jaggies and shimmering all over the place.
 
BF4 is the only "blurry" game pegged that we have seen running on PS4. And that is pegged down to a too aggressive AA solution.

PS4 games are not inherently like that, but it should be common sense.

I guess we'll really be sure at the next DF's PS4/PC face-offs, won't we?

Because the others compared versions had also an aggressive AA with worse upscaling for instance PS4 vs PS3 in IGN's current/next gen comparison. And I still have not seen a really sharp PS4 footage. Even the latest KZSF footage (even if beautiful and next gen in itself) wasn't especially sharp and was very low compression. Surely the video capture method?

A lot of people everywhere on Internet (including Digital Foundry) have noticed this recurrent problem on PS4's videos and I can't help myself having a bad feeling about this... Anyway, when is the inevitable COD Ghosts DF's face-off?
 
A lot of people everywhere on Internet (including Digital Foundry) have noticed this recurrent problem on PS4's videos and I can't help myself having a bad feeling about this... Anyway, when is the inevitable COD Ghosts DF's face-off?

You are literally the first person I have ever seen suggest such a thing. At this point we are in FUD territory, there is no evidence of what you are suggesting beyond BF4 with some screenshots. The people who actually played the game almost unanimously said it looks just like the PC version.
 
You are literally the first person I have ever seen suggest such a thing. At this point we are in FUD territory, there is no evidence of what you are suggesting beyond BF4 with some screenshots. The people who actually played the game almost unanimously said it looks just like the PC version.

You are probably right.

By the way sorry for my final contribution in the derailment of this thread, the derailment did occur just AFTER my last post. For a moment I couldn't understand what was happening with this thead. :LOL:
 
Then how can you explain that Demon's Souls has noticeably more jaggies and shimmering than Dark Souls even though Dark Souls runs at a lower resolution? I chose those because they are running at the same engine and have very similar color palettes.
Simple heres the answer
Dark Souls = 1024x720 (2xAA)
Demon's Souls = 1280x720 (no AA)

25% extra res but no AA vs 2xAA
Dark Souls uses ~60% more samples at the polygons edges thus you will see less edge aliasing
http://en.wikipedia.org/wiki/Multisample_anti-aliasing
 
Yes, lots. I use a photo print service that I supply 300dpi images (1800x1200, 2.1 MP) and the results on Fujifilm paper are every bit the equal of film prints, comparing Canon EOS digital to Canon EOS analogue and using the same lenses. 6 MP printed on that service won't look any better.

Just because I have a passive interest in photography - it isn't always true that 2MP is the equivalent of film. It depends a lot on what you are shooting. I think the reason people don't see much of a difference is the old film camera that everyone took on their trips had a resolving power that was approximately equal to 2 MP camera. That is assuming you used the 35mm 100 ISO film. The actual "pixel" equivalency varied though. Depending on the type of film and the camera, the range generally was estimated to be between 4MP and 16MP for a 35mm.

Larger formats increase in megapixels dramatically. Your large format films used in movies and high end cameras can have between 200 and 800 megapixel equivalency (depends on whether you are talking 4x5 or 8x10). Then again, they are meant to be viewed on screens that are up to 3 stories tall, so there is a reason for the extra.

I do agree with your overall point though. With a standard 4x6 print you are generally in the area where you aren't going to see much difference. I think you still need to be above 6MP though. Most "mid range" cameras fall in the 12-20 range. For example, my Rebel T3i is 18MP while my EOS 70D is 20MP. It is interesting to note that the MP count has not dramatically increased even though there were nearly 3 years difference between my two cameras. A good deal of that extra resolution is for cropping. It is very difficult to get a shot that is good from corner to corner. If you are at 6MP, you can't edit the photo at all without risking crappy prints. If you are at 18MP, you can cut away a third of the photo and be fine.

Basically, I agree with your point in general. At some point, resolution stops mattering a whole lot. I think I would argue that it is far more important for computer games than something like sports. My argument would be that film tends to blur things between frames in a realistic manner. So the human eye has context to interpret things by. Something that is fairly fast moving like sports is fine in 720p for exactly that reason.

On the other hand, computer graphics tend to be full of sharp lines and abrupt transitions. You can try to introduce the same sort of blur in computer generated graphics that exists naturally on film, but if you get it wrong it just looks odd. On the other hand, if you get resolution high enough, your eyes can no longer distinguish the subtle aliasing effects and it becomes less important. I would argue that the difference between the PS4 and the XBone is negligible. It isn't large enough to really hide the aliasing effects that computer graphics introduces.
 
Just because I have a passive interest in photography - it isn't always true that 2MP is the equivalent of film. It depends a lot on what you are shooting. I think the reason people don't see much of a difference is the old film camera that everyone took on their trips had a resolving power that was approximately equal to 2 MP camera. That is assuming you used the 35mm 100 ISO film. The actual "pixel" equivalency varied though. Depending on the type of film and the camera, the range generally was estimated to be between 4MP and 16MP for a 35mm.

Larger formats increase in megapixels dramatically. Your large format films used in movies and high end cameras can have between 200 and 800 megapixel equivalency (depends on whether you are talking 4x5 or 8x10). Then again, they are meant to be viewed on screens that are up to 3 stories tall, so there is a reason for the extra.

I do agree with your overall point though. With a standard 4x6 print you are generally in the area where you aren't going to see much difference. I think you still need to be above 6MP though. Most "mid range" cameras fall in the 12-20 range. For example, my Rebel T3i is 18MP while my EOS 70D is 20MP. It is interesting to note that the MP count has not dramatically increased even though there were nearly 3 years difference between my two cameras. A good deal of that extra resolution is for cropping. It is very difficult to get a shot that is good from corner to corner. If you are at 6MP, you can't edit the photo at all without risking crappy prints. If you are at 18MP, you can cut away a third of the photo and be fine.

Basically, I agree with your point in general. At some point, resolution stops mattering a whole lot. I think I would argue that it is far more important for computer games than something like sports. My argument would be that film tends to blur things between frames in a realistic manner. So the human eye has context to interpret things by. Something that is fairly fast moving like sports is fine in 720p for exactly that reason.

On the other hand, computer graphics tend to be full of sharp lines and abrupt transitions. You can try to introduce the same sort of blur in computer generated graphics that exists naturally on film, but if you get it wrong it just looks odd. On the other hand, if you get resolution high enough, your eyes can no longer distinguish the subtle aliasing effects and it becomes less important. I would argue that the difference between the PS4 and the XBone is negligible. It isn't large enough to really hide the aliasing effects that computer graphics introduces.

The whole Pixel vs Grain discussion was born out of nerds suddenly getting photographs based on pixels instead of film/grain. It's still borders on religion when this discussion rares it's ugly head, having shot film for many years and since digital and now on a Canon 1D there is no doubt in my mind that the amount of pixels matters, but ,most importantly the "quality" of the pixels matters more. And i am not just talking about color depth, noise etc. But also lens quality and most importantly the photographer :)

The thing with Consoles is that more resolution can make a difference when you go big, or close to your TV. And i am not talking about stupid graph that has done the rounds with a theoretical Max res for human eyes. When i sit close to my TV or Projector flaws show up. When you play on a big tv the resolution difference between the XB1 and PS4 WILL be there, visible for everyone to see. But those that have a XB1 wont see the difference unless they have a PS4 to compare with
 
Larger formats increase in megapixels dramatically. Your large format films used in movies and high end cameras can have between 200 and 800 megapixel equivalency (depends on whether you are talking 4x5 or 8x10). Then again, they are meant to be viewed on screens that are up to 3 stories tall, so there is a reason for the extra.
I was under the impression, from my days in the HD DVD group, that most movies are filmed on bog standard 35mm film. Even if not, they are transferred to 35mm for projection.
 
A bit late to party and while reading through this topic, this quote kind of stuck with me:

That thread exists because people can't tell what resolution games are at by looking at them on their tv. Hence the existence of the resolution thread. You don't need threads asking about frame rate or weak textures because you can just look at your tv and see that.

The reason there is a resolution thread is because it's nice to put an exact number on things. You are correct however, it's not easy to put an exact number on resolution, for numerous reasons:

For one, most people that play on consoles rarely sit 40 inches away from their monitor, but rather several feet away. Given the viewing distance, it's comparable as using a device with "retina" class pixel density in close proximity - a figure at which you are unlikely to be able to count pixels let alone see them. Not all people have the same set-ups though or similar viewing distances.

I.e. I game on a large projected screen where even at FullHD resolution, the difference between native 720p and 1080p is rather noticable (but not striking) when watching a supersampled source (movies). However, playing games with less AA and the difference between resolutions can be larger and more easily noticable.

While the difference between a 600p and a 720p game upscaled to a FullHD screen might have been small, I would expect the difference between an upscaled 720p and a native 1080p game to be larger, even if you counter in typical viewing distances.

Noticable to the average uninformed gaming public? Yes. Will they care or will it influence them to buy a different gaming device? Hardly.

I wouldn't bet too much on "average joe" not noticing too much though. The average joe of today has come a long way in educating himself on technical differences. Now days, every phone is bought and compared by its resolution (and other spec-sheet numbers). Resolution is playing a big role in everyday devices, so I wouldn't bet on the average gaming public to be as ignorant when it comes to resolution differences on consoles as perhaps a couple of years ago.
 
Simple heres the answer
Dark Souls = 1024x720 (2xAA)
Demon's Souls = 1280x720 (no AA)

25% extra res but no AA vs 2xAA
Dark Souls uses ~60% more samples at the polygons edges thus you will see less edge aliasing
http://en.wikipedia.org/wiki/Multisample_anti-aliasing

Yeah I know all of that, that's why I used the Souls games as an example of why "higher resolution will actually make the jaggies and shimmering less noticeable, I thought everyone understood this" is not true IMO, by just looking at the res of a game you can't tell if the IQ is good or not because there are other factors like quality of AA (plus things like shader aliasing), AF that are really important plus smaller things like motion blur and DoF that can affect IQ.

So where exactly you were disagreeing with me?
 
Back
Top