Digital Foundry Article Technical Discussion Archive [2014]

Status
Not open for further replies.
One thing people seem to be forgetting is you are not required to download the patch that changes the game from 900 to 1080p. It your choice which version you play.
Even if you have already installed the patch you can just delete the install data and reinstall from the disk. Im not sure how it works with the digital version though
 
And Ryse would look better at 1080p

Would it though? I haven't played it so I'm going to guess here, but there are games that are so heavy on post process that resolution benefits gets largely nullified. It's like how some movies look a bit blurred because the directory shot them always with the lens wide open because they wanted that shallow dof and soft look aesthetic. In a case like that people wouldn't notice much difference if the movie was presented in 2k or 4k just because of how it was filmed. Like I said I didn't play Ryse...but if that game does excessive mucking around with pixels in post then maybe resolution would be harder to notice in that title, more so when its just the 900p->1080p difference.
 
And Ryse would look better at 1080p, so what's your point?
You don't know that. In fact, you have no way of knowing if that's true.

People are moronic to care about 1080p? :rolleyes:
People who take one metric and blindly assume that single metric is the be-all, end-all differentiator between "looks great" and "looks like shit" are indeed morons.
 
You don't know that. In fact, you have no way of knowing if that's true.
All other things being equal (which was a clearly implied qualifier), how many people would dispute that?

Strictly speaking it's a matter of opinion (since there's not a strictly-defined way to decide whether anything is "better" than anything else), but really? The only imaginable objection would be something along the lines of "I like to have at least a certain amount of aliasing," which is atypical.
 
Last edited by a moderator:
All other things being equal (which was a clearly implied qualifier), how many people would dispute that?

Strictly speaking it's a matter of opinion (since there's not a strictly-defined way to decide whether anything is "better" than anything else), but really? The only imaginable objection would be something along the lines of "I like to have at least a certain amount of aliasing," which is atypical.
Exactly. 1080p > 900p, all other things being equal. I can't see how anyone could argue that. You can say subjectively they look the same, but you couldn't say definitively... ever.

I'm just getting tired of people calling others 1080p cheerleaders, morons, or that it's just a checkbox. The counter-arguments are just as bad. I think that the truth of the matter is somewhere in-between.

The differences between 900p and 1080p depends on the native res of the display, quality of the scaler, size of the display, viewing distance from the display, the quality of the viewers eye-sight etc. In most cases, 900p is very similar to 1080p, but there is a difference, whether you can see it or not. You can't sit there and say others can't. There was a time where a lot of people said that there's no difference between 720p and 1080p, but we all know that's false.
 
Last edited by a moderator:
Why is 900P to 1080P now considered marginal difference? It's more than any resolution gulf we had last gen. It's upscale vs no upscale effectively.

It might be less noticeable than usual depending on where the user sits, size of TV and we all don't really have perfect vision, but there is a difference.
 
Once third party developers start to get really into heavy (I mean; balls to the wall heavy) asynchronous compute, I'm pretty sure the freed Kinect resources aren't going to matter for the 1080p checkmark box... but rather matching compute for compute tasks between PS4/XB1. This is where things will get interesting...

I don't expect balls to the wall async compute to materialize this gen outside of a small handful of titles. Especially the async part; it's no magic bullet. With some good planning and care you can get close to peak perf with only the main ring, depending on what exactly you're doing. But regardless GPU time will come at a higher premium than CPU for a long time.

Some stuff runs like a dream on compute. Particle update, dynamic mesh generation, etc. That's the stuff people will move over, potentially freeing up the CPU they used to do it on older consoles. There's a lot of inertia against moving non-graphics work over to compute especially when you just freed a bunch of CPU.

Really though, the CU discrepancy is a problem whether or not you're doing compute and it will never go away. The only thing to do is balance around the weaker console like we did somewhat last generation, but the most surprising thing to me is how unwilling people have been to attempt parity in the titles released so far.
 
I don't expect balls to the wall async compute to materialize this gen outside of a small handful of titles. Especially the async part; it's no magic bullet. With some good planning and care you can get close to peak perf with only the main ring, depending on what exactly you're doing. But regardless GPU time will come at a higher premium than CPU for a long time.

Some stuff runs like a dream on compute. Particle update, dynamic mesh generation, etc. That's the stuff people will move over, potentially freeing up the CPU they used to do it on older consoles. There's a lot of inertia against moving non-graphics work over to compute especially when you just freed a bunch of CPU.

Really though, the CU discrepancy is a problem whether or not you're doing compute and it will never go away. The only thing to do is balance around the weaker console like we did somewhat last generation, but the most surprising thing to me is how unwilling people have been to attempt parity in the titles released so far.

Yes, except for most of the EA sports games which have basically total parity, I think the answer to that is because many PS4/XB1 titles are now developed first on PC then afterwards directly ported to consoles with simple settings optimized depending of the console.

It's probably easier this gen to directly port PC -> PS4/XB1 than last gen PC -> PS3/X360.
 
Exactly. 1080p > 900p, all other things being equal. I can't see how anyone could argue that. You can say subjectively they look the same, but you couldn't say definitively... ever.
...

The thing is, with fixed hardware "all other things being equal" never applies. There is always a give and take. You bump res up, other things get cut down. You drop res, other things can be bumped up. That's why people call it a checkbox mentality. You take any game, and you don't know what compromises would have to be made either way. You never get to see the two options. You just see the one the devs decide on. If you say, "I want all games to run at 1080p" you're basically asking the devs to provide resolution above all else, without knowing what the trade offs are.

I'm not saying you're a person that has that mentality. I don't know about you specifically. That's just seems to be the mentality of a lot of forum warriors.

Put it this way ... You're a first-party PS4 developer, and you've come up with some new rendering technique that fits into a new physically-based lighting and material system, but you can't get it to be performant enough to run your game at 1080p. You could release the game at 900p with a stable framerate, but you know there is a community of players that post frequently on gaming sites, and gaming sites are now collecting "checkbox" lists of resolutions and framerates like the one at IGN. Do you fit the new rendering feature in at 900p, or do you scrap the feature and release at 1080p hoping to find a way to optimize the feature or make a reasonable approximation for your next game?
 
...gaming sites are now collecting "checkbox" lists of resolutions and framerates like the one at IGN. Do you fit the new rendering feature in at 900p, or do you scrap the feature and release at 1080p hoping to find a way to optimize the feature or make a reasonable approximation for your next game?
I believe we started the pixel counting phenomenon in 2008. What a Pandora's Box. I can only apologise to the poor developers for the way that turned out. :(
 
Why is 900P to 1080P now considered marginal difference? It's more than any resolution gulf we had last gen. It's upscale vs no upscale effectively.

It might be less noticeable than usual depending on where the user sits, size of TV and we all don't really have perfect vision, but there is a difference.

There's a few reasons.

1) Some of that started from last gen when it was typically said how pc versions of games weren't considered much of an upgrade over their console counterparts because they only offered more resolution, higher frame rate, higher res textures, better shadow filtering and longer draw distance whereas the basic lighting system was often the same as was the basic game. Back then how the pixels were lit is what mattered more. Now there has been a shift in opinion and 900p to 1080p in and of itself is considered a noticeable difference, whereas last gen 720p to 1080p + 60fps + better textures + better shadows + better draw distance was not considered enough to care if the general renderer lighting everything was the same.

2) As long as pixel counters are needed for people to actually know the difference, then it will always be assumed by the people that make games that it is less important of a visual metric compared to other aspects of how pixels are drawn. After all if you can't see it without someone counting pixels for you, then it's probably not as big of a deal compared to other things like cleaner shadows, longer draw distance, better frame rate, etc. Marketing value is another matter entirely though.

3) The "better pixels" vs "more pixels' argument has been tested countless times in blind tests not just by game developers but also by tv makers with better pixels usually winning out. There's even a recent blind tv test where a 4k lcd tv display lost to a better quality 1080p plasma display.

So there is lots of past and present evidence to show that something like 900p to 1080p should not be considered as big deal of a deal as it currently is, everything from what gamers had said, to what gamers can see without being told what they see, and to what blind tests have shown.
 
Some of that started from last gen when it was typically said how pc versions of games weren't considered much of an upgrade over their console counterparts because they only offered more resolution, higher frame rate, higher res textures, better shadow filtering and longer draw distance whereas the basic lighting system was often the same as was the basic game.
I don't recall this ever being a thing. Sure, maybe there were some people who held this view but my perception is that the vast majority accepted that PC versions of mulit-platform games were noticeably (as in visibly) better than their console counterparts - on the right hardware, naturally.
 
The thing is, with fixed hardware "all other things being equal" never applies. There is always a give and take. You bump res up, other things get cut down. You drop res, other things can be bumped up. That's why people call it a checkbox mentality. You take any game, and you don't know what compromises would have to be made either way. You never get to see the two options. You just see the one the devs decide on. If you say, "I want all games to run at 1080p" you're basically asking the devs to provide resolution above all else, without knowing what the trade offs are.

I'm not saying you're a person that has that mentality. I don't know about you specifically. That's just seems to be the mentality of a lot of forum warriors.

Put it this way ... You're a first-party PS4 developer, and you've come up with some new rendering technique that fits into a new physically-based lighting and material system, but you can't get it to be performant enough to run your game at 1080p.
You could release the game at 900p with a stable framerate, but you know there is a community of players that post frequently on gaming sites, and gaming sites are now collecting "checkbox" lists of resolutions and framerates like the one at IGN. Do you fit the new rendering feature in at 900p, or do you scrap the feature and release at 1080p hoping to find a way to optimize the feature or make a reasonable approximation for your next game?

You mean for example Q Games and their Voxel cone tracing. I suppose it could be an issue, but then the way to deal with it is to show the results and let the gamers decide if its a worthy tradeoff.
 
You mean for example Q Games and their Voxel cone tracing

I'm not sure what this post is supposed to mean. Is that game running lower than 1080p, or are you trying to prove my hypothetical example "wrong"? All I'm saying is developers have to make a choice between new rendering techniques on fixed hardware, and if they know they're going to get backlash for dropping below 1080p, will they shelve good ideas that could look great at a lower resolution.


I suppose it could be an issue, but then the way to deal with it is to show the results and let the gamers decide if its a worthy tradeoff.

Do you think that's really a realistic proposition.
 
Put it this way ... You're a first-party PS4 developer, and you've come up with some new rendering technique that fits into a new physically-based lighting and material system, but you can't get it to be performant enough to run your game at 1080p. You could release the game at 900p with a stable framerate, but you know there is a community of players that post frequently on gaming sites, and gaming sites are now collecting "checkbox" lists of resolutions and framerates like the one at IGN. Do you fit the new rendering feature in at 900p, or do you scrap the feature and release at 1080p hoping to find a way to optimize the feature or make a reasonable approximation for your next game?

That's the main cause of the problem actually, because it causes upscaling. For many gamers in search for the ideal image quality, hence better graphics, 1080p in itself is not a checkbox, it's the native resolution of their monitors. For instance in Europe we played our old gen games on 768p TVs meaning upscaling, sometimes double upscaling for games not even rendering at 720p.

And for me upscaling is the worst image quality you can get from a game, it makes great otherwise graphics bad comparatively.

Take BF4 on PS4/XB1. For all its rendering technique and stuff it displays on screen, everything it does is wrecked by the imposed upscaling from 720p/900p to 1080p which negates completely what the effects and pyrotechnics it shows. And many people that play their PS4 games at 1080p immediately feel the blurriness of BF4 at 900p hence less impressive gfx than others native games.

What many PC gamers do when they set their games (what I was doing in the early years of LCD screens with my PC without really knowing at first the superiority of native resolution like I do now) is to first select the native resolution of their monitors because they know that everything not native will be awful and that better effects + upscaled resolution is worse than less effects + native resolution.

I agree that now, this fullHD quest is a checkbox list for clickbaiters articles, but it wasn't at all that at first, it was just the quest of the best image quality which directly helps the immersion.

It's the developers that should understand that on LCD screens, effects, shaders, rendering techniques, tessellation, number of polygons pushed etc. will be completely negated/wrecked by any upscaling (and blurry post effects for that matter :rolleyes:). Once you play most of your games at a native resolution, it's hard to go back to upscaled games IMO and I think many devs still did not really understand the importance of native resolution in the communication era of Internet and social medias.
 
That's the main cause of the problem actually, because it causes upscaling. For many gamers in search for the ideal image quality, hence better graphics, 1080p in itself is not a checkbox, it's the native resolution of their monitors. For instance in Europe we played our old gen games on 768p TVs meaning upscaling, sometimes double upscaling for games not even rendering at 720p.

And for me upscaling is the worst image quality you can get from a game, it makes great otherwise graphics bad comparatively.

Take BF4 on PS4/XB1. For all its rendering technique and stuff it displays on screen, everything it does is wrecked by the imposed upscaling from 720p/900p to 1080p which negates completely what the effects and pyrotechnics it shows. And many people that play their PS4 games at 1080p immediately feel the blurriness of BF4 at 900p hence less impressive gfx than others native games.

What many PC gamers do when they set their games (what I was doing in the early years of LCD screens with my PC without really knowing at first the superiority of native resolution like I do now) is to first select the native resolution of their monitors because they know that everything not native will be awful and that better effects + upscaled resolution is worse than less effects + native resolution.

I agree that now, this fullHD quest is a checkbox list for clickbaiters articles, but it wasn't at all that at first, it was just the quest of the best image quality which directly helps the immersion.

It's the developers that should understand that on LCD screens, effects, shaders, rendering techniques, tessellation, number of polygons pushed etc. will be completely negated/wrecked by any upscaling (and blurry post effects for that matter :rolleyes:). Once you play most of your games at a native resolution, it's hard to go back to upscaled games IMO and I think many devs still did not really understand the importance of native resolution in the communication era of Internet and social medias.

So, if I'm to understand this correctly, the beautiful games of last-gen, like The Last Of Us, had absolutely horrible image quality. Just completely awful. They should have made the game 1080p, because the PS3 could output 1080p, and cut back on the lighting, shading etc to make sure the image wasn't being upscaled? Are upscale artifacts really as bad as you make them sound? I highly doubt it. I play upscaled games all the time, and they look great. I'd rather let a dev decide, after rigorous testing, what looks good and what doesn't, and release it as they please, whatever resolution that happens to be.
 
I'm not sure what this post is supposed to mean. Is that game running lower than 1080p, or are you trying to prove my hypothetical example "wrong"? All I'm saying is developers have to make a choice between new rendering techniques on fixed hardware, and if they know they're going to get backlash for dropping below 1080p, will they shelve good ideas that could look great at a lower resolution.

Yes, to run their voxel cone tracing they're running below 1080p and no I'm not trying to prove your hypothetical wrong, I'm concuring with it. Your conclusion on the other hand.......




Do you think that's really a realistic proposition.

Yes, if the tradeoff brings sufficient value. However, if you're just swapping checkbox for checkbox, or worse, for parity, then yea you may suffer for choosing wrong.
 
Yes, to run their voxel cone tracing they're running below 1080p and no I'm not trying to prove your hypothetical wrong, I'm concuring with it. Your conclusion on the other hand.......


Yes, if the tradeoff brings sufficient value. However, if you're just swapping checkbox for checkbox, or worse, for parity, then yea you may suffer for choosing wrong.


Didn't know that game was below 1080p. I'd had assumed it was 1080p. It would be curious to know how many people are aware of that, and what they think of it.

Not sure what we're arguing anymore. All I'm saying is devs should be able to judge what is best for the visuals in their own game, and what tradeoffs to make. They're the ones that do all the performance testing, and all of the image comparisons. If they forgo new rendering techniques only because they're afraid of 1080p backlash, then that's a bad thing. An approval process based on fan input would be impossible to manage, on top of issues like sample bias. Not sure what checkbox for checkbox is being referred to, or where parity fits in.
 
Status
Not open for further replies.
Back
Top