FarCry 4 Creative Director: Nobody cares aboot 1080p

I'm pretty sure people were really okay pretending that a recent first party driving game having unrealistic penalties and horrific server performance not being subpar!

Please- if you do want to whine about a certain semblance of lack of integrity/intellectual honesty, it'd do a lot of good to stop thinking in such loaded, one-sided terms.

Typical response, and unwarranted. My statement dealt with console warriors noise from all sides. And your last statement proves what's wrong with gaming now... getting passionate over the wrong things. One thing I learned as a PC gamer (Nvidia vs. ATI vs. Intel vs. whoever else), is don't allow your ego to stop you from enjoying another product. And growing thicker skin when certain comments/opinions are made that you may dislike.

What is this? 1996? The ps2/xb wars? There is no need for this kind of post here.

Exactly.

back OT/
 
Last edited by a moderator:
For me, sitting relatively close (2.5m) to a 42" TV, I find I do really notice if it is native or not. However it does help when the HUD and menus are at least native, which should usually be the case on Xbox One.

I also have a 42" tv, but sit about 1m away. I notice low resolution (also e.g. when effects are low res) quite well, so I typically vote for high res.

But of course, the devs need the freedom to choose the best setup for a given fixed hardware console.
 
I'm pretty sure people were really okay pretending that a recent first party driving game having unrealistic penalties and horrific server performance not being subpar! Please- if you do want to whine about a certain semblance of lack of integrity/intellectual honesty, it'd do a lot of good to stop thinking in such loaded, one-sided terms.

??? I don't even understand what or if you want to say something with this post other than being angry about something ???
 
I really think that resolution is only one parameter in the onscreen experience. If possible, I go for as high as possible.

But my PC is e.g. Not powerful enough to max out the new Borderlands...but I love the PhysX in this game and want it at least at High settings...so I insta dropped the res to 900p to get decent framerate....imo there is no general conclusion which resolution is best as long as you don't game on PC and have invested in an uber-rig!

And it also seems logic that such a conclusion, even for the same game, might be different for PS4 and X1...in this sense, if we let the dev decide, we should get (in their educated opinion) the best possible on the respective machines.
 
I can’t wait until this console cycle ends… full of whiners and hypocrites. Hopefully during the next cycle, both systems will be more evenly matched, so that 1st party games will dictate the better choice. Not this buzzword checkmark game… nor, the game of pretending subpar IQ and performance is now acceptable because you own the subpar system.

Very true indeed. The only thing is the whole Sub native res slash different res on different platforms isnt new. It was extremely common on multiplats last gen, but it wasnt made into such a big deal. The problem in my eyes is that we now have a situation in which pixel count and framerate determine whether a game is truly next gen instead of gameplay or even graphical effects. People actually will pass on a game if it is 900p because they consider it not next gen enough. Which is technically BS. Games like BF4(900p) and Ryse(900p) arent possible on older hardware even at 600p without serious cutbacks on geometry, texture quality and efx quality. Anyone who passes on playing a title on any system simply because it isnt 1080p and or 60fps is doing themselves a serious disservice.
 
The problem in my eyes is that we now have a situation in which pixel count and framerate determine whether a game is truly next gen instead of gameplay or even graphical effects.......

Anyone who passes on playing a title on any system simply because it isnt 1080p and or 60fps is doing themselves a serious disservice.

Who, except for idiots, give a shit if a game is "truly next gen" or not?

However, if an action game is not close to 60 fps it will take a lot to convince me to take a look at it. I want to feel "in control" as much as possible and lower frame rates are a serious detriment for that.
 
No one cares about 1080p? Someone better tell Microsoft and also ask the Far Cry 4 dev why the hell is it 1080p then. (on PS4 at least)

Sounds like more UBI damage control, bought into by saps.
There is no need to worried about where others spend their money. I don't think that affects us, so no.

I can’t wait until this console cycle ends… full of whiners and hypocrites. Hopefully during the next cycle, both systems will be more evenly matched, so that 1st party games will dictate the better choice. Not this buzzword checkmark game… nor, the game of pretending subpar IQ and performance is now acceptable because you own the subpar system.
And that's how we are doing. :???: I say that's how it is with us because attitudes like that don't help the industry. :???:

As long as a game is 60 fps and looks decent that's enough. Halo The Master Chief Collection looks incredible even using "old" art, and it is 60 fps.

What do you prefer, classic terror movies or modern ones? I'd say classic horror movies are better, all the way. :smile2:

What made Jaws the movie so great if it weren't for the limitations of the time? :smile2:

Do you think a modern movie could surpass this kind of vintage episode in quality and sheer terror? I don't think so. It's all about how you capture things, the atmosphere, the artistic choice. :eek:


Then it's the fps. Even some Gameboy games look incredible, like says World Heroes on the Gameboy.
 
Ya, well he is preaching to the choir I guess. The better question is why would one person's opinion suddenly become gospel to you? What if he had said the opposite, would you post it?
I wouldn't because that wouldn't reflect my opinion. Diablo 3 and Powerstar Golf are my most favourite Xbox One games, and one of them was 900p not much time ago, Powerstar Golf -I think, not sure- isn't 1080p, most probably.

1080p is not the holy grail of a console or your TV. Fortunately for you, you didn't have to spend a shitload of money for little difference. Maybe you just don't want to admit that 900p and 1080p look almost the same --negligible difference, despite what the DF experts say.

But if you go with the consumer friendly hype that's what happens when games have little gameplay but hey, they run at 1080p. :???: You look to me like the guy who is sad because his console can't output 4K.

Wouldn't you buy a WiiU just to play F-Zero because it's 720p? :smile: Or wouldn't you get a PS4 if it had Gran Turismo 7 even if Polyphony Digital went crazy with effects and dropped the rest to create amazing effects? I would, even if I were a proud X1 owner, because those games are good enough to justify buying a console, despite the amount of pixels displayed.
 
All things being equal, 1080P is better than sub-1080P. My TV is 1080P. Scaling and less pixels hurts IQ. Sure a good song is a good song even at 96kbps, but it is even better at 128 or 256. Why are you on a crusade about this? A higher sampling rate is not a subjective thing, it is better - both temporally and spatially.

Why do people bring up 4k? Is that relevant? If I had a 4k TV and a console that output at 4k, then it would be objectively better than 1080P. But who has that first world problem?
 
I really think that resolution is only one parameter in the onscreen experience. If possible, I go for as high as possible.

But my PC is e.g. Not powerful enough to max out the new Borderlands...but I love the PhysX in this game and want it at least at High settings...so I insta dropped the res to 900p to get decent framerate....imo there is no general conclusion which resolution is best as long as you don't game on PC and have invested in an uber-rig!

And it also seems logic that such a conclusion, even for the same game, might be different for PS4 and X1...in this sense, if we let the dev decide, we should get (in their educated opinion) the best possible on the respective machines.
The amount of power required to run some games at certain resolutions is beyond the possibilities of certain hardware, and screw 1080p if you look back Ultra High Definition was heard of back in 2005-2006 but we've never seen it realised on our consoles.

UHD was known even as early as 2002, so...

http://en.wikipedia.org/wiki/Ultra-high-definition_television

Additionally, I agree with shrendevain's post above.
 
All things being equal, 1080P is better than sub-1080P. My TV is 1080P. Scaling and less pixels hurts IQ. Sure a good song is a good song even at 96kbps, but it is even better at 128 or 256. Why are you on a crusade about this? A higher sampling rate is not a subjective thing, it is better - both temporally and spatially.

Why do people bring up 4k? Is that relevant? If I had a 4k TV and a console that output at 4k, then it would be objectively better than 1080P. But who has that first world problem?
I basically agree with you that all things equal 1080p is slightly better, there is a difference. The simple & clever answer to it.... is: YES there's (unless you're blind & can't see squat). But then, you are back into shallowness. Numbers and more numbers mean anything, 'cos you are sacrificing gameplay for a few extra pixels. :)

Just imagine certain games running at perfect 60 fps with all the effects one, that's worth sacrificing 1080p.

The real power of your console has not much to do with your sound or visual perception. And that's very important. :smile2:

Try this experiment, put one hand in hot water, and the other in cold water, then after a while put both hands under tap water. Both your hands will adapt to the new temperature of the water even if your sensorials say there is something odd or problematic at first. :smile2:
 
Who, except for idiots, give a shit if a game is "truly next gen" or not?

However, if an action game is not close to 60 fps it will take a lot to convince me to take a look at it. I want to feel "in control" as much as possible and lower frame rates are a serious detriment for that.

Oh I agree. Problem is there is an article from extremetech claiming that the Xbox One is capable of either having games that play nextgen or look next gen but not both.
I dont have a link handy, but there only qualification for looking next gen is 1080p.
Extremetech is always touting that they are Ign's sister site, I guess they want their association with Ign to make the site seem important somehow.
 
Oh I agree. Problem is there is an article from extremetech claiming that the Xbox One is capable of either having games that play nextgen or look next gen but not both.
I dont have a link handy, but there only qualification for looking next gen is 1080p.
Extremetech is always touting that they are Ign's sister site, I guess they want their association with Ign to make the site seem important somehow.
The fact that idiots exist and onthe Internet is not exactly news.
 
All things being equal, 1080P is better than sub-1080P.

As an academic argument this is sound, however in the real world 'all things being equal' cannot apply when discussing console games. It is a box with finite capabilities and compromise in other areas is always required to hit 1080p (or 720p last gen, etc).


Sure a good song is a good song even at 96kbps, but it is even better at 128 or 256.

In the finite capability (i.e. real world) model, increasing the bitrate would increase file size. Your 256kbps track would sound awesome until it ended before it reached the guitar solo. (in my mind this song is a power ballad. I'm actually playing air guitar along to it right now. Work colleagues are looking concerned.)


A higher sampling rate is not a subjective thing

Again, academically you are correct. In the real world though, there are a number of degradations between a digital media source and a human sensory organ.

In the case of music: D/A converters, speakers, background noise, distance to source, room shape and most important of all, ears.

In the case of video: The TV, light interference, smoke particles in the air (pass it to the left), viewing distance, viewing angle and eyes.

This makes the notion of an objective measurement for human observers slightly ridiculous because the very involvement of a human makes these measurements inherently subjective.

Even if we take the human out of the equation, quantum physics suggest that there is a chance that with no observer the source signal does not exist! ;)

The discussion of this article is not about whether there is more information in a 1080p framebuffer than a sub-1080p framebuffer. We know there is. That is objective. The discussion is about whether that degraded signal, when it reaches the average human, is detailed enough to be able to tell small resolution differences apart.

I have no issue with people saying they can tell the difference between resolutions. With the right TV, eyes and viewing conditions I'm sure a great many people can tell the difference. Some may even be genuinely bothered by it (considerably less than the number who claim they are bothered by it for sure).

I do have a problem with this statement of higher resolutions being objectively better though. As the saying goes: Beauty is in the eye of the beholder.
 
There are some bizarre arguments being thrown around this thread that I’m struggling to understand; if resolution / framerate / effects have no relevance and next-gen can only be defined by gameplay, then why ever upgrade the hardware in consoles? Surely, we’d all still be happy playing on a NES, since new gameplay ideas can happen regardless of power. I don’t get it.

Is there much of a numerical differential between 900p and 1080p? Yep, there’s a fair amount.

Can you tell the difference between them? Sure you can.

Does the differential in resolution match the perceptive change? Probably not, because of diminished returns.

I’m very sorry but I totally bought a new console so that I can play games in a resolution that matches the output of my television, also because I want the graphical effects to be a lot better, with ideally, a better framerate. I’d very much like gameplay to improve with the new consoles, only I don’t see it as being a massive difference between what came before since you can always innovate and it doesn’t necessarily require better hardware. I can’t say I care too much if I’m playing BF at 900p, even if the preference is still the resolution to match my TV.

Last gen, the differences in games between the two systems were less about resolution and more about effects. This time the difference in power between the hardware is easily fixed with a resolution switch. It’s as simple as that. Should Xbox One owners care that they’re playing in a lower resolution than their PS4 owning counterparts? I sure don’t think so.
 
since new gameplay ideas can happen regardless of power. I don’t get it.
Although not wanting to engage in another daft, cyclic resolution discussion (like it hasn't all been said before), power definitely opens up gameplay ideas. Many games are simply impossible on a 4 kB Sinclair ZX81. Ergo, there's the possibility of wanting more power while still rendering at SD resolutions.

With that said, I'll leave you all to go on and on about how resolution does matter/doesn't matter just as everyone has done before. ;)
 
Although not wanting to engage in another daft, cyclic resolution discussion (like it hasn't all been said before), power definitely opens up gameplay ideas.
Yup, we've recently seen Dying Light's gameplay, according to the original vision, wasn't working on 360/PS3 and Shadow of Mordor's enemies system (Nemesis) is extremely paired back on last gen consoles.

< 512mb is not a lot of RAM for increasingly larger and complicated datasets. And while you can argue the core gaming (exploring, combat, completing missions) of Shadow of Mordor is there on last gen, a new generation of hardware for me means deeper and more complicated worlds.

Despite not liking the sound of it, I'm curious to see if AC Unity does anything really not possible on last gen, other than the graphics upgrade.
 
There are some bizarre arguments being thrown around this thread that I’m struggling to understand; if resolution / framerate / effects have no relevance and next-gen can only be defined by gameplay, then why ever upgrade the hardware in consoles? Surely, we’d all still be happy playing on a NES, since new gameplay ideas can happen regardless of power. I don’t get it.

Is there much of a numerical differential between 900p and 1080p? Yep, there’s a fair amount.

Can you tell the difference between them? Sure you can.

Does the differential in resolution match the perceptive change? Probably not, because of diminished returns.

I’m very sorry but I totally bought a new console so that I can play games in a resolution that matches the output of my television, also because I want the graphical effects to be a lot better, with ideally, a better framerate. I’d very much like gameplay to improve with the new consoles, only I don’t see it as being a massive difference between what came before since you can always innovate and it doesn’t necessarily require better hardware. I can’t say I care too much if I’m playing BF at 900p, even if the preference is still the resolution to match my TV.

Last gen, the differences in games between the two systems were less about resolution and more about effects. This time the difference in power between the hardware is easily fixed with a resolution switch. It’s as simple as that. Should Xbox One owners care that they’re playing in a lower resolution than their PS4 owning counterparts? I sure don’t think so.

Yup, we've recently seen Dying Light's gameplay, according to the original vision, wasn't working on 360/PS3 and Shadow of Mordor's enemies system (Nemesis) is extremely paired back on last gen consoles.

< 512mb is not a lot of RAM for increasingly larger and complicated datasets. And while you can argue the core gaming (exploring, combat, completing missions) of Shadow of Mordor is there on last gen, a new generation of hardware for me means deeper and more complicated worlds.

Despite not liking the sound of it, I'm curious to see if AC Unity does anything really not possible on last gen, other than the graphics upgrade.
I see your point, and I think you are wrong about certain detail... Technology can enhance gameplay, sure. Additionally, technology doesn't affect gameplay, but 1080p does, negatively.
 
Last edited by a moderator:
I see your point, and I think you are wrong about certain detail... Technology can enhance gameplay, sure. Additionally, technology doesn't affect gameplay, but 1080p does, negatively.

??? :???:

Your statement is confusing... very confusing.

As console and PC technology advances, so does the possibilities of implementing better gameplay mechanics. As for resolution outputs - be it 480p, 720p, 1080p, 4K and even 8K, that doesn’t stagger game design. Poor development planning and going beyond the systems limitations, creates issues. It would be quite silly for me to develop a 4K game on the PS4/XB1 knowing the systems limitations. So blaming a render output is quite silly IMHO…
 
Back
Top