Show me a game in 1080p on a top end PC that can beat today's best CGI film running at a DVD resolution in terms of detail and realism...
Not that I disagree with your general point but cgi films are rendered in much higher resolution than 1080p natively.
A higher resolution is great if you are seeing pixels but having photo-realistic graphics has little to do with actual resolution.
Show me a game in 1080p on a top end PC that can beat today's best CGI film running at a DVD resolution in terms of detail and realism...
Damn the public's quest for ever-increasing resolutions! we keep getting everything set back to zero again with regards instead of actually increasing the density of detail.
Maybe the problem is with your TV. I've gamed at 1080p and 720p on mine and it's hard to tell the difference.
The point is that they have a long way to go in image quality without lower resolution being a major drawback. If they could hit DVD quality at 480p, but only pong level graphics at 1080 would you still be cheering for your check box number?What has this argument to do with actual real time graphics? You are comparing something that is realtime with something that is not? What is your point? Of course cgi offline does look/should look better than real time gameplay. It is always the same weird argument, which in my opinion just does not fit to the actual discussion topic.
But aren't most PC games not that well optimised for multicore CPUs? I mean what percentage of CPU utilisation is a title like BF3 at on an i7, I doubt it'd be the same as a game running on Xenon or Cell.
Sorry scott, but I do not agree. At least, every console generation brought a general increase of resolution. Lots of games are 720p on PS360. So if we get 720p BF4...there is simple no progress and that is dissapointed. I know that some games this gen had super low resolution (ps3 the darkness?!) but were still higher than last gen. So, reality is that substantial sub 1080p, which 720p is in my opinion, would be dissapointing in the sense of the ps3 the darkness resolution and I do not expect this from one of the most technical capable devs around!
Furthermore, I clearly recognize the difference of 720 and 1080 on my screen, I promise you that you will see the difference as well. I did lots of test with my gaming PC and it is obvious. Played PC 1080 all time, until GoWA on PS3...which was super jarring in the first hour, although lots of forumites praise this game for its high IQ.
I wonder the following: if devs want to crank up the detail level of the graphics, e.g. texture and shader detail...720p will clearly limit them due to the artifacts. I don't want to have super high details...where the whole image is jagged and constantly shimmering.
In conclusion: I vote 1080p.
Furthermore, lets not forget that it is not 1080p or 60Hz. This is per se not an exclusive festure. It is in the hands of the devs to deliver both, 1080p and 60Hz. In my test with PC, I am much less sensitive to actual quality of certain effects (medium or high setting is often good enough, no very high needed) which cost lots of performance to maintain framerate at 1080p.
The point is that they have a long way to go in image quality without lower resolution being a major drawback. If they could hit DVD quality at 480p, but only pong level graphics at 1080 would you still be cheering for your check box number?
I've had a "good screen" since 2008 (professionally calibrated Pioneer Kuro 151). 720p content with good artstyle and effects has looked great on it. I can't imagine what 720p content with all the bells and whistles from the power of the new consoles would look like.
Resolution is just one part of the IQ and one with diminishing returns thus my 4K comment earlier.
As with every gen, there will be a resource budget that everyone will have to work with. I personally hope that budget is used to show me enhancements in gameplay (AI, animations, particles, physics, more players on map, destruction, etc) than aiming for a resolution checkbox.
If a developer such as DICE feels that in order to provide the gameplay experience they want, they need to keep the resolution low, I'd like to see the community to wait and see the results before bringing out the pitch forks.
CoD is a perfect example of a game that said "FUCK YOU" to the spec whores and forum goers and focused on what they believe would deliver the best experience. I want to see more of that.
If specs are your end all, build a gaming PC and knock yourself out.
I'm not picking on you Billy, it's just that shit gets old every gen when corporate mouthpieces start spec hyping and enthusiast can't look past it.
The bolded part there is where everything goes wrong for your comparison. Sure, if your screen is close enough that you can resolve individual pixels you'll be able to notice the difference. There is no way I'd play a 1200p game upscaled to 1600p on my 30" monitor because it's so close I can see the difference.
The important thing here is that this will be on a console. Which uses a TV. Which is generally located multiple meters away from the viewer/game player. My 55" HDTV is 3-4 meters away from my typical viewing seat.
To put that into perspective. If I had a 1080p 24" PC monitor on my desk at my typical 2 feet viewing distance, to resolve the same amount of detail on my TV at typical viewing distances, it would have be using a screen over 100" or I'd have to move the TV uncomfortably close to my couch.
To put it into more perspective. Recommended TV viewing distance for a 24" TV would be between 1-3 meters away. Your PC monitor is much MUCH closer to you than any TV manufacturer or calibrator would recommend. Hence it's much easier to see the differences between 720p and 1080p.
Long story short. I'd never upscale a game on my PC monitor because it is so close. On my TV, however? I run ALL PC games at 720p instead of 1080p. There is no discernible difference unless you were to put it side by side with another TV. I've done numerous blind tests with friends and family, and even those who absolutely, positively, without a doubt declared that there was no way they could mistake 720p for 1080p... Yeah, they couldn't tell the difference either as long as I didn't have game Text on the screen.
Hence, when PC gaming on my HDTV, I set resolution to 720p and crank up the rendering detail and AA. And end up with a far better looking game in general than if I had to run it at 1080p.
Regards,
SB
I know what you guys are trying say, don't get me wrong, and I really appreciate that you try to explain it to me so often
But again, your explanation is to extreme in my opinion and that is what I try to explain to you. The difference of needed computing resources to get 720 or 1080 is imo not that large that one version of the game would look like 'pong' and the other like 'avatar'. So how much increase in 'detail' can you expect when avoiding 1080p? We don't know, as it depends on the actual game. E.g. I certainly hope that we are CPU bound on new consoles for the new BF due to all the nice destruction physics and that resolution will not define the performance
PSlease stop it with this checkbox bs, boooring
The bolded part there is where everything goes wrong for your comparison. Sure, if your screen is close enough that you can resolve individual pixels you'll be able to notice the difference. There is no way I'd play a 1200p game upscaled to 1600p on my 30" monitor because it's so close I can see the difference.
The important thing here is that this will be on a console. Which uses a TV. Which is generally located multiple meters away from the viewer/game player. My 55" HDTV is 3-4 meters away from my typical viewing seat.
To put that into perspective. If I had a 1080p 24" PC monitor on my desk at my typical 2 feet viewing distance, to resolve the same amount of detail on my TV at typical viewing distances, it would have be using a screen over 100" or I'd have to move the TV uncomfortably close to my couch.
To put it into more perspective. Recommended TV viewing distance for a 24" TV would be between 1-3 meters away. Your PC monitor is much MUCH closer to you than any TV manufacturer or calibrator would recommend. Hence it's much easier to see the differences between 720p and 1080p.
Long story short. I'd never upscale a game on my PC monitor because it is so close. On my TV, however? I run ALL PC games at 720p instead of 1080p. There is no discernible difference unless you were to put it side by side with another TV. I've done numerous blind tests with friends and family, and even those who absolutely, positively, without a doubt declared that there was no way they could mistake 720p for 1080p... Yeah, they couldn't tell the difference either as long as I didn't have game Text on the screen.
Hence, when PC gaming on my HDTV, I set resolution to 720p and crank up the rendering detail and AA. And end up with a far better looking game in general than if I had to run it at 1080p.
Regards,
SB
I wouldn't think that resolution will be the deciding factor for most people. Battlefield is a totally different game than Killzone, unless you only play the single player.
I think going 720p and 60fps is the best decision they could make. Long as it is native 720p, and has some form of AA it will look fantastic. Being 60fps will make it a much more responsive game, especially in multiplayer.
I'm interested to see why DICE couldn't manage 1080p60 since BF3 could definitely run on PS4 at very high if not ultimate settings at 1080p60
They must have really made a lot of improvements to the engine.
Do you guys think this new dynamic weather system will be the culprit?
Nope. Most of WETA's CG in Avatar was rendered at 1080P.
http://forums.cgsociety.org/showpost.php?p=6496211&postcount=54
http://forums.cgsociety.org/showthread.php?t=878563