According to the DF interview, it seems they mainly had memory issues with resolution on the PS3. It seems that it was a performance issue that reduced the resolution on the 360 (AF was limited to 4x and tiling causes a performance hit, right).kind of a weird question, yes if crisis 2 were running on the frostbite engine it would be running at the resolution. 704 is the magic number for 40x22 tiles if each tile were 32x32 pixels. and you cull the rest of the pixels.
reason why crysis 2 was its resolution on 360 was probably to avoid tiling while on ps3 it was probably for performance.
720p should not be considered as a magical number that guarantees image quality. 1152x720 is already very close to 1280x720.I see, perhaps crytek should balance their engine out a bit and take it easy on the filtrate and bandwidth. Whatever takes them closer to 1280 x 720 the better. But that's just me.
I think that Crytek (and most other AAA games studios recently) have made a good choice by optimizing their resolution to match the EDRAM size (you get performance improvement by both reduced pixels and by reduced geometry passes, and you can use this extra performance to make the game look better and run better).
This reminds me of a recent case, one of my classmates pretty much always plays COD4 on PS3 on an HDTV and a couple of weeks ago when I went to his place I saw that he had the console connected to an HDTV while the display was set to an SD resolution....was pretty surprised that he never knew that he could have it displayed at a higher resolution.Then given the sell-through of the title to the mass, I have to wonder how many are even connected to HD displays (let alone properly ). Most would probably notice the better shaders (even if they don't realize it).
Someone should do some "blind tests" for 720p vs near 720p resolutions. I wonder how many of our average super-critical gaming forums users would be able to recognize the version that doesn't reward them with enough pixels.
single console optimization for the sku that ends up selling the worst while being the most difficult to work on makes it a hard sell for third party developers. theres a finite time and budget cost for these things.
That one is extremely obvious. I played Warhawk initially on SD which was very clean but not at all clear. 720p is definitely needed. However, I don't ever play it on a native display. I'm either on a 1366x768 TV or 1680x1050 screen (letterboxed to 16:9; thank goodness Samsung realised the need for 16:9 support on a 16:10 display in time for me!). 720p looks clear and gorgeous, and I have to stop and look at a 1080p game, like PJMonsters, to notice any advantage. In a typical fast game I doubt many people can notice the real different. And considering movies and TV programmes are rarely that sharp themselves, the pursuit of resolution really is something of a wild goose chase IMO.Made the same test this a 720p NoAA and SD 4*AA (and better shaders) at same view distance and with no zoom toshop tool
They said they were actually CPU bound on 360, and there was still some performance left on GPU.According to the DF interview, it seems they mainly had memory issues with resolution on the PS3. It seems that it was a performance issue that reduced the resolution on the 360 (AF was limited to 4x and tiling causes a performance hit, right).
They said the GPUs were comparable, except on the vertex side. I did see the part about them being CPU bound on the 360 and ran out of memory on the PS3. If there was GPU performance left, why would they say the 360 AF was 4x max due to performance. That wasn't in the DF article/interview. That was in the Crytech PDFs Scofield provided in the Cryengine 3 thread.They said they were actually CPU bound on 360, and there was still some performance left on GPU.
If there was GPU performance left, why would they say the 360 AF was 4x max due to performance.
Yes, I'm referring to PDF that their gfx engineer wrote. He may be right since MP on 360 ran flawlessly, but when AI comes to play it drops frames, mostly in first 3rd of the game, after its interesting that things get much better but game gets even bigger(night level shown at E3 2010 for example). Maybe it was more optimizing issue at first half of the game...They said the GPUs were comparable, except on the vertex side. I did see the part about them being CPU bound on the 360 and ran out of memory on the PS3. If there was GPU performance left, why would they say the 360 AF was 4x max due to performance. That wasn't in the DF article/interview. That was in the Crytech PDFs Scofield provided in the Cryengine 3 thread.
720p should not be considered as a magical number that guarantees image quality. 1152x720 is already very close to 1280x720.
Almost no digital TV sets have been sold with native 720p (1280x720) support. Some old CRT based HDTVs supported native 720p in US, first generation plasmas had 1024x768 resolution. Basically all HD Ready TVs have been 1366x768 for a long time. So 720p output always requires scaling, for both HD Ready and Full HD televisions. When played on a 1080p Full HD TV, upscaled 1280x720 and 1152x720 are very hard to distinguish (both get equally blurred by the upscaling). I think that Crytek (and most other AAA games studios recently) have made a good choice by optimizing their resolution to match the EDRAM size (you get performance improvement by both reduced pixels and by reduced geometry passes, and you can use this extra performance to make the game look better and run better). Resolution is just one thing in graphics quality. I personally would even go as low as 333x480 (VHS resolution) if we could make games look as good as the old VHS videos. I prefer less good looking pixels over many bad looking ones.
720p should not be considered as a magical number that guarantees image quality. 1152x720 is already very close to 1280x720.
Almost no digital TV sets have been sold with native 720p (1280x720) support. Some old CRT based HDTVs supported native 720p in US, first generation plasmas had 1024x768 resolution. Basically all HD Ready TVs have been 1366x768 for a long time. So 720p output always requires scaling, for both HD Ready and Full HD televisions. When played on a 1080p Full HD TV, upscaled 1280x720 and 1152x720 are very hard to distinguish (both get equally blurred by the upscaling). I think that Crytek (and most other AAA games studios recently) have made a good choice by optimizing their resolution to match the EDRAM size (you get performance improvement by both reduced pixels and by reduced geometry passes, and you can use this extra performance to make the game look better and run better). Resolution is just one thing in graphics quality. I personally would even go as low as 333x480 (VHS resolution) if we could make games look as good as the old VHS videos. I prefer less good looking pixels over many bad looking ones.
Disagree. Resolution is not "so low" in comparison with other games, its short of full 720p but I would bet its about average this gen gave us And yea, pixelated and blurry over say 10% difference?Not really.Bigger difference in IQ would be if you would go from TV to TV and compare, since all of them operate on different settings. Most people don't even know how to setup it properly, and difference between TV settings would probably be bigger than 10% resolution difference.I've seen the comparison of crysis 2 1280 x 720 on pc vs 1152 x 720 360 shots and the difference is very noticeable to me at least, that poor AA implementation didn't help either.
I also disagree on graphical feature over resolution especially when you have a fairly decent sized HDTV. You can have all the shaders and fancy lighting your game but since your resolution is so low the end result would just look so blurry and pixelated on your screen and therefore voids the effort in graphics.
I've seen quite a few games that has the perfect balance in resolution, features and frame rate without falling into the sub hd realm, but of course all developers have different goals and preferable approaches.