Choice of rendering resolution *spawn

I see, perhaps crytek should balance their engine out a bit and take it easy on the filtrate and bandwidth. Whatever takes them closer to 1280 x 720 the better. But that's just me.
 
single console optimization for the sku that ends up selling the worst while being the most difficult to work on makes it a hard sell for third party developers. theres a finite time and budget cost for these things.
 
kind of a weird question, yes if crisis 2 were running on the frostbite engine it would be running at the resolution. 704 is the magic number for 40x22 tiles if each tile were 32x32 pixels. and you cull the rest of the pixels.

reason why crysis 2 was its resolution on 360 was probably to avoid tiling while on ps3 it was probably for performance.
According to the DF interview, it seems they mainly had memory issues with resolution on the PS3. It seems that it was a performance issue that reduced the resolution on the 360 (AF was limited to 4x and tiling causes a performance hit, right).
 
I see, perhaps crytek should balance their engine out a bit and take it easy on the filtrate and bandwidth. Whatever takes them closer to 1280 x 720 the better. But that's just me.
720p should not be considered as a magical number that guarantees image quality. 1152x720 is already very close to 1280x720.

Almost no digital TV sets have been sold with native 720p (1280x720) support. Some old CRT based HDTVs supported native 720p in US, first generation plasmas had 1024x768 resolution. Basically all HD Ready TVs have been 1366x768 for a long time. So 720p output always requires scaling, for both HD Ready and Full HD televisions. When played on a 1080p Full HD TV, upscaled 1280x720 and 1152x720 are very hard to distinguish (both get equally blurred by the upscaling). I think that Crytek (and most other AAA games studios recently) have made a good choice by optimizing their resolution to match the EDRAM size (you get performance improvement by both reduced pixels and by reduced geometry passes, and you can use this extra performance to make the game look better and run better). Resolution is just one thing in graphics quality. I personally would even go as low as 333x480 (VHS resolution) if we could make games look as good as the old VHS videos. I prefer less good looking pixels over many bad looking ones.
 
I think that Crytek (and most other AAA games studios recently) have made a good choice by optimizing their resolution to match the EDRAM size (you get performance improvement by both reduced pixels and by reduced geometry passes, and you can use this extra performance to make the game look better and run better).

To take another example... there are the Call of Duty games which have taken the 60fps direction, and given the (obvious) limit of ROPs and ALUs, the edram basically served as a way to narrow down the target resolution to make hitting that render time easier. 2xMSAA is free on 360 in all respects in this situation. On RSX the ROPs are designed for 2 samples per clock though there is the bandwidth to consider (and of course, lower res helps mitigate that).

And although using MSAA further reduces the raw number of pixels fit into edram, it does allow them to do more per pixel while still trying to hit 60fps. Had they targeted 720p sans MSAA, the CoD games would probably not look as nice from a shading perspective... Then given the sell-through of the title to the mass, I have to wonder how many are even connected to HD displays (let alone properly :p). Most would probably notice the better shaders (even if they don't realize it).
 
Then given the sell-through of the title to the mass, I have to wonder how many are even connected to HD displays (let alone properly :p). Most would probably notice the better shaders (even if they don't realize it).
This reminds me of a recent case, one of my classmates pretty much always plays COD4 on PS3 on an HDTV and a couple of weeks ago when I went to his place I saw that he had the console connected to an HDTV while the display was set to an SD resolution....was pretty surprised that he never knew that he could have it displayed at a higher resolution.
 
Someone should do some "blind tests" for 720p vs near 720p resolutions. I wonder how many of our average super-critical gaming forums users would be able to recognize the version that doesn't reward them with enough pixels.
 
Last edited by a moderator:
In the case of Crysis 2, most of the blur was coming from the post process AA rather than the HD upscaling itself. Halo Reach, which had a similar resolution looked sharper and comparable to native 720p.
 
Someone should do some "blind tests" for 720p vs near 720p resolutions. I wonder how many of our average super-critical gaming forums users would be able to recognize the version that doesn't reward them with enough pixels.

Made the same test this a 720p NoAA and SD 4*AA (and better shaders) at same view distance and with no zoom toshop tool ;)
 
single console optimization for the sku that ends up selling the worst while being the most difficult to work on makes it a hard sell for third party developers. theres a finite time and budget cost for these things.

You also have to look at it the other way. There's quite a crowd on each side of the fence that only has one console. So "parity" doesn't matter to them. But if a game performs badly, it will sell worse. Though after a given workload, performance will reach diminishing returns, no matter how many man hours you put into it.
 
Made the same test this a 720p NoAA and SD 4*AA (and better shaders) at same view distance and with no zoom toshop tool ;)
That one is extremely obvious. I played Warhawk initially on SD which was very clean but not at all clear. 720p is definitely needed. However, I don't ever play it on a native display. I'm either on a 1366x768 TV or 1680x1050 screen (letterboxed to 16:9; thank goodness Samsung realised the need for 16:9 support on a 16:10 display in time for me!). 720p looks clear and gorgeous, and I have to stop and look at a 1080p game, like PJMonsters, to notice any advantage. In a typical fast game I doubt many people can notice the real different. And considering movies and TV programmes are rarely that sharp themselves, the pursuit of resolution really is something of a wild goose chase IMO.

Okay, this is taking this analysis thread off track, for which I reprimand myself! ;) The take home point is that pixel counting is a point of interest for looking at the technical issues of rendering realtime graphics on the consoles. It shouldn't be used as a quality benchmark in itself, and people shouldn't fall into the common trap of comparing numbers in isolation, such as a 15 megapixel camera must be better than a 10 megapixel camera.
 
According to the DF interview, it seems they mainly had memory issues with resolution on the PS3. It seems that it was a performance issue that reduced the resolution on the 360 (AF was limited to 4x and tiling causes a performance hit, right).
They said they were actually CPU bound on 360, and there was still some performance left on GPU.
 
They said they were actually CPU bound on 360, and there was still some performance left on GPU.
They said the GPUs were comparable, except on the vertex side. I did see the part about them being CPU bound on the 360 and ran out of memory on the PS3. If there was GPU performance left, why would they say the 360 AF was 4x max due to performance. That wasn't in the DF article/interview. That was in the Crytech PDFs Scofield provided in the Cryengine 3 thread.
 
They said the GPUs were comparable, except on the vertex side. I did see the part about them being CPU bound on the 360 and ran out of memory on the PS3. If there was GPU performance left, why would they say the 360 AF was 4x max due to performance. That wasn't in the DF article/interview. That was in the Crytech PDFs Scofield provided in the Cryengine 3 thread.
Yes, I'm referring to PDF that their gfx engineer wrote. He may be right since MP on 360 ran flawlessly, but when AI comes to play it drops frames, mostly in first 3rd of the game, after its interesting that things get much better but game gets even bigger(night level shown at E3 2010 for example). Maybe it was more optimizing issue at first half of the game...
 
720p should not be considered as a magical number that guarantees image quality. 1152x720 is already very close to 1280x720.

Almost no digital TV sets have been sold with native 720p (1280x720) support. Some old CRT based HDTVs supported native 720p in US, first generation plasmas had 1024x768 resolution. Basically all HD Ready TVs have been 1366x768 for a long time. So 720p output always requires scaling, for both HD Ready and Full HD televisions. When played on a 1080p Full HD TV, upscaled 1280x720 and 1152x720 are very hard to distinguish (both get equally blurred by the upscaling). I think that Crytek (and most other AAA games studios recently) have made a good choice by optimizing their resolution to match the EDRAM size (you get performance improvement by both reduced pixels and by reduced geometry passes, and you can use this extra performance to make the game look better and run better). Resolution is just one thing in graphics quality. I personally would even go as low as 333x480 (VHS resolution) if we could make games look as good as the old VHS videos. I prefer less good looking pixels over many bad looking ones.

I think the resolution compromises was something we can lives easily, but knowing that the engine completely 'ignore' the spu & a good amount of RAM for the graphic, is something I found really 'irritanting'. I mean, we talking of a lot of years of development & this should be the engine which push to the maximum the ps3 hardware? The RSX of course, but isn't the typical approach that the developer tries to overcome in the multi for years? I don't know how much time frontsibe 2 was developed, but seem a lot less of cryengine 3 indeed...
 
Last edited by a moderator:
720p should not be considered as a magical number that guarantees image quality. 1152x720 is already very close to 1280x720.

Almost no digital TV sets have been sold with native 720p (1280x720) support. Some old CRT based HDTVs supported native 720p in US, first generation plasmas had 1024x768 resolution. Basically all HD Ready TVs have been 1366x768 for a long time. So 720p output always requires scaling, for both HD Ready and Full HD televisions. When played on a 1080p Full HD TV, upscaled 1280x720 and 1152x720 are very hard to distinguish (both get equally blurred by the upscaling). I think that Crytek (and most other AAA games studios recently) have made a good choice by optimizing their resolution to match the EDRAM size (you get performance improvement by both reduced pixels and by reduced geometry passes, and you can use this extra performance to make the game look better and run better). Resolution is just one thing in graphics quality. I personally would even go as low as 333x480 (VHS resolution) if we could make games look as good as the old VHS videos. I prefer less good looking pixels over many bad looking ones.

I've seen the comparison of crysis 2 1280 x 720 on pc vs 1152 x 720 360 shots and the difference is very noticeable to me at least, that poor AA implementation didn't help either.
I also disagree on graphical feature over resolution especially when you have a fairly decent sized HDTV. You can have all the shaders and fancy lighting your game but since your resolution is so low the end result would just look so blurry and pixelated on your screen and therefore voids the effort in graphics.
I've seen quite a few games that has the perfect balance in resolution, features and frame rate without falling into the sub hd realm, but of course all developers have different goals and preferable approaches.
 
I think the "sub HD" dilema is more of a problem with fixed pixel displays. Call of Duty looks crisp on my HD CRT compared to my 1080p LCD. Could be mostly scaling low res images too.
 
I've seen the comparison of crysis 2 1280 x 720 on pc vs 1152 x 720 360 shots and the difference is very noticeable to me at least, that poor AA implementation didn't help either.
I also disagree on graphical feature over resolution especially when you have a fairly decent sized HDTV. You can have all the shaders and fancy lighting your game but since your resolution is so low the end result would just look so blurry and pixelated on your screen and therefore voids the effort in graphics.
I've seen quite a few games that has the perfect balance in resolution, features and frame rate without falling into the sub hd realm, but of course all developers have different goals and preferable approaches.
Disagree. Resolution is not "so low" in comparison with other games, its short of full 720p but I would bet its about average this gen gave us And yea, pixelated and blurry over say 10% difference?Not really.Bigger difference in IQ would be if you would go from TV to TV and compare, since all of them operate on different settings. Most people don't even know how to setup it properly, and difference between TV settings would probably be bigger than 10% resolution difference.
 
Back
Top