*spin-off* GoWIII is it 1080p

He said it will be 1080p, supporting 1080i and 1080p.

How they manage it can be debated, but I think it would be prudent to wait and actually see something of the (near) final product before proclaiming what they will do.


"Supporting" 1080p & "Rendering" at 1080p are different cases.

Supporting basically means the old HDTVs will be able to display the game in HD.
Many old HDTVs only supports 1080i & SD resolutions...a 720p game wont get displayed & so the people are forced to play it on SD, to avoid this they upscale the 720p image to 1080p & once its done the HDTV can display it...Thats supporting 1080p for you !
 
I will side with 720p, 2xaa and chuck in as much eye candies as possible be it 30 or 60 fps. Going 1080p is so not worth it if trading for visuals especially on HD TVs.
 
I will side with 720p, 2xaa and chuck in as much eye candies as possible be it 30 or 60 fps. Going 1080p is so not worth it if trading for visuals especially on HD TVs.

:yep2: Thats the way to go! I want a gorgeus title with superfun gameplay, just the way GOW2 was ! 720p is enough for pleasant viewing on an HDTV ! Put all the effort in gameplay , set pieces and eye-candy !
 
I take it one step further my friend

:yep2: Thats the way to go! I want a gorgeus title with superfun gameplay, just the way GOW2 was ! 720p is enough for pleasant viewing on an HDTV ! Put all the effort in gameplay , set pieces and eye-candy !


Recently I saw the Jurassic Park movie on a 480P DVD and I felt the graphics were great. I saw the Tyrannosaurus Rex on Jurassic Park DVD at 480P and said "even at 480P she looks very realistic and scary."

I will prefer to make the game 480P, 4xMSAA, super AF, 60fps, and extra quality shaders and textures. I feel this will give a much better look than 1080P or 720P with less shades and textures and also less AF and AA.

I feel look is more important than resolution. Maybe for GT5, high res is ok because it already looks great, but for God of War 3, it still looks like cartoons, not real.
 
I want framerate first. A cap at 30fps would be very disappointing. A few dips here and there in the odd scenic moment are understandable, but combat action games should strive for 60fps.
 
Recently I saw the Jurassic Park movie on a 480P DVD and I felt the graphics were great. I saw the Tyrannosaurus Rex on Jurassic Park DVD at 480P and said "even at 480P she looks very realistic and scary."

I will prefer to make the game 480P, 4xMSAA, super AF, 60fps, and extra quality shaders and textures. I feel this will give a much better look than 1080P or 720P with less shades and textures and also less AF and AA.

I have been saying this since 2005 :smile: I want to see a cutting edge game at 480p Widescreen with 4xMSAA and great filtering and the devs to go nuts with lighting, shadowing, and material shaders and a ton of particles and transparency based objects (like grass!). Throw in some hot texture streaming for high fidelity textures and I will take this over a bland 1080p game in most cases. This may even allow for some of the nicer looking GI and AO hacks to be possible.
 
I want framerate first. A cap at 30fps would be very disappointing. A few dips here and there in the odd scenic moment are understandable, but combat action games should strive for 60fps.

The problem with 30fps is the lack of 30fps ironically. Most games that aim for 30fps tend to have big drops throughout gameplay. Having a 10fps drop on a 60fps isn't as bad as watching your 30fps game dip to the 20 and below in heavy action area (when you need it the most!) 60 would certainly be ideal, esp at the expense of resolution.
 
Having a 10fps drop on a 60fps isn't as bad as watching your 30fps game dip to the 20 and below in heavy action area.

Well, one is 3.3ms, one is 16.6ms...
(Which incidentally is why everyone is going soft-v-sync these days.)
 
I have been saying this since 2005 :smile: I want to see a cutting edge game at 480p Widescreen with 4xMSAA and great filtering and the devs to go nuts with lighting, shadowing, and material shaders and a ton of particles and transparency based objects (like grass!). Throw in some hot texture streaming for high fidelity textures and I will take this over a bland 1080p game in most cases. This may even allow for some of the nicer looking GI and AO hacks to be possible.

I was on the same boat with you guys. However, after going from DVD to Bluray recently, I am not sure I am feeling the same anymore.

However, I wonder why they do not put this as an option at all in games. Better textture filtering(AF+tri-linear) + 4xAA@480p vs. worse texture filtering + 0/2xAA@720p. I would not mind testing these two. We can maybe test this on PCs. Anyone willing to do that? :)
 
I have been saying this since 2005 :smile: I want to see a cutting edge game at 480p Widescreen with 4xMSAA and great filtering and the devs to go nuts with lighting, shadowing, and material shaders and a ton of particles and transparency based objects (like grass!). Throw in some hot texture streaming for high fidelity textures and I will take this over a bland 1080p game in most cases. This may even allow for some of the nicer looking GI and AO hacks to be possible.

I vote Yes!!! I'm really interesting to show the result.

Is it possible to check this goal with some PC games on a SDTV or via a capture, don't known if some PC games have features like these you indicate in SD?
 
I'm pretty sure Carmack made some comments about making better use of fewer pixels rather than pushing more and more of them. I have no idea where I'd find them again.

I pretty much agree. I'd rather see God of War 3 at 720p than at 1080p with reduced effects. It's no question going to be a 720p game anyway.
 
However, I wonder why they do not put this as an option at all in games. Better textture filtering(AF+tri-linear) + 4xAA@480p vs. worse texture filtering + 0/2xAA@720p. I would not mind testing these two. We can maybe test this on PCs. Anyone willing to do that? :)

Sure. I chose HL2 due to easier to see AF and spot to see edge jaggies etc. Mind you I use a 20" native 1680x1050 monitor and 848x480 looks extremly blurry while 1280x720 is passable and higher is much better.

I scaled the 848x480 image to 1280x720 for comparisions sake however I included original image size for anyone to upscale with perhaps another algorithm.

848x480 upscaled to 1280x720, 4xAA, TSAA, 16xAF
http://img186.imageshack.us/img186/8351/848x4804xaatsaa16xafa.jpg

1280x720, 2xAA, 0xAF
http://img186.imageshack.us/img186/737/1280x7202xaa0xaf.jpg

1280x960, 2xAA, 0xAF
http://img512.imageshack.us/img512/3883/1280x9602xaa0xaf.jpg

---------------

848x480 not upscaled, 4xAA, TSAA, 16xAF
http://img255.imageshack.us/img255/7227/848x4804xaatsaa16xaf.jpg
 
Last edited by a moderator:
The 720P looks best IMO, just too blurry in 480P.

Edit: Thanks for the pics Nebula.
 
Last edited by a moderator:
Yeah a lot of fine grain detail is lost.

But all you did was up AA and AF and take a shot at an angle where AF is pretty nominal in benefit :???: Seeing as many, many games don't even have MSAA at 720p and both the 360 (no tiling and very cheap MSAA on a single tile) and PS3 should spit out 4xMSAA at 480p with no issues.

The concept I was proposing was not only to gain AA and AF (IQ) but also increase memory footprint and cycles that could be spent on shaders and textures. If you are GPU limited and running 30fps at 720p, by ~ halfing your pixels allows more work per pixel.

Take the same screen shot (this time standing on the ground so the ground is at a more acute angle) and half the texture resolution, cut the shadows from high to medium, cut model detail in half, and run no MSAA etc and I think it is more akin to what I am proposing for a GPU limited scenario.
 
I was on the same boat with you guys. However, after going from DVD to Bluray recently, I am not sure I am feeling the same anymore.

That is due to the fact the source is the same. What is being proposed is the fidelity of the source material would be higher at the cost of output resolution.
 
But all you did was up AA and AF and take a shot at an angle where AF is pretty nominal in benefit :???: Seeing as many, many games don't even have MSAA at 720p and both the 360 (no tiling and very cheap MSAA on a single tile) and PS3 should spit out 4xMSAA at 480p with no issues.

I redid that scene close to the ground. The 848x480 shot has highest settings and the 1280x720 has medium/low settings. however the FC2 shots might be more inline with your thoughts.

[strike]Scratch it I think AF was on for both...[/strike]

EDIT: Fixed.

848x480 upscaled to 1280x720 highest settings 4xAA, TSAA, 16xAF
Native 1280x720 shots 2xAA, 0xAF and medium/low settings
Native 1280x720 shots 0xAA, 0xAF and medium/low settings

848x480 upscaled to 1280x720 highest settings 4xAA, TSAA, 16xAF
Native 1280x720 shots 2xAA, 0xAF and medium/low settings
Native 1280x720 shots 0xAA, 0xAF and medium/low settings

The concept I was proposing was not only to gain AA and AF (IQ) but also increase memory footprint and cycles that could be spent on shaders and textures. If you are GPU limited and running 30fps at 720p, by ~ halfing your pixels allows more work per pixel.

Take the same screen shot (this time standing on the ground so the ground is at a more acute angle) and half the texture resolution, cut the shadows from high to medium, cut model detail in half, and run no MSAA etc and I think it is more akin to what I am proposing for a GPU limited scenario.

I did Far Cry 2 to with highest settings for upscaled screenshot and flat medium fornative 1280x720.

848x480 res upscaled to 1280x720, highest settings, 4xAA, TSAA and 16xAF
native 1280x720 res medium settings with 2xAA, 2xAF
Native 1280x720 res medium settings with 0xAA, 2xAF
 
Last edited by a moderator:
In the FarCry 2 screen shots, I prefer the 480upscaled by far over the 720 at medium settings.

HL2 doesn't seem to have a major quality difference between highest/medium.

I suppose that is the real answer. If you set out to make a game that really pushes the graphics as perhaps Crytek are known for doing, you might be seeing a large difference in med/high and thus at that point you might be better off dropping the resolution for the highest settings possible.

However, for something like HL2/source where the diff between med/high isn't big, you might be better off taking the resolution sharpness over subtle IQ/effects differences.

The problem really sits with the fanboys who just can't let go of silly manufacturer promises. Ofcourse as CoD4 onwards has shown, you can weather such petty storms and still dominate as long as you make the compromises in the right places. The biggest selling games of this gen, CoD4/WaW/6/Halo3/MGS4 have done quite well being "sub HD"
 
Back
Top