Activision's 360 games not running in full 720p?


Do a search on the forums. Bit-tech misreported based on what all the other sites have. The spec sheet simply said "Anti-Aliasing" and numerous other sources had said 2xMSAA. In the interviews immediately following E3 (and at E3) with ATI and MS represenatives (see Anandtech, HardOCP, Extremetech, Firingsquad, etc) it was laid out that 2xMSAA as "free" but they really encouraged 4xMSAA because of the low overhead (1-5%; in retrospect the comment ""but it's not what you'd associate normally with 4x multisample AA" takes on a whole meaning of its own, i.e. "gotta design for it to get this sort of performance"). Anyhow, that link shows that you don't encourage developers to use 4xMSAA if it is required.

After some prodding in the immediate months after E3 it was made further clear that if a developer decided on DOF and Motion Blur, or other techniques that could minimize edge aliasing, that those too could be acceptible. Yet I believe a developer here noted that it you would be pushing your luck trying to get your app through validation without anti-aliasing of some sort.

That said, this seems to demonstrate the gap between the big heads at MS PR and the reality of their developer networking. Their biggest game in the first 12 months, Gears of War (well established at E3 2005 and reaffirmed at E3 2006) clearly was not going to have MSAA. MGS also went the route of not bankrolling "ground up" engines, but instead financed Mass Effect, Too Human, Lost Odessey, Crackdown, etc on UE3 which would not have MSAA. So the bulk of their initial 12 month offerings were using an engine that effectively would not support this bulletpoint.

Someone at MS had to know that many of the rendering techniques either in use (e.g. some games had no early z-pass) or the software in development (like UE3) were not going to work out, at least not at first. And that using a tiled framebuffer -- something not present on the PC side, where much of MS's support comes from -- could take a couple years to get proper support, and that in general multiplatform development could pose some hurdles. Obviously the marketing heads won. eDRAM is nice for a lot of reasons (even cost reduction), and games without MSAA benefit from it, but you sometimes wonder about MS PR. They leak like crazy, you have crazy people like the gal at E3 telling Epic to take out the chainsaw, and simple things like technical bulletpoints which in hindsight MS should have known would be a big issue.
 
As long as it has AA it's difficult to see if it's enlarged or not by just looking at a TV screen as the dot edges can't be counted. If cynamite.de did frame grabs with a review kit, then they are correct.

Will PGR4 use real 720p?
 
I'm fairly sure that Dead Rising has 4xAA (and LP for that matter). I haven't checked out any other games recently though.
Is this 4xAA? (actual MSAA mini buffer on the right)
33625-ca03.jpg
 
You tell me? Grainy JPEG photo of a slide... :???:

I don't read japanese, but I'm guessing that slide doesn't mention a number.
 
Tap In said:
some games are 4x AA 720p.... perhaps someone with more info than me (and time) will tell us which ones.
Dead rising , Lost Planet , Ghost Recon , Splinter cell ... I think that there are quite enough games with 4Xaa.

Now about COD3 i think that (If true) maybe this has something to do with the fact that the test version (and all screenshots and videos that are coming from this version) look so much crapy while the final version look from a different universe , in terms of image smoothness. At least this is what people who have played the game say.
 
Now about COD3 i think that (If true) maybe this has something to do with the fact that the test version (and all screenshots and videos that are coming from this version) look so much crapy while the final version look from a different universe , in terms of image smoothness. At least this is what people who have played the game say.

I really don't think that the "testversion" just had a lower resolution for fun (if it had that at all) and then all of a sudden the raised it in the finals... This is not the first time that games are supposed to run at a lower resolution on the 360 (think PGR3 was the first one to do so).

The reason for lowering the resolution on the 360 is quite simply - to avoid tiling.

And judging CoD3 just by its "good looks" doesnt prove anything about resolution, since AA can fake that look very well.
 
The reason for lowering the resolution on the 360 is quite simply - to avoid tiling.

If you know the answer, why ask it in the original post?

I wonder...

For the other people posting in this thread 'worrying' about the IQ of 360 games: I have a HD beamer hooked to the machine, and it's fine. Really, you should try it one time.
 
Last edited by a moderator:
And judging CoD3 just by its "good looks" doesnt prove anything about resolution, since AA can fake that look very well.

2xMSAA didn't hide edge aliasing in PGR3 very well. By the time it got stretched out the aliasing was pretty noticable in high contrast areas. I don't know what internal resolution COD3 is, but if it is using 2xMSAA we should be able to spot some pretty obvious aliasing, especially if it is being scaled.
 
I really don't think that the "testversion" just had a lower resolution for fun (if it had that at all) and then all of a sudden the raised it in the finals... This is not the first time that games are supposed to run at a lower resolution on the 360 (think PGR3 was the first one to do so).
No they dont have Lower res for fun (again if this is true) but although i m not a dev i suppose that generally there are quite a few reasons for a test/beta version to not run with the final settings.

For the PGR ... Actually we have a launch game(PGR) running in lower than official/standard resolution and 2 or 3 other games (not launch) to run at higher than 720p. So i can't complain since this game remain a launch exemption.

The reason for lowering the resolution on the 360 is quite simply - to avoid tiling.
At the moment all we can do is speculating , so yes on the one hand maybe they render to lower res in order to avoid tiling.
On the other hand maybe the final version uses tiling and for this reason they can get 720p , 2xAA and 60 frames at the same time.
And judging CoD3 just by its "good looks" doesnt prove anything about resolution, since AA can fake that look very well
From what i know aa can improve the look of jaggies but can not make the texture quality better. In all those prereleased shots and videos , the textures look washed out (as if they are running in lower res) while in the final version they look alot more sharper.

Of course all this is just pure speculation but again the same happens with the entire post.
I m curious to see the response of German Activision , though.
 

I remember we discussed this some time ago, and I thought the mainstream conclusion was that it was using no AA, let alone 4x. Not sure about the others in your list though, but that one popped out at me given the debate we had before on it.
 
I remember we discussed this some time ago, and I thought the mainstream conclusion was that it was using no AA, let alone 4x. Not sure about the others in your list though, but that one popped out at me given the debate we had before on it.
You remember it well. I have played PDZ and it has some of the most noticeable jaggies i have seen in a 360 game.

Not sure this looks like 4xMSAA: here and here although the game is very pretty
I havent touch oblivion but i recall Bethesda to say something for 2XAA and HDR for the 360 version at the same time when they anounced that the pc version won't support HDR with AA (?)
 
Last edited by a moderator:
(1) colour palette: there are few cases where you'd see an ultra bright "anything" against a dark "anything". The contrast just isn't really there to make it extremely obvious.
(2) clever angles: jaggies appear most when you've got smaller angles wrt the normal.
(3) they used motion blur and DOF, which can hide things while you're moving the camera or moving
 
amazing looking really.
In this case, lower model resolution with normal mapping is a plus (not that I'm saying GeoW is low poly! Talking about jaggies in general here!). With texture filtering, the more detail you can shift onto textures, the less you have to worry about jaggies. And GeoW certainly uses textures very effectively. That and smart art-direction will go a long way.
 
Back
Top