Why still no 4xAA on 360 games???

Status
Not open for further replies.

RobertR1

Pro
Legend
If 4xMSAA is "free" on the 360 and now that dev's have had time to get accustomed to the edram configuration, why aren't we seeing all the titles with 4xaa applied to them? Is it not "free" after all?
 
RobertR1 said:
If 4xMSAA is "free" on the 360 and now that dev's have had time to get accustomed to the edram configuration, why aren't we seeing all the titles with 4xaa applied to them? Is it not "free" after all?

The best answers that I have not being a dev is that:

1) UE3 is the base engine that supports most x360 games which as far as we know right now doesnt support the strengths of the edram configuration. Since its easy to use and widespread (its part of the SDK) most games can and will continue to use it.

2) Based on the assumption in #1 and the fact that there have been no well known, non-UE3 games announced, I think that by this winter you may begin to hear rumblings of games which have been produced from the ground up utilizing the strengths of the edram configuration. If a team got the final devkit last august and began analysis etc., later this year they can probably begin to show some the results of their work.

Of course all of this is just my guess..
 
RobertR1 said:
If 4xMSAA is "free" on the 360 and now that dev's have had time to get accustomed to the edram configuration, why aren't we seeing all the titles with 4xaa applied to them? Is it not "free" after all?

4xMSAA was never said [by anyone of relevance] to be absolutely free, but to have a negligable performance impact because the ROPs are designed for no slow down up to 4xMSAA and the eDRAM provides enough bandwidth to not be hindered. But there is a small performance hit with MSAA at 720p due to tiling; ATI estimated this at 1-5%, but this could vary. People may "short hand" that it is 'free' because it is relatively free in regards to the impact it has on framerate when implimented correctly, but from a technical point of view it does have some impact dependant on the geometry that spans tiles.

Why don't all games support it? One reason is because the game engine needs to be engineered with the feature in mind. Its akin to slapping HDR or other graphical features on at the backend of development. This is not a PC where you just flip a switch (and even then MSAA does not work with all PC games either). We know that the eDRAM requires an early Z pass in the renderer to make use of tiling. As Mintmaster said about a month ago (do a search on Mintmaster's posts on this issue) that engine development takes years and we wont be seeing engines designed with Xenos exclusively in mind for another year or more. He also pointed out that it can create a lot of work to go back and re-design your engine when the focus is getting a game completed and not just hitting technical check boxes.

And in the UE3/Gears of War thread ERP went into some detail why the UE3's use of deferred rendering techniques don't play nice with MSAA. Epic has made comments that this would be worked on, but if memory serves correct they were talking about a late 2007 timeframe.

Basically 4xMSAA is very fast on the 360 with a negligable performance hit *if* the game is designed with the feature in mind. But the 360 is a console, not a PC, and the implimentation of this feature requires developers to make design decisions to enable this feature. It is pretty typical for games to take 2+ years to make, and big games 3 or more. The final silicon is less than a year old. That should put it into perspective.
 
Thanks Acert!

So is it safe to assume that since the RSX is more of a "traditional" GPU the AA implementation is similar to that on a PC and thus already incorporated into a lot of the engines as can been seen by the E3 presentations?

Also, if the RSX is doing the AA and we know that AA takes away a decent amount of resources from the GPU, would the RSX have enough horse power to run "next gen" games at 1080P, 60fps and 4xMSAA? From what I can gather from E3, the 720P games had AA implemented yet GT4 HD in 1080P had no AA. I wonder if it simply ran out of juice at this point?
 
according to the devs behind Forza2, the game will have 4x MSAA, HDR amongst other effects and be running at 60fps..
 
I wouldn't expect any 1080p game to have AA. It's a painful cost for very little visible benefit (as far as most people are concerned, I imagine). One could argue 1080p is a painful cost with little benefit too though =o

People without 1080p/i screens will be getting a free AA of sorts anyways if it gets straight downsampled. People with 1080p screens will likely appreciate the fact that it isn't upscaled. It's very hard to see pixels at 1080p on a reasonable sized screen at the recommended viewing distance anyways.
 
RobertR1 said:
Also, if the RSX is doing the AA and we know that AA takes away a decent amount of resources from the GPU, would the RSX have enough horse power to run "next gen" games at 1080P, 60fps and 4xMSAA?

NV ran a PP slide on the net after last years E3 noting how Anti-Aliasing was "here" in Hi Definition on their hardware already. What they showed was games at 1600x1200 (similar number of pixels to 1080p) with 4xMSAA enabled. They did this on a dozen or so games and averaged the difference, in %, the game had in performance difference. It was pretty small, like under 10%.

But when you looked through the games all of them were old less Half-Life 2, Doom 3, and Far Cry. Those games all topped a 40% performance penalty if memory is correct.

The PS3 is a closed box so they can do some things more effeciently. But in regards to next gen games (i.e. stuff like Killzone, Heavenly Sword, etc) at 1080p at 60fps and 4xMSAA? I don't think so... but maybe a PS3 developer can comment on whether they are aiming for 1080p at 60fps with 4xMSAA with cutting edge graphical features. Based on the E3 games, the more graphically intense games tend to be targetting 720p. And pragmatically you do ~50% less work at 720p compared to 1080p, which means you have roughly 2x the performance gain in your bottleneck (if it is related resolution, e.g. fillrate, shaders,etc).

Personally I would prefer more games at 720p that look better and more stable. In fact, if the game looks substantially better, I would be fine with 480p widescreen. Watching a 480p widescreen movie with CGI reminds me that resolution helps, but it does not solve every problem. Watching a Pixar movie, or the FF movie, or a movie like I, Robot which has a bit of CGI it always strikes me: even at this lower resolution it looks way better than any game shown on the PS3/360 yet. The Halo 3 and MGS4 realtime trailers are starting to blur the line some, but we are still pretty far away. I sometimes wonder if we would not be substantially close though if devs could target lower resolutions. I am sure RSX could handle 4xMSAA at 480p widescreen without a complaint. It may be able to do 8x (if it supports that).
 
But the CG stuff is rendered at super high resolutions then downscaled. So it benefits from the source material being ridiculously high quality. It's the old garbage in/garbage out idea, you're only as good as your source material.

Can we really look at Pixar movies and say resolution doesn't matter? A bug's life was rendered originally at 2048x862 which is very low for a CG movie from what I understand, yet still higher than 720p.
http://www.dvdreview.com/html/dvd_review_-_pixar_interview.shtml
 
Yet there is no way possible games are going to be rendered at super high resolutions. Super Sampling adds a lot of fidelity, but we are dealing with bigger problems IMO in games right now. Texture quality, soft shadowing and self shadowing (especially with dynamic objects), AA as standard, texture filtering. I would take a games with 4xMSAA, high quality AF, 2x the texture and shader detail, and superb lighting and shadowing models at 480p widescreen over a lot of the stuff we are currently seeing. 480p widescreen has less than 1/2 the pixel work to do. That means 50% less bandwidth, fill rate, pixel work, etc That means if you designed well you could squeez in 2x as much detail and fidelity.

1080p requires 5x as many pixels as 480p widescreen. There is a lot you can do with 5x as much graphical muscle sitting around. Not a perfect comparison, but I would rather play PGR3 at 480p widescreen with AA than PGR2 at 1080p with AA. More often than not resolution, if it outpaces game detail, actually exposes more flaws and ugliness in a game.
 
Status
Not open for further replies.
Back
Top