Can CELL do AA?

lip2lip said:
from work done on movie scale cgi development, xbox aa will heavily assist with the archiving results of tiling. ps3 has 512 ram available for video, and does not need to tile. Because of the negative effect tiling causes on the xbox, I can see how 2x aa on ps3 would look equal to 4x aa on x360. This is just from the noticable effect of tiling video's requirements of aa for movie cgi. There is, ofcourse, no way not to tile the video on a cluster box.

AFAIK both PS3 and x360 have 512 MB available for everything. Ps3 does not have 512 available for video.

AFAI understand it tiling on x360 is internal to the edram... you wont see a visual disparity on screen

Finally aa on Ps3 is not supposed to be free where on x360 it basically is...
 
V3 said:
1080p broadcast standard is limited to 30Hz, but most 1080p sets are happy to do 60Hz. Beside the standard will be added in the near future.

I'm pretty sure most "1080 sets" are not 1080p, in the first place. Typically, the 1080 set range falls under 1080i native scan displays. This does not mean that such sets or even typical 720p sets will have nothing to display when it comes to a true 1080p signal, however. You are pretty much guaranteed compatibility with formats as far out as 1080p30, but beyond that (the magic 1080p60), things become more "undocumented". So maybe it will do something with a 1080p60 signal or maybe nothing at all, and it is not really that easy to pin down until you actually try doing it with a particular display.

Naturally, all of the broadcast standard formats up to 1080p30 or 1080i60 will be converted into "something" displayable for that set, regardless of the native output. So that leaves a very select few sets that will actually display 1080p natively, let alone 1080p60.
 
blakjedi said:
Finally aa on Ps3 is not supposed to be free where on x360 it basically is...

I remember this being said for XB, as well. Something for "free" should always be viewed as suspect (a lot of times, it's not any more free than something else that is not free).
 
randycat99 said:
blakjedi said:
Finally aa on Ps3 is not supposed to be free where on x360 it basically is...

I remember this being said for XB, as well. Something for "free" should always be viewed as suspect (a lot of times, it's not any more free than something else that is not free).

its part of the design spec for the xenos... i think it really is free as opposed to speculation
 
blakjedi said:
its part of the design spec for the xenos... i think it really is free as opposed to speculation

What Randy's saying I think is that this 'free' item cost 100 million transistors - so it's 'free', but only within the context of not burdening the main chip, not in terms of coming out of nowhere at no cost.
 
cell prob wouldn't be the best app to use AA, and I am sure RSX can handle it fine, just what is the hit to performance when it is turned on is the real question
 
randycat99 said:
I remember this being said for XB, as well. Something for "free" should always be viewed as suspect (a lot of times, it's not any more free than something else that is not free).
The Xbox's "free" AA was only so when the bottleneck was not in a particular place. IIRC, if you had enough memory bandwidth, it was free. That will not be the case on the X360, though. The system is now designed to do that for free instead of just the GPU being designed to do it for free. But yeah, we'll have to see.

And I use the word "free" here to simply mean "little to no performance hit." I don't mean that it costs no design work or transistors or etc.
 
ATI said 4xAA caused a 1-4% performance hit, so it was basically free and they said any dev would be "crazy" not to use it.

How much does a typical GPU suffer from 4xAA?? Anyone know a ballpark for the G70??
 
scooby_dooby said:
ATI said 4xAA caused a 1-4% performance hit, so it was basically free and they said any dev would be "crazy" not to use it.

How much does a typical GPU suffer from 4xAA?? Anyone know a ballpark for the G70??

A typical gpu's performance hit with AA varies from game to game and the settings in game..assuming bandwidth is maxed out, I believe it's a 75% performance hit for enabling 4xAA.
 
scooby_dooby said:
ATI said 4xAA caused a 1-4% performance hit, so it was basically free and they said any dev would be "crazy" not to use it.

How much does a typical GPU suffer from 4xAA?? Anyone know a ballpark for the G70??

I believe ati is talking about bandwidth free and that is a result of the edram. Now fillrate will most certianly take a hit using fsaa
 
jvd said:
scooby_dooby said:
ATI said 4xAA caused a 1-4% performance hit, so it was basically free and they said any dev would be "crazy" not to use it.

How much does a typical GPU suffer from 4xAA?? Anyone know a ballpark for the G70??

I believe ati is talking about bandwidth free and that is a result of the edram. Now fillrate will most certianly take a hit using fsaa

How will fillrate take a hit if it's using multisampling and not super sampling?
 
jvd said:
scooby_dooby said:
ATI said 4xAA caused a 1-4% performance hit, so it was basically free and they said any dev would be "crazy" not to use it.

How much does a typical GPU suffer from 4xAA?? Anyone know a ballpark for the G70??

I believe ati is talking about bandwidth free and that is a result of the edram. Now fillrate will most certianly take a hit using fsaa

Isn't the Xenos' edram blenders supposed to take care of it fully? Without bandwidth or fill rate hits? Toss the scene (or a section of the scene) to the edram and it does the AA entirely on its own?

If it still had the normal fillrate hit, I don't see how they could claim it's only 1-5% on average.
 
Fox5 said:
scooby_dooby said:
ATI said 4xAA caused a 1-4% performance hit, so it was basically free and they said any dev would be "crazy" not to use it.

How much does a typical GPU suffer from 4xAA?? Anyone know a ballpark for the G70??

A typical gpu's performance hit with AA varies from game to game and the settings in game..assuming bandwidth is maxed out, I believe it's a 75% performance hit for enabling 4xAA.
Do you mean 4X would cause a 60fps app to drop to 15fps.
 
ralexand said:
Fox5 said:
scooby_dooby said:
ATI said 4xAA caused a 1-4% performance hit, so it was basically free and they said any dev would be "crazy" not to use it.

How much does a typical GPU suffer from 4xAA?? Anyone know a ballpark for the G70??

A typical gpu's performance hit with AA varies from game to game and the settings in game..assuming bandwidth is maxed out, I believe it's a 75% performance hit for enabling 4xAA.
Do you mean 4X would cause a 60fps app to drop to 15fps.

If it's bandwidth limited without AA then wouldn't it? Maybe pretty close to that kind of drop anyhow.
 
If it's bandwidth limited without AA then wouldn't it?
It would require the game to be 100% GPU limited for one. Not to mention it kinda implies something is horribly wrong with GPU cache behaviour.

randycat99 said:
You are pretty much guaranteed compatibility with formats as far out as 1080p30, but beyond that (the magic 1080p60), things become more "undocumented"
Well I'm yet to see an LCD set that refreshes at less then 60hz (regardless of the resolution). And HDTV CRTs are more often then not higher then 60 (even at 1080).
 
...but the LCD HDTV's that actually output 1080 native are few and far between (if any at all), rather than 720-ish. 720p60 is certainly easy to come by, for sure.

As for the CRT HDTV's, I was not aware of higher-scanning ones than your typical 1080i60, but if you say so.

I feel it bears pointing out (for the general audience) not to confuse allowable input formats/refreshes vs. what the final output is on a particular display device. For example, you can feed some 1080p60 program into your LCD panel, and if it happens to support the format, you will see something appear on the screen. What appears on the screen won't necessarily be 1080p60, however. Depending on the panel's native output format, what you will be seeing is the original program downsampled to 720p60 (or 768 or whatever, as the case may be). So it's not exactly the same as if you had a display device that actually displayed the program in its native 1080p60 format. Undoubtedly, there will be HDTV's that will successfully accept a 1080p60 program at it's input. What it actually shows on the output will depend entirely on the particular architecture of the display element.
 
randycat99 said:
...but the LCD HDTV's that actually output 1080 native are few and far between (if any at all), rather than 720-ish. 720p60 is certainly easy to come by, for sure.
Well if it's not 1080 native then it isn't really relevant to the discussion of refresh rates for 1080 is it? :p
Anyway my point was that afaik LCD panels pretty much all refresh 60-75hz range at their native resolution. Be that 480P, 1080P, or 1200P/1600P

Actually speaking of native resolutions, I thought it was CRTs that usually cop-out on horizontal resolution in particular (as opposed to having full 1920), even though they claim native 1080.
But anyway I haven't really been following on HDTV CRTs as much, so I'm sure someone else can provide more info on that issue.
 
I see you have opened up the range to include computer displays, as well as traditional HDTV units. This is fine, and by all means people are free to view hi-def material on computer displays, but let's not lose sight of the fact that nearly all large screen, wide-format LCD hdtv displays intended for living room useage will be of some 720-768p variety. They will refresh at 60. ...but they do not spontaneously become capable of 1080p60 output simply because a 1080p60 signal appears at the input (therein occurs the pivotal useage of downscaling to make any image a possibility). By your later comments, I think we are both in agreement on that. This was, by far, what my original comment on the matter was addressing. As it stands now (barring the most extreme examples of what can be purchased with price as no object), you will find a great preponderance of LCD and plasma hdtv's with native output as 720-768p60 and a great preponderance of CRT hdtv's with native output as 1080i60. 1080p60 programming may certainly find use in any of those displays, but finding a display in that home viewing hdtv venue that actually natively does 1080p60 is still a rare, rare item, indeed. The Apple display certainly treads into notable size range for home theater use, but it is still on the relative small end of the range. The resolution for that size (and likely, the available refresh settings) is beyond reproach, no doubt.
 
blakjedi said:
AFAIK both PS3 and x360 have 512 MB available for everything. Ps3 does not have 512 available for video.

Slightly contradictory, but to clarify, RSX has access to all the RAM in the system, as has Cell.
 
Back
Top