So, the RSX can do AA or HDR... but not both?

Status
Not open for further replies.
Can anyone confirm that the Xenos CAN do BOTH HDR and AA at the same time?

Is the whole reason why the G70 can't do both HDR and AA ( MSAA ) at the same time a bandwidth issue?

Is THAT why the Xenos CAN do both?....because the embedded eDRAM takes care of the AA, so the rest of the system can take care of the HDR?

Does that sound right?
 
BenQ said:
Can anyone confirm that the Xenos CAN do BOTH HDR and AA at the same time?

Is the whole reason why the G70 can't do both HDR and AA ( MSAA ) at the same time a bandwidth issue?

Is THAT why the Xenos CAN do both?....because the embedded eDRAM takes care of the AA, so the rest of the system can take care of the HDR?

Does that sound right?

yes
 
BenQ said:
Is THAT why the Xenos CAN do both?....because the embedded eDRAM takes care of the AA, so the rest of the system can take care of the HDR?

Rest of the system? I thought the eDram took care of HDR too? :?

If it's "just" a bandwidth issue, I would have thought it'd be up to devs then to figure out if they want to make the tradeoffs to do both, but I thought there was some more fundamental difference in G70 which prevented it.

That said, it makes it puzzling how various demos from NVidia etc. with HDR seemed also to be using AA, though it's easy to mistake some other effects for HDR, so I may be wrong there.
 
That said, it makes it puzzling how various demos from NVidia etc. with HDR seemed also to be using AA, though it's easy to mistake some other effects for HDR, so I may be wrong there.
Some demos definitely had HDR + FSAA, as we were told they had HDR (besides it being obviously present) and had pretty obvious antialiasing as well. They probably used SSAA? That's actually what makes me think 2x SSAA might be a viable solution.
 
If it's "just" a bandwidth issue, I would have thought it'd be up to devs then to figure out if they want to make the tradeoffs to do both, but I thought there was some more fundamental difference in G70 which prevented it.
Since you basically can't fit an HDR 720p render context in 10 MB without some kind of rounding or tone mapping, you'd have to go through all the tiling garbage even at 720p if you chose to use HDR.

With G70, I'd imagine, first of all, that SSAA is necessary for purely mathematical reasons -- with HDR, the dynamic difference across an edge can potentially be huge, and something like MSAA or Temporal AA would cause too much bias or be too noticeable. I mean the main reason SSAA even exists as a solution to "jaggies" is because at least 4x supersampling is necessary for the Nyquist limit of the sampling rate to match the image resolution.

But there's no doubt that you need 4x the bandwidth for 128-bit HDR, which is an achievement that will probably never come about because bandwidth demands for everything else will grow sooner.
 
ShootMyMonkey said:
With G70, I'd imagine, first of all, that SSAA is necessary for purely mathematical reasons -- with HDR, the dynamic difference across an edge can potentially be huge, and something like MSAA or Temporal AA would cause too much bias or be too noticeable.
In which ase how does Xenos support it and does the IQ suffer as a result? Will MSAA produce jaggies between areas of high contrast? Wouldn't SSAA still do the same, or does that work on the backbuffer once rendered (0...255 RGB data)?
 
marconelly! said:
That said, it makes it puzzling how various demos from NVidia etc. with HDR seemed also to be using AA, though it's easy to mistake some other effects for HDR, so I may be wrong there.
Some demos definitely had HDR + FSAA, as we were told they had HDR (besides it being obviously present) and had pretty obvious antialiasing as well. They probably used SSAA? That's actually what makes me think 2x SSAA might be a viable solution.

The performance hit with SSAA is quite large (looking at D3 the other day it took a 40% hit with 4x MSAA and another 45% on top of that for 4x SSAA). Demos with AA and HDR probably were SLI setups.
 
cobragt said:
we dont need to worry about the ps3 being able to do both, all I have to say is, getaway demo. The demo used HDRL and AA, no jaggies were seen and the res was 720p.

Just because those screenshots of the Getaway have no jaggies doesn't mean that it's rendering HDR and AA at the same time.

Developers LOVE to take even the most jaggy console games and clean up the pics with AA before they release then to the public. Infact if we were to judge solely off pics, one might conclude that FSAA has been available ( and implamented ) on every console game for YEARS :LOL:
 
Just look at the commercials running for GTA SA on television.

Not a single jaggy in any screen, looks perfect, but when you actually fire up the game it's Aliasing galore.
 
It's possible all the other things the Getaway demo was doing was affecting perception of jaggies, if there was no AA there. They were doing stuff to emulate the optics of a camcorder, auto-focussing, auto-white balancing, depth of field etc - that may have had an impact (it certainly gave the demo a different look than some of the others, it looked sort of "soft"). Of course it was also at 1080p which would help.

I was thinking more of the Luna demo, though. Or the Unreal demo (Tim Sweeney mentioned HDR, and that was actually at 720p, not 1080p). Of course, without direct feed and no direct shots of those demos, it's harder to tell if AA was there or not. If shots were released, we possibly could take them at face value though..the HS screenies don't look PRed up from a AA perspective ;)
 
Actually I think there were jaggies in the getaway demo, I remember seeing them quite clearly when the camera zoomed to some statue against the sky backdrop.
Jaggies such that you get when you play a PC game with 1280 x 1024 or some other high resolution without anti aliasing enabled in the drivers.
Nothing that I would much mind in next gen games, as I don't have very critical eye for jaggies on high resolution. But then again I might just gotten used to them, with my PS2 and Radeon 9600XT card :)
As the getaway demo was reportedly only running on "Cell" without the RSX, it's safe to assume games of similar graphical complexity will have some AA enabled.
In the Killzone demo I saw no jaggies, nor in any other demos other than Getaway and Warhawk, which had some very visible jaggies.
 
rabidrabbit said:
In the Killzone demo I saw no jaggies, nor in any other demos other than Getaway and Warhawk, which had some very visible jaggies.

Heavenly Sword also had no AA enabled in the E3 demo. You can see small jaggies in shots, but nothing to write home about (most edges actually appear to have none).

Quite literally, it wasn't noteworthy. I remember few comments, if any, about aliasing around E3 when this media hit the web, which may be a reflection of the size of this issue in the wider scheme of things, at least at these resolutions.

(Still, if Deano's reading, 2xAA @ 1080p would be nice! :p)
 
Regards the Cell Getaway demo, unless someone has an HD res film from the source you can't claim AA is involved. All I saw was a 300x200 screen video - 1080p shrunk down to that size is gonna have some nice supersampling effect.
 
Shifty Geezer said:
Regards the Cell Getaway demo, unless someone has an HD res film from the source you can't claim AA is involved. All I saw was a 300x200 screen video - 1080p shrunk down to that size is gonna have some nice supersampling effect.

Kikizo has HD movies from E3 and pre-E3 conferences.

Link
 
There shouldn't be any fsaa on any of the demos as we know the nv40 and g70 can't do fsaa with hdr . So any actual content that has hdr wont have fsaa and any content with fsaa wont have hdr .

Anything that exibhits this is most likely cgi .
 
Acert93 said:
marconelly! said:
That said, it makes it puzzling how various demos from NVidia etc. with HDR seemed also to be using AA, though it's easy to mistake some other effects for HDR, so I may be wrong there.
Some demos definitely had HDR + FSAA, as we were told they had HDR (besides it being obviously present) and had pretty obvious antialiasing as well. They probably used SSAA? That's actually what makes me think 2x SSAA might be a viable solution.

The performance hit with SSAA is quite large (looking at D3 the other day it took a 40% hit with 4x MSAA and another 45% on top of that for 4x SSAA). Demos with AA and HDR probably were SLI setups.

That'd be 58% for 4xssaa(extrapolating from your numbers), and if 2x is half as expensive, I'd guess 29% for 2x, not too bad actually, could be less if we get any last minute mem improvements.
 
do we really need AA @1900x1200 resolution?

10-30% is a very big price to pay if you could otherwise use it to bump up your 3d engine dont you guys think?
 
well that isn't hte 1080p res .

Anyway , playing on my computer i still like 4x fsaa or 6x fsaa (if its able to run at good fps ) on 1280x1024 . Now of course i'm much closer to the screen than the a console player is . However a console user will have acess to 50inch + screens .

So fsaa is allways important and the higher lvls you can put hte better it is .

4x fsaa at 720p isn't optimal but its better than 0 fsaa at 720p and 2x or 4x would be better than nothing at 1080p .
 
jvd said:
However a console user will have acess to 50inch + screens .
It's funny how Americans talk about console gamers. In the EU 50" is pretty much unheard of. Largest I've seen is in the 40's and very few. Majority of consoles I guess are played on 28" widescreen and will probably remain that way for ages. If for no other reasons than houses aren't big enough for larger screens!
 
Status
Not open for further replies.
Back
Top