Is free AA really worth it?

Status
Not open for further replies.
Jaws said:
Personally I'm glad X360 has eDRAM and PS3 doesn't, if only to entertain topics of discussions until next-next-gen! :p

I think all those who are concern with if a console have free AA or not are simply talking last gen problems which won't affect this next gen. Xbox 360 and PS3 both have res quality current hd display so have what it takes to make images look sharper, cleaner, and brighter than anything created in games before.

I think developers should worry more about how to get the most out of multi -core processors than focusing AA. PS2 won this last generation without it. All I'm saying is a next gen GPU focus should be on getting the most out it's multi-core CPU (like cell & xeno) than focusing on what mattered little last gen inthe overall proformace of the console. :?
 
xbdestroya said:
Shifty Geezer said:
I've speculated some of those missing RSX trannies are for PS2 BC, but I doubt they're for eDRAM. KK has said they couldn't fit enough eDRAM to be any use so didn't want it, but 4 MBs could surely be used for something wierd and wonderful. As the full BW to RSX is about the same as PS2's eDRAM, plus with use of compression need for BW is much reduced, there seems little need for the eDRAM. More likely IMO memory controllers will feature to simulate eDRAM access.

Missing G70 trannies - G70 my friend. ;)
Sorry. I meant the extra trannys on RSX that are needed from G70. ie. Take out all the redundant PC features and you've les trannys, but RSX is at least the same size as G70 - ergo those trannies are used for something else. If GS is 43 MT, and 32 MT is eDRAM, that's only 11MT needed for GS hardware inclusion. Add another 33MT for 4x GS performance and inbuilt 2x FSAA...it's the only logical explanation :D
 
Titanio said:
Kutaragi has also spoken about it (eDram) in terms such as to paint it as unnecessary. So I think at the very most, if it is there, it'd be only as part of the hardware solution for PS2 backwards compatability, and not available to PS3 games (which would be so large a waste as to make the prospect very unlikely, imo).

Because it be hacked onto the rsx at this point . Just giving the g70 10mbs of edram wont make it perform that much better. There was more work done in the xenos to have it perform well and use the edram effectivly
 
At the moment, Xenos has 3 unified shaders with 48 ALU's, divided into groups of 16, right? That, along with other components, allows them to process so many vertex and pixel shader commands a second. Developers will have to make sure their shaders can be run in time. IF those 100M eDRAM transistors were given over to other things, ATi could have added a fourth unified shader, giving a 33% increase to shader performance. They compromised between peak shader power and bandwidth saving AA enhancement.
There is more to shaders than just the shader hardware . Just adding more doesn't mean the capabiltys would increase . Memory bandwidth would have to increase . But your talking about decreasing memory speed in your example .

Just that they COULD have done things differently, to gain a benefit in one area at the cost of AA performance.
Its not just aa performance . They were able to greatly reduce the bandwidth hit from the framebuffer . Thus allowing them to use cheaper ram / less badnwdith and more of it . THe badnwidth saved can also be used for other things

If there were no sacrifices, and no need to leave things out, ATi would produce a 256 shader system with 16 GB eDRAM
In some ways its true . However a balanced design rarely sacrifieces anything . Remeber they had a target res and no legacy support was needed thus giving them an achievable goal and for the goal i doubt there were any sacrifices . Because after all in your example above with a res limitation of 640/480 you'd be wasting alot of money for over kill features
 
leechan25 said:
Jaws said:
Personally I'm glad X360 has eDRAM and PS3 doesn't, if only to entertain topics of discussions until next-next-gen! :p

I think all those who are concern with if a console have free AA or not are simply talking last gen problems which won't affect this next gen. Xbox 360 and PS3 both have res quality current hd display so have what it takes to make images look sharper, cleaner, and brighter than anything created in games before.

I think developers should worry more about how to get the most out of multi -core processors than focusing AA. PS2 won this last generation without it. All I'm saying is a next gen GPU focus should be on getting the most out it's multi-core CPU (like cell & xeno) than focusing on what mattered little last gen inthe overall proformace of the console. :?

Wow, you really think AA doesn't matter and didn't matter much last gen. You realize sharper and cleaner means more AA and you also realize that the reason Hollywood cgi looks so nice is because of the high levels of AA. You're confusing performance with sale figures. Last gen people traded fps for image quality, now this eliminates one of the big factors in that decision. I don't understand why anyone would claim that edram is a bad idea.
 
ralexand:
It seems like the MS/Ati engineers took alot of those ideas from powervr. Wonder will we see lawsuits?
There are many possible ways to divide a scene up among manageably sized tiles. Tile-based rendering is a fairly general concept.
Does the C1 use display list rendering in the same way that the powervr platform does meaning rendering the entire 3d scene then rendering one pixel at a time?
I don't believe it does. I think Xenos is still more toward immediate mode rendering where it doesn't bin the entire scene before rendering.

Jaws:
It will be interesting to compare X360 and PS3 because this generation, BOTH will have their major chipsets on a 90nm process node and launched within months of each other...
The RSX would have to be almost twice as powerful graphically to be as advanced as Xenos for their respective times of release, according to nVidia who says that they double their performance every six months (of course, RSX would need to be even more powerful than that according to nVidia who claims that their technology is not just as advanced as ATi's... it's more advanced.)
 
was aa applied before or after the motion blur? also I not sure where the artifacts would be found to prove aa was used here.

aa is really terrific, and it has been proven memory bandwidth for frame buffering on the ps3 is calculated to be near the parent daughter bandwidth limitation on xenos. so here is to plenty of aa on the frame buffer for both system!
 
lip2lip said:
...and it has been proven memory bandwidth for frame buffering on the ps3 is calculated to be near the parent daughter bandwidth limitation on xenos. so here is to plenty of aa on the frame buffer for both system!


:oops: how can something still theoretical, already be proven?
 
lip2lip said:
was aa applied before or after the motion blur? also I not sure where the artifacts would be found to prove aa was used here.

aa is really terrific, and it has been proven memory bandwidth for frame buffering on the ps3 is calculated to be near the parent daughter bandwidth limitation on xenos. so here is to plenty of aa on the frame buffer for both system!
But the bandwidth between the daughter and parent card is not what's important, it's the 256gb of bandwidth between the ROPs and the framebuffer is where the needed bandwidth savings are.
 
Tap In said:
lip2lip said:
...and it has been proven memory bandwidth for frame buffering on the ps3 is calculated to be near the parent daughter bandwidth limitation on xenos. so here is to plenty of aa on the frame buffer for both system!


:oops: how can something still theoretical, already be proven?

rhetoric comes to mind.

I'm no genius, but using the entire memory of the 360 gpu as a frame buffer also sounds like a major issue. won't this cause massive issues with texture buffer and loads?

my personal experience with 3d programming is limited to a few months over the past few years; although I have found having textures at the gpu more important than shaders. As I understand this is a major issue with programers for the system. I hope they have the continued support they need, or I may not be picking up my pre-order this November.
 
If we change some bits, that thread is a copy of all the anti aa threads we had during the R300/NV30 era when it becames clear that ATI's AA was better.
That thread is not about technicals really but is a PR downplaying campaign.
 
PatrickL said:
If we change some bits, that thread is a copy of all the anti aa threads we had during the R300/NV30 era when it becames clear that ATI's AA was better.
That thread is not about technicals really but is a PR downplaying campaign.
Thanks, that's sort of what I thought too....I just don't get these threads!
 
ralexand said:
...Hollywood cgi looks so nice is because of the high levels of AA...

There's more to hollywood CGI than just AA of course. Hi-res textures, HDR, high geometry, very long shaders etc...With software rendering you could apply all your resources to AA if you want at the expense of others but it doesn't force you to use X or Y resources more than others as long as it's an 'acceptable' standard for the audience...

ralexand said:
Last gen people traded fps for image quality, now this eliminates one of the big factors in that decision...

Well it helps devs not to think about it too much! :p But fps is traded EVERY gen for image quality. This is a game decision on a game-by-game basis...

ralexand said:
...I don't understand why anyone would claim that edram is a bad idea...

I'm not sure who's claiming it's a bad idea in this thread but discussing what the alternative cost of those transistors used are...

Lazy8s said:
Jaws:
It will be interesting to compare X360 and PS3 because this generation, BOTH will have their major chipsets on a 90nm process node and launched within months of each other...
The RSX would have to be almost twice as powerful graphically to be as advanced as Xenos for their respective times of release, according to nVidia who says that they double their performance every six months (of course, RSX would need to be even more powerful than that according to nVidia who claims that their technology is not just as advanced as ATi's... it's more advanced.)

Please define,

"powerful graphically"
"advanced"
"performance"

You cannot measure difference without defining what you're measuring...

Just to be clear, I was referring to the whole architecture and their chipsets at 90nm...

digitalwanderer said:
Qroach said:
Why is ths still being argued?
No clue, I've never understood the anti-AA crowd. :?

Not speaking for the anti-AA crowd but simple analogy,

Lots of people like fruits,
all like bananas, apples and oranges,
some prefer bananas over apples and oranges,
some prefer bananas and apples over oranges,

Everyday for life, with their meals, they get a 'free' orange from the orange fairy! :p

Clue: orange = free AA and it cost them a one off payment in transistors!

digitalwanderer said:
PatrickL said:
If we change some bits, that thread is a copy of all the anti aa threads we had during the R300/NV30 era when it becames clear that ATI's AA was better.
That thread is not about technicals really but is a PR downplaying campaign.
Thanks, that's sort of what I thought too....I just don't get these threads!

Welcome to A vs. B because of this and that... ;)
 
digitalwanderer said:
Thanks, that's sort of what I thought too....I just don't get these threads!
Gamespot had a pretty good article on this phenomenon: link... quote "Innovation is what it's called when your console has a unique feature. Otherwise, it's called a gimmick."
 
Status
Not open for further replies.
Back
Top