Is free AA really worth it?

Status
Not open for further replies.
I know the Xbox ATI chip is supposed to have free AA?

From what I've heard in order to get AA with no penalty you have to have 100 million transistors of embedded Dram plus some specialized control logic just to support it. That doesn't exactly sound free.

For me, the lack of AA is barely noticeable at HD resolution.
So couldn't that die area be used for something more useful?

Has this been discussed here yet?
 
It has been discussed somewhat... the general consesus seems to be:

Wait and see.

We don't really know if it was worth it -- hopefully it was.
 
On xbox 360, the AA is handled off chip inside a bank of memory that contains certain GPU logic.

They aren't really trading off die space for ram.

btw, you'll still see aliasing at HD resoloutions without AA.
 
Qroach said:
They aren't really trading off die space for ram.

He wasn't talking about RAM, he was talking about ATi using a big chunk of its transistor budget in order to move computing power onto a daughter die that ultimately gives it its "free AA".

Obviously, while this feature is "free" for developers to use - it ultimately isn't free from a hardware perspective since transistors are used. At the end of the day, both ATi and Nvidia are playing with more or less equal amount of transistors - the question which company has made the better tradeoff is yet to be answered.

I guess it all depends if developers want less shading power for fixed free AA or the other way around (assuming that Nvidia is opting for more shading performance, though we really are yet to find out).
 
My impression was that the dram + logic was on a daughter die within the same package as the GPU. So they really are trading off a certain amount of die space for the same size chip. (maybe not the full amount though).

Aliasing is still there at HD resolutions but as graphics getting better things like lighting and texture quality are more noticeable to me.


^^^ yeah, what he said.
 
did they though ? The xenos is bigger than the r420 .

232m vs 160m

So its already much bigger than thier previous chip . I don't really see them sacrifing transitors for edram and it looks like they wont be sacrifing yields as they are two diffrent chips
 
seismologist said:
I know the Xbox ATI chip is supposed to have free AA?

From what I've heard in order to get AA with no penalty you have to have 100 million transistors of embedded Dram plus some specialized control logic just to support it. That doesn't exactly sound free.
It's "free" in the sense that games can enable it and take almost no performance hit. Obviously, the feature was not "free" from a hardware implementation standpoint or we'd already have it in every GPU.

For me, the lack of AA is barely noticeable at HD resolution.
So couldn't that die area be used for something more useful?
I think AA is quite useful, even at HD resolutions. Plus, some of the die area you speak of that aides "free" AA does other tasks, so it's not as simple as removing that stuff from the chip.
 
jvd said:
So its already much bigger than thier previous chip . I don't really see them sacrifing transitors for edram and it looks like they wont be sacrifing yields as they are two diffrent chips
Well if instead of using 100 transistors worth of die space they had used it on shader hardware...yes, they made a sacrifice.

If I had a large-screen HDTV I'd want AA. Lack of AA is yucky. On small screen it won't make as much difference. I think, from the sounds of Xenos, it was a good choice coupled with the unified shader architecture, but we'll need field testing to see if this pans out.
 
Remember that computer graphics = all about optical illusion and the bigger the distance the lesser the visible jaggies.

That "free" 4* AA is a great thing for the xenos GPU allthough I don't know if it's that usefull with HDTV screens (or normal screens).

The recommended viewing distance is 8 to 12ft. on a 42" plasma tv and 12 to 16ft. or more on a 50" screen. (Most people also watch TV at these distances)

This gen most people when playing on their Xbox/Ps2 go grab an extra chair (or sit on the ground) with their heads almost glued to the screen mainly because of the controllers. Next gen we have wireless controllers so you can expect people to sit at the reccomended distance.
 
Yes, it was worth it, and I can't understand why you don't feel the need for AA at HD. I game on my PC at 1600x1200 exclusively since thats my native res and no-AA bugs the crap out of me.

AA in all games is the #1 thing I'm looking forward to next gen.
 
Well if instead of using 100 transistors worth of die space they had used it on shader hardware...yes, they made a sacrifice.
Unless more shader hardware wouldn't help

The amount of work per clock could have been limited by the number of rops .

Its all a balance inside the chip for image quality . obviously aa is part of that . But the chip is more powerfull than thier last attempts . The logic part of the edram also seems to help in other areas too . Not to mention they can also do fp10 hdr for free
 
I think it was also done that way to meet their desire for (relatively) low heat and low power consumption on the GPU.
 
They also take care of Z/Stencil and...other things too... ack... just read Dave's article. :p
 
3roxor said:
Remember that computer graphics = all about optical illusion and the bigger the distance the lesser the visible jaggies.

That "free" 4* AA is a great thing for the xenos GPU allthough I don't know if it's that usefull with HDTV screens (or normal screens).

The recommended viewing distance is 8 to 12ft. on a 42" plasma tv and 12 to 16ft. or more on a 50" screen. (Most people also watch TV at these distances)

This gen most people when playing on their Xbox/Ps2 go grab an extra chair (or sit on the ground) with their heads almost glued to the screen mainly because of the controllers. Next gen we have wireless controllers so you can expect people to sit at the reccomended distance.

I hereby invite everyone to my house to see just how shitty no AA looks on HD sets, you can sit as far back as you want... I'm having a hard time seeing how AA is a bad thing, or an unneeded feature.
 
This argument can't be made without showing that the X360 is deficient in some other graphical area, like shading performance. Otherwise, the system's design could've achieved competitive performance all-around and still implemented better faculties for AA because it had also achieved extra headroom above and beyond other designs.

Trying to make this argument when the design is competitive all-around in graphics exposes only personal preference for visual features, not a design imbalance.

The premise for this line of reasoning is confused anyway. The eDRAM is not some feature-trade off for AA; it's a repositioning of the pipeline to keep bandwidth intensive operations off the external bus. It impacts the whole rendering scheme.

But, no, embedded framebuffers and 100M transistors are not needed for free AA. The goal is to move sampling out of the external buses to save bandwidth, and ideally, out of the framebuffer too to save memory storage. Tile accelerated rendering can keep sampling on-chip, and display list rendering can help only the final image to be written into the framebuffer at the end to help save on memory storage while sampling:
http://www.pvrdev.com/pub/PC/doc/f/PowerVR Tile based rendering.htm
http://www.pvrdev.com/pub/PC/doc/f/Display List Rendering.htm
 
seismologist said:
For me, the lack of AA is barely noticeable at HD resolution.
So couldn't that die area be used for something more useful?
Usefullness is subjective. The discussions about high resolutions making AA irrelevant are as old as the history of this board! The consens boils down to this: higher resolutions lessen the impact of aliasing, but do not replace AA. Even at 1600x1200 AA is still desirable, and that's on a monitor far smaller than the average HDTV set.

That being said, the daughter die isn't only there for the "free" AA, it also performs other tasks and, most importantly, essentially removes one of the biggest bandwidth hogs from the console's main memory bus, the framebuffer and all operations revolving around it. The X360 was designed around this technology, its an integral part of it and couldn't just be traded in for something else. IMO its one of the most innovative and exciting designs in 3D technology for years...

SanGreal said:
Yes, it was worth it, and I can't understand why you don't feel the need for AA at HD. I game on my PC at 1600x1200 exclusively since thats my native res and no-AA bugs the crap out of me.

AA in all games is the #1 thing I'm looking forward to next gen.
Maybe not my #1 thing, but yeah ... it ranks pretty high!
 
jvd said:
Well if instead of using 100 transistors worth of die space they had used it on shader hardware...yes, they made a sacrifice.
Unless more shader hardware wouldn't help

The amount of work per clock could have been limited by the number of rops .
Well then they could have used those 100 million transistors for shaders and ROPS :p The point is, the rendering power of Xenos was abbreviated to provide for bandwidth saving, AA improving features. This was a design decision on where to spend their resource budget, where they sacrificed some features to provide room for others.
 
Status
Not open for further replies.
Back
Top