Is free AA really worth it?

Status
Not open for further replies.
c0_re said:
I can't belive people are still talking like their going to see games in 1080p, they me be upscaled to 1080p from 720 or 480p but no developer in their right mind is going to target their game to run at a rez less far less than .01% of the popluation can even view.



I went to Ultimate Electrons today, they had 1 1080p TV for sale out of the 300+ TV's on display and it wsa 13,000$. Best buy and CC had none. Gimme a break
Not that I necessarily agree with it, but if the PS3 is limited to using FSAA with HDR, then it's conceivable that devs might opt for outputting a 1080p signal with no AA, to save bandwidth. Then if you run the game at 720p, you get some supersampling, and when most people use it at SDTV resolutions, there'll be crazy amounts of SS going on.

BTW, this topic should consider that 90% of gamers will be running at SDTV resolutions, and that neither system will break a sweat running 4-8x AA at 480p. PEACE.
 
digitalwanderer said:
Jaws said:
I grew up on super-scaling blocky sprites on SEGA arcades and to me ART> ALL GRAPHICS.
Dude, I grew up on Pong but that don't mean I can't grow and aquire higher standards.

Art is only good if the graphics show it off well for me.

Strange, I think that the reverse would be true. Good art can hide deficiencies in utilizing cutting edge visual technology. Take Okami for example.
 
Jaws said:
Listen to this guy?

digitalwanderer
Democratically Voted Village Idiot.


Joined: 19 Feb 2002
Posts: 15242
Location: Highland, IN USA


I didn't vote but... ;)
Ha? :?

Anyway, It's subjective...
Well, duh. Of course we see and percieve differently.

I grew up on super-scaling blocky sprites on SEGA arcades and to me ART> ALL GRAPHICS.
The technical aspects are a necessary evil in communicating the art, so the less evil and intrusive they are, the better. Hence, AA is good.

Maybe you in particular don't notice a big difference, but on the grand scale, it matters not.
 
Oda said:
While that pic of Top Spin 2 clearly does have jaggies, look at how barely noticeable they are. Only on a few surfaces (such as the net or the top stair) is the aliasing really noticeable.

I think you would revise your opinion if you saw it in motion.
 
Inane_Dork said:
Jaws said:
Listen to this guy?

digitalwanderer
Democratically Voted Village Idiot.


Joined: 19 Feb 2002
Posts: 15242
Location: Highland, IN USA


I didn't vote but... ;)
Ha? :?

<Explains the seemingly obvious>

"listen to the guy?"
<me bolds guys tag>
"Democratically Voted Village Idiot."
<me comments that I didn't vote for him but..me explains point>
Why should I listen to an idiot?
<@Digi, nothing personal! ;)>

Inane_Dork said:
Anyway, It's subjective...
Well, duh. Of course we see and percieve differently.

Well, double duh! ...follow the posts, and you'll see I have to explain AGAIN, that the point was it's SUBJECTIVE...

Inane_Dork said:
I grew up on super-scaling blocky sprites on SEGA arcades and to me ART> ALL GRAPHICS.
The technical aspects are a necessary evil in communicating the art, so the less evil and intrusive they are, the better. Hence, AA is good.

Who said AA was bad? See my repeated point of subjectivity and matter of priorities...

Inane_Dork said:
Maybe you in particular don't notice a big difference, but on the grand scale, it matters not.

Exactly. See my repeated point of subjectivity and matter of priorities...
 
The worse thing i notice in that top spin picture was the white pixels on the stairs. Nothing worse while playing a game than seeing pixel popping seems.

I hate broken seems more than jaggies :p
 
blakjedi said:
...
The 192 highspeed SIMD units on the EDRAM can be used for anything...

Sorry, I must've missed something but where did you get the idea that those 192 ALUs are programmable SIMD units?
 
Lazy8s said:
This argument can't be made without showing that the X360 is deficient in some other graphical area, like shading performance. Otherwise, the system's design could've achieved competitive performance all-around and still implemented better faculties for AA because it had also achieved extra headroom above and beyond other designs.

Trying to make this argument when the design is competitive all-around in graphics exposes only personal preference for visual features, not a design imbalance.

The premise for this line of reasoning is confused anyway. The eDRAM is not some feature-trade off for AA; it's a repositioning of the pipeline to keep bandwidth intensive operations off the external bus. It impacts the whole rendering scheme.

But, no, embedded framebuffers and 100M transistors are not needed for free AA. The goal is to move sampling out of the external buses to save bandwidth, and ideally, out of the framebuffer too to save memory storage. Tile accelerated rendering can keep sampling on-chip, and display list rendering can help only the final image to be written into the framebuffer at the end to help save on memory storage while sampling:
http://www.pvrdev.com/pub/PC/doc/f/PowerVR Tile based rendering.htm
http://www.pvrdev.com/pub/PC/doc/f/Display List Rendering.htm
Those are 2 very interesting articles. It seems like the MS/Ati engineers took alot of those ideas from powervr. Wonder will we see lawsuits? Does the C1 use display list rendering in the same way that the powervr platform does meaning rendering the entire 3d scene then rendering one pixel at a time?
 
jvd said:
But what other stuff ?
At the moment, Xenos has 3 unified shaders with 48 ALU's, divided into groups of 16, right? That, along with other components, allows them to process so many vertex and pixel shader commands a second. Developers will have to make sure their shaders can be run in time. IF those 100M eDRAM transistors were given over to other things, ATi could have added a fourth unified shader, giving a 33% increase to shader performance. They compromised between peak shader power and bandwidth saving AA enhancement.

I don't see where they wasted transitors on edram
I never said they did!!! Just that they COULD have done things differently, to gain a benefit in one area at the cost of AA performance. I've already said I like Xenos design (several times!!). I would say ATi sacrificed shader performance in terms of ALUs to make room for the eDRAM, to produce an effective, balanced rendering system that might well outperform the alternative without eDRAM in some/many areas - but they still had to decide what to include and exclude in the die space, which meant leaving some things out.

If there were no sacrifices, and no need to leave things out, ATi would produce a 256 shader system with 16 GB eDRAM ;)
 
If the X360's graphics performance is competitive in all ways with comparable systems of its time, it needed less silicon resources to rival the competition and gave itself the advantage of better AA capabilities (among other benefits) with the extra silicon headroom.
 
Lazy8s said:
If the X360's graphics performance is competitive in all ways with comparable systems of its time, it needed less silicon resources to rival the competition and gave itself the advantage of better AA capabilities (among other benefits) with the extra silicon headroom.

Depends how you want to look at it. You can argue that if they had more silicon headroom, then they would've used it. Or instead of adding more silicon, they could 'clock' higher to gain more performance and ultimately reach a thermal design limit of the 'whole' system with case enclosure.

As with any design, it will be bounded by costs and physical design limits. It will be interesting to compare X360 and PS3 because this generation, BOTH will have their major chipsets on a 90nm process node and launched within months of each other...

On paper the PS3 has the raw performance, so we'll have to wait and see how that is leveraged by devs...
 
I think the key point to Xenos isn't the eDRAM as much as the unified shader architecture. How good is it in real-world applications? Does it fulfill it's promise? If the improved efficiency matches a higher pipe conventional architecture, they'll have managed the same performance in less space, giving room to provide AA. If the performance is comparable with the same transistor count worth of conventional shaders, the eDRAM will be providing AA at the cost of shader power.
 
Shifty Geezer said:
I think the key point to Xenos isn't the eDRAM as much as the unified shader architecture. How good is it in real-world applications? Does it fulfill it's promise? If the improved efficiency matches a higher pipe conventional architecture, they'll have managed the same performance in less space, giving room to provide AA. If the performance is comparable with the same transistor count worth of conventional shaders, the eDRAM will be providing AA at the cost of shader power.

Obviously, each system has novel components that's leveraged in their designs. But ultimately it's the TOTAL embedded system design/architecture that is key and the potential it provides for creating games...
 
I don't even see this issue as debatable.

Implmenting AA has been a thorn in EVERYONES side for FAR too long.

There are VERY FEW console games that have any kind of AA, and noone likes turing on AA only to watch their framerte drop like a stone and even in some cases bring your GPU to it's knees.

If the Xenos' daughter die works aswell as it's supposed to, I wouldn't be suprised to see PC GPU's in the not too distant future have them aswell. ( or something very similar )

PC GPU's get faster and more powerful, and if you have a new GPU, you CAN play OLD games and have essentially free AA.

But a new GPU + New games has ALWAYS equated to a far more than trivial performance hit.

It is a blessing for gamers AND developers that atleast for the XBOX 360, AA implamentation is no longer an issue. It will be in EVERY game, and it really will be as simply as "turning it on" for developers. No more deciding if implamenting AA is really "worth it" reletive to the performance hit.

Anyone who believes that jaggies on an HDTV at 720p are"barely noticable" is flat out WRONG.
 
The issue is debatable because those 100million transistors could've been used for something else. If Nvidia is to be believed the performance hit with 720p is minimal ("practically free", on the G70, and we can all deduce why the released that information :p ), and so with those freed transistors they have a hell of a lot more shading power available.

If RSX doesn't perform AA well its a matter of complexity vs. image quality. If it does then the transistors from the daughter die might've been better spent on the main chip.
 
Personally I'm glad X360 has eDRAM and PS3 doesn't, if only to entertain topics of discussions until next-next-gen! :p
 
Nicked said:
The issue is debatable because those 100million transistors could've been used for something else. If Nvidia is to be believed the performance hit with 720p is minimal ("practically free", on the G70, and we can all deduce why the released that information :p ), and so with those freed transistors they have a hell of a lot more shading power available.

If RSX doesn't perform AA well its a matter of complexity vs. image quality. If it does then the transistors from the daughter die might've been better spent on the main chip.

AA is FAR from "free" for the G70. I'm sure your thinking of those charts nVidia released right?

Well take a LOOK at the games they were using. Most of them were circa 2002-2004 ( they even had the original Unreal tournament onn there :LOL: ).

Sure, if your planning on playing ports of 2002-2004 PC games on your PS3, then yes..... AA will be "almost free."

But did you notice the performance hit that a newer game such as Doom3 was taking ( which can just BARELY be considered a "Next gen game" ). It was taking a 17% hit at 1024 X 768 and a whopping 44% hit at 1600X 1200.

In the context of NEXT GEN games for the PS3, AA will be FAR from free.

Ontop of that, it is only the uninformed among us who believe that the entire 100 transistors, making up the daughter die, is used for "just" AA. :LOL:
 
Status
Not open for further replies.
Back
Top