Alternative AA methods and their comparison with traditional MSAA*

Doesn't seem to be any meaningful posting anymore.
No reason to let a good thread degenerate.
Closed for now.
 
One thing I'll just add:

Resource usage figures for games is an interesting metric, but ultimately there is very little meaningful information you can take away from it.
Extrapolating and saying 'ohh, they are only using 50%, that means their next game will do twice the stuff' is the worst form of speculation.

Three very simple examples why this is the case:


You can't apply a single metric to total system performance / resource usage:


A developer says they are using 50% of a systems processing power in flops (which would actually be quite an achievement). That could mean they don't have enough parallism, or they have hit the system memory bandwidth wall - or a host of other things. Chances are it's not a trivial problem at all. Sure in theory it could be 2x faster...


The law of diminishing returns:


In a room full of elephants, nothing stands out. Smart developers target key performance bottlenecks first, those systems that will see the maximum gain for the least work. And usually, the smaller the gain, the harder the work. As soon as you are dealing with a few percent here or there, chances are it's not worth the time and money needed. 80/20 rules, etc.


No game uses 100% of a systems resources 100% of the time:


This should be obvious, but people seem to miss it alot. Uncharted 1 may well have only averaged 30% SPU activity (as a hypothetical example), however that does not mean it didn't spike.
The way around this is to design your game to have little breathing room. Eg, you will only ever see 4 enemies up close, etc.
I imagine this was why Uncharted 2 was so consistently high quality. The side effect is the player usually has less freedom to deviate from a scripted path, or simply sees less variation.


There is also the simple matter of doing things smarter, Mass Effect 2's characters use very similar texture budgets as the first game - yet the perceptual difference is night and day. This is simply making better use of the available resources.
Art process maturity, if anything, is a far more significant reason why games look better the further you get into a console generation - not a sudden ability to run twice as much code.
 
The GoW3 and Metro implementations are entirely different. In truth, we know how Metro does its thing but the specifics of GoW3 are largely unknown right now - unless any one knows differently.
 
No game uses 100% of a systems resources 100% of the time
I think a percentage use figure can be interesting when
a)it clearly isolates the component it refers to (CPU most likely)
b)it follows a reasonable interpretation of performing work. Time spent waiting for another component to deliver input data, even when (or rather especially when) it's a busy wait, should never be claimed as "used". That's overhead.

In the ideal case you'd count the time spent in code that contributes anything to i/o (graphics, audio, networking) and disregard the rest.

Using many-core processors efficiently is hard. That's what makes it so interesting to see what studios can achieve. But the figures need to follow some base standards to be even remotely comparable.
 
... not to mention the scale has been reversed. In software engineering, it is considered an achievement to use less resources to do the same thing. e.g., If GoW3 can use half the resources and look almost as good as UC2, that's a Good Thing (TM).
 
I found the implementation to be staggeringly good with a minimum of pixel-popping. Only on the Guitar Hero bit does it show some of the limitations of other implementations.
 
Just a strange little idea on regarding MLAA.

Can you incorporate the MLAA to upscale process, you do have the edges in memory so it should be possible to render game in whatever resolution and scale it up with clean antialiased edges.
Yes, the edges would still jump around, but in theory it might look better than just low resolution MLAA and then basic upscale.
 
... not to mention the scale has been reversed. In software engineering, it is considered an achievement to use less resources to do the same thing. e.g., If GoW3 can use half the resources and look almost as good as UC2, that's a Good Thing (TM).

To me seems even better of unchy 2 :LOL:
 
Back
Top