Range of graphics effects in console games *spawn

IT's the only smart way to do AF. PC way of brute force " all or none" is just another case of dumb waste of ressources typical of PC developement.

However several devs do have customised AF with different amounts applied depending on what is rendered.

On fixed hardware you have to be more serious with that.

I understand that for perfomance reasons. Just never though the consoles where this limited when handling texture filtering, even with a smart way of doing it yet it's greatly limited!

Can barely believe it considering I've had 8-16xAF in every game I played since my ATI 9800 at early 2004. And perfomance impact has been low. :cool:
 
What is it about AF that gives the consoles so much trouble? Because like neb said 8xAF has been very cheap on midrange graphics hardware for a while now.
 
Well... those are REALLY old GPUs... 2007 and older. Already looking at the most recent 8800GTS (which I own), you can see the impact is MUCH smaller than with the older cards.
 
Anyway, what is it about AF that gives console GPUs so much trouble? And what changed so that PC GPUs can do it so easily for the last 4 years?
 
Anyway, what is it about AF that gives console GPUs so much trouble? And what changed so that PC GPUs can do it so easily for the last 4 years?

Now that I think about it, ROPs and texture units? Both consoles got 8 ROPs.

7800GTX series had 16, ATI 9800pro/6600GT 8 ROPs IIRC. Then efficiency has been improved to and increased texture units with refined tech.
 
Bandwidth is also quite limited, especially compared to a modern PC graphics card.
 
Yeah, just checked it, my GTX460 has like 4 times the bandwidth of an X360's GPU. Even if I'd run Crysis 2 in 1080 (my monitor is only 1680*1050 so I won't) I'd still have nearly twice the bandwidth available so I could probably use both MSAA and AF without significant performance hits.
Of course it's also worth to remember that this VGA costs more than an X360, and it can't play games like Reach or Gears 3 for example ;) or KZ/UC for PS3 owners.

Now an 580's memory transfer rate is like 8 times as high as what the XGPU has...
 
So if I understand correctly, bandwidth is the limiting factor. Might still be a tough nut to crack next gen on a 128bit bus.
 
So if I understand correctly, bandwidth is the limiting factor. Might still be a tough nut to crack next gen on a 128bit bus.

Well we could see XDR2 in a console on a 128 bit bus to the GPU......

And in comparing cost of PC graphics to Xbox 360, it's amazing what $75 can get you, hell even $50. It's crazy to think that very easily, the OS can be the most expensive part of your machine outside of the main processor.
 
Last edited by a moderator:
Am I the only one that thinks Uncharted 2's IQ is horrid?

It'a a matter of perspective. If you're used to gaming on a PC with full screen supersampling bullshot mode, it doesn't look so good. But for a PS3 game it is very clean.
 
Am I the only one that thinks Uncharted 2's IQ is horrid?
Horrid?! You do appreciate that on a sliding scale from 'catastrophic mess with no AA or filtering' to 'picture-perfect 64x supersampling quality', horrid places U2 near the bottom of the scale? And that's plainly not true. U2's is definitely higher than average for console games. Perhaps far from perfect, but definitely even further from catastrophic.
 
Back
Top