I'm disapointed with this, it looks good but then again I was expecting this.
http://www.tmalliance.com/uploads/024176/960.jpg
whats so special about that screen?
I'm disapointed with this, it looks good but then again I was expecting this.
http://www.tmalliance.com/uploads/024176/960.jpg
I'm disapointed with this, it looks good but then again I was expecting this.
http://www.tmalliance.com/uploads/024176/960.jpg
And then we need to subtract the "AA" from the AC5 image!
hopefully with NEXT gen, we'll have at least 16x AA in every game.
QFT.Not going to happen and imo that's a good thing. Anything above 4x is waste of processing power imo. I'd be perfectly happy with 1080p with 4x AA in next gen, and I'm not even expecting to actually get that.
Realtime AO calculations of some form should be possible for photorealistic lighting.
God knows how AC6 turned out to be controversial.
Not going to happen and imo that's a good thing. Anything above 4x is waste of processing power imo. I'd be perfectly happy with 1080p with 4x AA in next gen, and I'm not even expecting to actually get that.
QFT.
Not going to happen and imo that's a good thing. Anything above 4x is waste of processing power imo. I'd be perfectly happy with 1080p with 4x AA in next gen, and I'm not even expecting to actually get that.
Ok, 23 posts on 2 pages must be a new record.
Reminder: This is now obviously labelled as sensitive thread (God knows how AC6 turned out to be controversial). Anything, that goes in here should should be at least reconsidered once if it's too flammy, etc. There'll be no leniency.
P.S.: Changed topic title.
modern PC GPUs are doing 16x AA, and with SLI or CrossFire, they're doing upto 32x AA.
That's because the high end and SLI/CrossFire setups have prosessing power to waste. PC games are built for low to mid range cards and basically the higher end cards are "too good" for them, that's why they can crank up the image quality, but it's a rather high price for it. In the console world the development is somewhat different with only one sku, unless the gpu manufacturers figure out some super efficient way to do AA, I don't think we'll see over 4X used even in Next gen, because it would be a heavy tradeoff.
Modern GPUs have 384-bit and even 512-bit busses. Its effectively double that for SLI. I doubt we'll see more than 256-bit next gen.modern PC GPUs are doing 16x AA, and with SLI or CrossFire, they're doing upto 32x AA.
I don't see why next-gen consoles can't do 16x AA for 720p games and 8x AA for 1080p games.
You can't forget the space needed either. If consoles still use eDRAM next gen (and I don't see why not), I'd say 64MB is the most they'll have. That's enough for 1080p, 32bpp, 4xAA. Devs can then forget about tiling. If they have to choose between 4xAA and 16xAA with tiling, what are they going to choose? If most console gamers today don't even care about 4xAA, why would they care about 16xAA over 4xAA? I most certainly don't, and I care about AA far more than a typical console gamer.I don't think anti-aliasing is waste of processing power, so long as it doesn't suck up a large amount of GPU resources. the Xbox 360's Xenos EDRAM chip doing 4x AA for very low-cost in performance is pointing the way forward, IMO.
At 1080p I don't think you're right. Maybe when examining screenshots up close, but not while playing, and not for most people.4x AA is not enough to achieve a "CGI-ish" look, which is what next-gen consoles should be aiming for.
4x AA is not enough to achieve a "CGI-ish" look, which is what next-gen consoles should be aiming for