Extreme ROPs?Uttar --what's a "beefier" ROP partition?
Sorry, I just couldn't help myself..
Extreme ROPs?Uttar --what's a "beefier" ROP partition?
Transparency SS or adaptive AA might give excellent results quality-wise, but if your scene is overloaded with alpha test textures, the gain over full scene SSAA isn't all that much after all. The unfortunate thing about Coverage sampling is that you get as much Transparency samples as Z/stencil samples are being stored.
Heh, I read Ail's post as "the unfortunate thing about Coverage sampling is that you get as much Transparency (color) samples as Z/stencil samples are being stored."
So that for 16xCSAA you would get 16 Transparency color samples taken for alphas
Also, talk of G90 and so on seems to have missed out one factor: if G80 turns out to be a dud in comparison with R600, what kind of turn-around is NVidia going to have to do.
It's looking pretty certain, now, that R600 will have ~50% more GFLOPs (345 versus >500). Am I meant to believe that GFLOPs will not be important in the D3D10 generation?
16x->4z/stencil = 4xTMAA or 4xTSAA
Isn't 16x -> 4 color + 16 z/stencil !?
Ah, New Meat with an attitude. I predict a short and fiery tenure if you don't dial the latter back about a notch and a half.
I love how the thread started as someone asking about a midterm upgrade and has progressed to people talking out their a$$ about AA. As for the OP, and the discussion about future nVidia products I just wanted to point out some things that people are aparently missing. To start the G80's (8800GTX/GTS) already support through GDDR4. secondly people are missing the boat on this whole DX10.1 crap. M$ is working on a whole new architure beyond DX but they fubard it like they did with WinFS, but both suposedly will be updates to Vista in SP1 or SP2. As for the R600, right now I don't think any aspect looks promising for AMD/ATI. With the new technologies coming out of intel that they have long term pat.'s on, and nVidia's completly new architure coupled with CUDA, AMD/ATI has a lot of work ahead. personally I think AMD is going to use ATI primarly to provide better in-house chipsets rather than focusing on high-end GPU's. but WHO knows. ok I just gave all the ******s about 6 tangents they can run with... let's see where the thread goes from here.
He might have, had he not posted that shit 15 days after the thread wad dead, and coupled it with rather... unstructured speculation of his own. PopinFRESH, consider yourself warned; if your second post is not less retarded than your first one, you're in for a lot of trouble, because that kind of posting style isn't welcomed here, no matter if some of your points are partially valid.I think he has some valid points.. *shrug*