A bit overstretched, isn't it?they dont have any real performance?
A bit overstretched, isn't it?they dont have any real performance?
Would you like having the capability of having fully compatible MSAA in all games?Gee what a surprise, ATI pinning their hopes on some useless bullet point because they dont have any real performance?
I would. Will Radeons allow for AA in Stalker and all the other older games that don't support AA, or are you just teasing us here?Would you like having the capability of having fully compatible MSAA in all games?
So call me a skeptic. G92 looks to be quite a bit larger than RV670. It's not rocket science to think that G92 would cost more per die than RV670 due to the area difference alone. Hence, nvidia won't like to compete at the same price point as RV670 because nvidia would make less money than AMD.Prove it. They stated in the conference call the margins on G92 are higher than the corporate average, likely an amazing 55%. [ Source Reference ]
Its down to the devloper to support it, but indeed, titles that use deferred shading (which includes STALKER, UT3, Gears of War, R6V, etc.) are incompatible with MSAA - this has left IHV's sometimes trying to hack the renderer in order to support AA, with the potential side effects of IQ issues and lower performance because you don't necessarily know which buffers need AA and which don't.I would. Will Radeons allow for AA in Stalker and all the other older games that don't support AA, or are you just teasing us here?
Yeah Spyhawk but what happens here ?
http://www.iax-tech.com/video/3870/38706.htm i hope Catalyst 7.11 will perform better here even if it is a Syntetic Benchmark it shows that something is wrong in Performance when it comes to lights and this imho is a bottleneck for all new Games for example Crysis or COD4
:smile: I think he meant that:Will Radeons allow for AA in Stalker and all the other older games that don't support AA, or are you just teasing us here?
:smile: I think he meant that:
then you'll "have fully compatible MSAA in all games", which should be about when R800 readies to ship.
- DX10.1 unifies MSAA landscape AND
- when NVidia comes out with parts that support DX10.1 AND
- when these capabilities are widespread enough for devs to start supporting AND
- hardware can handle the load it brings (take a look at HardOCP's Gears of War performance with 4xAA)
But with HD3800 you "have the capability" now.
DX10.1 allows control of the MSAA buffers to the developer, so titles that use deferred shading in the future could still enable AA from a DX10.1 enabled application.
Ah, Gears runs fine on my 8800gt with AA, and I've heard other Nvidia users as well as ATI users say the same. Not sure what was up with the H's benchmarks.[*]hardware can handle the load it brings (take a look at HardOCP's Gears of War performance with 4xAA)
World in Conflict, DX10, 1280x1024: HD3870=17fps, 8800GT=32fps.
What does DX10.1 add vs. DX10 that makes such a big difference? I was under the impression DX10 had everything necessary for MSAA with deferred shading. Sure, DX10.1 makes it slightly more convenient, but I hadn't noticed anything that makes a possible/impossible kind of difference, or even a significant performance difference.