I think we can only compare AF quality.
http://img717.imageshack.us/img717/2024/unigine2010030517220684.jpg
http://bbs.expreview.com/attachments/month_1003/10030521585c1936be60baceff.jpg
Texture quality is lower on the 5800, check the wall and stones on the bottom right side or the patch of land on the top left.
Almost looks like there's no AF on the 5800 OR it is running with SSAA because there's a slight blur even on the shadows and cobblestones on the floor. You running with SSAA fellix?
"Despite the fact that Nvidia will have a massive volume launch with thousands of cards"Serious or sarcastic? I'm having an argument with someone:
And I used the default AF setting to run the test, which is lower than the 16xAF, seen in the GF100 shot.
Why would GTX470 have considerably less bandwidth? Did I miss something? Is the memory bus severly crippled and clocked low?I don't know what NVidia was saying, but GTX470 with, apparently, considerably less bandwidth than HD5870, is faster.
A comparison with HD5850 seems better as both seem likely to have the same bandwidth. Here GTX470 is about 32% faster (29 versus 22 fps).
Why would GTX470 have considerably less bandwidth? Did I miss something? Is the memory bus severly crippled and clocked low?.
I can't go into details unfortunately but one building block that works very well is atomics, which are becoming increasingly important in compute workloads (including graphics). Local atomics (shared memory) in particular are *really* fast on Cypress, attaining rates like ~280 uncontended atomics/clock across the chip even on midrange hardware. This opens up a lot of algorithmic options and makes some fairly complex stuff viable.Care to elaborate on what exactly it is you're doing, and what the respective performance numbers are...? *curious*
If it's one game, there's a good chance that a negative LOD bias was used. I've seen it from racing games before, especially ones destined for the console. I don't know if NVidia and ATI handle AF with -ve bias the same way.On a different note, if you want to see how bad is the AF under-sampling on HD5000 hardware, check out TrackMania Nations game
Rumours suggest it's crippled with 800MHz clock and 320-bit width.Why would GTX470 have considerably less bandwidth? Did I miss something? Is the memory bus severly crippled and clocked low?
HD5850 (128GB/s) and GTX470 (~128GB/s) both have about the same bandwidth as HD4890 (124.8GB/s).Secondly, why on earth would a comparison with HD5850 seem better? This isn't a bandwidth test. ATI could easily get more FPS with the same bandwidth if it made a bigger chip. A 4870 is never 80% faster than a 4850.
Is it 320-bit for the GTX470?I believe the GDDR5 clocked at 800Mhz supposedly, compared to 1200Mhz on the 5870?
Power, I suspect. GDDR5 I/O on a GPU uses a fair bit of power. And the rumours suggest low clock and commensurately low-voltage.So what's the reason for using 800 MHz GDDR5? They couldn't clock the memory controller high enough? You'd think that with experience from GT215 they could at least get this right.
That's crazyness. ATI is getting more bandwidth out of a 256-bit bus than NVidia does out of a 320-bit bus due to power constraints? Wow.Power, I suspect. GDDR5 I/O on a GPU uses a fair bit of power. And the rumours suggest low clock and commensurately low-voltage.
Jawed
Power, I suspect. GDDR5 I/O on a GPU uses a fair bit of power. And the rumours suggest low clock and commensurately low-voltage.
That's crazyness. ATI is getting more bandwidth out of a 256-bit bus than NVidia does out of a 320-bit bus due to power constraints? Wow.
Does that mean the GTX480 is even more boned?
So the Unigine benches look legit, which brings us to that rumor of a while back, concerning the GTX 470's price of $299. If it is true, then it's not that bad, although from a performance perspective, it seems to be poised for disappointment (still waiting for actual game benches with proper drivers to confirm that).
Good analysis, very insightful. Thanks, that fits pretty well with my thinking."Despite the fact that Nvidia will have a massive volume launch with thousands of cards"
It would be moronic to say something like that without being sarcastic ... he never once really made any critical note about the credibility of the release dates while pushing them onwards week by week on his site though ... so why should I believe now he is not just being a moron? That said, I think it's more about plausible deniability ... this way he can say to he is pushing the message and still pretend to himself he is placing a critical note (which goes right over the top of the head of his remaining readership I'd say). So basically he's chicken and this is as far as he is willing to go.