Relatively speaking, yes - I don't think anybody is claiming that bandwidth is never important nor that it's the sole factor in the overall performance of a graphics card but as the resolution increases relative performance hits decrease with respect to bandwidth. I've run a couple more 3DMark06 tests to show this:
Code:
3DMark06 - Graphics Test 1: Return to Proxycon
640 x 480 - No AA / Trilinear 680 x 480 - 8x AA / Trilinear
576 / 1350 / 900 - 49.50 fps 576 / 1350 / 900 - 47.37 fps
576 / 1350 / 450 - 46.49 fps (-6.1%) 576 / 1350 / 450 - 37.16 fps (-21.6%)
1680 x 1050 - No AA / Trilinear 1680 x 1050 - 8x AA / Trilinear
576 / 1350 / 900 - 38.97 fps 576 / 1350 / 900 - 26.16 fps
576 / 1350 / 450 - 31.90 fps (-18.1%) 576 / 1350 / 450 - 18.05 fps (-31.0%)
Take the non-AA results first: an increase in resolution by a factor of 5.74 results in the performance, when halving the bandwidth, decreasing by a factor of 2.97; the reason being, as already pointed out earlier in this thread, is that the bandwidth requirement per pixel doesn't increase. The same is shown with the AA figures: scale the resolution by a factor of 5.74 again, and the performance drop only scales by a factor of 1.43. Obviously more bandwidth is being used in total but not per pixel, thus there comes a point when the resolution is so high that differences in bandwidth between graphics cards matters far less than their other merits, such as number and speed of the SPs, ROPs, etc. I wish I had a bigger monitor to look at this in more detail.
Anyway, what Mintmaster was showing was that the games examined, the performance on the graphics card was only significantly affected by its bandwidth for a percentage that's probably a lot lower than people were expecting. Just like the amount of RAM or number of TMUs, it's just not the be-all, to end-all that it's often claimed to be. Which, funnily enough, was what I believe you were claiming and Mintmaster was showing!