NVIDIA G92 : Pre-review bits and pieces

Ok, just tell us why although R600 has more bandwidth and more ALU power than G80 its beaten again and again.
To me that means bad balance of priorities, do you think something else, and where/how do you think it can be improved?
It does really well when there's no AA. I think the problem is pretty obvious.
 
Do you think the R600 will ever be able to stretch it's legs?
Well, as the compiler improves, sure, it will help in ALU-bound apps. But, I don't think you're seeing that many strictly ALU-bound apps once you turn on AA with the R600. CFAA's performance hit is too high for this to have been the plan all along.
 
It does really well when there's no AA. I think the problem is pretty obvious.
Well, to be more specific, no AA and no AF, which is a rather useless test. 1280x1024 w/ 16xAF has more detail than 1600x1200 w/o AF for nearly every scene, but is usually faster.

Here's some recent comparisons:
http://www.xbitlabs.com/articles/video/display/dx10-part2.html
http://www.xbitlabs.com/articles/video/display/enemy-territory-performance.html

(I don't know any other sites that do NoAA/16xAF testing. This site also disables filtering optimizations, which probably penalizes ATI a bit more than G80 for a negligible IQ boost.)

Most of the time, even without AA R600 is trailing the GTS. In a few games, you're right that AA has a bigger hit on R600 than G80, but its problems are deeper than that.
 
Well, to be more specific, no AA and no AF, which is a rather useless test. 1280x1024 w/ 16xAF has more detail than 1600x1200 w/o AF for nearly every scene, but is usually faster.

Here's some recent comparisons:
http://www.xbitlabs.com/articles/video/display/dx10-part2.html
http://www.xbitlabs.com/articles/video/display/enemy-territory-performance.html

OK then, and what about the 4xAA and noAF case? Does anyone do that kind of test? Not that it's very lifelike, but it would be nice to see more clearly where the problem lies.
Anyway computerbase.de does tests without AA and AF, although it's in German. ;)
 
OK then, and what about the 4xAA and noAF case? Does anyone do that kind of test? Not that it's very lifelike, but it would be nice to see more clearly where the problem lies.
Anyway computerbase.de does tests without AA and AF, although it's in German. ;)

AA in itself is somewhat fooked, considering that in some scenarios the performance drop is catastrophic(as in greater that the drop experienced by X19xx cards, for example). This is without even applying the (fairly-useless) tent filters or the(much too expensive in terms of performance ATM) edge-detect mode, it happens with using box filtering. AF works as it should, given the number of RBEs units in the R600-it's a pinch better in terms of performance drop than it was on the R5xx gen, and it's significantly inferior to the behemoth that the G80 is when it comes to texturing capability.

Probably, the first thing to be really math bound will be Crysis-they seem to be doing a metric fuckton of work per pixel in their higher quality modes, through DX10. Trouble is that currently available 7.10s seem to be fairly sucky in the above scenarios, and a later version that I'm using, whilst improving performance, produces rendering artifacts.

So even though theoretically it could be a battlefield that the R600 should like(because I don't think a lot of ppl are ready to accept the performance drop associated with enabling AA in this game), because no AA testing suddenly becomes probable and relevant, it's again hamstrung by its drivers. IMHO, even though the 1-release per month thingie was a good idea at a point in time, relaxing the schedule and adding hotfixes/betas/whatever, like nV does with drivers released for every major title, would be better for em'. Especially given the fact that R600 drivers still seem to be less then stellar when it comes to DX10 rendering, so many months after initial release.
 
LOL, just got my 8800GT (Leadtek) and this is indeed on the box: "10-35% faster than R670", look at the attachment.
 

Attachments

  • S1.jpg
    S1.jpg
    51.1 KB · Views: 88
WTF are they smoking?


They aren't smokin ;), those two games favor nV's GPU as it is, then you look at current benchmarks of current cards, well can we really expect a huge change in the rx6xx architecture performance wise without any major changes to the GPU?
 
The sadly/funny part (apart from the silly scale of the graph) is that many people won't even know what a RV670 is!

They aren't smokin ;), those two games favor nV's GPU as it is, then you look at current benchmarks of current cards, well can we really expect a huge change in the rx6xx architecture performance wise without any major changes to the GPU?

My "what are they smoking" comment is in reference to the above. I'd be surprised if more than 1% of the target audience even knows what RV670 is.
 
Like I said earlier, if it is 10-35% faster according to the PR expect a much close race in reality. :D
 
So Jawed, now much longer until we can drop the "it's all drivers" and "the focus is on stability" explanation for R600 performance abnormalities? 2009? 2010? I mean, it's not like it reached the end of it's f-ing life or anything at this point.
I guess not being desperate to conclude that R600 is bug-ridden (round here it seems fashionable to conclude it is buggy) marks me out as an apologist for it.

Come up with some evidence for bugs.

Jawed
 
Ok, just tell us why although R600 has more bandwidth and more ALU power than G80 its beaten again and again.
To me that means bad balance of priorities, do you think something else, and where/how do you think it can be improved?
Fillrate and texel rate. I'm losing track of the number of times I've posted this.

I agree it's a bad balance of priorities. Which will only be "fixed" with the 2x chips and the hell that is CrossFire.

The more I look at R600, the more I see something that would never, under any circumstances, have been more than 16 TUs and 16 RBEs. This "4 quads" configuration has been around since 2004 and appears to be as far as ATI wants to go, before going multi-chip.

Jawed
 
I guess not being desperate to conclude that R600 is bug-ridden (round here it seems fashionable to conclude it is buggy) marks me out as an apologist for it.

Come up with some evidence for bugs.

Jawed

I though my question was fairly straightforward. The product is reached EOL, the "driver bugs" have still not been fixed. At what point will that stop being a valid excuse? Just give me a date.
 
This is my take on his words...

Many say the potential of the HD2900XT/X is held back by Drivers. The HD2900XT is End-Of-Lifed. It is no longer being made. IT is no longer a product. To me it implies: the "driver bugs" are really hardware bugs. Thus, the HD2900XT has reached its full potential. There is no magical driver that can polish the HD2900XT. If you put lipstick and a dress on a pig, it's still a pig. The sooner they stop deluding themselves the better off the discussions will be. Can we please stop with the AMD excuses?
 
Back
Top