Do you expect GFFX performance improvements with drivers?

Do you expect any significant GeForce FX performance improvements with future drivers?

  • No, I do not expect any performance improvements at all

    Votes: 0 0.0%
  • No, I do not expect any significant performance improvements

    Votes: 0 0.0%
  • Performance improvements be damned, just reduce the noise!

    Votes: 0 0.0%

  • Total voters
    225

Reverend

Banned
Looking at the various previews with the benchmarks and image quality issues, what are your expectations of NVIDIA's NV30 driver team?
 
I think the question about performance improvements is not "if" but rather "when"

That is, performance improvements are certain, but when will they occur? Will the performance improve by 5%-10% in the next month? Will it take three months to do that?

Personally, I expect around 5%-10% performance improvements in specific game scenarios (not global improvement) within a month, and 20%-30% global improvement in high-detail scenarios within six months.
 
Surely the situations in which it chokes the most right now are memory bandwidth limited. Aren't such cases the *least* likely to benefit from driver optimisation?

I personally think about 10% increase generally and about 20% for VS performance. I really don't expect much improvement in the situations where it is really needed.

MuFu.
 
It would be totally unrealistic to not expect any performance improvements at all. History has shown that we can surely expect very significant improvements in some applications. And some not-so-significant improvements across most apps.
 
i would say that there would definitely be some improvements, but i couldn't even begin to guess how much and in what
 
3dmark will definitely have a great improvement, if nothing else :LOL:

I would not be surprised to see improvements in big titles though...

If anything, I think there will be more optomizations in games rather than improvements on the driver side of things.
 
I would think that there will be performance improvements, however I don't think they'll be quite as significant as they may have been in the past (i.e. prior products).

The reason why I think this may be the case is that I'm sure the driver team has been under extreme pressure to provide the most performance they could in the initial set of drivers released to the reviewing media.

As nvidia never had much in terms of a performance benchmark to meet in the past, they would've been balancing prior card's inital drivers much more evenly (between best performance and driver stability).

My guess is that the driver probs we've seen in current reviews are due to the team focusing on performance rather than quality.

Pure speculation of course! ;) Nvidia, I hope you prove me wrong!
 
I'm not sure about 20% better. Has anyone actually gone back and checked how much the performance of the GF3, R8500, etc increased over their lifetime?
 
Yea I think that NVIDIA can pull of 20% and maybe even more within 6 months.
I wish I could vote for (1) and (4) together though ;) I voted for (1) as I dont think they will ever get that noise level down to acceptable levels within the same time period I mentioned above.
 
Nagorak said:
I'm not sure about 20% better. Has anyone actually gone back and checked how much the performance of the GF3, R8500, etc increased over their lifetime?

over time as the months and years go by the computer hardware around the graphics card improves, so really can't say how a certain card over time has improved in speed

what you would have to do is get one system and then put a card on it, gf3, 8500 etc. and install the shipping drivers the card came with, and then newer drivers all the way up to the latest driver set testing each one to see how much the drivers alone have improved speed
 
Brent said:
Nagorak said:
I'm not sure about 20% better. Has anyone actually gone back and checked how much the performance of the GF3, R8500, etc increased over their lifetime?

over time as the months and years go by the computer hardware around the graphics card improves, so really can't say how a certain card over time has improved in speed

what you would have to do is get one system and then put a card on it, gf3, 8500 etc. and install the shipping drivers the card came with, and then newer drivers all the way up to the latest driver set testing each one to see how much the drivers alone have improved speed

Doesn't Rivastation and Guru3D do something similar to what you mention Brent?

edit: Here is one such comparison from Guru3D comparing the detonators and performance with a GF4 Ti 4600.

http://www.guru3d.com/detonator-dbase-xp/
 
Brent said:
Nagorak said:
I'm not sure about 20% better. Has anyone actually gone back and checked how much the performance of the GF3, R8500, etc increased over their lifetime?

over time as the months and years go by the computer hardware around the graphics card improves, so really can't say how a certain card over time has improved in speed

what you would have to do is get one system and then put a card on it, gf3, 8500 etc. and install the shipping drivers the card came with, and then newer drivers all the way up to the latest driver set testing each one to see how much the drivers alone have improved speed

Was that you volunteering... :LOL:
 
After reviewing that article from Guru3D which looked at UT2003 and Quake3 and 3DMark only (can't blame them - they had tons of drivers to go through) I would like to revote...I now expect incremental increases in performance over the next 6 months .. of about 5% culmanative(sp?). :oops: :oops: :oops:
 
I don't expect any real performance increases in DX7 or DX8 based games. There is a strong possibility of some largish improvements using anisotropic filtering but I expect that to be accompanied by some caveats regarding quality. DX9, I don't know. To the extent the card works like a GF4 for basic rendering I think performance is pretty well already tapped out by way of drivers. The possibilities for performance increases by way of aniso optomization that Unwinder (RivaTuner creator) exposed for GF3 and GF4 hardware is the big question mark imo.
 
MuFu said:
Surely the situations in which it chokes the most right now are memory bandwidth limited.

I'm not so sure about that. Specifically, if you look at the relative hit GFfx takes for enabling 4xMSAA (should be a pure bandwidth hit), it's not much different from the hit the 9700 takes for enabling 4xMSAA: probably slightly larger overall, but actually smaller in a number of cases. The GFfx is usually taking a larger hit from the combination of AA and AF, but surprisingly it appears AF could be contributing just as much or more. (Also depends crucially on whether the comparison is with R300's performance or quality AF.)

A lot of credit is evidently due to GFfx's framebuffer compression which, if I understand it correctly, is hardwired for 4:1 compression (i.e. either all 4 subsamples are the same or they're not, with per-pixel flags specifying which is the case??), as opposed to a more flexible implementation in R300? (Sorry for the question marks; there was a thread here earlier to this effect, but I didn't get an entirely clear picture based on what was discussed there.)

But it's still a very notable result, considering the bandwidth/fillrate ratio is barely half that of R300. Not to mention Anand's sketchy but intriguing comment that OC'ing the memory barely increased performance.

IMO, people seem to be concentrating their (non-FXFlow related) GFfx complaints on AA/AF performance being even or slightly worse than with 9700 Pro. But this should have been expected based on the raw bandwidth difference. The large negative surprise, to me at least, is that performance with no AA/AF is only slightly better than 9700 Pro. No one seems to talk about this much, presumably because the GFfx is still in front and because these scores are rightly considered less important for the immediate end-user experience.

But these only-slightly-better "plain" scores are the big performance disappointment of GFfx in its current state, and it is entirely likely (although by no means certain) that driver improvements could help out a lot here. Whether those improvements will trickle-down to AA/AF performance is another question, of course.
 
Tahir said:
After reviewing that article from Guru3D which looked at UT2003 and Quake3 and 3DMark only (can't blame them - they had tons of drivers to go through) I would like to revote...I now expect incremental increases in performance over the next 6 months .. of about 5% culmanative(sp?). :oops: :oops: :oops:

Yeah, maybe drivers are a little bit overrated in terms of their impact. Although does this review include the pre-shipping drivers? I'd say the biggest increase is probably from the buggy first build to one that's actually available when the card is on the shelf.

Also kind of funny how 3DMark increases so much compared to everyone else. But then again, everyone optimizes for it, so it's all good.
 
Where it counts, with FSAA and anisotropic performance I wouldn't expect much, it's pretty much all hardware there.
 
Himself said:
Where it counts, with FSAA and anisotropic performance I wouldn't expect much, it's pretty much all hardware there.

couldn't aniso see an improvement via drivers, if say they improve the algorithm which determines when to aniso etc... something like that

and with aa i would think you could get improvements via driver as well

i think we saw this with the 9700 pro, didn't ati come out with some drivers a while back that improved aa in some games?
 
Tahir said:
Doesn't Rivastation and Guru3D do something similar to what you mention Brent?

edit: Here is one such comparison from Guru3D comparing the detonators and performance with a GF4 Ti 4600.

http://www.guru3d.com/detonator-dbase-xp/
Only problem is, a comparison like that isn't particularly telling. One with drivers throughout the lifecycle of, say, the TNT, GeForce DDR, or GeForce3 would be far more telling. The drivers were already reasonably-mature at the release of the GeForce4 (since it was a refresh part).
 
Back
Top