Correlation of attributes with Framerate Scores

Nite_Hawk

Veteran
Hi Guys,

Well, since we have a nice clean forum, I figured I'd post a new topic to try to get things started on a good track. :) I've been lately working with the data set I collected from different reviews off the web (I have about 3600 framerate scores from about 40 different reviews for various cards/cpus/configurations) and I'm working on determining what attributes are most correlated with the framerate.

What is interesting that I have found so far is that you have the best chance of classifying a score based on the resolution the game is run at, the antialiasing level and the anisotropic level. Other things like GPU speed and CPU speed are often somewhat correlated, but so far the quality settings are better at determining the score than the hardware being used. This is interesting, becuase it indicates that atleast for the cards I have in my dataset (ti4200s up to 9800pros), quality settings are still the deciding factor in determining how fast a game will play.

Right now I'm going through and analyzing two-attribute situations, to see if there is a co-dependence of variables that might be skewing the results. For instance, it is likely that new videocards will have newer drivers, so is it the drivers or the new videocard that cause changes in the score?

I really need to collect more data. 3600 sample points isn't enough to really do a good analysis across 13-14 attributes. I can't wait to add in some of the newer cards (x800 series and 6800 series) to my data sets as well. It would be really interesting to see if this changes the correlation of the quality settings (and it would be interesting to see the correaltion of quality settings when singling out specific cards). I'm also working on determining if our intuition on certain things (like UT2003 botmatch is more CPU dependent than UT2003 Flyby) is supported by the correlation tests I'm running.

Nite_Hawk
 
Test boxes at most of the sites tend to have some of the best CPUs around, I would be surprised if you'd draw much CPU information from the sites, especially as their reviews should be, if they're any good, doing all they can to stress the GPU as much as possible and the CPU as little as possible.
 
Quitch said:
Test boxes at most of the sites tend to have some of the best CPUs around, I would be surprised if you'd draw much CPU information from the sites, especially as their reviews should be, if they're any good, doing all they can to stress the GPU as much as possible and the CPU as little as possible.

It's not quite as bad as you might think. Various hardware review sites occasionally do cpu roundups focusing on either different chipsets and cpus or just different cpus. They use the same GFX card in multiple systems, so I have atleast some information about how different CPUs do on various benchmarks with the same GFX card. The problem with these is that they tended to not be as high quality as say a B3D review and often times are missing information about things like what texture filtering is being used. Still, I did manage to get some variation on the CPU. With the 2-attribute tests I should be able to see how strong the correlation is between GPUs and CPUs to see how bad it actually is.

Nite_Hawk
 
Which should make sense, because you can multiply the required fillrate and memory bandwidth per frame many times with different resolution/AA/AF settings.
 
Chalnoth said:
Which should make sense, because you can multiply the required fillrate and memory bandwidth per frame many times with different resolution/AA/AF settings.

In some ways I wish I would have delved into the GPU a bit and made fillrate an attribute to see how fillrate is affecting things outside of the different GPU architectures. I'd probably need even more samples to do this though. Still, it'd be rather interesting to see how fillrate and video memory throughput affect performance under different circumstances (various CPU configurations, different quality settings, etc).

Nite_Hawk
 
Well, what you might do is take resolution out as a factor by measuring fillrate instead of FPS, and see what that does for your correlations.
 
That doesn't eliminate resolution as a factor. Chips become more efficient as the resolution rises due to the rise in average polygon sizes and the reduction in required texture bandwidth due to the increased amount of magnification filtering.

This effect will clearly be seen even if you are 100% hardware limited.
 
It doesn't remove its effects, but that won't prevent you from taking it out of the corellations. I suppose you could leave it in, but that would basically just tell you whether or not the particular graphics card in question was CPU-limited at that resolution. I was thinking of just taking it out entirely, therefore averaging the scores over multiple resolutions.
 
Back
Top