Hi Guys,
Well, since we have a nice clean forum, I figured I'd post a new topic to try to get things started on a good track. I've been lately working with the data set I collected from different reviews off the web (I have about 3600 framerate scores from about 40 different reviews for various cards/cpus/configurations) and I'm working on determining what attributes are most correlated with the framerate.
What is interesting that I have found so far is that you have the best chance of classifying a score based on the resolution the game is run at, the antialiasing level and the anisotropic level. Other things like GPU speed and CPU speed are often somewhat correlated, but so far the quality settings are better at determining the score than the hardware being used. This is interesting, becuase it indicates that atleast for the cards I have in my dataset (ti4200s up to 9800pros), quality settings are still the deciding factor in determining how fast a game will play.
Right now I'm going through and analyzing two-attribute situations, to see if there is a co-dependence of variables that might be skewing the results. For instance, it is likely that new videocards will have newer drivers, so is it the drivers or the new videocard that cause changes in the score?
I really need to collect more data. 3600 sample points isn't enough to really do a good analysis across 13-14 attributes. I can't wait to add in some of the newer cards (x800 series and 6800 series) to my data sets as well. It would be really interesting to see if this changes the correlation of the quality settings (and it would be interesting to see the correaltion of quality settings when singling out specific cards). I'm also working on determining if our intuition on certain things (like UT2003 botmatch is more CPU dependent than UT2003 Flyby) is supported by the correlation tests I'm running.
Nite_Hawk
Well, since we have a nice clean forum, I figured I'd post a new topic to try to get things started on a good track. I've been lately working with the data set I collected from different reviews off the web (I have about 3600 framerate scores from about 40 different reviews for various cards/cpus/configurations) and I'm working on determining what attributes are most correlated with the framerate.
What is interesting that I have found so far is that you have the best chance of classifying a score based on the resolution the game is run at, the antialiasing level and the anisotropic level. Other things like GPU speed and CPU speed are often somewhat correlated, but so far the quality settings are better at determining the score than the hardware being used. This is interesting, becuase it indicates that atleast for the cards I have in my dataset (ti4200s up to 9800pros), quality settings are still the deciding factor in determining how fast a game will play.
Right now I'm going through and analyzing two-attribute situations, to see if there is a co-dependence of variables that might be skewing the results. For instance, it is likely that new videocards will have newer drivers, so is it the drivers or the new videocard that cause changes in the score?
I really need to collect more data. 3600 sample points isn't enough to really do a good analysis across 13-14 attributes. I can't wait to add in some of the newer cards (x800 series and 6800 series) to my data sets as well. It would be really interesting to see if this changes the correlation of the quality settings (and it would be interesting to see the correaltion of quality settings when singling out specific cards). I'm also working on determining if our intuition on certain things (like UT2003 botmatch is more CPU dependent than UT2003 Flyby) is supported by the correlation tests I'm running.
Nite_Hawk