Intel slash prices on Core2 Quads to counter Phenom2 *check*

Yeah, I hope so. I'm not getting a new GPU right now (there's just no card I want ATM), so I hope my GTX can scale some. Maybe I'll OC it.
 
My GTA4 testing with 4GB memory (3GB for X58), 4870X2 (Catalyst 8.12) and Vista Ultimate x86 SP1 was along these lines:
PDC E5200: 9fps
C2D E8600: 22fps
C2Q Q9400: 35fps
C2Q QX9770: 45fps
Core i7-290: 54fps

Numbers are based off an average from 5 runs, running my pre-defined 'Medium' setting in an automated testing tool I built (Textures medium, Render quality high, view distance 30, detail distance 60, shadow density 10). Benchmark itself is the built in test.
 
What is it with you 939 people. ;)

What is so exciting about the Athlon 64 X2 offerings on Socket AM2 that makes you so jealous? We have Windsor core, which is kinda nice if you really want another couple hundred MHz over what Toledo could do. And we have Brisbane, which is slower per clock by quite a bit and doesn't run that much cooler or clock much higher. Wow.

Phenom was never going to be re-engineered for DDR1 and 939. And Phenom was never really all that exciting anyway!

In retrospect, with RAM prices having gone where they have, I think it's pretty obvious why 939 was left behind. Equipping a 939 mobo with gobs of RAM costs a whole lot more than the $40 for 2x2 gigs of DDR2. DDR2 was the future because Intel had left DDR1 behind. Whenever the sales volume of a RAM tech disappears significantly and the tech is at a dead end, pricing becomes ugly. SDR SDRAM was the same way. And EDO for that matter.

I thought 939 had a good run. It went from having only single core CPUs to getting dual cores running at 2.8 GHz or so. I suspect that the 939 angst is from before Core 2 launched, when a few hundred extra MHz on an Athlon 64 X2 was still a bit exciting.
 
Last edited by a moderator:
How about those poor buggers using rambus

I don't believe that was CPU socket-specific, you could have an S478 rig with Rambus and still had an upgrade path for several years.

Perhaps the better example would be Intel's Socket 423, which lived a very short life at the beginning of the P4 era. That was a truly worthless socket...
 
My GTA4 testing with 4GB memory (3GB for X58), 4870X2 (Catalyst 8.12) and Vista Ultimate x86 SP1 was along these lines:
PDC E5200: 9fps
C2D E8600: 22fps
C2Q Q9400: 35fps
C2Q QX9770: 45fps
Core i7-290: 54fps

Numbers are based off an average from 5 runs, running my pre-defined 'Medium' setting in an automated testing tool I built (Textures medium, Render quality high, view distance 30, detail distance 60, shadow density 10). Benchmark itself is the built in test.

You gotta love those i7's :D

Your settings much heavier on the CPU than console settings though. At console settings I get about 34fps in the benchmark on an E6600.
 
You gotta love those i7's :D

Your settings much heavier on the CPU than console settings though. At console settings I get about 34fps in the benchmark on an E6600.

Console settings?

I have to say I defined mine almost arbitrarily - picking middle ground settings and trying to keep GPU memory usage under 512MB - but medium on my benchmark utility is roughly equal to what I run on the game on on my home system (i7-965, 4870 512MB).
 
What is it with you 939 people. ;)

...

I thought 939 had a good run. It went from having only single core CPUs to getting dual cores running at 2.8 GHz or so. I suspect that the 939 angst is from before Core 2 launched, when a few hundred extra MHz on an Athlon 64 X2 was still a bit exciting.

Fastest chip - an FX-60 / Opty 185 ran at 2.6 and cost approximately a million pounds. Still does on ebay. Being able to drop a 3 gHz part in for a pittance was a hope that was cruelly ripped away from us ...
 
Fastest chip - an FX-60 / Opty 185 ran at 2.6 and cost approximately a million pounds. Still does on ebay. Being able to drop a 3 gHz part in for a pittance was a hope that was cruelly ripped away from us ...

I remember lusting after that particular CPU...good ol' days!
 
Console settings?

I have to say I defined mine almost arbitrarily - picking middle ground settings and trying to keep GPU memory usage under 512MB - but medium on my benchmark utility is roughly equal to what I run on the game on on my home system (i7-965, 4870 512MB).

Yep:

Texture Quality: Probably Medium
Render Quality: Low
View Distance: 21
Detail Distance: 10
Shadow Density: 0
Vehincle Density:?? But I guessed 20

Those are the settings I run at but I have 16xAF set through the control panel.
 
Yep:

Texture Quality: Probably Medium
Render Quality: Low
View Distance: 21
Detail Distance: 10
Shadow Density: 0
Vehincle Density:?? But I guessed 20

Those are the settings I run at but I have 16xAF set through the control panel.

That'll explain it to some extent, thats lower in some respects than my Low settings.
Interesting how even with increased graphics settings it seems to scale with the CPU as opposed to the GPU.
 
Yep:

Texture Quality: Probably Medium
Render Quality: Low
View Distance: 21
Detail Distance: 10
Shadow Density: 0
Vehincle Density:?? But I guessed 20

Those are the settings I run at but I have 16xAF set through the control panel.
But render quality defines AF. Try setting it to highest in game and application control in the control panel.
 
But render quality defines AF. Try setting it to highest in game and application control in the control panel.

Yeah i've only done it like that though as I try to standardiise my cp settings as much as possible. i.e. just use the global settings for every game rather than individual game profiles as they can get lost when you upgrade the drivers.
 
Back
Top