Sweeeeeeet, new CPU to play with!

Nearly all of the new "Penryn"-based chips stop somewhere around the low 4Ghz mark (ignoring FSB issues) unless you cryo-cool them. Feeding them more volts pretty much doesn't help; most people are finding the limitation in the motherboard rather than the processor.

Anand's basic take on it was this; We've almost come to the point now where overclocking is limited by the silicon, not the voltage or (reasonable) cooling.
 
Heh, I just provide the cliff notes. Anand's article was far better and more precise ;) but I can't argue with them... I use watercooling because it's been a carryover from my Northwood overclocking days. But I STRONGLY doubt that it's doing anything for my overclock now, with the notable exception of keeping temps even lower while also quiet.

I'm of the mind that either my E8400 or my Q9450 would hit the same clocks on air without fuss at all, albeit perhaps with a bit more heat.
 
I have a X3350 (xeon version of Q9450) on the way that's been proven to do 3.6ghz at <1.3v, will post results when I get it. I was going to buy a E8400 and shoot for 4.0ghz but this deal came up and I couldn't resist :D.
 
If it's anything like mine, it'll do it on stock volts and load up to about 48*c. Definitely a nice CPU :)
 
I'm saying you shouldn't pay for more speed than you'll use. An E8200 at stock speeds will have no problems chewing through whatever you throw at it - in my opinion. If you think you need more, then turn it up a little -- these Penryn dual cores seem to love the speed.
What my statements whittles down to is if pushing the CPU power has negligible benefits on games, and my system would have multiple programs and potentially a game running at one time, then a quad core is more beneficial for my case.

Yeah, I've found that I have a tough time noticing the difference between a Core 2 E6300 @ 1.86GHz and a Core 2 Quad @ 3.6 GHz in most games. Maybe all games. Hell, I have a hard time telling the difference between my Athlon 64 X2 and a Core 2 Quad in games. The load is mostly on the video card, unless you run 1024x768 or are running a Celeron D. The bulk of games out there will usually reside on 1 or 2 cores. I know of nothing that gets a noticeable improvement from more. SupCom and UT3 sorta hit a 3rd core, but I can't tell if it does anything for gameplay. Every CPU comparison I've seen showed that Quads don't bring returns with games.

On the other hand, with stuff like video encoding where I can peg all 4 cores, there is a very obvious difference. The problem is that it is very hard to find apps that do this. I've only managed to do it with said video encoding (x264, WMV, divx, or 2x xvid), 4x LAME MP3 encodings (lol), 2x 7zip instances... Another thing that comes up here is I/O performance: you will max out hard disks before the CPU. So you have to plan out accessing from one drive to another and not hitting each drive with more than one process...blah blah..

I doubt that a modeling app like Lightwave or 3DStudio would benefit much from a quad unless you are doing hours of final scene renders. For the modeling step, a Quadro GPU would probably be dramatically more beneficial. I've seen the lowest Quadros outperform SLI'ed 8800GTX cards in these apps. Quadro's are only something to look at if you are very serious about modeling, however. This is something I'd see a engineering firm investing in (I have some friends using this hardware).

I have a hard time seeing that multi-tasking various mostly-idle applications will ever push a quad core to any noticeable benefit. Maximizing I/O performance will make for a quicker system here, IMO.

Berek: I think you should invest in SSDs. Getting rid of HDD access times and horrible random access throughput almost completely will make for one hell of a snappy (if expensive) system. Or just least try to set things up so major apps aren't on the same HDDs.

Considering that Quads are approaching $200, the only disadvantage to them is power usage, IMO. Max overclock isn't going to be a big deal in the end. But they do almost use 2x the power because they are simply 2 dual cores on the same package.
 
Last edited by a moderator:
colin mcrae dirt can use up to 8 cores i beleive how well though ive no idea;)

I think Lost Planet can use four cores, and it does pretty good depending on the scene complexity. Can't remember where I saw the benches, but in places where the CPU was truly the bottleneck (they were running at 1024x768 for just that reason) you could see it pegging all four cores.
 
I think Lost Planet can use four cores, and it does pretty good depending on the scene complexity. Can't remember where I saw the benches, but in places where the CPU was truly the bottleneck (they were running at 1024x768 for just that reason) you could see it pegging all four cores.

http://tech-report.net/articles.x/14052/6

The comments from the reviewer mention the game utilizing all 8 cores. The two tests occur with lots of enemy A.I. The most gains were seen in the second benchmark.
 
One area where I personally have seen nice dual to quad core benefits were the work I've been doing with ETQW and the 1.4 patch with threaded renderer enabled.

The benefits are two-fold, first of all the quad-core with a single 3870 would average ~90fps, and would almost constantly be above 60fps. A dual-core would come in around 70fps average, with the occasional batch of slowdown and stuttering. Adding a second 3870 actually DECREASED average frame rates on the dual-core due to increased CPU overhead. This was running 1680x1050, medium detail.

A more subtle difference I managed to pick up on however was the loading of the games mega-texture. On respawning or changing spectator view the game I believe has to load in a section of the maps megatexture. On the dual-core this would take a couple seconds, on the quad-core it was near instant.
I presume the decompression of the texture is being handled by another thread to prevent the game stalling as it is loaded in and on the dual-core it is fighting for CPU time while on the quad the resources are available.

UT3 is very well threaded indeed, I found running a 32-bot fly through the dual-core I was using would average 120fps, while an identically clocked quad-core would give you about 190-200fps. Again at 1680x1050, medium detail, and again adding CrossFire or SLI would reduce performance on the dual-core.
Of course this time the frame rates are high enough that it isn't massively important, but in this case what is better now will still be what is better in the future.
 
Yeah, I've found that I have a tough time noticing the difference between a Core 2 E6300 @ 1.86GHz and a Core 2 Quad @ 3.6 GHz in most games. Maybe all games. Hell, I have a hard time telling the difference between my Athlon 64 X2 and a Core 2 Quad in games.
I guess you don't play CS:S, that game is CPU bottlenecked even on my Q6600@3.4ghz.
 
I guess you don't play CS:S, that game is CPU bottlenecked even on my Q6600@3.4ghz.

It may be bottlenecked (although I seriously doubt that), but what he is saying is he doesn't notice the difference...

Would you notice a difference from 60fps to 90fps? I don't think so... CPU may be the bottleneck there, but it doesn't matter.

CPUs start to top out around the mid-3Ghz mark. Going from 2Ghz to 3Ghz can net you double digit percentage increases in performance, while going from 3.5Ghz to 4Ghz (Penryns here boys and girls...), will only net you 3 or 4 percentage points at best.
 
It may be bottlenecked (although I seriously doubt that), but what he is saying is he doesn't notice the difference...

Would you notice a difference from 60fps to 90fps? I don't think so... CPU may be the bottleneck there, but it doesn't matter.

CPUs start to top out around the mid-3Ghz mark. Going from 2Ghz to 3Ghz can net you double digit percentage increases in performance, while going from 3.5Ghz to 4Ghz (Penryns here boys and girls...), will only net you 3 or 4 percentage points at best.
I notice a 60 to 90fps difference immediately, it's like night and day. In fact I can see a difference up to 130ish, but my monitor and eyes are very good. I say this because I have 2 maxfps binds for COD4, one at 125fps for hard jumps and one at 100fps for max packets rate, and I can tell the difference in smoothnes right away when I switch between the two. And in CS:S if you can't maintain at least 100fps at all times you can't use 100 rates and you hit reg won't be as good as it could be. But I agree, most users with regular OEM mice and LCD screens can't tell more than 60fps.
 
I demand double-blind testing!

Also, I think Geo and I both bit the Yorkfield bullet. That $300 Q9450 was too tempting to pass up (and I now have an enormous machine that I can actually use to benchmark things for some papers I'm working on).
 
I notice a 60 to 90fps difference immediately, it's like night and day. In fact I can see a difference up to 130ish, but my monitor and eyes are very good. I say this because I have 2 maxfps binds for COD4, one at 125fps for hard jumps and one at 100fps for max packets rate, and I can tell the difference in smoothnes right away when I switch between the two. And in CS:S if you can't maintain at least 100fps at all times you can't use 100 rates and you hit reg won't be as good as it could be. But I agree, most users with regular OEM mice and LCD screens can't tell more than 60fps.

What kind of monitor do you have? I don't think any LCD goes above 60fps, so it must be a CRT?
 
I demand double-blind testing!

Also, I think Geo and I both bit the Yorkfield bullet. That $300 Q9450 was too tempting to pass up (and I now have an enormous machine that I can actually use to benchmark things for some papers I'm working on).

It is a great deal, and it's been a great processor for me at least. I wasn't immediately sure if I wanted to down-clock from my E8400 to my Q9450, but after a week of actually playing with it (versus just doing overclocking stability tests and benches), I continue to like the quad more and more.
 
Back
Top