Most CPU intensive games

Yeah FSX is a good one. I would love to see some CPU scaling benchmarks for that game using Nehalem as well as C2Q/C2D.
 
fsx does use more than 2 cores, doesnt like sli either
its a shames toms uses fsx in the vga charts but not the cpu charts
 
Last edited by a moderator:
fsx does use more than 2 cores, doesnt like sli either
its a shames toms uses fsx in the vga charts but not the cpu charts

with SP1 multi-threaded support for scenery loading and terrain texture synthesis during flight was added. which will work for the up to 32 cores addressable by SetThreadAffinityMask.

SLI is done through driver magic, read the NV doc on it, it has advice on what not to do so you dont stuff up the magic the driver does.

yes, I am surprised FSX hasnt been used in more CPU tests.
 
has anyone tried the intel quadcore benchmark (the fugly one with the hovercraft) on a i7

ps: If anyone is thinking Phil Taylor i've heard that name before
well your right, the above poster is indeed
"Philthy" Phil Taylor of Motorhead fame :D
 
Last edited by a moderator:
Dungeon Siege. Went from a Radeon9800 Pro to an X850XTPE and the fps improvement was zero. DX7 game though. :p
 
This is kinda relative but stick with me.

Counter:Strike Source and Team Fortress 2. They aren't the hardest hitters, but in these two games it is absolutely critical to maintain at least 100 and 66 frames per second, respectively, for connection quality reasons. Their respective versions of the Source engine aren't multi-threaded (maybe TF2 is a bit but it crashes when you force it fully) and it takes a lot of ghz to do it.
 
This is kinda relative but stick with me.

Counter:Strike Source and Team Fortress 2. They aren't the hardest hitters, but in these two games it is absolutely critical to maintain at least 100 and 66 frames per second, respectively, for connection quality reasons. Their respective versions of the Source engine aren't multi-threaded (maybe TF2 is a bit but it crashes when you force it fully) and it takes a lot of ghz to do it.

I v-lock at 60fps in TF2 and never have any noticeable connection quality issues. But when I do get the occasional slowdown to ~30fps, reducing AA and AF and overclocking the GPU has no impact on performance -> the game is CPU limited on my 3.2GHz E6750 + GTX260 at 1440x900 with _all_ settings maxed.
 
I v-lock at 60fps in TF2 and never have any noticeable connection quality issues. But when I do get the occasional slowdown to ~30fps, reducing AA and AF and overclocking the GPU has no impact on performance -> the game is CPU limited on my 3.2GHz E6750 + GTX260 at 1440x900 with _all_ settings maxed.
servers tick at 66 updates/second and unless you have at least 66fps at all times you really can't run "cl_cmdrate 66". Many servers don't care that you are vsynced at 60 and they force 66 rates on you, so you basically play with 6 choke all the time. It's even worse in CS:S where servers tick at 100 and you need 100fps for cl_cmdrate 100. People with 100fps and 100 cmdrate/updaterate will have significantly better hit reg. My E8400 runs at 4.2ghz and I still get times in both CS:S and TF2 when my framerate dips below 100 and 66, respectively. r_3dsky 0 helps a lot though, but Valve really needs to update both games to the L4D engine where muti-threading finally works.

btw, vsync is a bad idea in Source because for example if you are synced at 60hz, if your framerate dips below that, without triple buffering it will get halved to 30, and if you had your cmdrate set to 60 to take advantage of running at 60fps, you just got 30 choke. And it always happens when it hursts you the most, in firefights. Not to mention if you cap yourself to 60fps with vsync you can't use 100 rates in CS:S and most of the other players will have 40 more updates per second than you to and from the server. You can force triple buffering with Direct3dOverrider but the input lag is not worth it imo. I play at a high enough refresh rate not to see the tearing anymore, but with a 60hz lcd you have to pick your poison, imo it's much better to turn vsync off so you can use the best rates and just try to ignore the tearing.
 
Last edited by a moderator:
servers tick at 66 updates/second and unless you have at least 66fps at all times you really can't run "cl_cmdrate 66". It's even worse in CS:S where servers tick at 100 and you need 100fps for cl_cmdrate 100. People with 100fps and 100 cmdrate/updaterate will have significantly better hit reg. My E8400 runs at 4.2ghz and I still get times in both CS:S and TF2 when my framerate dips below 100 and 66, respectively. r_3dsky 0 helps a lot though.

btw, vsync is a bad idea in Source because for example if you are synced at 60hz, if your framerate dips below that, without triple buffering it will get halved to 30, and if you had your cmdrate set to 60 to take advantage of running at 60fps, you just got 30 choke. And it always happens when it hursts you the most, in firefights. Not to mention if you cap yourself to 60fps with vsync forget about using 100 rates in CS:S. You can force triple buffering with Direct3dOverrider but the input lag is not worth it imo. I play at a high enough refresh rate not to see the tearing anymore, but with a 60hz lcd you have to pick your poison.

Interesting. I use vsync and triple buffering in TF2 'cause the lag doesn't really matter when I'm demo spamming :D. In CS:S I choose to leave vsync off and suffer the tearing.
 
Morrowind - what's the deal w/that game engine? PC HW review sites don't use it as a reference game any more & I've always wondered how it scales, if at all, on late model HW & where the hang up is/was, CPU or GPU. My suspicion is that it's more CPU than GPU bound, but ya never know. As a previous poster noted, probably just another poorly coded game w/flags not dropping, resulting in ever accumulating memory leaks.
 
I picked up Morrowind a few months back. Performance was fine, but I got wicked mouse lag when vsync was enabled, as I do in Oblivion and Fallout 3. Somethings screwy with that engine there is.
 
Back
Top