If a 5GHz 2500K can't keep 60fps, I submit to you that poor coding is at fault.
Eh? What is happening onscreen in those games then? I bet you you're still GPU-limited (especially with just 1 GPU)...A 5Ghz 2500K can't keep minimum fps above 60fps in every game
Bullshit.
If I turn off GPU-only functions like tessellation, etc. 60 fps is no issue. GPUs or drivers are the bottleneck.
Eh? What is happening onscreen in those games then? I bet you you're still GPU-limited (especially with just 1 GPU)...
Slotting in a second graphics board doesn't increase CPU load to any discernible degree, so running with just one won't offer BETTER performance, it just gets worse all-round.
To properly test if you really are CPU-limited you should knock down the screen resolution as low as it can go and turn off antialiasing, and any advanced shaders and stuff unless the shader level is baked into some other detail level setting that affects CPU load.
Correct me if I am wrong guys but the GTX 285 cards besides being frighteningly old are not DX 11 parts right? I think DX 10 is as far as they go? Also I keep reading on the internet there are some rumors about some new cards coming out over Christmas? Probably make sense to hold out for those right instead of splashing out cash for a pair of 580s?
Also yeah I got the 990x for 545 bucks with taxes and what not. Read the reviews and with Sandy Bridge there is no noticeable difference IMHO. Can't wait to get that sucker put in to my rig.
On average yes but your minimum frame rate will still dip stupidly low. Can you maintain a minimum framerate of 60fps in the original Crysis benchmark tool running the 'Harbor' benchmark?
990X max out at 4.4Ghz with a good water cooling loops, Snady bridge can hit 5Ghz with the average being around 4.7-4.8Ghz. You won't see it with a single graphics card but you run 2 or more of them and you'll see Sandy Bridges extra power blow everything else away.
What exactly do you think would cause a game like this to be CPU-bound? Certainly not AI...so then what? Physics could do it if you had that going on, but in many instances this simply isn't the case.
I submit to you that if Harbor tanks the fps it's the GPU that's the bottleneck.
You are probably right but I think this review says otherwise:
http://www.tomshardware.com/reviews/core-i7-990x-extreme-edition-gulftown,2874-13.html
Regardless, I am not going to overclock any higher than 4 ghz. I was quite happy with my i7 930 tbh but when I found out I could get a stonking deal on this 990X I just had to jump on it.
As for my video card upgrade it is not so much that I am worried about being obsolete, it's just that I want to keep the wife happy hence Q4 is when I have decided to upgrade my video cards hehe. I am very happy with Nvidia cards and starting from the 8800 Ultras in SLI to the 285s in SLI they have been pretty much destroying everything I can throw at it at 2560 x 1600 and everything else maxed out...except for AA. I usually do 4x AA and even then that is a bit overkill. So my next cards will most likely be Nvidia. I just know my way around them and it seems to be more games just work better with Nvidia out of the boxx and it could be due to Nvidia having deep pockets and what not but yeah next cards will be Nvidia for sure...unless I get persuaded by someone to think otherwise
AI, Physics, Explosions, Particles.... There's loads of different things. Try tossing a grenade into a fire fire fight at watch the GPU usage drop.
A 5Ghz Sandy Bridge CPU can't even manage 60fps minimum on the contact fly through benchmark with a pair of 850Mhz core clocked 5850's so it'll tank like hell on Harbor as it's a lot more stressful on the CPU side of things.
I always thought the high CPU load of Crysis was due the incredible number of draw calls the game issues at times. (something I suspect could be alleviated with DX11 even on DX10 hardware)
Does it ever drop in the actual game?
Aye, and with the decline in PC gaming status and performance (market-wise, I mean), there's not much pressure to keep new generations rolling out either. Gone are the days of six-month product cycles...and maybe for the better too. It was a pretty brutal system, with hardware getting obsoleted what felt like almost overnight!I wouldn't count on new high end cards by Christmas. Just a hunch but these things seem more and more delayed every year.
Yes I'm sure you're right, the problem is, 2xGTX580 can happily and quite easily burn 600W all on their own. A new generation with similar performance but a less brutal power envelope would have been much appreciated amongst many. I've been dreading summer for the last three years after buying my first really high-spec PC back in 2007 I think it was. Of course, it was only with the launch of the GTX8800 that power consumption really crashed through the roof, so this is a relatively new phenomena after all...The GTX580 is blindingly fast, 2 of them should destroy any game within the next 2 years at least