Crysis 2 PC edition OT

If a 5GHz 2500K can't keep 60fps, I submit to you that poor coding is at fault.
 
If a 5GHz 2500K can't keep 60fps, I submit to you that poor coding is at fault.

A 5Ghz 2500K can't keep minimum fps above 60fps in every game, There's quite a few were it drops very low.

Coding can be blamed for everything when it comes to performance.
 
Bullshit.
If I turn off GPU-only functions like tessellation, etc. 60 fps is no issue. GPUs or drivers are the bottleneck.
 
A 5Ghz 2500K can't keep minimum fps above 60fps in every game
Eh? What is happening onscreen in those games then? I bet you you're still GPU-limited (especially with just 1 GPU)...

Slotting in a second graphics board doesn't increase CPU load to any discernible degree, so running with just one won't offer BETTER performance, it just gets worse all-round.

To properly test if you really are CPU-limited you should knock down the screen resolution as low as it can go and turn off antialiasing, and any advanced shaders and stuff unless the shader level is baked into some other detail level setting that affects CPU load.
 
Bullshit.
If I turn off GPU-only functions like tessellation, etc. 60 fps is no issue. GPUs or drivers are the bottleneck.

On average yes but your minimum frame rate will still dip stupidly low. Can you maintain a minimum framerate of 60fps in the original Crysis benchmark tool running the 'Harbor' benchmark?

Eh? What is happening onscreen in those games then? I bet you you're still GPU-limited (especially with just 1 GPU)...

Slotting in a second graphics board doesn't increase CPU load to any discernible degree, so running with just one won't offer BETTER performance, it just gets worse all-round.

To properly test if you really are CPU-limited you should knock down the screen resolution as low as it can go and turn off antialiasing, and any advanced shaders and stuff unless the shader level is baked into some other detail level setting that affects CPU load.

And slotting in a second GPU increases the driver overhead which requires more CPU grunt.

And to test properly if you're CPU limited you just look at the GPU usage :rolleyes:

If your GPU's are not 99% load with little to no on screen action then your CPU bottlenecked and if both GPU usage and frame rate drop with heavy on screen action then your CPU limited.

There's no CPU that can do a 60fps minimum frame rate in every game that's available and that it true FACT, And it's pointless blaming drivers or coding. Coding and drivers can be blamed for everything.
 
So, does anyone else think the DX11 upgrades have made the game on the same graphical level as the Samaritan demo from Epic if not more? Maybe character model isn't as detailed as the Samaritan? I'm just fascinated for the fact it only takes one 580 GTX to run Crysis 2 on DX11 compared to the 580 GTX tri sli for Epic.
2psfzlu.jpg
 
I certainly don't. Crysis 2 looks quite meh to me even after these latest patches. Not far into it yet though. Samaritan footage looks very nice.
 
Got my new PC (570 + i5 2500). Really satisfied with the performance at 1080p + DX11 on Ultra. Mostly 30-40fps, which is a big improvement on 20-30fps at 720p DX9 high I got on my laptop.:D
 
Correct me if I am wrong guys but the GTX 285 cards besides being frighteningly old are not DX 11 parts right? I think DX 10 is as far as they go? Also I keep reading on the internet there are some rumors about some new cards coming out over Christmas? Probably make sense to hold out for those right instead of splashing out cash for a pair of 580s?

Also yeah I got the 990x for 545 bucks with taxes and what not. Read the reviews and with Sandy Bridge there is no noticeable difference IMHO. Can't wait to get that sucker put in to my rig.
 
GTX285 is DX10.0 (as opposed to 10.1).

I wouldn't count on new high end cards by Christmas. Just a hunch but these things seem more and more delayed every year. The GTX580 is blindingly fast, 2 of them should destroy any game within the next 2 years at least (until the next consoles come out).

But if you're really worried about being obsolete by year's end, two 6950s modded to 6970s should be nearly as fast and about half the price.
 
Last edited by a moderator:
Correct me if I am wrong guys but the GTX 285 cards besides being frighteningly old are not DX 11 parts right? I think DX 10 is as far as they go? Also I keep reading on the internet there are some rumors about some new cards coming out over Christmas? Probably make sense to hold out for those right instead of splashing out cash for a pair of 580s?

Also yeah I got the 990x for 545 bucks with taxes and what not. Read the reviews and with Sandy Bridge there is no noticeable difference IMHO. Can't wait to get that sucker put in to my rig.

990X max out at 4.4Ghz with a good water cooling loops, Snady bridge can hit 5Ghz with the average being around 4.7-4.8Ghz. You won't see it with a single graphics card but you run 2 or more of them and you'll see Sandy Bridges extra power blow everything else away.
 
On average yes but your minimum frame rate will still dip stupidly low. Can you maintain a minimum framerate of 60fps in the original Crysis benchmark tool running the 'Harbor' benchmark?

What exactly do you think would cause a game like this to be CPU-bound? Certainly not AI...so then what? Physics could do it if you had that going on, but in many instances this simply isn't the case.

I submit to you that if Harbor tanks the fps it's the GPU that's the bottleneck.
 
990X max out at 4.4Ghz with a good water cooling loops, Snady bridge can hit 5Ghz with the average being around 4.7-4.8Ghz. You won't see it with a single graphics card but you run 2 or more of them and you'll see Sandy Bridges extra power blow everything else away.

You are probably right but I think this review says otherwise:
http://www.tomshardware.com/reviews/core-i7-990x-extreme-edition-gulftown,2874-13.html

Regardless, I am not going to overclock any higher than 4 ghz. I was quite happy with my i7 930 tbh but when I found out I could get a stonking deal on this 990X I just had to jump on it.

As for my video card upgrade it is not so much that I am worried about being obsolete, it's just that I want to keep the wife happy hence Q4 is when I have decided to upgrade my video cards hehe. I am very happy with Nvidia cards and starting from the 8800 Ultras in SLI to the 285s in SLI they have been pretty much destroying everything I can throw at it at 2560 x 1600 and everything else maxed out...except for AA. I usually do 4x AA and even then that is a bit overkill. So my next cards will most likely be Nvidia. I just know my way around them and it seems to be more games just work better with Nvidia out of the boxx and it could be due to Nvidia having deep pockets and what not but yeah next cards will be Nvidia for sure...unless I get persuaded by someone to think otherwise :D
 
What exactly do you think would cause a game like this to be CPU-bound? Certainly not AI...so then what? Physics could do it if you had that going on, but in many instances this simply isn't the case.

I submit to you that if Harbor tanks the fps it's the GPU that's the bottleneck.

AI, Physics, Explosions, Particles.... There's loads of different things. Try tossing a grenade into a fire fire fight at watch the GPU usage drop.

Harbor is CPU limited as GPU usage drops along with the frame rate meaning the GPU/s are starved of rendering information.

A 5Ghz Sandy Bridge CPU can't even manage 60fps minimum on the contact fly through benchmark with a pair of 850Mhz core clocked 5850's so it'll tank like hell on Harbor as it's a lot more stressful on the CPU side of things.
 
You are probably right but I think this review says otherwise:
http://www.tomshardware.com/reviews/core-i7-990x-extreme-edition-gulftown,2874-13.html

Regardless, I am not going to overclock any higher than 4 ghz. I was quite happy with my i7 930 tbh but when I found out I could get a stonking deal on this 990X I just had to jump on it.

As for my video card upgrade it is not so much that I am worried about being obsolete, it's just that I want to keep the wife happy hence Q4 is when I have decided to upgrade my video cards hehe. I am very happy with Nvidia cards and starting from the 8800 Ultras in SLI to the 285s in SLI they have been pretty much destroying everything I can throw at it at 2560 x 1600 and everything else maxed out...except for AA. I usually do 4x AA and even then that is a bit overkill. So my next cards will most likely be Nvidia. I just know my way around them and it seems to be more games just work better with Nvidia out of the boxx and it could be due to Nvidia having deep pockets and what not but yeah next cards will be Nvidia for sure...unless I get persuaded by someone to think otherwise :D

I've just moved back to Nvidia, EVGA's GTX 570's with 2.5Gb of VRAM are perfect, Loads of memory to match the shader power but without the price tag the 3Gb VRAM equipped GTX 580's have.
 
AI, Physics, Explosions, Particles.... There's loads of different things. Try tossing a grenade into a fire fire fight at watch the GPU usage drop.

I always thought the high CPU load of Crysis was due the incredible number of draw calls the game issues at times. (something I suspect could be alleviated with DX11 even on DX10 hardware)

A 5Ghz Sandy Bridge CPU can't even manage 60fps minimum on the contact fly through benchmark with a pair of 850Mhz core clocked 5850's so it'll tank like hell on Harbor as it's a lot more stressful on the CPU side of things.

Does it ever drop in the actual game?

Suryad if you want to stick with NVIDIA the twin 570s are a very good option. But I don't know if you'll need the 2.5GB versions even at 1600p.
 
I always thought the high CPU load of Crysis was due the incredible number of draw calls the game issues at times. (something I suspect could be alleviated with DX11 even on DX10 hardware)

Does it ever drop in the actual game?

Cryengine does some funky stuff in regards to CPU performance, If you go into the game directory there's actually a .cmd file to start a second CPU benchmark set in the later snow levels that's pure explosions and physics, It RAPES CPU's... Dropping the GPU setting didn't help my Phenom 2 at all.

And of course it will drop in the actual game, If you can manage 60fps as a minimum in a fly through with no AI, Particles, Explosions and physics running on the CPU do you really think you can still do that in the actually game with all the above things placing strain on the CPU?
 
I wouldn't count on new high end cards by Christmas. Just a hunch but these things seem more and more delayed every year.
Aye, and with the decline in PC gaming status and performance (market-wise, I mean), there's not much pressure to keep new generations rolling out either. Gone are the days of six-month product cycles...and maybe for the better too. It was a pretty brutal system, with hardware getting obsoleted what felt like almost overnight!

The GTX580 is blindingly fast, 2 of them should destroy any game within the next 2 years at least
Yes I'm sure you're right, the problem is, 2xGTX580 can happily and quite easily burn 600W all on their own. A new generation with similar performance but a less brutal power envelope would have been much appreciated amongst many. I've been dreading summer for the last three years after buying my first really high-spec PC back in 2007 I think it was. Of course, it was only with the launch of the GTX8800 that power consumption really crashed through the roof, so this is a relatively new phenomena after all...
 
Back
Top