[PC] Grand Theft Auto IV

Trying again. :)

here are my benchmarks

run 1 51.65
run 2 51.63
run 3 51.76
run 4 51.31
run 5 51.77
run 6 51.45
run 7 50.87
run 8 51.45
run 9 51.70
run 10 51.84
run 11 51.53
run 12 51.28
run 13 46.84
run 14 51.50

Q9450@2.6 GHz, HD4850 Crossfire, CC 9.3, Vista64

don't know what happened on run 14, I don't seem to be loosing performance.
 
Last edited by a moderator:
I don't get this part....etc

I wouldn't say linearly, but i didnt had much data to rely and make more detailed comments, they were basicly based on this article.

But lets keep in mind GTA4 was targeted first with multi-core consoles as the main target and we would expect even with some quickpatched port to pc to see some distinctive scale on cpu's with 2 extra cores (in comparison to the mainstream C2D) totaling 4.8Ghz (2.4x2 each) not some extra 7fps, that is a small increase!
Something that the C2D E8600 could match with some extra 600mhz increase, while quad cores have way more than headroom in frequency even with just one extra core.

And the i7 920 even with lower clock frequency of 2.66Ghz compared to the C2Q extreme @3.6 (yes i know theres more to it than just clock frequency but still theres some sort of indication of where the perfomance boost is mainly comming from) leaves it far behind with minimal fps above/around 50 when gpu isnt the bottleneck while the C2Q extreme is stuck at 31fps.

Hence why i though the main factor behind such small performance increases is more related to inter-core bandwidth than anything else.
 
Last edited by a moderator:
here are my benchmarks

run 1 51.65
run 2 51.63
run 3 51.76
run 4 51.31
run 5 51.77
run 6 51.45
run 7 50.87
run 8 51.45
run 9 51.70
run 10 51.84
run 11 51.53
run 12 51.28
run 13 46.84
run 14 51.50

Q9450@2.6 GHz, HD4850 Crossfire, CC 9.3, Vista64

don't know what happened on run 14, I don't seem to be loosing performance.

Thanks, interesting.

My numbers are
run 1 49.50
run 2 48.17
run 3 47.05
run 4 44.71
run 5 43.97
run 6 43.68
run 7 43.39
run 8 43.60
run 9 43.45
run 10 41.99
run 11 40.99
run 12 40.02
run 13 39.55

As you can see, it goes down by each run and by the last run, the game is very stuttery and bad. It also behaves as if -noprecache is enabled because objects and textures disappear as I turn away and reappear as I turn towards them again.
 
I wouldn't say linearly, but i didnt had much data to rely and make more detailed comments, they were basicly based on this article.

But lets keep in mind GTA4 was targeted first with multi-core consoles as the main target and we would expect even with some quickpatched port to pc to see some distinctive scale on cpu's with 2 extra cores (in comparison to the mainstream C2D) totaling 4.8Ghz (2.4x2 each) not some extra 7fps, that is a small increase!
Something that the C2D E8600 could match with some extra 600mhz increase, while quad cores have way more than headroom in frequency even with just one extra core.

And the i7 920 even with lower clock frequency of 2.66Ghz compared to the C2Q extreme @3.6 (yes i know theres more to it than just clock frequency but still theres some sort of indication of where the perfomance boost is mainly comming from) leaves it far behind with minimal fps above/around 50 when gpu isnt the bottleneck while the C2Q extreme is stuck at 31fps.

Hence why i though the main factor behind such small performance increases is more related to inter-core bandwidth than anything else.

You can't compare clock rates across different architectures though. Looking at the table, a pure apples to apples comparison between conroe with 2 cores and Conroe with 4 cores equates to a 50% performance increase or more. Thats pretty good.
 
Maybe its an NV driver issue then? Something todo with how it clears out the video memory? *total wild guess of course*
Yes, this is what I think too. I've been trying to convince Rockstar support that I have this issue and at first they were going through the motions and telling me to run without any background programs and setting max pre-rendered frames to 1.

Last I heard from them, they asked me to reply if the 1.0.3.0 patch didn't resolve my issue. It didn't and I told them that, I even let them have the benchmarks.

Hopefully they will escalate it and tell me to log something or maybe they can reproduce it. It is annoying because the performance and visual quality degrades to a point where it is a stuttering, reloading mess and I love the game and I want to be able to finish it.
 
You can't compare clock rates across different architectures though. Looking at the table, a pure apples to apples comparison between conroe with 2 cores and Conroe with 4 cores equates to a 50% performance increase or more. Thats pretty good.

My main focus was the small increase in performance from 2 to 4 cores.
I mentioned the i7 920 just to point out that it seems likely inter-core bandwidth being the bottleneck comparing C2D with C2Q (wich the i7's have a nice increase of, at 40~50%+ compared to the predecessor architecture)
 
My main focus was the small increase in performance from 2 to 4 cores.
I mentioned the i7 920 just to point out that it seems likely inter-core bandwidth being the bottleneck comparing C2D with C2Q (wich the i7's have a nice increase of, at 40~50%+ compared to the predecessor architecture)

The perfomance increase is large dual-core vs quad-core. Dont know what else to say. Of course better inter-core bandwidth, cache management, architecture, tripple DDR3 channels etc all factor in.

http://www.pcgameshardware.com/aid,...ark-review-with-13-processors/Reviews/?page=2

*Max detail + 50% view distance*, average/min fps

C2D E6600 2.4GHz, 19/14fps

C2Q Q6600 2.4GHz, 28.8/23fps


You seem to have missed this also which I posted above.

EDIT: I benched and checked recorded results in SS 2009 Professional regarding inter-conenction bandwidth and latency.

My CPU E8400 - 10.7GB/sec, 37ns


Ext QX6950 ----- 20.3GB/sec, 23ns (Core 2 extreme 3.3GHz QX6950 1333MHz FSB)

QX6950 ---------- 18GB/sec, 29ns (1333MHz FSB)

Q6600 ----------- 14.9GB/sec, 35ns


Phenom 9750 -- 3.4GB/sec, 137ns

Phenom 8750 -- 2.9GB/sec, 159ns


Core i7 920 ----- 33.8GB/sec, 17ns


Athlon FX60 ----- 2.9GB/sec, 95ns
 
Last edited by a moderator:
The perfomance increase is large dual-core vs quad-core. Dont know what else to say. Of course better inter-core bandwidth, cache management, architecture, tripple DDR3 channels etc all factor in.

http://www.pcgameshardware.com/aid,...ark-review-with-13-processors/Reviews/?page=2

*Max detail + 50% view distance*, average/min fps

C2D E6600 2.4GHz, 19/14fps

C2Q Q6600 2.4GHz, 28.8/23fps


You seem to have missed this also which I posted above.

I saw it, and it was the reason i mentioned that a dual core like the E6600 only needed a 600Mhz OC to match another cpu with 2 extra cores @2.4GHz each, wich is what leads me to think, the bottle neck is inter-core bandwidth, specially since C2D and Q have to clone each others cache.
Looking at the phenom II in your tests where the shared L3 shows it doesnt need as much inter-core bandwidth to achieve similar results. L3 Cache management efficiency and lantency is a bottleneck in the PhenomII case, hence the similar results.

I think we made our remarks more than clear already and ill leave them at this point since theres no need to to be re-chewing statements.
 
Last edited by a moderator:
Nebula, I wouldn't put much stock into Sandra's quoted inter-core numbers for the Core 2 Quads. Latency and bw figures seem to suggest it's only testing cacheline sharing between processors on the same die (which happens via the shared L2 iirc), going through the FSB should be much slower.
 
Can anyone else try what I mentioned above? Run 13 or 15 or whatever benchmark runs in succession without closing the game between each run. I lose about 20% from the first benchmark to the thirteenth or so.

Use the highest settings you can get away with without using any commandline arguments that remove restrictions.

My settings are: everything at absolute max except textures which are set to medium. Remember that texture filtering have a step above very high which is called highest. If the settings are playable or not is what this is about, it is the delta in average FPS between the first and the last runs.

Post your CPU, GPU, GPU driver and OS too.

i7 920@3.2 GHz, GTX 285, GeForce 182.08 WHQL, XP32SP3

Just a quick update. My issue is not resolved with the 182.46 beta driver that nvidia just released.
 
I saw it, and it was the reason i mentioned that a dual core like the E6600 only needed a 600Mhz OC to match another cpu with 2 extra cores @2.4GHz each.

Your comparing Conroe to Penryn there.

The Conroe - Conroe comparison puts the 3Ghz dual core (25% more clock speed) quite a bit slower than the 2.4Ghz quad. i.e....

25.4 / 21 - 28.8 / 23

But then swap the dual core conroe for the quad in the above figures:

25.4 / 21 - 19 / 14

The increase in clock speed actually produces greater than linear performance increases in the figures above. While as mentiioned before, keeping the same clock rate but doubling cores gives well over 50% extra performance.
 
I've finally tried this game out now. My bro bought it on Steam and we share the acct. :)

I couldn't get it to stop crashing on load initially. It turns out that Securom dislikes D3DOverrider. Nice. I also dislike the Social Club junk it forces you to install and load.

My biggest problem with the game is the inability to have FSAA. That really sucks a whole bunch. I'm playing it on my 50" TV thru my 8800GTX + C2Q Q6600. Runs butter smooth at 1280x720 but what a mess that is without any AA. Bumped it up to 1920x1080 and it looks a good bit cleaner but gets a bit choppy sometimes. Getting sick of deferred renderers that don't support AA.

I basically just started a new game and then immediately wandered out of the apartment and onto the streets. Not that interested in doing the "campaign" missions. I just like going native with sandbox games like this.
 
Hey, swaaye, would you mind doing the experiment I've outlined in this post? If you read my post before the one I am linking to and my subsequent ones, you should know what it entails. :)
 
Hey, swaaye, would you mind doing the experiment I've outlined in this post? If you read my post before the one I am linking to and my subsequent ones, you should know what it entails. :)

I have ATI graphic card but I really want to help you but I after each bench the game throws me into gameplay!

I'll make some time to do it anyways.
 
I've finally tried this game out now. My bro bought it on Steam and we share the acct. :)

I couldn't get it to stop crashing on load initially. It turns out that Securom dislikes D3DOverrider. Nice. I also dislike the Social Club junk it forces you to install and load.

My biggest problem with the game is the inability to have FSAA. That really sucks a whole bunch. I'm playing it on my 50" TV thru my 8800GTX + C2Q Q6600. Runs butter smooth at 1280x720 but what a mess that is without any AA. Bumped it up to 1920x1080 and it looks a good bit cleaner but gets a bit choppy sometimes. Getting sick of deferred renderers that don't support AA.

I basically just started a new game and then immediately wandered out of the apartment and onto the streets. Not that interested in doing the "campaign" missions. I just like going native with sandbox games like this.

Press "P" to enable DOF/blur filter. No setting is optimal but it is soft/blurrier look with much less shimmering and jaggies or shimmery jaggy mess! :mad:
 
I have ATI graphic card but I really want to help you but I after each bench the game throws me into gameplay!

I'll make some time to do it anyways.
Yeah, and you have to go to options and run a new benchmark. That's the way it works. It's tedious, I know. Thanks. :)

PS, the blur filter is horrible. I hate it so much. I never play with it enabled.
 
I'm a bit curious about cards with 1792MB or 2GB ram (real, effective video ram), and the game said to need 1.5GB on the highest setting.
Does anyone have such a thing here? I'm seeing a GTX260 with 1792MB at 240 euros, thinking it's not too much terrible. not that I can buy it but I'm wondering. These cards are so powerful anyway. GTA 4 is probably a lousy example for a game consuming that much v-ram, but I wonder how would games look like using über amounts of memory.

Is GTA 4 looking good in "extreme mode"? (and why do they provide such a silly option but not even supersampling?)
 
Back
Top