COD2 benches..a comparison point?

Status
Not open for further replies.
BTW, slightly off topic.... Excuse me if I'm mistaken, but aren't games like this built around the most powerful cards available at the time of release? Meaning, the more powerful cards play the game as it's meant to be played, no better no worse. Then if you're playing on a less powerful card the engine gives you the options to scale down effects or turn them off so the game will run better?
 
Stand on your right foot and touch your left knee with your right elbow while licking your right thumb...? Whilke trying to scream "You were right i was wrong!!" 10 times...?



;)
 
Hardknock said:
BTW, slightly off topic.... Excuse me if I'm mistaken, but aren't games like this built around the most powerful cards available at the time of release? Meaning, the more powerful cards play the game as it's meant to be played, no better no worse. Then if you're playing on a less powerful card the engine gives you the options to scale down effects or turn them off so the game will run better?


PC games are not built from the groud up for the high-end cards, not these days at least.
They're built for the somewhat "middle ground" and then add things for higher end cards or take things out for lower end cards.
A game REALLY built from the ground up for the highest end card wouldn't work too well when scaled down to lower end cards.
Then again, most games just tend to be built according to DirectX standards to make them compatible with as much hardware as possible, a PC developer can't really develop a game specifically for one card and then just "scale down" because that one card would have proprietary features other cards just don't have, or do things in a "proprietary way" that would be difficult to replicate on other cards.
That's why DirectX exists, the developers only have to make their games compatible with whatever version of DirectX they want, and let the card (and the drivers) do the rest.

EDIT: Obviously some devs can add "specific features" of one card to the game, like they used to with Truform, but it doesn't seem to be the same these days. It's all SM2.0 or SM3.0 or whatever DirectX standard there is at the time.
 
I don't think comparisons can be made between consoles and PCs based solely off PC benchmarks. A console is much easier to optimise for than an open platform like the PC, neither is likely to be fully utilising the hardware etc.
But FWIW, assuming all was equal and the frame rates quoted for each was absolute (60fps @ 720p X360 v. 40.2 @ 1280x1024), the X360 fares only 5% better than the 7800 GTX/FX-57 (ex. AA) or 22% better adding AA (2xAA, 8x AF, 34.6 @ 1280x1024). If we were to linearly extrapolate RSX figures based on clockspeed, we see it edge out the X360. See how easy (and foolish) it is to draw conclusions that suit our purposes?
 
Last edited by a moderator:
Hardknock said:
BTW, slightly off topic.... Excuse me if I'm mistaken, but aren't games like this built around the most powerful cards available at the time of release?

Not at all. For most games, the most powerful card at release will have only been available for a fraction of the game's development time. I'd be pretty confident in say there are currently NO games built from scratch the most powerful cards available now. By the time there are, they'll be mainstream or lower end (like how Doom3 was built from scratch around Geforce3s, IIRC - at release there were much more powerful cards available).
 
Nicked said:
I don't think comparisons can be made between consoles and PCs based solely off PC benchmarks. A console is much easier to optimise for than an open platform like the PC, neither is likely to be fully utilising the hardware etc.
But FWIW, assuming all was equal and the frame rates quoted for each was absolute (60fps @ 720p X360 v. 40.2 @ 1280x1024), the X360 fares only 5% better than the 7800 GTX/FX-57 (ex. AA) or 22% better adding AA (2xAA, 8x AF, 34.6 @ 1280x1024). If we were to linearly extrapolate RSX figures based on clockspeed, we see it edge out the X360. See how easy (and foolish) it is to draw conclusions that suit our purposes?

While I understand the point you are trying to make, your percentages appear to be way off :???:

There is only a 10% pixel resolution between them. With Xbox 360 having a 33% better framerate you would then need to take 10% off that, which makes it still 30% better performance. With 2x AA added, again there is roughly a 45% performance advantage, but a 10% resolution decrease that needs to be taken into account, which still leaves Xbox 360 with a 41% better framerate....
 
Last edited by a moderator:
Bill said:
http://www.firingsquad.com/hardware/call_of_duty_2_performance_ati_nvidia/default.asp

Finally! I have been waiting for these.

Since the X360 version is supposedly just as good, and running at 60 FPS..

Without AA a 7800GTX ruins it at 53.5 FPS at 1024X768 and 40.2 at 1280X1024, the closest rez's to 720P.

You can add 28% for RSX, as well.
How many times do we have to go through this?

RSX != 7800GTX

PS3 != PC CPU + PC GPU + other unknown PC parts running COD2 along with Windows XP, Norton Antivirus, etc.

Just because it fits your agenda doesn't mean it's a good comparison. I'm tired of seeing PS3 vs X360 comparisons in just about every thread as of late, especially with all the retarded arguments from people coming from console especific forums. We're comparing performance in a closed platform vs. some PC benchmarks to draw conclusions on how the first system compares to a third unrelated platform now?

Penis.

There, I feel better now.
 
Hardknock said:
While I understand the point you are trying to make, your percentages appear to be way off :???:
Not at all. If it appears that way it is because you're comparing 720p to 1280x1024 as if they are equal, which they clearly aren't.
720p, 60fps:
55,296,000 pixels/sec
1280x1024, 40.2fps:
52,690,944 pixels/sec
~5% difference (a tad less).

720p, 60fps:
55,296,000 pixels/sec
1280x1024, 34.6fps:
45,350,912 pixels/sec
~22% difference (again, a tad less).
 
Wasn't Call of Duty DirectX based, and aren't nVidia cards better optimised for OpenGL code?

Edit: Oh yes, and only 8 players in online for Xbox 360 version? No Anisotropic filtering?
 
Bill said:
http://www.firingsquad.com/hardware/call_of_duty_2_performance_ati_nvidia/default.asp

Finally! I have been waiting for these.

Since the X360 version is supposedly just as good, and running at 60 FPS..

Without AA a 7800GTX ruins it at 53.5 FPS at 1024X768 and 40.2 at 1280X1024, the closest rez's to 720P.

You can add 28% for RSX, as well.

hmmm....

weird...

my pc is running CoD2 @ 39-72 fps on a 6800 (no AA @ 1024 x 768)

and it looks a lot better than the 360 kiosk I've played...
 
Games are built for one of the following:

1) Some kind of random middleground based on developers not really optimising much at all.
2) Whichever card the engine coders happen to have in their machine.
2b) The latest card a manufacturer has given the developers to put in their machines.
3) Whichever card the developers publisher has done a bundle deal with.

The "minimum" spec for development purposes is usually an out of date estimate of what machine most people will have by the time the game comes out. When it gets released the minimum spec on the box will be the spec of the worst machine anyone in QA got it to run on.

The recomended spec for development will be some kind of ludicrous estimate based on hype and wishing for miracles on the part of the publisher, typically involving buying the best possible machines and assuming 4x performance for an affordable machine in a years time. The spec on the box will probably be whatever costs around $1500-$2000 when the game launches, or alternatively whatever the producer bought with his completion bonus.
 
Thought I'd point out that the 360 is locked at 60 FPS- meaning it NEVER drops below, and never goes above.

You're looking at average figures for the PC cards and comparing it to the minimum of the X360. While not incredibly accurate, perhaps comparing the minimum FPS of the PC cards with the 60 FPS would be more realistic?

Obviously, if locked at 60 FPS, the 360 is completely capable of rendering the scenes at much more than 60 FPS, its simply limited by the framerate the televisions are outputting.
 
shortround said:
Thought I'd point out that the 360 is locked at 60 FPS- meaning it NEVER drops below, and never goes above.

You're looking at average figures for the PC cards and comparing it to the minimum of the X360. While not incredibly accurate, perhaps comparing the minimum FPS of the PC cards with the 60 FPS would be more realistic?

Obviously, if locked at 60 FPS, the 360 is completely capable of rendering the scenes at much more than 60 FPS, its simply limited by the framerate the televisions are outputting.

Where on earth are you getting *that* from? It *never* runs less than 60?? Have they achieved some kind of infinite performance? It's pretty much impossible to completely gaurantee you'll never drop frames in all bit the simplest and most conservative of cases. Console games tend not to be in that category.

Last time I saw the system, stuff was running a whole lot less than 60..
 
I think the love affair between hardknock and l-b is so cute and cuddly wuddley... errrr

CoD2 looks exciting for whenever it gets released...

need to find a 360 kiosk...
 
london-boy said:
Wh'y d'o yo'u hav'e to b'e so pick'y!!!!!one


;)


Seriously, the level of grammar on these boards makes (NOT MAKE'S) me cringe, especially when it's (=IT IS) their (NOT THERE or THEY'RE) first language!!
!eVo!-X Ant UK i take it is from the UK... Dear god... And i'm not british myself!!! English is more or less my second language!!


[/rant]

Yes london boy i am indeed from the UK
 
MrWibble said:
I'm going to hate myself for this post, but honestly I'm beginning to get a headache decyphering your posts.

Apostrophes should not be used for plurals - they're for possessives and contractions. They are not simply inserted before the last letter in a random selection of words because they look pretty.

"That's" should have had one because it's short for "That is".

"Get's" shouldn't, "News" isn't even a plural, and I'm not sure what "Code'd" is all about either.

I'd actually almost rather see people not use punctuation than liberally sprinkle it where it doesn't belong... that's just painful.

OK, you can return to your regularly scheduled flamewar now :)

That just the way ive been taut at school and the way ive always done things, but to make you happy ill do it the way every one else prefers
 
!eVo!-X Ant UK said:
That just the way ive been taut at school and the way ive always done things, but to make you happy ill do it the way every one else prefers

Now your doin it on purpos aintcha ;)
 
Totally off topic, but I've met kids with high grade GCSE's who don't even know what a comma is used for. There was a change a while back away from 'stifling people's creativity by constricting them to regimental grammar' and as a result these people are oft incapable of expressing themselves clearly in a way people can understand. It's like the flippen' Middle Ages revisted without any defined spellings or gramatical laws. Try reading original Chaucer for an example of the very best English could manage back then. In contrast check things like reports written about the Mary Rose and associated articles which used different spellings for the same word in the same paragraph!

I let most spelling and grammatical errors slide on the Internet as there's an awful lot of non-native English speakers doing a very impressive job of wielding the language, plus when typing at speed it's easy to make common mistakes and correcting them is often time spent without affecting an already readable text. But ugh, the way English as a language is heading makes me cringe. Clear expression will become very hard if the standards that took English society hundreds of years to develop to make things easier are dropped as quickly as they are dropping. I was actually talking with friends a while back on Messenger using smilies, and felt we were heading back to using pictographs. Perhaps in 300 yers time electronic pictographics will be the norm :???:
 
Well, Xbox360's gpu is supposed to have kickass shader performance, right?

BTW, I downloaded the demo of COD2 and played it at 1280x1024 with the max settings it would allow(minus AA, AF, and I think it only let texture quality go up to medium), I believe it ran about 20-30fps on my 6600gt.

If they're going to compare the console version to a PC version though, they should eye it, adjust the PC version until it looks identical to the console version, not just max out the PC version and assume that's equal.
 
Status
Not open for further replies.
Back
Top