COD2 benches..a comparison point?

Status
Not open for further replies.
I dont recall exactly.

There were some, though. This guy saying how much better emotion engine was, etc.

And the sparks in burnout 3? I dont know if you've heard, but they're better on PS2.
 
Bill said:
I dont recall exactly.

There were some, though. This guy saying how much better emotion engine was, etc.

And the sparks in burnout 3? I dont know if you've heard, but they're better on PS2.


Thanks...

Now, have you got anything to contribute to the thread?
 
Enough with thoses kiddie replies

The uninformative comments about "haters/trolls" should stop. And will stop.

When someone brings up what you'll consider a incorrect point, you have two solutions:
- Address the point and give your opinion about what you consider as wrong.
- Or, simply ignore/dismiss the claim.

Nothing else, discussing other people intentions, dark motives or hidden agendas is not constructive, nor even interesting in the first place (For god's sake, who cares what poster X or Y likes or hates).
And more of all, it's not tolerated. Not, at all.

Discussing the person's motives, if he/she made a rational point, means that you cannot handle a coherent discussion, which are usually made of conflicting contradictions.

This posts especially concerns the new members.
 
you have a nice overclocked 6800LE and probably with 16pixel and 6 vertexshaders unlocked. close to 6800GT spec.. but then again, i get your point. :)
 
london-boy said:
Hardknock, last thing i want is to side with pjliverpool, but if you read what you quoted from him, i don't think his statement is so over the top to deserve your response.
He said he will want to see the game running under those condition and will not trust a MS PR man until he sees it with his eyes.
Is that really unacceptable? Don't think so.
You've said much more over-the-top things in your history here, or does your response stem from the fact that his scepticism is towards MS and not other companies you keep on doubting (sometimes in a much less civil way) all the time on here?

First of all, it was not a "MS PR man" it was an interview with the developer of the game that stated a rock-solid 60fps.

Is it unacceptable to automatically discredit them? In my eyes, yes. The developers have no reason to lie, this is not going to boost sales to any considerable extent. Only thing we have to go on is their word, unless you have some proof to show they're wrong there's nothing to argue about.

One thing of note, you sure like to call me out when the mood suits you. Your precipitence that I doubt other companies(obviously refering to Sony knowing your posting history) just because I like Xbox is not only ludicrous but becoming redundant. Unlike some people here, I support all consoles. If somethings wrong with Xbox 360 I call that out just as quick as I'll call out something for PS3. Stop trolling people in threads and discuss the topic at hand for once. Thank you.
 
Last edited by a moderator:
I once knew a producer - we'll call him Bob - who was working on a game. Lets call the game "Sandwich Attack" (can you tell it's lunchtime?)

Bob gave interviews to the press. He claimed his game would be at 60fps. He even showed it to people and said to them "this game is running at 60."

Sandwich Attack was in fact a bit of a turkey technically. It was lucky if it managed 30 a lot of the time - I don't think I ever saw it get to 60.

However the press not only quoted Bob in previews, but in fact went on to state in final reviews that the game played at a smooth 60fps.

You know what? I'm not even sure if Bob knew his game didn't run at 60. Obviously the reviewers didn't, whether they played it or not. To me it sticks out like a sore thumb (though depending on the game I might not care) but others, and I know very highly qualified/experienced graphics engineers who this applies to, can't see a difference at all.

This a true story - the names have been changed to protect the innocent/incompetent. The point, is that when it comes to things like framerates there's very little point debating any kind of claims regardless of who they come from, developers included.

I certainly wouldn't start crunching numbers and making and/or kind of statement about performance...
 
Vysez said:
The uninformative comments about "haters/trolls" should stop. And will stop.

When someone brings up what you'll consider a incorrect point, you have two solutions:
- Address the point and give your opinion about what you consider as wrong.
- Or, simply ignore/dismiss the claim.

ok

but what is the sense of a thread like this?
It's all started comparing x360 and pc version of cod2

And I wrote the proofs (with links to video)
we have a review from a trusted site that says that we have 60 solid LOCKED fps in every situation.
then we have the final clear comment of the president od IW that say clearly that x360 version have better framerate than the top-pc available

so the thread is over, but some people continue to say "I don't want to believe anyone.", others says false statement "cod2 for x360 is 1088x613" or "there's other framerate than 30 and 60 fps with vsync, I'm SURE" and so on

so I think that this thread have no reason to be
and excuse me if I think that some talks "to attack" one platform, as this seems the case
but if saying that is agains the internal laws of the board, ok, I shout my mouth, excuse me, but all this It's sad :cry:
 
Is there a fraps for consoles?

I'd like more hard numbers.

For example, Xbox gets criticized for 30 FPS games a lot, yet I bet if you tested both libraries, it has far more 60 FPS games than PS2 as a percentage.
 
SynapticSignal said:
no, he's totally right
he's a president of infinity wards, you're a guy that have doubts about the simple vsync question -> you have not the "knowing base" to counterfeit him ;) (don't blame me for this)
you can't know how x360 compares to pc with cod2, as He can
simple, clean ;)

Ok, so for the record are you saying that you interpret his statement to mean that Xenos can match the performance of the R600 and that you agree with that statement?

Tell me, if ATI is capable of such an incredible feat (in volume) then whats the deal with the R520? Why did they spend so much time and money getting it to work when its vastly inferior to the Xenos in every way and is about to be shown up by the GTX 512MB? I know some of the Xenos patents are owned by MS but wouldn't it have made far more sense for ATI simply to release a varient of it on the PC?


this example is totally wrong, I'm assuming that, maybe, you are one of the x360 haterz or what?:rolleyes:

I don't belive the example is wrong. It simply showing that a GPU can be technically 2 generations ahead of another GPU while still be slower, hence proving the point that technical advancement does not equal raw rendering power.

And no, I don't hate the x360, read my sig, I already have it on pre-order.


unbeliever, what king of proof you need after the words of trusted reviewers and the words of the president of IW?

came on, this thread is becoming ridicule "i don't wanna believe the trusted sources" and shoud be closed by admin :rolleyes:

Just like you or I, a reviewer, even a trusted and great one can't tell a game is never dipping below 60fps just by looking at it. No-one hs played the full game all the way throughyet to say nothing of multiplayer and so until thats happened and no-one, anwhere at any time has experienced any slow down in the game what-so-ever can you be certain that the lowest framerate it ever achoeves is 60fps.

And as for the president of IW, like I said, he's marketing his own game, the same claim has been made of many console games in the past which when they actually get to the shelves, have demonstrated visual slowdown hence proving the claims of the developers false.

again, you don't know how vsync works
pace.

Actually I do and I was completely correct in my assertion that vsync doesn't have to half your framrate when you drop below the refresh rate. With triple buffering it goes down in thirds so you have 40fps to fall back on. And again, vsync doesn't have to be turned on in a console game - they could just cap the framrate at 60fps (like Doom 3).
 
Last edited by a moderator:
MrWibble said:
I once knew a producer - we'll call him Bob - who was working on a game. Lets call the game "Sandwich Attack" (can you tell it's lunchtime?)

Bob gave interviews to the press. He claimed his game would be at 60fps. He even showed it to people and said to them "this game is running at 60."

Sandwich Attack was in fact a bit of a turkey technically. It was lucky if it managed 30 a lot of the time - I don't think I ever saw it get to 60.

However the press not only quoted Bob in previews, but in fact went on to state in final reviews that the game played at a smooth 60fps.

You know what? I'm not even sure if Bob knew his game didn't run at 60. Obviously the reviewers didn't, whether they played it or not. To me it sticks out like a sore thumb (though depending on the game I might not care) but others, and I know very highly qualified/experienced graphics engineers who this applies to, can't see a difference at all.

This a true story - the names have been changed to protect the innocent/incompetent. The point, is that when it comes to things like framerates there's very little point debating any kind of claims regardless of who they come from, developers included.

I certainly wouldn't start crunching numbers and making and/or kind of statement about performance...


Thank you very much.
 
Bill said:
Is there a fraps for consoles?

I'd like more hard numbers.

For example, Xbox gets criticized for 30 FPS games a lot, yet I bet if you tested both libraries, it has far more 60 FPS games than PS2 as a percentage.

As a percentage it would be normal, since PS2 has many times the number of games the Xbox has.
One will always find PS2 at a disadvantage "in percentage" for anything. Percentage of AAA titles compared to the rest, percentace of bouncing boobs compared to the rest, percentage of games that don't require your brain compared to the rest...

PS2's "rest" is so big that the percentages of one particular group of games will always be lower than on other consoles.
 
SynapticSignal said:
haters never stop:rolleyes:

when a game that run at 60fps with a vsync, a little drops in frames per second cause the huge drop visually 60-30 fps
and this is CLEARLY VISIBLE.

some of you don't (want to) believe to trusted reviewers, president of infinity wards..
so who we can ask for, the pope?
:LOL:

Triple buffering eliminates that drop. Many GameCube games used triple buffering, though very few Xbox games did. Since triple buffering is now part of the DX9 spec, I'd imagine that it's also a standard feature on the Xbox 360. I'm not sure if you can disable vsync on a TV though, I know that using the video out on my video card that Vsync is always enabled.

That, and COD2 could always lower detail to avoid a drop.

BTW, for the guy with the COD2 demo running on a 6800LE...pretty sure the demo doesn't allow you to max settings. For instance, I think it only allows up to medium texture quality.

again, you don't know how vsync works
pace.

It's so frustrating that people don't acknowledge triple buffering! It's been around since practically the dawn of 3d graphics! It's part of the DX9 spec and enabled by default!

I know some of the Xenos patents are owned by MS but wouldn't it have made far more sense for ATI simply to release a varient of it on the PC?

Xenos is rather low fillrate compared to current ATI parts, plus the 16MB eDram would be smaller than some high resolution framebuffers.
 
MrWibble said:
I once knew a producer - we'll call him Bob - who was working on a game. Lets call the game "Sandwich Attack" (can you tell it's lunchtime?)

Bob gave interviews to the press. He claimed his game would be at 60fps. He even showed it to people and said to them "this game is running at 60."

Sandwich Attack was in fact a bit of a turkey technically. It was lucky if it managed 30 a lot of the time - I don't think I ever saw it get to 60.

However the press not only quoted Bob in previews, but in fact went on to state in final reviews that the game played at a smooth 60fps.

You know what? I'm not even sure if Bob knew his game didn't run at 60. Obviously the reviewers didn't, whether they played it or not. To me it sticks out like a sore thumb (though depending on the game I might not care) but others, and I know very highly qualified/experienced graphics engineers who this applies to, can't see a difference at all.

This a true story - the names have been changed to protect the innocent/incompetent. The point, is that when it comes to things like framerates there's very little point debating any kind of claims regardless of who they come from, developers included.

I certainly wouldn't start crunching numbers and making and/or kind of statement about performance...

So the moral of the story is all developers lie and we can't believe anything anyone says?

Just want to make sure you and L-B hold this opinion for every developer on all platforms from this day forward. Henceforth, we will never again know the framerate of any console games, they all lie.

When Polyphony comes out and says GT4 is 60FPS, or Full Auto, everyone just goes "ok", but for some reason IW (one of the most respected developers on the planet) lie through their teeth. Anyone wanna explain why we are assuming this?

Sorry, but around here we give developers the benefit of the doubt, especially when making TECHNICAL claims about their product, so unless you have some proof that IW would outright lie to their fanbase, keep the mindless pessimism and pre-concieved opinions out of this thread(that wasn't directed at you Wibble).
 
I just have to ask the question, if we can't even believe that the PC Editor of 1UP.com is capable of determing the difference between 60FPS and low 30's, 40's and 50's, then why the hell does it matter that every game be at 60FPS?

I mean, it's either noticeable or it's not. It's either important, or it's not. This guy is as close as you can get to an expert in the field, and if he is incapable of telling the difference who IS capable? And why does it matter?

This is a little OT, just wondering how some people can claim 60FPS is "essential for next-gen gaming", then at the same time say that nobody, not even PC game reviewers, can tell the difference between 60FPS and lower.
 
scooby_dooby said:
So the moral of the story is all developers lie and we can't believe anything anyone says?

Well, kind of :)

What I'm really trying to say is that sometimes developers *do* lie, but sometimes they just get things a little wrong, and sometimes things change. A lot of the time one would hope that what they say is reasonably accurate.

I have no reason to doubt this particular game runs at 60. I would probably give any developer the benefit of the doubt about any claim that wasn't in some way hard to believe (a game running at 60fps sounds completely reasonable to me).

My point is really just that when some doubt does creep in then there is little point arguing over who is the most reliable source of information when in only a short amount of time we can just go and look at the game and establish the truth. And even having done so the only thing we will have established is whether or not this particular console can run this particular version of this particular game. It's really not a good enough sample to start making estimates of how powerful the machine is.

Not everything is black and white - there is lots of uncertainty involved here and until that gets removed by actually observing the final product, this entire argument seems rather futile.
 
Hello there,

I am not sure if the Xenos has been optimized and/or recommended for triple buffering, and it is quite rare in games as far as I know. For one you are forced(unless it is 480p without AA) to store one of the back buffers in the main memory wich will effectively consume bandwidth, something that is not dezired. Another thing is that it will also take up a good chunk of memory, I am not sure if you need to store z-depth in the back buffer stored in the main ram, but with or without z-depth, it will take several megabytes to store, atleast 7 megabytes if the rendered resolution is 720P. The main advantage of triple buffering is to reduce the tearing effect, another nice "side-effect" is that slow downs will look somewhat smoother than double buffering.
 
scooby_dooby said:
I just have to ask the question, if we can't even believe that the PC Editor of 1UP.com is capable of determing the difference between 60FPS and low 30's, 40's and 50's, then why the hell does it matter that every game be at 60FPS?

I mean, it's either noticeable or it's not. It's either important, or it's not. This guy is as close as you can get to an expert in the field, and if he is incapable of telling the difference who IS capable? And why does it matter?

This is a little OT, just wondering how some people can claim 60FPS is "essential for next-gen gaming", then at the same time say that nobody, not even PC game reviewers, can tell the difference between 60FPS and lower.
That isn't OT, that is essential to your misunderstanding of the problem. There are plenty of us who can tell the difference between 60 and 30 frames a second, but then there are plenty of people like you who say "it's either noticeable or it's not" when they obviously can't tell the difference between 30fps and something higher and then some of those people go and make bold claims that the game runs at a rock solid 60fps when that is hardly the case. Again, maybe the final game does run locked at 60fps and that would be great, but I can assure you that the demo does not.
 
Shompola said:
Hello there,

I am not sure if the Xenos has been optimized and/or recommended for triple buffering, and it is quite rare in games as far as I know. For one you are forced(unless it is 480p without AA) to store one of the back buffers in the main memory wich will effectively consume bandwidth, something that is not dezired. Another thing is that it will also take up a good chunk of memory, I am not sure if you need to store z-depth in the back buffer stored in the main ram, but with or without z-depth, it will take several megabytes to store, atleast 7 megabytes if the rendered resolution is 720P. The main advantage of triple buffering is to reduce the tearing effect, another nice "side-effect" is that slow downs will look somewhat smoother than double buffering.
Vsync is what stops the tearing effect, regardless of how many buffers you use. Also, triple buffering has been around for ages and the extra buffer doesn't need z-depth or take much bandwidth, it basically is just an extra front buffer to be used in case the backbuffer isn't finished with a new frame in time for refresh.
 
kyleb said:
That isn't OT, that is essential to your misunderstanding of the problem. There are plenty of us who can tell the difference between 60 and 30 frames a second, but then there are plenty of people like you who say "it's either noticeable or it's not" when they obviously can't tell the difference between 30fps and something higher and then some of those people go and make bold claims that the game runs at a rock solid 60fps when that is hardly the case. Again, maybe the final game does run locked at 60fps and that would be great, but I can assure you that the demo does not.

Maybe your missing my point. The guy is a PC Editor, it's his JOB to play and review PC games all day long. My point is this isn't just some random "guy on the internet", this guy should have plenty of experience to draw from, and should have a good idea what framerate games are running at. And his exact words were "It doesn't skip a beat...no alowdown at all"
 
Status
Not open for further replies.
Back
Top