xbox360 kiosk + CoD2 - FSAA - Aniso + Trilinear

Status
Not open for further replies.
Chalnoth said:
Well, duh, but you can expect that most of the time the framerate will be above 60 fps if the average is 100 fps.
I agree with Chalnoth, an average 100FPS means you will be getting a constant frame of over 60FPS. So I'll gladly take an average 100FPS game over a constant/average 60FPS game when both run at the same level of details and resolutions.
 
Alstrong said:
If I were a developer, I would want the majority to experience the game at its best visual quality. The X360 is a much more economical means to that end. More people would have a much better chance at experiencing the game at such high detail.

What does that have to do with which system I choose to buy? Anyway, even with the same core graphics the PC version of any X360 game can look and play better due to superior image quality and more FPS.

If I want to play all my games at 1900x1200 with 6x Adaptive FSAA/16x Quality AF at a solid 60+ fps, is the X360 going to let me do that?

Besides, after a year or two devs start implementing extra graphical features into the PC versions of games anyway so clearly most don't agree with your reasoning. Why deliberatly hobble a game on the PC when mainstream PC's are capable of so much more?

100fps at CoD2 at max details by the end of this month you say? Is that a locked fps or are you counting the multitude of drops common with most games?

Of course its an average, who talks about PC framerates in terms of what they are locked at? Do you have some proof that the X360 version of the game never falls below 60fps? How do you know thats not just an average aswell?

How much are you going to have to spend just to brag?

What does that have to do with it? I already stated above - assume I have the money to buy such a PC and spending that amount of money on a PC is not a problem for me. So its not for you to question my spending habits since it could be easier for me to purchase a high end PC than for you to purchase an X360 (im talking hypothetically here by the way - I don't have that kind of money and thats why im getting an X360). What matters is the end result - which provides the better experience.
 
pjbliverpool said:
What does that have to do with which system I choose to buy? Anyway, even with the same core graphics the PC version of any X360 game can look and play better due to superior image quality and more FPS.

If I want to play all my games at 1900x1200 with 6x Adaptive FSAA/16x Quality AF at a solid 60+ fps, is the X360 going to let me do that?

What does that have to do with it? I already stated above - assume I have the money to buy such a PC and spending that amount of money on a PC is not a problem for me. So its not for you to question my spending habits since it could be easier for me to purchase a high end PC than for you to purchase an X360 (im talking hypothetically here by the way - I don't have that kind of money and thats why im getting an X360). What matters is the end result - which provides the better experience.

A lot of this comes down to where and what kind of games you play. Microsoft talks a lot about the '10 foot experience' where a system is designed to be utilized from the couch (or at least from 10 feet if yuo sit on the floor). This experience is not improved a whole heck of a lot by adding things like 16xAAF or 1920x1200 resolution (especially becuase theres less than half a percent of people that have displays which could leverage such an output).

Problem with the PC game is that its about an 18" experience and you can see EVERY detail and every bump in AAF, AA, resolution improves the experience.

Clearly, the 360 has been designed from the ground up for the 720p/10 foot experience. What really needs to be compared is the PC at 18 inches and the 360 at 10feet because thats what they were designed for. From a price/performance perspective, the 360 blows the PC out of the water. You would need to spend $300 on just the video card to get 720p.

IF you take price out of it and focus on what you PREFER, there ARE things that the 360 just cant replace which is the MMORPG experience, the preferred KB/Mouse FPS control scheme. But sports games, on Live, on a 110" DLP procjector, well thats just awesome. :)

There is no real answer here. We can argue it but it ends up coming down to the experience YOU want, how you play, what you can live with, and your budget. I dont think anyone can question that a 360 is a tremendous value though.
 
expletive said:
You would need to spend $300 on just the video card to get 720p.
No way. I played at 1280x960 with 2x AA and 16-degree AF in nearly all of my games just fine on a 6600 GT. A few newer games force me to go lower, but this is only a $140 card, which is far below the $300 limit you set.
 
Chalnoth said:
Well, duh, but you can expect that most of the time the framerate will be above 60 fps if the average is 100 fps.

BTOA said:
I agree with Chalnoth, an average 100FPS means you will be getting a constant frame of over 60FPS. So I'll gladly take an average 100FPS game over a constant/average 60FPS game when both run at the same level of details and resolutions.


Well, I'd rather have a consistent frame rate than one that jumps all over the place. I don't care if it jumps from 60 to 140fps during the course of the game, I'd still notice frame rate drops and it's annoying. ;) And therefore, not the best experience (for me). Sure I could enable V-sync on an LCD (for 60fps), but that defeats the purpose of arguing for hardware that can do 100fps does it not? You spend the money to get that high performance, or at least the ability to see that performance at some high resolution that you don't see on the console.

If you meant that with 100fps average, you're guaranteed 60fps all the time (and enabling V-Sync does help me), then....say so. (I am not going to assume what you "really" meant, so please be precise).

pjbliverpool said:
What does that have to do with which system I choose to buy?
Anyway, even with the same core graphics the PC version of any X360 game can look and play better due to superior image quality and more FPS.

Nothing, I was just making a statement.

Besides, after a year or two devs start implementing extra graphical features into the PC versions of games anyway so clearly most don't agree with your reasoning. Why deliberatly hobble a game on the PC when mainstream PC's are capable of so much more?

That's funny you mention PC versions because usually the PC enthusiasts complain night and day about crappy ports with the few exceptions. And now you're talking about mainstream and extra graphical specs? Most mainstream users have cards that are less capable than the console at this time, probably even for another couple years yet. If you hadn't realized it yet, Xenos is beyond SM3.0 spec.

Why hobble a game? You've got to have a higher state of quality in the first place to hobble the game. The "best" the developer decided on going for is the best you'll see. It is also a balance between how much money they are going to spend on the target audience. As I said in my statement previously, the targets are the majority of users, and there is no denying that most users on PC will not be experiencing the game at max detail and decently high resolution at a good framerate compared to the Xbox 360's value here.

Of course its an average, who talks about PC framerates in terms of what they are locked at? Do you have some proof that the X360 version of the game never falls below 60fps? How do you know thats not just an average aswell?

Exactly, on PC the framerates fluctuate wildly unless you throw a crazy amount of hardware specs at it. ;) Usually on consoles, developers who say 30 or 60fps mean constant. It's a different lingo/expectation. So, no I do not know for sure because I have not played the Xbox 360 version. All I have is a direct feed video, which does not drop frames at all.

What does that have to do with it?

The thing is that you are removing the problem with PCs by stating that you have no qualms about spending money. Everyone has problems getting to such a high level of computer hardware specs. Cost is a critical factor when comparing PCs and consoles by nature.

Previously, you mentioned comparing best console to best PC barring price. The thing is that there will be millions more users of the best console compared to the highest end graphics chips in conjunction with high end CPUs because of the price. (All talking within 1-2 years of launch, by then you get into the advantages of PC: waiting for new tech).

If money were no object, of course you could spend enough money to buy yourself an IMAX and enough computing hardware to display games at whatever resolution you want for the best experience possible. By removing price, you are now able to go to the extreme. I might as well remove the free-space limit for the speed of light just so I can travel faster (which I can actually).
 
Last edited by a moderator:
Well, I'd rather have a consistent frame rate than one that jumps all over the place. I don't care if it jumps from 60 to 140fps during the course of the game, I'd still notice frame rate drops and it's annoying.
How can you be complaining about a game running an average 100FPS? A game running an average 100FPS should dip between 80-120FPS, not 60-140. Besides, people wouldn't notice that much of a different when a game runs at 60FPS+.

I just don't see how you complain about games that run at a 60FPS+ rate, its just retarded. Now maybe if it was an average 30-45FPS game then I would agree with you.

If you meant that with 100fps average, you're guaranteed 60fps all the time (and enabling V-Sync does help me), then....say so. (I am not going to assume what you "really" meant, so please be precise).
You should already know an average 100fps game would have a frame rate of about 80-120FPS. Even if so, you shouldn't notice that much of a difference when a game runs at 60FPS+.
 
Alstrong said:
That's funny you mention PC versions because usually the PC enthusiasts complain night and day about crappy ports with the few exceptions. And now you're talking about mainstream and extra graphical specs? Most mainstream users have cards that are less capable than the console at this time, probably even for another couple years yet. If you hadn't realized it yet, Xenos is beyond SM3.0 spec.
Night and day? Hardly. PC enthusiasts like the PC for a reason: we like PC games. And we typically prefer games that are not ports. The vast majority of the games I play are not available on a console at all.

Why hobble a game? You've got to have a higher state of quality in the first place to hobble the game. The "best" the developer decided on going for is the best you'll see. It is also a balance between how much money they are going to spend on the target audience. As I said in my statement previously, the targets are the majority of users, and there is no denying that most users on PC will not be experiencing the game at max detail and decently high resolution at a good framerate compared to the Xbox 360's value here.
Most PC ports today have slight graphical improvements. The complaints of bad ports are typically with respect to low performance for the graphics (compared to other PC games), or poor controls.

Exactly, on PC the framerates fluctuate wildly unless you throw a crazy amount of hardware specs at it. ;) Usually on consoles, developers who say 30 or 60fps mean constant.
God. This is only because PC users typically run at much higher refresh rates, and often will even run with VSYNC disabled (this is the default for most games, actually). With any game, no matter what, you're going to have high fluctuations in framerate if vsync is disabled. It's simply unavoidable, because sometimes there's more stuff on the screen than at other times.

If you prefer to enable VSYNC and get constant framerate, that's going to be quite possible on a PC, but you may need to run at a slightly lower resolution.

The thing is that you are removing the problem with PCs by stating that you have no qualms about spending money. Everyone has problems getting to such a high level of computer hardware specs. Cost is a critical factor when comparing PCs and consoles by nature.
Sure it is. And if you want to have a high-end PC for other reasons (for me those are research and video encoding), then the marginal cost to make it into a great gaming PC can be relatively small (it's just a video card, basically).

And due to the nature of PC's, they are much more incremental in their improvements. So you typically expect a brand-new console to be better than the PC at release, but start to look worse within a year of introduction. Now we have PC's that are going to remain better than the X-Box 360 at the 360's launch, but they will cost a lot. In a year, we'll have much cheaper PC's that will be better than the X-Box 360, and likely have graphics cards in the $150 range that will beat the 360's.

Previously, you mentioned comparing best console to best PC barring price. The thing is that there will be millions more users of the best console compared to the highest end graphics chips in conjunction with high end CPUs because of the price. (All talking within 1-2 years of launch, by then you get into the advantages of PC: waiting for new tech).
And look at today. Today pretty much any store-bought computer that has anything but integrated graphics will do better than today's consoles at games. This has been the case for a couple of years now. And it will be the case again in a couple of years.
 
Chalnoth said:
Now we have PC's that are going to remain better than the X-Box 360 at the 360's launch, but they will cost a lot. In a year, we'll have much cheaper PC's that will be better than the X-Box 360, and likely have graphics cards in the $150 range that will beat the 360's.

Thats a big, HUGE maybe. I dont see any PCs with a triple core SMT cpu that outputs 115 Gflops with three VMX units attached in the near future.
 
blakjedi said:
Thats a big, HUGE maybe. I dont see any PCs with a triple core SMT cpu that outputs 115 Gflops with three VMX units attached in the near future.
Sure, but we'll have dual-core CPU's with similar processing capabilities. The powerpc core that these consoles are using isn't actually that great compared to PC CPU's.
 
Chalnoth said:
No way. I played at 1280x960 with 2x AA and 16-degree AF in nearly all of my games just fine on a 6600 GT. A few newer games force me to go lower, but this is only a $140 card, which is far below the $300 limit you set.

A 6600 GT may be fine for some current games but if you tried running some the games that the 360 WILL be capable (e.g. the shader-intensive ones coming down the pike) of theres no way that card could do it. I have a 7800 GTX and F.E.A.R. makes it chug even at 1152x864 2xAA (and this is with an A64 3800+, 2G RAM).
 
I think this time we may see PC games taking lot more time to overtake next gen console visuals, compared on this gen. But as major PC (FPS) titles start appearing on next gen consoles, the wait may even be longer.
 
BTOA said:
You should already know an average 100fps game would have a frame rate of about 80-120FPS. Even if so, you shouldn't notice that much of a difference when a game runs at 60FPS+.

pardon my interference, but your "knowledge" is astounding.
humor me, try calculating the average of the following sequence:

120, 120, 120, 120, 20

now let's try to relate that to reality. take 1 second, split it in equal parts of 200ms, imagine for the first 4 parts you watch a framerate of 120fps, i.e. 24 frames per 200ms, and for the last 1/5th of the second you see the whooping total of 4 frames. now, imagine you're playing a fps under those conditions where you're constantly moving your view. ..*wating for you to image*.. now, compare that to a steady 60fps. still not much of a difference?
 
expletive said:
A 6600 GT may be fine for some current games but if you tried running some the games that the 360 WILL be capable (e.g. the shader-intensive ones coming down the pike) of theres no way that card could do it. I have a 7800 GTX and F.E.A.R. makes it chug even at 1152x864 2xAA (and this is with an A64 3800+, 2G RAM).
This is why I said it will take a couple of years before a mid-range graphics card will outperform the X360.
 
darkblu said:
now let's try to relate that to reality. take 1 second, split it in equal parts of 200ms, imagine for the first 4 parts you watch a framerate of 120fps, i.e. 24 frames per 200ms, and for the last 1/5th of the second you see the whooping total of 4 frames. now, imagine you're playing a fps under those conditions where you're constantly moving your view. ..*wating for you to image*.. now, compare that to a steady 60fps. still not much of a difference?
So the reality is, if you count the average framerate simply by taking the average of the framerates for each frame, you're doing it quite wrongly (for in-game benchmarks).

The proper way to find the average framerate for in-game benchmarks would be to take the total number of frames rendered, and divide by the amount of time it took to render them. This will give the proper behavior where one second at 100 fps and one second at 50 fps will average to 75 fps.

If you're doing a timedemo, however, it will be recorded at a constant framerate (typically 30Hz). So the proper thing to do is to actually take the average framerate for each individual frame (since each frame now represents a specific space in time, as opposed to normal play).
 
Chalnoth said:
So the reality is, if you count the average framerate simply by taking the average of the framerates for each frame, you're doing it quite wrongly (for in-game benchmarks).

the 'averages for each frame'? there's not such a thing. you sure you did not mean each second? if you meant that - i used just an arbitrary not too long timespan. i could've picked a minute for that matter. or an hour. but a second was more helpful to my point.

The proper way to find the average framerate for in-game benchmarks would be to take the total number of frames rendered, and divide by the amount of time it took to render them. This will give the proper behavior where one second at 100 fps and one second at 50 fps will average to 75 fps.

sure. i never said otherwise. you just jumped to conclusions.

If you're doing a timedemo, however, it will be recorded at a constant framerate (typically 30Hz).

not necesserily. depending on many things, the most important of them if the engine's physics is variable or fixed step.

So the proper thing to do is to actually take the average framerate for each individual frame (since each frame now represents a specific space in time, as opposed to normal play).

seems to me you're stepping into unfamiliar territory. in normal gameplay each frame represents a timespan too (regardless of whether it was displayed with or w/o motion blur), as it stays for a non-zero timespan on the screen. there's no fundamental difference between timedemos and gameplay in this regard (and gameplay can be at a constant framerate of 30Hz too)
 
Deepak said:
I think this time we may see PC games taking lot more time to overtake next gen console visuals, compared on this gen. But as major PC (FPS) titles start appearing on next gen consoles, the wait may even be longer.

For high end or low end cards? Because I see the exact opposite for high end cards, I think they'll be able to outrun the new consoles in a relatively short period of time.
 
darkblu said:
the 'averages for each frame'? there's not such a thing. you sure you did not mean each second? if you meant that - i used just an arbitrary not too long timespan. i could've picked a minute for that matter. or an hour. but a second was more helpful to my point.
Yes there is. The framerate for one frame is the inverse of the amount of time it take to render that frame.

not necesserily. depending on many things, the most important of them if the engine's physics is variable or fixed step.
Possibly. But for GPU limitation, it makes more sense to just look at the framerate in the way I described it (averaging over frames instead of times). CPU limitations may be different, of course. But since CPU limitations will merely cause this method of measuring framerate to result in too low an average, this isn't that much of a problem.

seems to me you're stepping into unfamiliar territory. in normal gameplay each frame represents a timespan too (regardless of whether it was displayed with or w/o motion blur), as it stays for a non-zero timespan on the screen. there's no fundamental difference between timedemos and gameplay in this regard (and gameplay can be at a constant framerate of 30Hz too)
Well, of course, but my point was that in a timedemo, each frame represents the same fixed timespan. This is not going to be the case in-game (and even if it happened to be due to very special circumstances, both methods will report the same average, so it's a mute point).
 
A hypotethical dual core 4 GHz Athlon 64 would peak at 32 GFLOPS.

4 FP ops/cycle(SSE/3DNow) * 4 GHz * 2 = 32 GFLOPS.

If we counted new instructions providing 4-way MADD's (doubling the number of FP ops per cycle) and we said there were two SSE units per core sure we would get to 128 GFLOPS beating the Xbox 360 CPU.

Somehow I do not see such an Athlon 64 coming out in a month or two ;).
 
Peak performance is far away from real-world performance, though. You won't see any games make full use of the X360's CPU for some time to come. The rumblings I've been hearing on these boards are indicating that it will actually underperform current P4 and A64 CPU's for launch titles (titles not designed for the CPU from the beginning).
 
Status
Not open for further replies.
Back
Top