xbox360 kiosk + CoD2 - FSAA - Aniso + Trilinear

Status
Not open for further replies.
Chalnoth said:
Yes there is. The framerate for one frame is the inverse of the amount of time it take to render that frame.

again there's not such a thing as 'average [framerate] per a single frame'. just the same way as there's not an average number of Jon's per single Jon. yes, it's computable. no, it does not make it any more meaningful - it's a measure that measures nothing. furthemore, how you read that into my original post is beyond my comprehension.

Possibly. But for GPU limitation, it makes more sense to just look at the framerate in the way I described it (averaging over frames instead of times).

those frames you want to measure over take up a certain amount of time - whether you average framerate over n frames or over the time of those n frames is the same - it's a matter of convention.

CPU limitations may be different, of course. But since CPU limitations will merely cause this method of measuring framerate to result in too low an average, this isn't that much of a problem.

aha. and what was the relevance of you introducing timedemos in this discussion of framerate? you have n frames for t time (usually measured in 'frames per second') - where those come from is totally irrelevant (could be your vcr just as well).

Well, of course, but my point was that in a timedemo, each frame represents the same fixed timespan. This is not going to be the case in-game (and even if it happened to be due to very special circumstances, both methods will report the same average, so it's a mute point).

again, the relevance to my original post is nil. my original post was about 'average framerates vs stable/minimal framerates'. but hey, in case you're looking for an intellignet discussion open up a new thread on whatever topic amuses you and i promise to participate ; )
 
CMAN said:
For high end or low end cards? Because I see the exact opposite for high end cards, I think they'll be able to outrun the new consoles in a relatively short period of time.

He said visuals. Sli 7800Gtx must have more raw power than RSX or Xenos but "visually speaking" i don't see many games to come reaching the graphical lvl of future good ps3/x360 games... ( and they can 2^32x antiliasing won't help a pc game to be good looking )
 
And except Fps, the other type of game on PC still look flat... The only fancy thing they have comparing to this gen console is higher res and AA. ( and texture res which is still crap )
 
darkblu said:
again there's not such a thing as 'average [framerate] per a single frame'.
I didn't say that. You did.

those frames you want to measure over take up a certain amount of time - whether you average framerate over n frames or over the time of those n frames is the same - it's a matter of convention.
No, it's not the same. It leads to different numerical results.

Consider the case I just outlined: one second at 100 fps, the following second at 50 fps.

If you average over time, you'll get a total of 150 frames rendered in 2 seconds, or 75 average fps.

If you average over frames, you'll get 100 frames rendered at 100 fps, and 50 frames rendered at 50 fps, resulting in an average of 83.3 fps.

aha. and what was the relevance of you introducing timedemos in this discussion of framerate? you have n frames for t time (usually measured in 'frames per second') - where those come from is totally irrelevant (could be your vcr just as well).
If we take as an assumption that the important thing with respect to measuring the average framerate is accurately averaging over the percentage of time that the user sees each framerate, then there is a distinct difference between timedemos and gameplay benchmarks.

That is to say, with a timedemo, since each frame represents a fixed amount of time, no matter what framerate it is actually rendered at, you want to average over frames. This gives the proper interpretation because when high framerate areas zip by quickly, they will leave much less of an impact on the final average.
 
rosman said:
And except Fps, the other type of game on PC still look flat... The only fancy thing they have comparing to this gen console is higher res and AA. ( and texture res which is still crap )
As I've said before, if you personally prefer console games, fine, leave it at that.

But there are many game types available on the PC that are just not available on a console, or have a distinctly different feel on a console (examples: FPS, RPG, simulation, strategy, MMORPG's, modifyable games, to name a few).
 
Chalnoth said:
Peak performance is far away from real-world performance, though. You won't see any games make full use of the X360's CPU for some time to come. The rumblings I've been hearing on these boards are indicating that it will actually underperform current P4 and A64 CPU's for launch titles (titles not designed for the CPU from the beginning).
Perhaps. I'm skeptical myself. Even so that's LAUNCH titles. What about 2 years down the line when the hardware's being better used? I think it's far too early to say an Athlon64 will trounce XeCPU comparing actual game performance. And of course there's also PS3 to consider as well in the PC vs Console debate which is even more of an unknown.
 
Shifty Geezer said:
Perhaps. I'm skeptical myself. Even so that's LAUNCH titles. What about 2 years down the line when the hardware's being better used? I think it's far too early to say an Athlon64 will trounce XeCPU comparing actual game performance. And of course there's also PS3 to consider as well in the PC vs Console debate which is even more of an unknown.
And what about two years down the line when PC hardware will itself be much advanced? Within two years I believe both AMD and Intel will have moved to their next generation architectures.
 
Chalnoth said:
And what about two years down the line when PC hardware will itself be much advanced? Within two years I believe both AMD and Intel will have moved to their next generation architectures.
I don't. I don't think we'll see a desktop PC part with 200+ GFlops gaming performance from Intel or AMD for a good few years at best. They're not thinking that way, and so won't have the performance for top level physics. Which of course GPU's and PPUs will see about fixing, and maybe GPGPU will come to PC's aid, but it won't be anywhere near as well used. If a GPGPU came out next year that could perform certain functions, how likely are they to be used?
Plus wasn't the point of this conversation to say a PC is a better buy than an XB360 for those with the money, rather than a better buy in 2-3 years time when the technology's caught up (not that I was really paying attention. 'My x is better than your x' arguments are pretty banal IMO)
 
No, what I've been attempting to say is that the PC can hold its own, and isn't going to be dramatically more costly than a console for those who already want an up to date PC. And yes, I do believe you're dramatically overestimating what game developers are going to be capable of doing with the X-Box 360's CPU.
 
Chalnoth said:
No, what I've been attempting to say is that the PC can hold its own, and isn't going to be dramatically more costly than a console for those who already want an up to date PC. And yes, I do believe you're dramatically overestimating what game developers are going to be capable of doing with the X-Box 360's CPU.

You have to remember that PC developers dont develop for the best of breed gaming PC, the shoot for somewhere in the middle to bottom. VERY few will write code that uniquely leverages a dual core CPU to its fullest. The difference with consoles is that whatever is in the box is the highest/middle/lowest so it gets fully utilized. So IMO, you have to figure that in 2 years will the lowest common denominator CPU be more powerful as a gmaing CPU than the Xenon?

Also, in terms of cost, theres a good chance a 360 will be $199 in two years so take that into account.
 
Chalnoth said:
No, it's not the same. It leads to different numerical results.

Consider the case I just outlined: one second at 100 fps, the following second at 50 fps.

If you average over time, you'll get a total of 150 frames rendered in 2 seconds, or 75 average fps.

If you average over frames, you'll get 100 frames rendered at 100 fps, and 50 frames rendered at 50 fps, resulting in an average of 83.3 fps.

perfect. now show me exactly where i used this ingenious method of yours.

to reiterate that notorious example of mine:

me earlier said:
..average of the following sequence [ed: of FPSs over 5 spans of 200ms]:

120, 120, 120, 120, 20

or in other words:

Code:
  800 ms @ 120fps = 96 frames
+ 200 ms @ 20fps = 4 frames
=
 1000 ms @ 100fps = 100 frames

so you must disagree with the above (otherwise your arguing with me is pointless) - please explain how/why you disagree. and please, i don't want to listen about 'timedemos vs gameplay'.

when i said:

those frames you want to measure over take up a certain amount of time - whether you average framerate over n frames or over the time of those n frames is the same - it's a matter of convention.

i meant exactly this - whether you measure 100 frames and find out they span a second or you measure frames for a second and you find they're 100 - it's a matter of convention.

k?

ps: did you consider my advice to open a separate topic of your own where you can argue to no end?
 
Last edited by a moderator:
expletive said:
So IMO, you have to figure that in 2 years will the lowest common denominator CPU be more powerful as a gmaing CPU than the Xenon?
No, but the PC market is different. The GPU is more important in the PC market, and in two years, there will be a lot of market penetration for GPU's that can exceed the performance of the Xenos.
 
Darkblu, if you're not going to bother to read and understand my posts, please don't bother to reply either.
 
Chalnoth said:
Darkblu, if you're not going to bother to read and understand my posts, please don't bother to reply either.

first, i don't remember my original post being directed at you. second, if you can't explain exactly what's wrong with my original post please don't bother commenting on it.
have a good night.
 
darkblu said:
first, i don't remember my original post being directed at you. second, if you can't explain exactly what's wrong with my original post please don't bother commenting on it.
have a good night.
It doesn't matter whether it was directed at me or not. Your original post was based upon some very wrong assumptions about framerate averaging.

And I already have explained what's wrong:
Averaging the framerate on a per-frame basis is different from averaging per time.
 
Chalnoth said:
And yes, I do believe you're dramatically overestimating what game developers are going to be capable of doing with the X-Box 360's CPU.
Well I'm not overestimating anything because I'm sitting on the fence on this one. Not working with either next-gen CPU myself I can't really say at all what they are capable of. My point was that's it's probably too early to say with confidence PC's will be just as quick, when they work in a fundamentally different way. But from the paper details, if XeCPU can't manage a lot more than PC CPU's for a while, it's a terrible waste of potential and a badly designed processor that leaves it's FPU capabilities crippled by other limitations and sitting under-used. I don't think IBM and MS are that inept.
 
Chalnoth said:
Averaging the framerate on a per-frame basis is different from averaging per time.
Why would you want to average on a per-frame basis? It just leads to lots of extra work, since most of the time each frame takes (sometimes a hugely) different time to render, and in the end what you come up with is a frames per time number anyway. It's just a lot more cumbersome method (and more prone to rounding errors I'd say).
 
Just a few things .


IF your playing a game with a framerate that changes sometimes drasticly in a second its going to be an unpleasent experiance.

I don't want to be playing a game at 180fps if within that second its going from 1- whatever fps .


Quote:
Originally Posted by expletive
So IMO, you have to figure that in 2 years will the lowest common denominator CPU be more powerful as a gmaing CPU than the Xenon?

No, but the PC market is different. The GPU is more important in the PC market, and in two years, there will be a lot of market penetration for GPU's that can exceed the performance of the Xenos.

Not only this but remember on the pc you have other hardware assiting the cpu.

Sound ? Got it covered with dedicated sound cards .

Physics ? Over the next few years we will most likely see physic add on boards and going from what ati is saying physics on our gpus

Not to mention that we get new gpus / refreshes every 6-12 months .

Look at the jump from the 6800ultra / x850 to the 7800gtx and x1800xt . We will see the refreshes of these products when the ps3 launches in japan .


Sure not all pcs have these things . But developers still program for them and of course as time goes on these things start to move into the low end
 
No offense to anybody intended but the last few pages have had nothing to do with the thread topic.
 
Status
Not open for further replies.
Back
Top