Provocative comment by Id member about PS2 (and Gamecube)!

Quincy:

Qroach said:
That would be true only if the capture source was a camera of some sort. Like say you went to E3 and recorded footage of a game with your camcorder directly off a montior. If the montior was displaying 60 frames persecond, or even 30 for tha tmatter you'll probably notice motion blur on your video tape.

If a game outputs it's own video clip (which is very easy to make on xbox) you won't get motion blur between the frames at all. it will just drop those extra frames when it outputs each frame to the video clip compressor.

Anyway, no matter what you do, you're still only seeing 30 frames per second on these video clips taken directly from games. Judging the framerate of a game (unless it dips below 30 is pretty much impossible) with a video clip.

Obviously. The capture source of the Halo 2 presentation was a camera and the fact it being 30 fps at the very maximum is strickingly obvious. Not only that, but I am almost 99% certain that the developers had already confirmed the framerate before going on showing at E3. If you weren't able to spot the difference, good for you - although I find that very unlikely, given the slow pace action of the xbox shooter that make it more than obvious.

Qroach said:
Ok, I think the majority of us can agree that what Paul said about bandiwdth being limited on xbox compared to PS2, and bandwidth being the reason many games run at 30 frames persecond simply isn't true.

Regarding the 30 fps nonsense, you can find games on both platforms that are 30 or 60 fps. Both plat forms have their fair share of these games, and it's always a design consideration by the developer on what they want to achieve. it certainly isn't some sort of fault of the hardware.

As Simon and others have noted, there are many advantages and disadvantages to both architectures, and you can't claim one to be a big winner over the other in badwidth due to the platforms being so different. Arguing with numbers doesn't tell you how that bandwidth (high banwidth on ps2 needed to upload textures on the fly, and xbox having reach texture caches and lot's of system memory.) is being used.

Not only that, many PS2 games as simon noted run in lower resoloutions then other consoles, so it's not always a apples to apples comparrison. Something also has to be said for visual quality in the argument. Anyway I think all that need to be said, has been said on this topic. If paul can't see how he was wrong in his comments, then there's no point in arguing further...

Nice post, though I don't quite see Paul being the real issue here, despite him arguing that Xbox's limiting factor is bandwidth. I think the point was rather to urge the people constantly bringing up image quality as an annoyance on PS2 to realize that Xbox has an annoyance of equal gravity in regards to framerate. Just as you say, framerate, image quality as well as anything in-game related are often the result of clever (or not) design choices - design choices that are made with the hardwares given limits and advantages. To say PS2 can't do good IQ is as ignorant as saying Xbox can't substain a constant 60 framerate.
 
chaphack said:
Same thing with ZOE2. Highly unstable framerates when things get messy. No idea how it compares to AOD though..

I only played through it once so far (rental), but I really didn't lose frames where I expected to. Throw a few billion missiles and homing laser shots on the screen? No problem! Ten enemies to slash apart? Pshaw! In fact, at this point I don't remember WHERE it slowed down, as I kept cringing coming up to the places I expected to get hit, and sailed smoothly through--must've been too distracted. :) Makes me think they were coding toward and around all the clutch points that they simply didn't have time to polish up some of the others. Hehe...
 
Qroach:

You forgot all those high end 30fps Xbox games I mentioned.

Many high end Xbox games run at 30fps, if it's not bandwidth. Then why?
 
cthellis42 said:
chaphack said:
Same thing with ZOE2. Highly unstable framerates when things get messy. No idea how it compares to AOD though..

I only played through it once so far (rental), but I really didn't lose frames where I expected to. Throw a few billion missiles and homing laser shots on the screen? No problem! Ten enemies to slash apart? Pshaw! In fact, at this point I don't remember WHERE it slowed down, as I kept cringing coming up to the places I expected to get hit, and sailed smoothly through--must've been too distracted. :) Makes me think they were coding toward and around all the clutch points that they simply didn't have time to polish up some of the others. Hehe...

May be another skewed set of samples taken, only read the complains, but the complains may be the problems of pirated copies.
 
Simon F said:
As for your bandwidth arguments on non-PS2 HW, I suspect that you are completely forgetting that those architectures are very likely to have real texture caches. These would significantly reduce the need to access the external texture memory.

OTOH, the GS' memory is the texture memory (well, apart from significant % that has to be used for frame and Z-buffers!). It's not a true cache as commonly understood in computer terms. AFAIU, since that memory is so small, the application has to actively manage it, doing the swapping itself.

So, if I understand this right, according to you, an application manageable "cache" is a good thingif it’s on Gamecube, but a bad thing on PS2?
With all due respect Simon F, I fail se the logic.
 
Paul said:
Qroach:

You forgot all those high end 30fps Xbox games I mentioned.

Many high end Xbox games run at 30fps, if it's not bandwidth. Then why?

it's by design.

xbox is sold as the most powerful console. it needs to have the nicest graphics. you don't see the framerate on screenshots.
so you can sacrifice fps for better graphics.
 
<sigh> Sad but true. Sad because they can still do better graphics, better IQ, AND higher framerates most of the time, except that graphics may not appear "that much better" to the untrained eye, who is everyone's target first and foremost.
 
somehow i have the idea that for japanese dev houses framerate is more of a priority, whereas in most european/american one doesn't prioritize it so much.

it could be interesting to look at the percentage of 60 fps titles developped on each continent..
and the percentage of sub-30 fps.

and do the same thing by plateform.
 
Incidence of 30 or 60 fps can be a design decision... So, claiming the Xbox has some major bandwidth problem for 60fps would only make sense if it could be shown that its 60fps games generally lose a significant amount of their technical advantage over PS2's 60fps games. However, some of the Xbox's strongest technical performances all around actually are 60fps and are far and away some of the most impressive games on any platform.

Panzer Dragoon Orta boasts heavy particle effects, massive fleets of detailed characters on-screen, stunning character skinning, some of the most liberal use of complex multitexturing I've seen, and dramatic distortion effects for depth-of-field and motion blur. The environments in Jet Set Radio Future are just unreal in their geometric complexity. Toe Jam and Earl 3 amazes in lighting and related special effects as light filters through tree tops, and complex shadows of individual leaves blowing in the wind get cast with great definition onto the characters (which also goes to show how high-poly the characters are). Some other of the most technically impressive games: Ninja Gaiden, The House of the Dead 3, NBA 2K3 in 720p, Dead or Alive 3, SEGA GT 2002, Soul Calibur II in 720p, GUNVALKYRIE, Dead or Alive Xtreme Beach Volleyball, etc. The outstanding graphical achievements of 60fps games like these are unmatched with their presentation which includes consistent texture integrity through superior filtering, native proscan, 32-bit color, and a healthy usage of pixel-level shading effects.

IQ on the PS2, on the other hand, truly is limited compared to even the older and less expensive Dreamcast. Image quality is the product of precision in z-buffer, color depth, image resolution, update resolution (degree of interlacing), and properties of the like. A robust display of these elements will tax the available memory where cached textures are also competing for space. Even in the best PS2 efforts like the much admired Killzone, these framebuffer attributes get compromised to make enough room for textures, manifesting a comparative limitation that the DC and its two-times more display RAM doesn’t suffer from. PS2 games that boost certain areas of IQ, like FSAA via super-sampling with Baldur’s Gate: Dark Alliance, then just trade it off in other areas of IQ like being limited to field rendering and 16-bit color. Super-sampling has also been used in DC games (Space Channel 5 pt 2 looking smooth enough), and the hardware consistently provides the necessary resources for 32-bit z-buffering, full resolution updates for native progressive scan, the highest-quality image output available through native VGA, and 640x480 image resolutions - all standard throughout a full library of games, no compromises. Another aspect of IQ is the presentation of the image and the textures, and more alaising and artifacts are present in PS2 games because of the lower usage of mip-mapping and filtering.

Many PS2 games field render and so only provide half the detail with each update, and many of the others with full buffers won’t display correctly or crash when the VGA hack tries to make them output progressively. Also, the visible detail advantage of 640x480 over 640x448 is dependent on how much viewable area a screen has and thus will vary, but progression in screen technology, especially in flatness, has continually decreased its distortion and expanded its viewable area. A TV like my Mitsubishi shows the black borders at the bottom and top of the screen in the low-res PS2 games (like MGS2 and GT3), so Dreamcast games would be appearing more visibly detailed.

randycat99:
If it fluctuates between 30 and 60, then that means at its worse it will look as "jittery" as something that is running 30 all the time. So nothing was really lost. If you are running at "jittery 30" all of the time, that would seem to be the worst case situation (the Xbox situation, if you will).
No. We can interpret a fast enough succession of still frames as motion because of their consistency... consistency in the rate and allowable degree of change between them. Fluctuations noticeably harm this illusion of motion, lowering believability beyond that of just a minimized framerate.

The extra damage to believability caused by an unstable framerate is one of the reasons why designers wouldn't try to produce a slow-motion special effect by just making a scene suddenly update at a lower frequency. Instead, they keep the update constant to avoid that nasty choppiness and simply lessen the degree of change between the animation/motion of objects.
 
Squeak said:
Simon F said:
As for your bandwidth arguments on non-PS2 HW, I suspect that you are completely forgetting that those architectures are very likely to have real texture caches. These would significantly reduce the need to access the external texture memory.

OTOH, the GS' memory is the texture memory (well, apart from significant % that has to be used for frame and Z-buffers!). It's not a true cache as commonly understood in computer terms. AFAIU, since that memory is so small, the application has to actively manage it, doing the swapping itself.

So, if I understand this right, according to you, an application manageable "cache" is a good thingif it’s on Gamecube, but a bad thing on PS2?
With all due respect Simon F, I fail se the logic.

Isn't the Gamecube's texture address space the 24MB of off-chip memory? If so (and I believe so) then the 1MB of on-chip texture memory is a cache(hardware-controlled), rather than the PS2's eDRAM which is a scratchpad (software-controlled).
 
Lazy8s said:
randycat99:
If it fluctuates between 30 and 60, then that means at its worse it will look as "jittery" as something that is running 30 all the time. So nothing was really lost. If you are running at "jittery 30" all of the time, that would seem to be the worst case situation (the Xbox situation, if you will).

No. We can interpret a fast enough succession of still frames as motion because of their consistency... consistency in the rate and allowable degree of change between them. Fluctuations noticeably harm this illusion of motion, lowering believability beyond that of just a minimized framerate.

I don't know where this is written in stone. If you have a 60 to 30 fluctuation, then the persistence of motion is no worse than having 30 all of the time. At those rates, it's too fast to not see fluid motion (unless you want to argue that 30 isn't appropriately fluid enough, but then that puts a lot of XBox titles in a perilous state). If you are getting a 60 to 20 or 30 to 20, that would make "consistency" a more applicable point. You'll definitely see a discontinuity of motion then.

I would imagine you will endeavor to push the "consistency" bandwagon for obvious reasons. However, I would personally file this along with your other "strange" theory about half-resolution screens.
 
randycat99:
I don't know where this is written in stone. If you have a 60 to 30 fluctuation, then the persistence of motion is no worse than having 30 all of the time.
That's true, but the persistance of motion isn't the only quality we perceive.

We can also sense incongruity in the update rates, and such fluctuations are not interpretted as natural. This is because our eyes don't sample the real world at perceptibly varying rates, so to see such an effect makes obvious its artificial nature.
I would imagine you will endeavor to push the "consistency" bandwagon for obvious reasons.
The obvious reason being that it's true, of course.
However, I would personally file this along with your other "strange" theory about half-resolution screens.
There's nothing strange about admitting full resolution is superior to half resolution and that there is a visible difference between proscan and interlaced output. The only thing strange is your willingness to compromise your reasonability by pretending it isn't so.
 
Lazy8s said:
randycat99:
I don't know where this is written in stone. If you have a 60 to 30 fluctuation, then the persistence of motion is no worse than having 30 all of the time.
That's true, but the persistance of motion isn't the only quality we perceive.

:rolleyes: Yes, erm, I would also like to submit that the flavor of the silicon used in a particular console can also have a "perceptible" quality. Personally, I prefer the vanilla flavored wafer. Banana has its perks, too, but ultimately it is less "accurate" and "natural", contrary to what uninformed videogame players would tell you. Now chocolate- that's a flavor only the most demanding "avid frame watcher" could appreciate!


We can also sense incongruity in the update rates, and such fluctuations are not interpretted as natural. This is because our eyes don't sample the real world at perceptibly varying rates, so to see such an effect makes obvious its artificial nature.

Yeah, as natural as staring at a glass screen with little dots of color that make a picture. It's a miracle that we can see anything remotely discernible or "natural" at all, in such a situation, right?

I would imagine you will endeavor to push the "consistency" bandwagon for obvious reasons.
The obvious reason being that it's true, of course.

Psychobabble aside, the more obvious reason would be one of console preference (but we'll never get an admission of that).

However, I would personally file this along with your other "strange" theory about half-resolution screens.
There's nothing strange about admitting full resolution is superior to half resolution and that there is a visible difference between proscan and interlaced output. The only thing strange is your willingness to compromise your reasonability by pretending it isn't so.

Sure there is a difference. Whether it amounts to the all encompassing difference in use that you make it out to be, that's another question altogether. Do you think a 1000 Hz interlaced display would still suffer from the dreaded "half-resolution" phenomenon? What about a 1 Hz interlaced display? Jumping from the "theoretical" to the "real" should force you to realize that said effect could be anywhere from perceptible, to barely perceptible, to imperceptible depending on the refresh rate and the video subject material.

I'm afraid you'll just have to agree to disagree on this one, lest you risk falling into the same "non-interactive" category as DMGA (for instance).
 
Perhaps on cheaper setups.

Have you ever seen what a Faroujdia can do for a big screen with an interlaced input? That will break your comment altogether. Go check one out.
 
minimum frame rate is the most important number. you can have a game that averages 60fps. Which is what your talking about. But can dip as low as 1fps. Or you can have a game that runs at 30fps and never drops from it. Of course the one that never drops is the better framerate. Thats why in pc games most people want much higher framerates . Thats so the dips disapear.
 
randycat99 said:
Perhaps on cheaper setups.

Have you ever seen what a Faroujdia can do for a big screen with an interlaced input? That will break your comment altogether. Go check one out.



A faroudja deinterlacer will not make an interlaced source look as good as a native progressive scan source.
 
Dural said:
randycat99 said:
Perhaps on cheaper setups.

Have you ever seen what a Faroujdia can do for a big screen with an interlaced input? That will break your comment altogether. Go check one out.

A faroudja deinterlacer will not make an interlaced source look as good as a native progressive scan source.

Certainly, they are not equivalent, but it can be very close in motion, especially the higher end models.
 
Back
Top