Provocative comment by Id member about PS2 (and Gamecube)!

maskrider said:
Dural said:
randycat99 said:
Perhaps on cheaper setups.

Have you ever seen what a Faroujdia can do for a big screen with an interlaced input? That will break your comment altogether. Go check one out.

A faroudja deinterlacer will not make an interlaced source look as good as a native progressive scan source.

Certainly, they are not equivalent, but it can be very close in motion, especially the higher end models.


And those higher end models cost more than most high end HDTVs, hell the low end models are more than many high end HDTVs.
 
Dural said:
maskrider said:
Dural said:
randycat99 said:
Perhaps on cheaper setups.

Have you ever seen what a Faroujdia can do for a big screen with an interlaced input? That will break your comment altogether. Go check one out.

A faroudja deinterlacer will not make an interlaced source look as good as a native progressive scan source.

Certainly, they are not equivalent, but it can be very close in motion, especially the higher end models.


And those higher end models cost more than most high end HDTVs, hell the low end models are more than many high end HDTVs.

Yep and I'm really confused as to why somebody would bring this expensive deinterlacer up as an argument in the first place. :rolleyes: :LOL:

Seems like the PS2's limited progressive scan games are forcing some people to go to extremes to get progressive video AKA a very expensive bandaid ;)
 
jvd said:
minimum frame rate is the most important number. you can have a game that averages 60fps. Which is what your talking about. But can dip as low as 1fps. Or you can have a game that runs at 30fps and never drops from it. Of course the one that never drops is the better framerate. Thats why in pc games most people want much higher framerates . Thats so the dips disapear.

Well, technically I would rather have an FPS running at 60... uh... fps that dips to 1 fps ONCE than one running at 30 fps solid. ^_^

An extreme example, of course, but it's similar to an attitude many people have. Spot ONE flaw and savage a title/platform as much as possible--usually blowing things well out of proportion. (And usually only done by those with particular platform distastes.) Many people list "irritations" from a game--be it slowdown, an occasional noticed IQ hit, or whatever else--as just that, but SO often I see it blown out of proportion in many painful, painful ways. -_-

Noticing Aragorn's scabbard on the wrong side in a scene didn't make the Two Towers a bad movie for me, after all. (It took MANY more irritations to do that! :p )
 
cthellis42 said:
jvd said:
minimum frame rate is the most important number. you can have a game that averages 60fps. Which is what your talking about. But can dip as low as 1fps. Or you can have a game that runs at 30fps and never drops from it. Of course the one that never drops is the better framerate. Thats why in pc games most people want much higher framerates . Thats so the dips disapear.

Well, technically I would rather have an FPS running at 60... uh... fps that dips to 1 fps ONCE than one running at 30 fps solid. ^_^

An extreme example, of course, but it's similar to an attitude many people have. Spot ONE flaw and savage a title/platform as much as possible--usually blowing things well out of proportion. (And usually only done by those with particular platform distastes.) Many people list "irritations" from a game--be it slowdown, an occasional noticed IQ hit, or whatever else--as just that, but SO often I see it blown out of proportion in many painful, painful ways. -_-

Noticing Aragorn's scabbard on the wrong side in a scene didn't make the Two Towers a bad movie for me, after all. (It took MANY more irritations to do that! :p )
I'm sorry so your saying you wuold rather have a moment in the game where it becomes unplayable and you might get killed ? I can tell you i never want that . which is why i allways upgrade my video card.
 
when the framerate dips it's usually when the action is the most intensive ie: when you need the most to have a good framerate. :(
 
chaphack said:
So the FaraWHAT(!?) is just some fancy deinterlacer? pfft. :LOL: o_O

The said company is the technology leader in the consumer video processing market.

You wouldn't even want to buy the lowest end product which is still around US$3000, unless your pocket is very deep.

Not an usual gamer (not even most hardcore gamers) will want to buy such a device.
 
jvd said:
I'm sorry so your saying you wuold rather have a moment in the game where it becomes unplayable and you might get killed ? I can tell you i never want that . which is why i allways upgrade my video card.

Depending where it happens could of course be bad, but most places I've noticed slowdown just breaks one's stride momentarily, rather than do anything harmful. Even on games with BAD slowdown (and I get this a lot on my PC setup) it's workable unless it lasts for way too long.

It's not "good," but I'm not sure why people seem to place so much evil on one or few instances of "bad" that rarely amount to anything more than the equivalent of a momentary stumble over a carpet lump while the rest of a game can be entirely pleasurable.

I guess it depends on the way one started with gaming, but the last time I WASN'T making game concessions was probably in 1997 when the first PC I bought was "80% max" or so, and the Voodoo2 was still king of the hill. (I'd just gotten an 8 meg instead of a 12 meg.)

I would RATHER not have it happen at all, but I don't usually have the money to upgrade that often, so hey. Even still, I prefer driving to work though at 60 through a few lights that MIGHT be red than taking the highway, but capped at 30 mph. ;)
 
cthellis42 said:
jvd said:
I'm sorry so your saying you wuold rather have a moment in the game where it becomes unplayable and you might get killed ? I can tell you i never want that . which is why i allways upgrade my video card.

Depending where it happens could of course be bad, but most places I've noticed slowdown just breaks one's stride momentarily, rather than do anything harmful. Even on games with BAD slowdown (and I get this a lot on my PC setup) it's workable unless it lasts for way too long.

It's not "good," but I'm not sure why people seem to place so much evil on one or few instances of "bad" that rarely amount to anything more than the equivalent of a momentary stumble over a carpet lump while the rest of a game can be entirely pleasurable.

I guess it depends on the way one started with gaming, but the last time I WASN'T making game concessions was probably in 1997 when the first PC I bought was "80% max" or so, and the Voodoo2 was still king of the hill. (I'd just gotten an 8 meg instead of a 12 meg.)

I would RATHER not have it happen at all, but I don't usually have the money to upgrade that often, so hey. Even still, I prefer driving to work though at 60 through a few lights that MIGHT be red than taking the highway, but capped at 30 mph. ;)
I rather be caped at 30 then be driving on the high way at 60mph and then the car in front of me suddenly breaks down to 5mph ..... wouldn't u ?
 
maskrider said:
chaphack said:
So the FaraWHAT(!?) is just some fancy deinterlacer? pfft. :LOL: o_O

The said company is the technology leader in the consumer video processing market.

You wouldn't even want to buy the lowest end product which is still around US$3000, unless your pocket is very deep.

Not an usual gamer (not even most hardcore gamers) will want to buy such a device.
Yeah, the chip is nice, but like Dural said they don't really compare to a high quality native progressive scan source. Panasonic uses the Faroudja DCDi chip in a number of their progressive scan DVD players. They people who want the chip usually want it foremost because it lacks the chroma bug.
 
DeathKnight said:
maskrider said:
chaphack said:
So the FaraWHAT(!?) is just some fancy deinterlacer? pfft. :LOL: o_O

The said company is the technology leader in the consumer video processing market.

You wouldn't even want to buy the lowest end product which is still around US$3000, unless your pocket is very deep.

Not an usual gamer (not even most hardcore gamers) will want to buy such a device.
Yeah, the chip is nice, but like Dural said they don't really compare to a high quality native progressive scan source. Panasonic uses the Faroudja DCDi chip in a number of their progressive scan DVD players. They people who want the chip usually want it foremost because it lacks the chroma bug.

The chip is another line to make more money out of their already spent R&D cost, I had real world working experience with the chip and know the chips you are referring to really well.

Even the lower end Faroudja products are not just about chips, it is much more than that, not everyone is willing to pay the price, though.
 
jvd said:
I rather be caped at 30 then be driving on the high way at 60mph and then the car in front of me suddenly breaks down to 5mph ..... wouldn't u ?

Yeah, but you keep trying to suggest that crashes always HAPPEN, where I find that they rarely do. Following this analogy out, it would simply force me to break hard or swerve to avoid, and the main effect would just be me breathing faster and have my heart racing.

...and if I were a thrill-seeker, that'd just be an added bonus! ;)
 
cthellis42 said:
jvd said:
I rather be caped at 30 then be driving on the high way at 60mph and then the car in front of me suddenly breaks down to 5mph ..... wouldn't u ?

Yeah, but you keep trying to suggest that crashes always HAPPEN, where I find that they rarely do. Following this analogy out, it would simply force me to break hard or swerve to avoid, and the main effect would just be me breathing faster and have my heart racing.

...and if I were a thrill-seeker, that'd just be an added bonus! ;)
If you say so but dips in frame rate are bad . That is the one thing I thought most of us could agree on. I know all pc fans agree on that .
 
jvd said:
PC-Engine said:
Doesn't framerate fluctuations cause motion sickness?
it can cause other much more series things

Woah, what kind of more serious things? I could understand a low refresh rate causing problems, but the TV or monitor should remain at the same hertz no matter the framerate, shouldn't it?
 
maskrider said:
chaphack said:
So the FaraWHAT(!?) is just some fancy deinterlacer? pfft. :LOL: o_O

The said company is the technology leader in the consumer video processing market.

You wouldn't even want to buy the lowest end product which is still around US$3000, unless your pocket is very deep.

Not an usual gamer (not even most hardcore gamers) will want to buy such a device.

lowest end 3000USD!!!! :oops: :oops: :oops: oh my...why was this product even brought up.

I had thought faraWHAT is sorta like dobly prologic2 technological format, you know, an intermediate stopgap for digital TVs, something to keep analog TVs going for awhile. Something that is built into upcoming TVs. But a break out box and one hellavu EXPENSIVE one.... :LOL:
 
jvd said:
...and if I were a thrill-seeker, that'd just be an added bonus! ;)
If you say so but dips in frame rate are bad . That is the one thing I thought most of us could agree on. I know all pc fans agree on that .[/quote]

Certainly, it's easy to say so. But almost ANYTHING has concessions--you take some bad for some good. After that, it's just personal preference as to what is "better" or "worse" than others.
 
Someone earlier said that being locked at 30 fps meant that there could be no drops in framerate from there. I don't believe that is correct. It can certainly dip in direct correspondance to high graphics loads (as it would in any console architecture), but it is certain that fps won't exceed 30. Thus 30 Hz locked is just as vulnerable to inconsistency below 30 as 60 Hz not-locked. Basically, all it does is eliminate inconsistency between 30-60, whose effect is certainly less objectionable than inconsistency below 30 (which would not be addressed).

FWIW, I'd like to address why I did mention Faroudja (spelling I'm still unsure of). I brought it up in response to someone's assertion that progressive vs. interlaced becomes more of an issue at bigger screen sizes. The counter-argument being that either choice becomes secondary in problem nature by virtue of the scanlines themselves. Basically, it all comes down to the visibility of scanlines as screen sizes increase. Considering the number of scan lines available in NTSC, it was surely not intended for typical screen sizes of large modern TV's. Some would say 24" was the official limit before quality suffers. Pushed to the limit, I would say 35" is still acceptable, and 40" is really skirting the limit. Going to much larger screen sizes will require some processing or it will look really bad. Scanlines on such screen sizes, whether they are coming progressive or interlaced, are just too discernable. At that point, it isn't even interlacing that is so offensive, so much as the low number and size of the scanlines themselves.

That is where Faroujda comes into play. It deinterlaces, converts to progressive, and does further adaptive blending and/or scaling to allow impeccable image replay on 40"+ devices, whether the source is interlaced or progressive. Can you get by w/o one? Sure...but maybe not so completely as you think. Many high-quality large screen TV's employ their own versions of this functionality which have varied levels of success. Some look decent and some look really quite bad. This is one of those things where the results rely on the quality and sophistication of the processing, not just a marketing bulletfeature that says "onboard linedoubler". Large-screen TV's that show the direct signal w/o any sort of processing look just plain bad even with a progressive signal source, as there are just too few lines to fill the large screen. That's why I hinted at "better-quality" large screen TV's earlier.

...but all this is about getting the utmost quality in the experience, right? Well Faroujda has a product that is a ticket to that destination. After having demoed their latest projector product at a local store, I only have to say that an interlaced source can indeed reach the heights of a progressive source, if not to the point of imperceptibility. It's really rather amazing what they have made possible. Even plain progressive scan isn't enough to complement the sharpness and size of modern large-screen TV's.

Now if you get off on just playing games on your little 28" HDTV in progressive or hovering over a 20" computer monitor, that's fine and dandy, but still missing out on half the experience, IMO. Size counts for something here, as well. Going larger, you need "the processing", and that processing happens to benefit interlaced sources, as well (if not brings them to par, as unbelievable as that seems). The price of that "processing" may not be so much the technical hardware involved, but the exclusive/proprietary implemention of the Faroudja product (which is likely the running lead of the industry), itself.
 
PC-Engine said:
Yep and I'm really confused as to why somebody would bring this expensive deinterlacer up as an argument in the first place. :rolleyes: :LOL:

Seems like the PS2's limited progressive scan games are forcing some people to go to extremes to get progressive video AKA a very expensive bandaid ;)
Bingo!!!

The only reason it was brought up in the first place was for use as an excuse for interlaced games. The best way to fix the setback is at the source rather than outside intervention. Take DVD's for example... their format is interlaced to allow them to fit on today's DVD sizes. In order to get movies back as close as possible to their original source they must be converted to progressive scan (line-doubling/3:2 pulldown/4:4:4 processing, etc). However, this setback is being fixed at the source with newer DVD technology, ie. high-definition DVD.
 
Back
Top