Provocative comment by Id member about PS2 (and Gamecube)!

Can't argue that is the general thought on the subject. It seems a bit pointless, IMO, to even bother with getting bigger and bigger screens just so you can sit proportionally farther away and the screen becomes perceptibly the same size (the one exception being that you move into a bigger place). When I shop for a bigger screen, I am more than likely planning to view it at nearly the same distance I am currently so the screen is that much more immersive. That's why I'm paying for it- to see a bigger image.

The rule of thumb for viewing distance IIRC is 3X the height of the screen or something like that. Most people who go to the movie theater will get the best picture from the middle to the back seating sections not the front as your eyes will get a real workout in the horizontal and to a lessor extent the vertical direction s if you sat at the front. Same with a big tv in the living room. Also when you say you'll be sitting at nearly the same distanct when moving to a bigger tv means very little when you don't state initial and final screen sizes. If you're moving from a 32" to a 36" then you won't be sitting that much farther if at all obviously.
 
DeathKnight said:
For example I'd rather buy a progressive scan DVD player
No doubt, especially considering that Panasonic progressive scan DVD players with DCDi de-interlacing by Faroudja can be had for less than $200.

The availability of expensive high quality de-interlacers from Faroudja and/or higher end televisions with the Faroudja de-interlacers built in is not a plausible excuse for abundant interlaced rendering and lower quality output on the PS2.

If it is your claim that a $200 player with Faroudja-based hardware will give equivalent performance to a $3000 Faroudja video processor, then you should have no problem imagining that a sub $200 processor can exist to help your PS2 quandry.
 
PC-Engine said:
Most people who go to the movie theater will get the best picture from the middle to the back seating sections not the front as your eyes will get a real workout in the horizontal and to a lessor extent the vertical direction s if you sat at the front. Same with a big tv in the living room. Also when you say you'll be sitting at nearly the distanct when moving to a bigger tv means very little when you don't state initial and final screen sizes. If you're moving from a 32" to a 36" then you won't be sitting that much farther if at all obviously.

No one said you had to sit at the very front of the theater. Similarly, there is no requirement that you sit directly a foot from your TV, either.

When I mention distance in my TV watching scenario, it is simply the distance that I have found comfortable, practical, and reasonably large (no, I don't follow the 3x rule as if it is religion). If I go to a considerably larger screen (not just 32 to 36), I'd expect to not change my distance that much (despite recomendations), because I wanted a bigger screen, in the first place.
 
randycat99 said:
PC-Engine said:
Most people who go to the movie theater will get the best picture from the middle to the back seating sections not the front as your eyes will get a real workout in the horizontal and to a lessor extent the vertical direction s if you sat at the front. Same with a big tv in the living room. Also when you say you'll be sitting at nearly the distanct when moving to a bigger tv means very little when you don't state initial and final screen sizes. If you're moving from a 32" to a 36" then you won't be sitting that much farther if at all obviously.

No one said you had to sit at the very front of the theater. Similarly, there is no requirement that you sit directly a foot from your TV, either.

When I mention distance in my TV watching scenario, it is simply the distance that I have found comfortable, practical, and reasonably large (no, I don't follow the 3x rule as if it is religion). If I go to a considerably larger screen (not just 32 to 36), I'd expect to not change my distance that much (despite recomendations), because I wanted a bigger screen, in the first place.

Still doesn't change the fact that the effective pixel size does not change in relation to the recommended viewing distance. If your viewing distance requires you to sit closer than the recommended then it's your choice to see pixels :LOL:
 
randycat99 said:
If it is your claim that a $200 player with Faroudja-based hardware will give equivalent performance to a $3000 Faroudja video processor, then you should have no problem imaging that a sub $200 processor is possible to help your PS2 quandry.
You're all over the place aren't you?

I'm not the person who brought up Faroudja de-interlacing in this thread in the first place. That was you. You used it as an excuse for the field-rendering and lower output quality on the PS2. If you're the type of person who wants to spend ungodly amounts of money on a de-interlacer to bring interlaced output on the PS2 to progressive quality then more power to you. The problem here is that you took it a step further and used it as an absolutely terrible excuse for such a thing. We wouldn't be in this mess if you hadn't said what you did in the first place and then tried to defend it until the cows come home.
 
PC-Engine said:
I assume you're talking about TV broadcasts? Well that's moving towards HDTV too ;)

Among other things, but certainly that's the biggest one right now.

And all things considered, weren't we supposed to get HDTV, like, last decade or something? ;) I'm not holding my breath...
 
PC-Engine said:
Still doesn't change the fact that the effective pixel size does not change in relation to the recommended viewing distance. If your viewing distance requires you to sit closer than the recommended then it's your choice to see pixels :LOL:

Yes, of course...and moving to larger screens is then mostly useless. This then makes Chap's comment (which precipitated this whole discussion) about progressive vs. interlaced wrt larger screens that much less relevant.

It will also be your choice to see pixels or not, not only wrt to distance, but the quality of equipment you buy. The larger screen with higher quality processing and thus higher performance will then give the "better" presentation in this case.
 
DeathKnight said:
I'm not the person who brought up Faroudja de-interlacing in this thread in the first place.

Yes, but you are sure dutiful to follow-up on why it isn't worthwhile. It's only natural that I be allowed to respond.

You used it as an excuse for the field-rendering and lower output quality on the PS2.

You saw it as an excuse, but my intent was nothing of the sort. My suggestion was wrt interlaced formats, at large. Naturally, this includes the PS2 as the potential recipient of cited improvements. This does not constitute an "excuse". It just means that it isn't the dire dead end you make it out to be.

If you're the type of person who wants to spend ungodly amounts of money on a de-interlacer to bring interlaced output on the PS2 to progressive quality then more power to you.

You just said a $200 piece of hardware could do the equivalent thing. How is that ungodly? ...but the more relevant concept here is that this isn't simple deinterlacing that makes the magic happen on large screens. So a $200 box won't do, and neither will plain progressive output from an Xbox, DVD, what have you... The "processing" is the key for large screen sizes, given a fixed source format. What you expect in video quality and what you are willing to pay are entirely up to you.

The problem here is that you took it a step further and used it as an absolutely terrible excuse for such a thing. We wouldn't be in this mess if you hadn't said what you did in the first place and then tried to defend it until the cows come home.

Now you are just spinning things. Get a grip and realize it was simply a reply to a comment Chap had made (which turned out to be a situation not as simple as he had thought). The "mess" you attribute seems to come from those who would argue at all costs that video processing couldn't possibly be worthwhile.
 
I think the bottom line here is that PS2 with its interlaced output will always look worse than Xbox with it's progressive output regardless if there's video processing or not since both benefit from video processing where/when available.


cthellis42 said:
PC-Engine said:
I assume you're talking about TV broadcasts? Well that's moving towards HDTV too ;)

Among other things, but certainly that's the biggest one right now.

And all things considered, weren't we supposed to get HDTV, like, last decade or something? ;) I'm not holding my breath...

FCC says 2007 for a near complete phase out IIRC.
 
PC-Engine said:
I think the bottom line here is that PS2 with its interlaced output will always look worse than Xbox with it's progressive output regardless if there's video processing or not since both benefit from video processing where/when available.

That is completely contradictory to what has been fleshed out here so far. The appropriate video processing could yield essentially identical screens, especially where large screen presentations are involved. The bottomline is that worrying about interlaced vs. progressive is dubious, given what is possible with the right technologies (that do exist today and are accessible to the public).
 
randycat99 said:
PC-Engine said:
I think the bottom line here is that PS2 with its interlaced output will always look worse than Xbox with it's progressive output regardless if there's video processing or not since both benefit from video processing where/when available.

That is completely contradictory to what has been fleshed out here so far. The appropriate video processing could yield essentially identical screens, especially where large screen presentations are involved. The bottomline is that worrying about interlaced vs. progressive is dubious.

PS2's image quality problems are not just restricted to interlacing. It's just one factor ;)

Also Chap and jvd said he can see a quality difference between PS2's output and Xbox's on a HDTV. Then you brought up the expensive deinterlacer as a remedy/option which still doesn't compare to a pure progressive output on Xbox. This option is just an expensive band aid required for PS2. That still hasn't changed. Nobody is arguing that more money buys better equipment making the difference between PS2 and Xbox output quality closer, but even with all the money spent on additional video processing it's still worse. Bottom line is if PS2 had progressive output for 99% of the games available we wouldn't be arguing about this interlacing in the first place. Instead we'd be arguing about washed out colors, no mippmapping, no trilinear filtering, no bumpmapping, etc. ;)
 
Wow... 15 pages and a complete subject change to respond to something that was addressed in the first page! :?
 
I think arguing about XBox vs. PS2 at this point is kind of silly as we're already seeing a trickle of news about XBox2 and PS3. You might as well argue about NES vs. Master System.
 
PC-Engine said:
PS2's image quality problems are not just restricted to interlacing. It's just one factor ;)

...and here we have the swing to side issues, since the central one can no longer be progressed to satisfaction.

Also Chap and jvd said he can see a quality difference between PS2's output and Xbox's on a HDTV.

The performance and capabilities of the specific HDTV would make a great deal of difference. They may all be HDTV's, but that doesn't mean they handle various inputs with the same finesse. Suffice to say, most instances of built-in video processing are decidedly not representative of the best that is available. It is more a choice of reasonable performance with minimal impact on the target price of the system. To that, saying you can see a difference on just any HDTV means very little. It entirely depends on the quality of the video processing (as has been stated ad nauseum).

Then you brought up the expensive deinterlacer as a remedy/option which still doesn't compare to a pure progressive output on Xbox.

Wrong. I brought up a device as an example of what is possible, in order to dispell a popular misconception.

This option is just an expensive band aid required for PS2.

It's also an "expensive band aid" for the Xbox when it comes to true large screen presentation...if you choose to only see it as a "band aid".

Nobody is arguing that more money buys better equipment making the difference between PS2 and Xbox output quality closer, but even with all the money spent on additional video processing it's still worse.

Worse? You've been able to confirm this, personally? The reality is you have a preconception that it must be worse.

Bottom line is if PS2 had progressive output for 99% of the games available we wouldn't be arguing about this interlacing in the first place.

Well, it wouldn't matter to those who seem to live and die by bullet product features printed on a box.

Instead we'd be arguing about washed out colors, no mippmapping, no trilinear filtering, no bumpmapping, etc. ;)

Evidently, that consortium takes great pleasure in arguing over that, regardless, day after day. Surely one day they will evolve to realize how petty it is to quibble over this endlessly just to feel more secure about their own console of choice. ...or maybe not- XBox2 and PS3 will come out, and they start all over again.... Seems like their only hope is to simply grow out of consoles.
 
If Chap or jvd had both a PS2 and Xbox hooked up to the same output device and Xbox ends up with the better image quality then so be it. No point in bringing up expensive Faroudja deinterlacers to try and narrow the gap.Nobody was claiming that Xbox's image quality couldn't be improved further with advance video processing same with PS2.

You can spend as much money as you want to try and make a Pinto perform as well as a Mustang too...your point?
 
What you are not accounting for is that different HDTV's can yield different results. Things have gotten far more sophisticated than just a wire connected to a tube. There's a whole lot of electronics and processing that goes on in between. An outboard Faroudja simply represents the best possible case, as it reduces the ambiguity of what is at work inside the HDTV. As such, that means there will be various alternative equipments that occupy performance points across the entire range between mediocre HDTV and full-on HDTV with dedicated outboard processing. Potentially, the one that can make either source look good will be better than one that only does well with one kind of source.

Your making comparisons to Pintos and Mustangs is telling of your inability to be descriptive w/o being inflammatory. It's a shame you are unable to discuss things w/o employing digs. It shows a lack of respect. It also shows your presence here at this topic is more of a trolling intent than one of being conversational (...course this stuff flies under the moderator's radar all the time when it is pro-Xbox).
 
Faroudja's, hmm. There are HDTVs that use their chips. I wonder how these compare to their independent scalers?

Fafalada said:
Doesn't framerate fluctuations cause motion sickness?
The people I know that suffer from motion sickness all get it from being exposed to 3d animation, most often if it's 1st person camera. And usually the more constant the framerate, the worse the effects they suffer - I don't really know why though, my field isn't medicine :p
To best of my knowledge though, it has something to do with lack of other sensory feedback (screen is only audio/visual).

I'm one of those people. It only happens significantly in first-person games, like you mentioned :?
 
randycat99 said:
...or Mustang and Camaro. Different strokes for different folks.

I wasn't aware you needed to spend loads of money to make one perform on par with the other. I always thought they were equal out of the box. ;)
 
Back
Top