I’m not trying to get your thread of topic Deepak (although it really doesn’t look like its meaty enough subject to continue long
), but reading the VF4: Evo review got me thinking about the subject of interlace, and 30fps vs. 60fps again. That was the subject of a discussion, on a Danish forum some time ago.
The discussion was about whether/why it is beneficial to have a full front buffer, and also about Dreamcasts interlace-field lineblending HW-feature.
It was never really concluded satisfyingly, so I thought I would bring the question to some real experts (that’s here, if anyone was wondering
)
My query is this: The heavy aliasing found in VF4 (corrected in VF4: Evo) and many other PS2 games, is due to slight horisontal displacement of the interlace fields, because the GS only renders the lines that the TV needs, right now, in the 640x240 interlace field, to save VRAM . So that’s why jaggies get enhanced, right?
But why? Why should it get better if your game is running at full front buffer? If the game is running at 60fps like VF4 and many other jaggilisious PS2 games does, you would still inevitably get a displacement of the fields, even if rendering a full frame of two fields per frame/field (and then throwing one field away), because at 60fps, every frame is a 640x240 field.
Is it because of lineblending that some games seem to look better than others? That wouldn’t explain why even some 30fps game is also plagued by aliasing, because the interlace fields should fit together in that case, because they are from the same frame.
In short, what is the real advantage of rendering a full 640x480 frame, when all you really need is a 640x240 field-frame?