As far as the system is concerned, is it the same to render 1080i as 1080p?

skilzygw

Newcomer
From a graphics card, system, processor point of view. Would the same amount of rendering power be needed for 1080i signal and 1080p signal?

ex... So for an nvidia card to output a game 1080i @ 60fps is it the same as the card outputting 1080p @ 60fps?

Or xbox 360 for that matter. Which is what really brought about this question, since i know it plays a lot of games now at 1080i.

Thanks.
 
No 1080i has half the pixels of 1080 p i believe . I believe it renders the same verticly but horizontaly its every other line is shown again
 
If they are doing it correctly, I would say yes.

1080i still gives 1080x1920 resolution, it just refreshes the odd and even horizontal scan lines 60 times every second, interleavd. 1080p gives 1080x1920 non-inteleaved at a lower refresh rate.
 
skilzygw said:
From a graphics card, system, processor point of view. Would the same amount of rendering power be needed for 1080i signal and 1080p signal?

ex... So for an nvidia card to output a game 1080i @ 60fps is it the same as the card outputting 1080p @ 60fps?

Or xbox 360 for that matter. Which is what really brought about this question, since i know it plays a lot of games now at 1080i.

Thanks.

1080p = 1920 x 1080p
1080i = 1920 x 540 by 2 fields (making up 1920 x 1080i)
720p = 1280 x 720p

Some (game sites / hardware sites) have reported that Xbox 360 games aren’t really doing true 720p graphics (via by the GPU), but rather a pseudo-720p through the video chip scaling capabilities. So I have a problem believing that the system is doing true 1080i graphics without the aid of the video chip. I can’t fault the Xenos GPU per-se; I fault the borderline 10MB EDRAM they decided to use in conjunction with it. Maybe 15-18MB would have done the trick, but using 10MB with AA, AF, and other post processing techniques turned on was just begging for trouble…IMO.
 
Last edited by a moderator:
I believe some older PS2 games used field rendering... I can't give you any more specific detail than that, but its different from rending an entire frame AFAIK :p
 
Sean*O said:
If they are doing it correctly, I would say yes.

1080i still gives 1080x1920 resolution, it just refreshes the odd and even horizontal scan lines 60 times every second, interleavd. 1080p gives 1080x1920 non-inteleaved at a lower refresh rate.
Nope. When rendering 1080i you only need render half as many lines per frame as 60 Hz 1080p. That's assuming as the OP asked, 1080p is capable of 60 Hz. If it can't go above 30 Hz then yes, there is no difference.
 
Umm yes it can, 1080p @ 60Hz is a supported standard, not widely used (bandwidth 2x 1080p 30Hz..) however it does exist in the standards.
 
Shifty Geezer said:
Nope. When rendering 1080i you only need render half as many lines per frame as 60 Hz 1080p. That's assuming as the OP asked, 1080p is capable of 60 Hz. If it can't go above 30 Hz then yes, there is no difference.
But the system (GPU) doesn't render alternating fields every frame, or does it?
I assume the GPU passes the complete image to the scaler which then sends the image to the T.V, alternating the horizontal lines to an interlaced set.
 
It'll either render alternating fields, or render a 1920x1080 framebuffer every 1/30th second and show the same buffer as alternating fields for two frames. I guess you could render the full framebuffer and only show half of it each time but that's a waste of resources.
 
Shifty Geezer said:
It'll either render alternating fields, or render a 1920x1080 framebuffer every 1/30th second and show the same buffer as alternating fields for two frames. I guess you could render the full framebuffer and only show half of it each time but that's a waste of resources.

In fact you would likely render to the full 1080 line framebuffer, since the deinterlacing circuit will use both fields.

You could technically render in fields (like early PS2 games did for 480i), but at that point you would have to maintain 60fps. Dropping frames would have ugly artifacts.
 
Shifty Geezer said:
It'll either render alternating fields, or render a 1920x1080 framebuffer every 1/30th second and show the same buffer as alternating fields for two frames. I guess you could render the full framebuffer and only show half of it each time but that's a waste of resources.

So does that mean that in a 60fps 1080i game, it could render a single frame every 1/30th of a second and just display it twice? If that's the case, it kinda seems like a 60 fps 1080i game requires the same work as a 30fps 1080p game...
 
How does interlaced display the whole frame if it only shows half of it?
If half of the frame A comes first, which one comes after, second half of A or the next B?
 
weaksauce said:
How does interlaced display the whole frame if it only shows half of it?
If half of the frame A comes first, which one comes after, second half of A or the next B?

Generally the output chip uses a weighted average of horizontal lines 3 and 5 tap filters are relatively common.

This reduces the "flickering" that can be seen when the two fields have disparate pixels at edges.
 
Back
Top