"If you seriously think 1080i @ 30fps is 1080p @ 30fps, then I don't what to say."
1080I at 30FPS run on a native 1080P set is the same as 1080P native, A Native 1080P set like mine cannot display an interlaced signal so it deinterlaces it and displays it at 60 HZ, there is no loss in information compared to a 1080P signal this way if a game is running at 30 fps.
From wiki
http://en.wikipedia.org/wiki/1080p
1080i film-based content can become true 1080p
The following examples refer to content that is encoded in progressive-scan form during recording or transmission—what would be considered "native" progressive signals. However, where 24 fps film-based material is concerned, a 1080i encoded/transmitted stream can become a true "1080p" signal during playback by deinterlacing to re-combine the split field pairs into progressive film-scanned frames. Regarding 24 fps film-source material presented in conventional 1080i60 form, the deinterlacing process that achieves this goal is usually referred to as "3:2 pulldown reversal" [also known as "inverse telecine"]. The importance of this is that, where film-based content is concerned, all 1080-interlaced signals are potentially 1080p signals given the proper deinterlacing. As long as no additional image-degradation steps were applied during signal mastering (such as excessive vertical-pass filtering), the image from a properly deinterlaced film-source 1080i signal and a native-encoded 1080p signal will look approximately the same. It should be noted that Blu-ray Disc and HD DVD sources are 1080p with no vertical filtering, therefore, 1080i output from players can be perfectly reconstructed to 1080p with 3:2 pulldown reversal.
"1080i gaming is not 1080i film. Not even close."
As long as it's 30FPS or below it's the same thing. It's still deinterlacing and displaying at 60HZ on a progressive scan HDTV.
There is no such thing as "interlaced frames", frames are deinterlaced from every two interlaced fields of content. In the case of 1080i, that means 60 interlaced fields of 1920x540 which combine to make 30 progressive 1920x1080 frames. That said, I'm certain that Oblivion renders the same on the 360 regardless of if you set the console to output 720p or 1080i, and I highly doubt that any 360 game bothers to ever render interlaced fields rather than progressive frames no matter what the 360 is set to output.And, for clarification, if it was 1080i it would have to be 60fps (interlaced frames) that the console outputs (not necessarily what the game is running at), or you'd get some disastrous 15fps slideshow on when the interlaced frames were combined.
Oblivion on the 360 looks like 1280x720 with 2xAA and basic trilinear filtering to me, regardless output resolution.
There is no such thing as "interlaced frames", frames are deinterlaced from every two interlaced fields of content.
There is no such thing as "interlaced frames", frames are deinterlaced from every two interlaced fields of content.
Interlaced rendering was an exception and never a rule, definitely for Xbox (I'm not too sure how many titles did interlaced rendering on PS2 or GCN, but it's likely not many as well). Achieving 60fps was not contingent upon interlaced frames or rendering a lower resolution. e.g. the Team Ninja games.
Very easily, the same way the HD DVD addon puts out 1080i60 even though we're only presenting it with 24 frames a second. The 360 has a chip that outputs any resolution and framerate, irrespective of the source resolution and framerate. It'll just repeat frames as necessary (and create fields for interlaced output).Well it sure seems strange how they can achieve 60fps without interlaced rendering (or of course you can rendered all the 60 frames progressive and only show half the frame.)
Very easily, the same way the HD DVD addon puts out 1080i60 even though we're only presenting it with 24 frames a second. The 360 has a chip that outputs any resolution and framerate, irrespective of the source resolution and framerate. It'll just repeat frames as necessary (and create fields for interlaced output).
Hmm? Thats weird. I see nothing demanding about Oblivion's graphics and it's only a first gen 360 game. I see no reason why some improvements such as 1080p cant be achieved on either console, especially when we are expecting games in the future that will blow away visually Oblivion in every way.
Oblivion is no way an example of top graphics for me
its not for you but brings quite a lot of PC's on its knees.. so yes it sort of hardware demaning
Oblivion (x360) is HDR+4xAA (maybe its 2x havent played in a good while)
.
I think Oblivion looks great outdoors. Indoors there's no denying it looks like trash but outdoors its easily one of the best looking games available today IMO.
Here's what I mean:
http://img.photobucket.com/albums/v68/pjbliverpool/Oblivion2.jpg
http://img.photobucket.com/albums/v68/pjbliverpool/Oblivion3.jpg
http://img.photobucket.com/albums/v68/pjbliverpool/Oblivion4.jpg