Oblivion at 1080p on PS3?

"1080i gaming is not 1080i film. Not even close."

As long as it's 30FPS or below it's the same thing. It's still deinterlacing and displaying at 60HZ on a progressive scan HDTV.
 
"If you seriously think 1080i @ 30fps is 1080p @ 30fps, then I don't what to say."

1080I at 30FPS run on a native 1080P set is the same as 1080P native, A Native 1080P set like mine cannot display an interlaced signal so it deinterlaces it and displays it at 60 HZ, there is no loss in information compared to a 1080P signal this way if a game is running at 30 fps.


From wiki
http://en.wikipedia.org/wiki/1080p

1080i film-based content can become true 1080p


The following examples refer to content that is encoded in progressive-scan form during recording or transmission—what would be considered "native" progressive signals. However, where 24 fps film-based material is concerned, a 1080i encoded/transmitted stream can become a true "1080p" signal during playback by deinterlacing to re-combine the split field pairs into progressive film-scanned frames. Regarding 24 fps film-source material presented in conventional 1080i60 form, the deinterlacing process that achieves this goal is usually referred to as "3:2 pulldown reversal" [also known as "inverse telecine"]. The importance of this is that, where film-based content is concerned, all 1080-interlaced signals are potentially 1080p signals given the proper deinterlacing. As long as no additional image-degradation steps were applied during signal mastering (such as excessive vertical-pass filtering), the image from a properly deinterlaced film-source 1080i signal and a native-encoded 1080p signal will look approximately the same. It should be noted that Blu-ray Disc and HD DVD sources are 1080p with no vertical filtering, therefore, 1080i output from players can be perfectly reconstructed to 1080p with 3:2 pulldown reversal.

I may be wrong but it seems to me that reference is merely talking about taking higher framerate 1080i film content and recombining the seperate frames into lower framerate 1080p.

Thats couldn't work with the 360 at 1080i/30fps unless the 1080p picture was running at 15fps.
 
"1080i gaming is not 1080i film. Not even close."

As long as it's 30FPS or below it's the same thing. It's still deinterlacing and displaying at 60HZ on a progressive scan HDTV.

Since it's clear you totally don't understand, let me explain: Film is originally captured at 24fps. This is not a problem because film has perfect motion blur. When viewed on a TV, it's often scaled to 60fps when interlaced, and 30fps when progressive because it really doesn't matter too much as long as it's over 24fps. Thus, film 1080i and 1080p are very comparable. However, games are not the same thing. No game has perfect motion blur. In fact it would need to run at something like 200fps before all effects of framerate become non-apparent. That said, you've clearly describe the game as 1080i at 30 fps, which is neither the 60fps of 1080i film nor the perfectly motion blurred 24fps in its original format. Like the previous poster said, this ain't gonna work at all without turning it into a 15fps slideshow, nor do I believe the game ever ran at 1080i in the first place.
 
And, for clarification, if it was 1080i it would have to be 60fps (interlaced frames) that the console outputs (not necessarily what the game is running at), or you'd get some disastrous 15fps slideshow on when the interlaced frames were combined.
There is no such thing as "interlaced frames", frames are deinterlaced from every two interlaced fields of content. In the case of 1080i, that means 60 interlaced fields of 1920x540 which combine to make 30 progressive 1920x1080 frames. That said, I'm certain that Oblivion renders the same on the 360 regardless of if you set the console to output 720p or 1080i, and I highly doubt that any 360 game bothers to ever render interlaced fields rather than progressive frames no matter what the 360 is set to output.
 
Oblivion is one (among many other confirmed) of the 360 games which imo runs sub 720p and probably the one where the difference in image quality between 480p and 720p is least noticable (even on my 50" Sony Bravia)

So i absolutely doubt it runs anything above 720p on the 360.
 
Oblivion on the 360 looks like 1280x720 with 2xAA and basic trilinear filtering to me, regardless output resolution.
 
Oblivion on the 360 looks like 1280x720 with 2xAA and basic trilinear filtering to me, regardless output resolution.

Actually crappy texture filtering is mostly on ground, other surfaces usually have better filtering. But IQ still makes your eyes bleeding, I hope it's better in PS3 version.
 
There is no such thing as "interlaced frames", frames are deinterlaced from every two interlaced fields of content.

I might taken your quote out of context but just to add in. Everything last gen (on SD) running 60fps was rendered in interlaced frames or was running at 640x240 res. right ?.
 
Interlaced rendering was an exception and never a rule, definitely for Xbox (I'm not too sure how many titles did interlaced rendering on PS2 or GCN, but it's likely not many as well). Achieving 60fps was not contingent upon interlaced frames or rendering a lower resolution. e.g. the Team Ninja games.
 
Interlaced rendering was an exception and never a rule, definitely for Xbox (I'm not too sure how many titles did interlaced rendering on PS2 or GCN, but it's likely not many as well). Achieving 60fps was not contingent upon interlaced frames or rendering a lower resolution. e.g. the Team Ninja games.

Well it sure seems strange how they can achieve 60fps without interlaced rendering (or of course you can rendered all the 60 frames progressive and only show half the frame.)
 
Well it sure seems strange how they can achieve 60fps without interlaced rendering (or of course you can rendered all the 60 frames progressive and only show half the frame.)
Very easily, the same way the HD DVD addon puts out 1080i60 even though we're only presenting it with 24 frames a second. The 360 has a chip that outputs any resolution and framerate, irrespective of the source resolution and framerate. It'll just repeat frames as necessary (and create fields for interlaced output).
 
Very easily, the same way the HD DVD addon puts out 1080i60 even though we're only presenting it with 24 frames a second. The 360 has a chip that outputs any resolution and framerate, irrespective of the source resolution and framerate. It'll just repeat frames as necessary (and create fields for interlaced output).

Well if the source is 24 fps and the output is 60hz interlaced doesn't make it 60fps. it's 24fps at 60hz interlaced. if you want 60fps you got to have 60fps in the framebuffer regardless what kind output hz you are using.

For instance if you have 30fps progressive in the frambuffer and you have an interlaced output at 60hz does not make it 60fps even thou you are seing 60 different halv framed(b frames).
 
Hmm? Thats weird. I see nothing demanding about Oblivion's graphics and it's only a first gen 360 game. I see no reason why some improvements such as 1080p cant be achieved on either console, especially when we are expecting games in the future that will blow away visually Oblivion in every way.

Oblivion is no way an example of top graphics for me
 
its not for you but brings quite a lot of PC's on its knees.. so yes it sort of hardware demaning
 
Hmm? Thats weird. I see nothing demanding about Oblivion's graphics and it's only a first gen 360 game. I see no reason why some improvements such as 1080p cant be achieved on either console, especially when we are expecting games in the future that will blow away visually Oblivion in every way.

Oblivion is no way an example of top graphics for me

I'm not expecting games in the future to blow away visually Oblivion in output resolution, just in other every way :)

The "Oblivion looks like crap" effect is due mostly to a) bad art direction b) horrible shaders (you can find interesting analyses of Oblivion and other games, e.g. done with NVPerfHUD). It is demanding on the GPU, and I don't expect Bethesda have spent lots of times optimizing the PS3 version - just getting it to run. They have expansions to ship, after all.
 
its not for you but brings quite a lot of PC's on its knees.. so yes it sort of hardware demaning

Its only "demanding" because the game had poor optimization. Its hard to think that Oblivion is that ridiculously demanding that even sli and xf rigs cant put out decent frames at higher resolutions while other demanding games works out fine.
 
Back
Top