Oblivion at 1080p on PS3?

Butta

Regular
It seems that IGN claims that Oblivion will run at 1080p on the PS3. Here is the link:

http://ps3.ign.com/articles/755/755571p1.html

Is this possible or another misquote like the one where Motorstorm was supposed to run at 1080p. The reason I ask is that from what I remember the 360's framerate was nothing to write home about. How the hell are they going to pull off 1080p on the PS3?
 
Would be nice but I doubt it. I don't think they would be bitching about the 2x BD drive if they managed to achieve 1080p on the PS3.
 
Would be nice but I doubt it. I don't think they would be bitching about the 2x BD drive if they managed to achieve 1080p on the PS3.

Drive speed shouldn't really have anything to do with feasibility of a given resolution when the assets are the same. They can still bitch about it while doing wonderful things elsewhere...

I highly doubt we'd see it at 1080p though. It runs crappily enough on the PC and drops a few frames in certain areas on the 360 (not bad though).
 
Drive speed shouldn't really have anything to do with feasibility of a given resolution when the assets are the same. They can still bitch about it while doing wonderful things elsewhere...

I highly doubt we'd see it at 1080p though. It runs crappily enough on the PC and drops a few frames in certain areas on the 360 (not bad though).

I'm not too sure about this..

Running oblivion at home on my 64 bit Athalon 3500+, 2 Gig of RAM and GF 7800 GTX, it's pretty clear that overall the game is CPU bound..

Graphically the game never really pushes my card to the end of its rendering tether due to such fugly "aggressive" LOD in the outdoor environments and on the characters, weapons etc which don't look great from a distance and have a tendency to pop in and out of the ether quite frequently..

I can imagine with the Cell doing all the environment streaming (assisted by HDD cache), assisting in rendering and churning through the Radiant AI, with all those little SPUs a burst with processing, it's *possible* 1080p Oblivion on the PS3 is not too far out of the realm of possibility..

Also remember the fact that Bethesda got hold of PS3 dev kits quite early and so its possible that the PS3 version *may* have had even more time to mature than the PC counterpart ever did..
 
I'm not too sure about this..

Running oblivion at home on my 64 bit Athalon 3500+, 2 Gig of RAM and GF 7800 GTX, it's pretty clear that overall the game is CPU bound..

Graphically the game never really pushes my card to the end of its rendering tether due to such fugly "aggressive" LOD in the outdoor environments and on the characters, weapons etc which don't look great from a distance and have a tendency to pop in and out of the ether quite frequently..

I can imagine with the Cell doing all the environment streaming (assisted by HDD cache), assisting in rendering and churning through the Radiant AI, with all those little SPUs a burst with processing, it's *possible* 1080p Oblivion on the PS3 is not too far out of the realm of possibility..

Also remember the fact that Bethesda got hold of PS3 dev kits quite early and so its possible that the PS3 version *may* have had even more time to mature than the PC counterpart ever did..

You must be playing a different version of Oblivion to me! My (overclocked) 7900 GTX really struggles with some of the outdoor parts with lots of vegetation, frequently dropping down to under 30fps and sometimes down to under 20.

Granted I'm running with HDR on and maximum view distance on everything, but I've turned vegatation density down in the config file and I'm "only" running at 1600 x 1000 and not 1920 x 1080. Unless there's been some major reworking of the game (or they've turned the detail right down) I can't see them using 1920 x 1080 for the game.
 
But the 360 version uses HDR and I suppose some kind of AA (at least it looks that way to me). Since HDR + AA seems not to be RSX's boon they could have gone for a higher resolution instead.
 
But the 360 version uses HDR and I suppose some kind of AA (at least it looks that way to me). Since HDR + AA seems not to be RSX's boon they could have gone for a higher resolution instead.

Oblivion (x360) is HDR+4xAA (maybe its 2x havent played in a good while)

archangelmorph said:
Running oblivion at home on my 64 bit Athalon 3500+, 2 Gig of RAM and GF 7800 GTX, it's pretty clear that overall the game is CPU bound..

What graphics settings are you running on? X1900XTX struggles with oblivion outdoors when you start maxing out stuff..


archangelmorph said:
I can imagine with the Cell doing all the environment streaming (assisted by HDD cache), assisting in rendering and churning through the Radiant AI, with all those little SPUs a burst with processing, it's *possible* 1080p Oblivion on the PS3 is not too far out of the realm of possibility..

Going by Besethsda's history of making horribly uneffective and unoptimized games, i think thats a big stretch.
 
But the 360 version uses HDR and I suppose some kind of AA (at least it looks that way to me). Since HDR + AA seems not to be RSX's boon they could have gone for a higher resolution instead.

Yeah, I'm guessing this as well. And then they probably found they could manage it in 1080p instead. I'd rather they'd copied NAO32 or something. ;)
 
I'm running oblivion in 1080P on the 360 right now.

Oblivion is one of the few games to render in 720p and 1080I depending on mode chosen. I have the VGA cables and a 1080P native Westinghouse LCD. Oblivion runs at 30 FPS, 1080I at 30 FPS upscaled through VGA to 1080P on my native 1080P LCD = Exact same PQ of native 1080P rendering at 30 FPS.

Been there done that a long time ago.
 
I'm running oblivion in 1080P on the 360 right now.

Oblivion is one of the few games to render in 720p and 1080I depending on mode chosen. I have the VGA cables and a 1080P native Westinghouse LCD. Oblivion runs at 30 FPS, 1080I at 30 FPS upscaled through VGA to 1080P on my native 1080P LCD = Exact same PQ of native 1080P rendering at 30 FPS.

Been there done that a long time ago.

:LOL: If you seriously think 1080i @ 30fps is 1080p @ 30fps, then I don't what to say.
 
Yeah, I'm guessing this as well. And then they probably found they could manage it in 1080p instead. I'd rather they'd copied NAO32 or something. ;)


I'd rather they optimize the existing stuff first*. :p


*and by that I mean they could probably do a lot better with stuff they already know given enough time.
 
IGN's information is usually a pile of rubbish. When someone at GAF made the joke that the X360 was getting a Metal Gear Trilogy they actually put it up on they're website :LOL:
 
"If you seriously think 1080i @ 30fps is 1080p @ 30fps, then I don't what to say."

1080I at 30FPS run on a native 1080P set is the same as 1080P native, A Native 1080P set like mine cannot display an interlaced signal so it deinterlaces it and displays it at 60 HZ, there is no loss in information compared to a 1080P signal this way if a game is running at 30 fps.


From wiki
http://en.wikipedia.org/wiki/1080p

1080i film-based content can become true 1080p


The following examples refer to content that is encoded in progressive-scan form during recording or transmission—what would be considered "native" progressive signals. However, where 24 fps film-based material is concerned, a 1080i encoded/transmitted stream can become a true "1080p" signal during playback by deinterlacing to re-combine the split field pairs into progressive film-scanned frames. Regarding 24 fps film-source material presented in conventional 1080i60 form, the deinterlacing process that achieves this goal is usually referred to as "3:2 pulldown reversal" [also known as "inverse telecine"]. The importance of this is that, where film-based content is concerned, all 1080-interlaced signals are potentially 1080p signals given the proper deinterlacing. As long as no additional image-degradation steps were applied during signal mastering (such as excessive vertical-pass filtering), the image from a properly deinterlaced film-source 1080i signal and a native-encoded 1080p signal will look approximately the same. It should be noted that Blu-ray Disc and HD DVD sources are 1080p with no vertical filtering, therefore, 1080i output from players can be perfectly reconstructed to 1080p with 3:2 pulldown reversal.
 
I'm running oblivion in 1080P on the 360 right now.

Oblivion is one of the few games to render in 720p and 1080I depending on mode chosen. I have the VGA cables and a 1080P native Westinghouse LCD. Oblivion runs at 30 FPS, 1080I at 30 FPS upscaled through VGA to 1080P on my native 1080P LCD = Exact same PQ of native 1080P rendering at 30 FPS.

Been there done that a long time ago.
Where did you get the idea that Obvivion or any other game renders at 1080i on your 360?
 
Developers commented on this on the elder scrolls forums before it was released, that it rendered in 720P and 1080I depending on your output mode and not just 720P like most games. "Rendered" was specifically asked so their was no confusion about it.
 
"If you seriously think 1080i @ 30fps is 1080p @ 30fps, then I don't what to say."

1080I at 30FPS run on a native 1080P set is the same as 1080P native, A Native 1080P set like mine cannot display an interlaced signal so it deinterlaces it and displays it at 60 HZ, there is no loss in information compared to a 1080P signal this way if a game is running at 30 fps.


From wiki
http://en.wikipedia.org/wiki/1080p

1080i film-based content can become true 1080p


The following examples refer to content that is encoded in progressive-scan form during recording or transmission—what would be considered "native" progressive signals. However, where 24 fps film-based material is concerned, a 1080i encoded/transmitted stream can become a true "1080p" signal during playback by deinterlacing to re-combine the split field pairs into progressive film-scanned frames. Regarding 24 fps film-source material presented in conventional 1080i60 form, the deinterlacing process that achieves this goal is usually referred to as "3:2 pulldown reversal" [also known as "inverse telecine"]. The importance of this is that, where film-based content is concerned, all 1080-interlaced signals are potentially 1080p signals given the proper deinterlacing. As long as no additional image-degradation steps were applied during signal mastering (such as excessive vertical-pass filtering), the image from a properly deinterlaced film-source 1080i signal and a native-encoded 1080p signal will look approximately the same. It should be noted that Blu-ray Disc and HD DVD sources are 1080p with no vertical filtering, therefore, 1080i output from players can be perfectly reconstructed to 1080p with 3:2 pulldown reversal.

:LOL: 1080i gaming is not 1080i film. Not even close.
 
I'm running oblivion in 1080P on the 360 right now.

Oblivion is one of the few games to render in 720p and 1080I depending on mode chosen. I have the VGA cables and a 1080P native Westinghouse LCD. Oblivion runs at 30 FPS, 1080I at 30 FPS upscaled through VGA to 1080P on my native 1080P LCD = Exact same PQ of native 1080P rendering at 30 FPS.

Been there done that a long time ago.

I don't remember that being the case at all. I played oblivion set to 1080i in dashboard and it was definitely 720p upscaled as was every other 360 game I've played -- you could see the upscaling... it definitely wasn't a real 1080i.

And, for clarification, if it was 1080i it would have to be 60fps (interlaced fields) that the console outputs (not necessarily what the game is running at), or you'd get some disastrous 15fps slideshow on when the interlaced frames were combined. The game fps may be 30 in progressive formats, but it needs to shoot out 2 interlaced fields for each progressive frame. Also, unless they do some voodoo with interlaced rendering, then it'd be using a 1920x1080 frame buffer, which would make this 1080p on PS3 possible, but it sounds unlikely that Oblivion renders in 1080 at all. Anyone else able to make claim supporting Swanlee? My experience differs from his (first paragraph).
 
Last edited by a moderator:
Back
Top