"No 1080p For PS3 Games"

At a price...

Only if Cell is the limit, and worryingly, it seems that so far it is, for whatever reason.

Maybe (almost certainly) the devs aren't pushing the parallel architecture a lot at the moment, which means the Cell output is less than what it would be if it were used "properly".

I think it's normal, and the funny thing is that because of this, i think 1080p will become less and less frequent as time goes by, as the PS3 will become less CELL bound. It really always depend on the type of game. I'm sure Sony will eventually boost that there are hundreds of 1080p games on PS3 without telling us that 90% of those are pinball and silly japanese majong games...
 
Only if Cell is the limit, and worryingly, it seems that so far it is, for whatever reason.

Maybe (almost certainly) the devs aren't pushing the parallel architecture a lot at the moment, which means the Cell output is less than what it would be if it were used "properly".

I think it's normal, and the funny thing is that because of this, i think 1080p will become less and less frequent as time goes by, as the PS3 will become less CELL bound. It really always depend on the type of game. I'm sure Sony will eventually boost that there are hundreds of 1080p games on PS3 without telling us that 90% of those are pinball and silly japanese majong games...

Why is Cell the limit on 1080p? Isn't memory bandwidth the issue?
 
Why is Cell the limit on 1080p? Isn't memory bandwidth the issue?

Well i'm just guessing here :smile: You know, if it ain't RSX, it probably is CELL.
The point is that if developers can push higher resolutions like 1080p, it means that most probably, RSX has time to spare (waiting for data to come from Cell, so it could also be a bandwidth issue like you say). So instead of leaving the RSX idle, they just go for 1080p.

It's actually a bit more complicated than that but you get the drift. Either that or RSX has just a lot more fillrate than we thought - which so far isn't the case as we know a lot about it.

Personally i don't think it's a CELL<-->RSX Bandwidth issue as there is one big fat pipe between those two chips.

This is all in my opinion from info we hads around here, i'm sure real developers could explain this much better than i can.
 
Well i'm just guessing here :smile: You know, if it ain't RSX, it probably is CELL.
The point is that if developers can push higher resolutions like 1080p, it means that most probably, RSX has time to spare (waiting for data to come from Cell, so it could also be a bandwidth issue like you say). So instead of leaving the RSX idle, they just go for 1080p.

I think the devs are going for 1080p because Sony is pushing them to do so. To differentiate from the X360, obviously, no matter the cost.
The first and obvious price is loosing several megabytes of memory that could have been used for more textures, sound effects or whatever. 1280*720p + 32 bit Z/stencil takes up about 7 MB, 1080p needs 16MB. Add 4x AA and it needs 64MB. Use some secondary render target and you're spending 100MB.
And then there's the fillrate, bandwith, shading power all going to waste. And the higher resolution will also make the texture pixels more obvious.

So I'm sorry but I just can't agree with you. The price is there on the graphics side, and it seems to be foolish to me to waste a lot of these already precious resources just for bragging. MGS4 seems to have gotten a little downgrade in the process as a prime example... I'm quite sure it's just marketing at play again, the same way as it has been with the CPU designs (more FPU power at the cost everything else).
 
I think the devs are going for 1080p because Sony is pushing them to do so. To differentiate from the X360, obviously, no matter the cost.
The first and obvious price is loosing several megabytes of memory that could have been used for more textures, sound effects or whatever. 1280*720p + 32 bit Z/stencil takes up about 7 MB, 1080p needs 16MB. Add 4x AA and it needs 64MB. Use some secondary render target and you're spending 100MB.
Those are really worst-case and highly improbably figures. 4xAA at 1080p? Who's gonna do that?! The probable consumption is about 2x what using a 720p buffer uses, and if you add AA onto your 720p buffer and not your 1080p buffer, not even that. If you're using a hundred megabytes for 1080p version of a game, you'd be using something like 75 MB for that same game at 720p, whereas you make it sound like 100 MB for 1080p, and 7 MB for 720p!

The RAM cost is, by my dodgy guesswork, maybe an extra 20 MBs. Texture resolution can stay the same as 720p. Perhaps compress things a a little more or lose a little geometry detail to free up that RAM? The concern as I see it is shader requirements and BW getting gobbled especially with overdraw. This would see less available for texturing and shading per pixel. If you can provide procedural textures direct from Cell to RSX (dunno how feasible that is) you could gain an extra 20 GB/s for fine-grain texturing.

I don't think it's so black and white. I'd have to see the difference between 720p optimized rendering and 1080p optimized rendering, which will be hard to compare as the same game won't come with optimizations for both. We'll just need to compare lots of different 720p and 1080p games and see a general difference. But it could be on a 1080p set the improved clarity might be worth it. Don't PC gamers generally choose higher resolutions with less AA, than lower resolution with AA? Of course 1080p's a minority of current HDTV owners, but it seems to me 1080p is something of a standard for new buyers. That needs figures to support it. If new HDTV sales over the next 5 years are mostly 1080p, there's reason to support them from day one. If you only support the current 720p standard and future buyers decide on 1080p sets, you've missed out on providing that. I'm sure anyone buying a 1080p set would like media to view on it!
 
Shame there still isn't any word on whether 1080p games on PS3 will always render at 1080p then scale down for the vast majority of people without 1080p TVs (giving some very nice SSAA in the process), or whether it will render at a lower resolution entirely, like the 360 does sometimes.
Guess we won't know for a while.
 
Shame there still isn't any word on whether 1080p games on PS3 will always render at 1080p then scale down for the vast majority of people without 1080p TVs (giving some very nice SSAA in the process).
They'd better flippen' do! That OS SPE and 96 MB OS RAM ought to be doing something useful with a game...

If they render the games for SD sets at SD resolutions with no AA, it'd be a joke, wasting like 50%+ of the available graphics performance. That's such a crazy ridiculous concept, to totally gimp the 80+% of gamers in the world for the sake of a bog standard bilinear downsample step, it makes me grumpy.
 
Shame there still isn't any word on whether 1080p games on PS3 will always render at 1080p then scale down for the vast majority of people without 1080p TVs (giving some very nice SSAA in the process), or whether it will render at a lower resolution entirely, like the 360 does sometimes.
Guess we won't know for a while.

Well, we know that Lair is rendering at 1080p. Eggbrecht confirmed that in an IGN interview. So that's at least one game.
 
Well, we know that Lair is rendering at 1080p. Eggbrecht confirmed that in an IGN interview. So that's at least one game.

Yeah i know that. :smile: The question is:

Will the games be rendered internally always at the same resolution, or will 480p output mean that PS3 will render internally at 480p?

If it always renders at 720p or 1080p, it means that 480p users will get some really nice free SSAA thrown in. That would definitely be nice. Heck, that eliminates even the HDR-AA issue with NVIDIA GPUs.
 
They'd better flippen' do! That OS SPE and 96 MB OS RAM ought to be doing something useful with a game...

If they render the games for SD sets at SD resolutions with no AA, it'd be a joke, wasting like 50%+ of the available graphics performance. That's such a crazy ridiculous concept, to totally gimp the 80+% of gamers in the world for the sake of a bog standard bilinear downsample step, it makes me grumpy.

Heh that's what people were expecting from the 360 and is obviously the most logical solution but as we've seen, it was not the case 100% of the time. Hopefully Sony will decide on the most logical solution considering SD users are still the vast majority.
 
Yeah i know that. :smile: The question is:

Will the games be rendered internally always at the same resolution, or will 480p output mean that PS3 will render internally at 480p?

If it always renders at 720p or 1080p, it means that 480p users will get some really nice free SSAA thrown in. That would definitely be nice.

If they're not improving the frame-rate by rendering at lower resolution, they better be rendering at the higher one with subsequential downscaling, otheriwse they will feel my wrath. :devilish:
 
Those are really worst-case and highly improbably figures.

Not really. I haven't even gotten into FP16 buffers.

I've only added framebuffer memory requirements to put things into perspective. The single 720p framebuffer + Z-buffer is the most simple case; but if the engine starts to use HDR it'll probably need more than a single buffer, and may even require FP16.

4xAA at 1080p? Who's gonna do that?!

I recall it mentioned about Warhawk...

The RAM cost is, by my dodgy guesswork, maybe an extra 20 MBs.

No, the RAM cost is rather more proportional and depends on the way the given game's renderer works. The framebuffer requires 2.25x the memory, although some additional buffers for effects may work at a lower res as well.

I don't have exact data on any game engines, but we know that UE3 does some sort of deferred rendering; that Heavenly Sword uses a lot of different buffers and composites them together in the pixel shader. Halo3 uses 2 buffers just to do HDR. Next gen games are far more complex then what we've seen so far, they need a lot of memory just for the rendering, and the requirements increase proportionally with the screen resolution.

Texture resolution can stay the same as 720p. Perhaps compress things a a little more or lose a little geometry detail to free up that RAM?

20 MB already accounts for 5 uncompressed 1024*1024*32 textures. Using 1:4 compression, you're already wasting 20 textures... and it can easily take more than 20MB to render at 1080p, depending on the engine.
20 MB of geometry is quite a lot to just throw out from a game level. And don't forget that some of that 512MB memory will be reserved for the OS, too.

If you can provide procedural textures direct from Cell to RSX (dunno how feasible that is) you could gain an extra 20 GB/s for fine-grain texturing.

Don't get me started on procedural textures again ;)
 
Last edited by a moderator:
No, the RAM cost is rather more proportional and depends on the way the given game's renderer works. The framebuffer requires 2.25x the memory, although some additional buffers for effects may work at a lower res as well.
We need something of a study to try and pin down the costs.

20 MB already accounts for 5 uncompressed 1024*1024*32 textures. Using 1:4 compression, you're already wasting 20 textures... and it can easily take more than 20MB to render at 1080p, depending on the engine.
20 MB of geometry is quite a lot to just throw out from a game level. And don't forget that some of that 512MB memory will be reserved for the OS, too.
It's definitely using resources, for sure. The question is whether the increase in fidelity is worth it. Some say no, others might say yes. I haven't seen enough of 720p and 1080p games to comment. In other words, will that loss of 20 textures be something the gamers would notice more than the increased resolution? It's always been a balancing act between resolution and content, and I think a lot of decisions are made without proper investigation because that'd be too costly. A mid-road stance sounds sensible, but maybe when users get to see the top-res solution, they'll decide they prefer it?

Don't get me started on procedural textures again ;)
Hah! You're a Luddite. You just know in a couple of years all you artists will be out of a job as procedural content creates everything on the fly :p
 
Get out of here

I think the devs are going for 1080p because Sony is pushing them to do so..


Sure pal. Big mean 'ol Sony is killing the industry am i right?

RR7 looks great and runs at a smooth 60 fps. So does Virtual Tennis 3. It's 60 fps and looks great.

But mean ol Sony is making them do something that they don't want to do right? :rolleyes:
 
The above posts is exactly what this forum doesn't need.

I don't necessarily agree with Laa-Yosh but his post was totally acceptable, that's his opinion and he laid it out in a very nice manner. Just because you feel defensive about it doesn't mean you can tell people to get out of here. If anything, you should get out of here, after that silly reply.

Reputation given accordingly.
 
Well my bad and I apologize, but it irks me that he (out of all people, knowing the position he holds) would say something like that.

Sony is making EA make 'SKATE' in 1080p? Sony is making Namco make RR7 in 1080p, making Sega create VT3 in 1080p? I mean come on it doesn't make sense.

I would he say something like that? It's not like those games look bad. And it's also not like those games had to sacrifice alot to get the game running in 1080p.

Isn't it possible that the devs noticed that their game ran really well and tried the 1080p resolution and seen that they could make it work? What's wrong with that?
 
Well my bad and I apologize, but it irks me that he (out of all people, knowing the position he holds) would say something like that.

Sony is making EA make 'SKATE' in 1080p? Sony is making Namco make RR7 in 1080p, making Sega create VT3 in 1080p? I mean come on it doesn't make sense.

I would he say something like that? It's not like those games look bad. And it's also not like those games had to sacrifice alot to get the game running in 1080p.

Isn't it possible that the devs noticed that their game ran really well and tried the 1080p resolution and seen that they could make it work? What's wrong with that?

You don't know, I don't know, he doesn't know.

He can have his opinions like you have yours.

Stop being so defensive cause in this case, you're the one looking like the troll for attacking him like that, not him for simply stating what he thinks.
 
You don't know, I don't know, he doesn't know.

He can have his opinions like you have yours.

Stop being so defensive cause in this case, you're the one looking like the troll for attacking him like that, not him for simply stating what he thinks.

Like I said, I apologize for the attack. But which is more likely to be true? Sony making certain games 1080p "no matter what the cost" like Laa-Yosh says or the Sega, F5, Namco, and EA devs noticing that the "cost" wasn't that bad and would run the game nicely?
 
Back
Top