Image Quality and Framebuffer Speculations for WIP/alpha/beta/E3 games *Read the first post*

Oh-mi-god, please don't misread or quote me as misquoting joker as having said anything remotely similar to "PS3 is inferior to 360." The way I paraphrased him was essentially, "PS3 is inferior to the 360, comma, in terms of rendering transparencies."

Yes I know, which is why my comments were strictly confined to the area of transparencies and particles. The point I was making is that what joker's talking about isn't something that's going to be manifest as "particles in X 360 game look better than in Y PS3 game" since resolution is just one factor out of many that will contribute to overall quality. KZ2's particles look great for a variety of reasons despite being 1/4 resolution, and no doubt there are plenty of games with full-res particles that don't look nearly as good.
 
Your initial statement implied that if a KZ2 explosion was to be freeze framed, its low resolution particle effects would be readily apparent. But to my eye, the duration and intensity of some the explosions I linked to were neither "quick flashes", nor low res.

Your mileage will vary. To my eye it's painfully obvious, but I can appreciate that many won't notice. That's the main reason the reduced frame buffer for transparencies technique came to be, because it was determined that many would not be able to spot the difference, so we went for it! It's commonplace on PS3 titles now. I pushed it even further on a title I won't name and went with 1/16th size particle buffer on the PS3 version and I bet many didn't notice either.


Statix said:
Why couldn't they simply take the performance hit during zoom-ins, and drop down to 50hz or what have you w/ v-sync? The last 60hz title I could recall playing was COD4, and even that game most definitely could not maintain a full 60hz framerate in many in-game scenes and situations.

I don't know if a few transparent elements in a scene causing a few dropped frames are a big enough problem to go so far as to implement a software scaling system in a game on a console that has no built-in scaling, which in itself introduces additional processing and additional memory requirements to accommodate both a 630p and 720p framebuffer loaded into memory at the same time for the software scaling to work.

Software scaling is dirt cheap compared to alpha blending. The hit on alpha blending on PS3 can be brutally high, measured in many milliseconds depending on what you are doing. As for why they did what they did, it's far easier to spot frame rate drops compared to a resolution drop. So if they were indeed dropping frames, then they made the right choice. I would be curious to know why though. Any Capcom insiders here? :)


Statix said:
I can't think of any real-life in-game applications, multiplatform or otherwise, where the 360 rendered higher-fidelity particles/transparencies than the PS3 (version).

Ironically it's easiest to spot it on PS3 1st party games, although they can get away with it since there is no other version to compare it to. On multiplatform games we do lots of voodoo to try and not make it blatantly visible between both versions. There are lots of games where the difference is visible though, but I'll leave that as an exercise for the reader :)
 
Yes I know, which is why my comments were strictly confined to the area of transparencies and particles. The point I was making is that what joker's talking about isn't something that's going to be manifest as "particles in X 360 game look better than in Y PS3 game" since resolution is just one factor out of many that will contribute to overall quality. KZ2's particles look great for a variety of reasons despite being 1/4 resolution, and no doubt there are plenty of games with full-res particles that don't look nearly as good.

2x2 smaller res particles look good if the source data is low frequency (colors vary smoothly from pixel to pixel with no hard edges). Smoke particles for example are a very good example of this, and most of the explosion textures are also rather low frequency. The 2x2 lower quality rendering of particles like these and bilinear upscaling looks almost as good as full res rendering. However when rendering high frequency transparencies such as sparks, electric arcs, sniper scope beams, etc the 2x2 lower quality rendering starts to look bad (especially when the particles are small or go far away from the camera).

We are rendering our particles at 2x2 lower resolution on Xbox 360. We do this because our particle system is designed to run huge amount of particles (10000 depth sorted particles per frame is a common case) and all our particles are per pixel lit and real time shadowed. However we render the 2x2 lower res particles using 4xMSAA to get perfect depth silhouettes and some additional softening. 2x2 lower res with 4xMSAA consumes same amount of backbuffer bandwidth than full res rendering. On XBox 360 however the back buffer bandwidth is almost infinite because of the EDRAM, so basically using this technique the result is pure 4x boost on particle fill rate.

All in all it's a developer choice. With 2x2 lower res you can render more particles and get nicer lighting for the particles for the same performance. However you lose some particle detail on high frequency particles.
 
Software scaling is dirt cheap compared to alpha blending. The hit on alpha blending on PS3 can be brutally high, measured in many milliseconds depending on what you are doing. As for why they did what they did, it's far easier to spot frame rate drops compared to a resolution drop. So if they were indeed dropping frames, then they made the right choice. I would be curious to know why though. Any Capcom insiders here? :)
If the framerate drop can be counteracted by as slight a drop in rendering resolution as 720p-->to-->630p, then I would assume that the overall framerate dip couldn't have exceeded ~10-15 fps in the first place. Having the zoom-in/close-up parts of the game run in 45-50 fps, or even 30 fps, wouldn't be that noticeable a drop to the average unwitting gamer, especially considering the camera cuts in such a swift and sudden fashion that said close-up shots are almost disconnected from the normal in-game action anyway.

On multiplatform games we do lots of voodoo to try and not make it blatantly visible between both versions. There are lots of games where the difference is visible though, but I'll leave that as an exercise for the reader :)
But surely there's at least some noticeable visual evidence of the quality difference in some multiplatform games? Otherwise, it wouldn't be as big a deal or developmental obstacle as you're making it seem (specifically as it relates to 360 versus PS3.
 
I'm getting a little confused though - aren't there two major different ways of handling transparancy? And what is the effect again of deferred rendering on transparancy?

I was always under the impression that at least part of the difference in handling transparancy in multi-platform games is a result of two almost diametrically opposite techniques being optimal for 360 vs PS3.
 
RSX has "peak performance" and "realistic performance". There are many gotchas with RSX, meaning that hitting peak performance in a game product is unlikely. sebbbi mentioned some issues, there are many others as well. For example, half floats. To get RSX to perform you *must* use half precision floats anywhere and everywhere possible. This helps it dual issue commands and hence reduce a shaders pass count. Of course half precision floats aren't always enough precision for the task at hand so you get forced to use regular floats, which hurts performance. Sometimes even just one float used in a shader can cut that shaders performance by a very measurable amount. On xenos, you don't have to care.

The bandwidth I was referring to was frame buffer bandwidth for transparency lookup. PS3 is permanently handicapped in this area. All games suffer from it, even first part titles like KZ2 have very low resolution particles, explosions, etc. They are hard to spot in that game because explosions last just a fraction of a second, but if you could freeze frame on an explosion pic you'd see how low res it really was. SF4 though has particle effects that like to linger around on screen, so they can't take the low res route on those because it would look really bad. During an SF4 zoom in sequence, the transparent particles could result in many screen amounts of overdraw, which RSX would struggle to maintain at 60fps. That might be why they drop main render rez on a zoom.

What about these?

http://farm4.static.flickr.com/3401/3233057121_0da3d27cb6_o.jpg

http://farm4.static.flickr.com/3319/3233057039_6ff53593b5_o.jpg

http://farm4.static.flickr.com/3384/3233055989_41b7caa29a_o.jpg
 
This has gone nowhere. All based on speculative observation. "To my eye it looks like this!", "Well, to my eye it appears like this!" Nothing to back anything up.
 
This has gone nowhere. All based on speculative observation. "To my eye it looks like this!", "Well, to my eye it appears like this!" Nothing to back anything up.

Judging by posted screenshots from some games here it is blatantly obvious the effects are upscaled.
 
Last edited by a moderator:
If the framerate drop can be counteracted by as slight a drop in rendering resolution as 720p-->to-->630p, then I would assume that the overall framerate dip couldn't have exceeded ~10-15 fps in the first place. Having the zoom-in/close-up parts of the game run in 45-50 fps, or even 30 fps, wouldn't be that noticeable a drop to the average unwitting gamer, especially considering the camera cuts in such a swift and sudden fashion that said close-up shots are almost disconnected from the normal in-game action anyway.

Remember the flack EA got for Madden running at 30fps PS3 compare to 60fps 360? No one wants a repeat of that. I'd sooner drop the rez than drop the framerate that dramatically, even if for just that zoom sequence. I'd bet more would notice the fps drop than the resolution drop, people just aren't that rez sensitive. Anyways, we don't know how much the fps hit was. 720p->630p may seem small, like just a 12% or so drop, but it really depends on how many layers of transparency there are, the hit can be far more than 12%. I don't have the actual game so I have no clue exactly what they are doing, I'm just going by screen shots.


Statix said:
But surely there's at least some noticeable visual evidence of the quality difference in some multiplatform games? Otherwise, it wouldn't be as big a deal or developmental obstacle as you're making it seem (specifically as it relates to 360 versus PS3.

There are definite visual differences, but in the end all that matters is will the masses notice. More often than not they don't. Look at past history on B3D. The game Fracture was much blurrier on PS3, yet people here didn't notice. GTA4 PS3 was blurrier, people didn't notice. Bioshock PS3 was blurrier, people didn't notice. If they can't see the obvious blur difference in those examples, are they really going to be able to spot downsized transparencies that flash briefly on screen? Probably not. Hence why it's now in widespread practice. In any case, it's not a development obstacle anymore. It was at launch, but now a downsized transparency pass is a standard part of PS3 best practices, all games that have a sizeable amount of transparency do it.


Moon Light Knight said:

Yeah that pic is a decent example of a downsized transparency buffer. It's a relatively low noise piece of art so they can get away with it, but you can see some artifacting as a result. Look at some of the areas where that explosion is in contact with Drake, like his neck and right tricep area. Do you see where it looks a bit more blocky in those areas? That's an artifact from blending back the downsized buffer to the full size main scene. There are ways to reduce that artifact though.


Arwin said:
I'm getting a little confused though - aren't there two major different ways of handling transparancy? And what is the effect again of deferred rendering on transparancy?

I was always under the impression that at least part of the difference in handling transparancy in multi-platform games is a result of two almost diametrically opposite techniques being optimal for 360 vs PS3.

As far as I know all games do transparency the same way. Deferred doesn't make a difference, I believe KZ2 has a separate transparency pass after everything else. In the end the issue is the same. When you draw a transparent pixel you must look up what the current color on that same pixel current is to blend it and write out the new color. It's the lookup part that is slow on PS3. If you have a small amount of it then no problem. If you have a lot of it, or worse yet many screen amounts of it, then you need to go downsized. MGS4, Motorstorm 2, Resistance 2, KZ2, etc, all do it. Can you spot it? For most people I'd say probably not so don't worry about it.

This has gone nowhere. All based on speculative observation. "To my eye it looks like this!", "Well, to my eye it appears like this!" Nothing to back anything up.

I word it that way to try and play nice :) But the reality is that they are obvious downsized transparency effects, it couldn't be anymore obvious to me. If you can't tell then no big deal, just enjoy the game. Make no mistake though, they are reduced sizes. Incidentally it was Sony that recommended the downsized transparency pass to other developers at a PS3 Devcon ages ago as a way to cope with the PS3's lack of edram for us multi platform devs.
 
Wow, I never noticed while playing, but those ARE really low resolution...

Huh? They look high res to me. That is why I posted them. In Uncharted there is a super slow motion option. You can use it to take a look at the particle effects in Unhcarted. They seem very well done compared to most games this gen.
The image is from here.

The Article links to some videos of the particle effects between the two games. Here is one from Uncharted.
http://www.youtube.com/watch?v=0zHn3BCpNps&feature=channel_page

http://farm4.static.flickr.com/3384/3233905146_84404c95aa_o.jpg
 
Last edited by a moderator:
I word it that way to try and play nice :) But the reality is that they are obvious downsized transparency effects, it couldn't be anymore obvious to me. If you can't tell then no big deal, just enjoy the game. Make no mistake though, they are reduced sizes. Incidentally it was Sony that recommended the downsized transparency pass to other developers at a PS3 Devcon ages ago as a way to cope with the PS3's lack of edram for us multi platform devs.

So developers are not using the edram for free 4xAA like its original intent? Or have they found a way to use that 10mb for both free 4xAA and particle effects at the same time?
 
the 10 mb of edram are probably not enough for what its intended for, thats why the original requirement for all games must be running at 720p with 4XAA on 360 never happen since launch day.
 
just a little break for SF4, any update on star ocean 4? is it really 900X512? lower than the WWE game for PS3? How about the AA, what was the resolution for Infinite Undiscovery?
 
Can someone have a stab at what it renders the map/overworld/walking around bit at?.

http://image.com.com/gamespot/images/2009/041/946860_20090211_screen011.jpg

just a little break for SF4, any update on star ocean 4? is it really 900X512? lower than the WWE game for PS3? How about the AA, what was the resolution for Infinite Undiscovery?

I asked about that shot because it looks higher the 900x512. So :p now that we have gotten that KZ2 mess out of the way, how about it, anyone feel like it? :p
 
KZ2 is actually the first game to solve this low frequency particle issues the PS3 games had for long time.

http://www3.telus.net/public/dhwag/particles_WAW_PS3.jpg

http://www3.telus.net/public/dhwag/particles_WAW_360.jpg


These shots are from COD WAW, and the difference is painfully obvious, and this probably is why some people feel that the PS3 version is running at lower res.


http://www3.telus.net/public/dhwag/particles_KZ2.jpg

Now this one's a similar shot from KZ2, no visible aliasing at all.


http://image.com.com/gamespot/images/2008/127/928377_20080507_screen003.jpg

This one's from the earlier development shots, in which the low res particles are clearly evident.


http://www3.telus.net/public/dhwag/particles_KZ2b.jpg

such is no more in the demo, probably done by adding additional filtering path

IMO, this certainly is an excellent solution, and I definitely look forward to see more devs to use it ;)
 
But surely there's at least some noticeable visual evidence of the quality difference in some multiplatform games? Otherwise, it wouldn't be as big a deal or developmental obstacle as you're making it seem (specifically as it relates to 360 versus PS3.
Try checking out GRAW 2.
 
Moderators may wish to delete this post later, but I was to this article a few minutes ago and it seemed odd that the explanations alongside the screens are similar to the discussion here.

http://www.gamezine.co.uk/news/game-types/fighting-games/street-fighter-iv-ps3-sub-hd-in-close-up-$1270459.htm

Any of you gents happen to be the author here? I really don't like plagiarism.
 
Moderators may wish to delete this post later, but I was to this article a few minutes ago and it seemed odd that the explanations alongside the screens are similar to the discussion here.

http://www.gamezine.co.uk/news/game-types/fighting-games/street-fighter-iv-ps3-sub-hd-in-close-up-$1270459.htm

Any of you gents happen to be the author here? I really don't like plagiarism.


those came from this

http://d.hatena.ne.jp/yoda-dip-jp/archive?word=*[Game+Compare]

and I think we shoudl all give KZ2 and SF4 a rest..........
 
Back
Top