HD problems in Xbox 360 and PS3 (Zenji Nishikawa article @ Game Watch)

one

Unruly Member
Veteran
http://www.watch.impress.co.jp/game/docs/20060426/3dhd.htm

Zenji Nishikawa has uploaded the latest issue of his article series about 3D game technologies. The subject of this article is the concern about sub-HD rendering in the next-gen consoles. Since most issues explained by Nishikawa in the article have already been discussed in this forum I make a summary of them. It also contains anonymous developers' quotes, all of which I translate here.

-----------------------------------------------------------------------------
  • The RAM bandwidth of Xbox 360 GPU is almost equal to RADEON X1600 XT and shared with CPU by UMA.
  • Without the eDRAM pixel processor doing 4xMSAA, the fillrate of the GPU core itself is 4 billion texel/sec and almost equal to GeForce 7600 GT.
  • While the Xbox 360 has a 3.5 times broader bandwidth than the original Xbox, 720p pixels require a 3 times broader memory bandwidth. It leaves only 0.5 times headroom which is insufficient for multiple texture lookups by complex shaders.
  • eDRAM is implemented to mitigate the impact of the low memory bandwidth. But FP10 + 2xMSAA requires Predicated Tiling.
  • Tile rendering has many performance demerits.
    • In games with many chracters like N3 the cost of overlapped geometry grows large unless LOD is implemented.
    • Lens effect, refraction, HDR effects such as bloom and glare, and other frame buffer filtering cause overlapped drawing near tile boundaries.
    • Objects that cross boundaries can't use the cache efficiently.
    • CPU L2 cache locking is practically unusable.
  • Since textures are stored in the shared 512MB RAM, regardless of the eDRAM size or use of tile rendering, texture lookup consumes the shared memory bandwidth. Normal mapping and shadow mapping require many texture lookups.
  • So the last resort is to use Display Controller to upscale the image without using tile rendering, for example rendering FP10-32bit / 960*540 / 2xMSAA / 32bit Z (8MB).
Developer A: Even 2xMSAA is not required by Microsoft anymore.
Developer B: FP10-32bit / 880x720 / 32bit Z / 2xMSAA (9.9MB) rendered to look right when upscaled to 16:9 is also possible.
Developer C: You can render it in a certain low-res then to display it you can create a 720p frame by your own shader. In converting the original low-res frame into a 720p frame by the shader you can do color dithering, which may result in smooth color expression or alleviation of the resolution deficiency in FP10.
Developer D: At any rate I want to reduce jaggies. Since the eDRAM pixel processor is penalty-free upto 4xMSAA, it will be interesting if it's fully exploited. Though it becomes 640x480 with 4xMSAA and FP10-32bit if it's not tile-rendered, aliasing-free images will be totally different from what we have seen in older games.
Developer E: If you think HDR rendering as a premise, PS3 is worse than Xbox 360.
  • Since PS3 doesn't support FP10-32bit buffer, if FP16-64bit HDR is used it requires twice the bandwidth of Xbox 360 but PS3 doesn't have eDRAM like Xbox 360 to mitigate the impact. It's possible that pseudo-HDR employed in Xbox and DX8 that use a conventional 32bit buffer (8bit int per ARGB) is often used in PS3. Besides the display controller may be used to upscale sub-HD images to a HD resolution.
Developer F: As for resolution I think if it's modest it's OK. Since RSX in the PS3 is a shader monster, adding more information to a pixel by executing ultra-advanced shader and then antialiasing it completely must make it look more real. I'd like to give priority to the reality charged in one pixel rather than to HD resolution.
 
Cheers one!

Edit - To be honest, I did expect MS to drop the internal HD rendering restriction...
 
Thanks, one.

The most interesting thing to me was "CPU L2 cache locking is practically unusable." I would like some more info on that (unless they're referring to the procedural synthesis stuff for which the CPU thread will obviously have to regenerate the geometry for each tile).
Also, I do understand the remarks about transforming geometry straddling tile boundaries more than once in the cases mentioned, but we're talking huge tiles here (4 max?), and they make it sound like the tiles are 64x64 or something.

Looking forward to 1080p on the PS3... ;)
 
london-boy said:
So much for "HD consoles" in the "HD Era".... More like "Upscaling consoles"...

Talk about the X360 there, the PS3 should handle it just fine (but w/o anti aliasing).

Different systems, different tradeoff, the XBox360 will provide "good" HDR + MSAA at not-so-HD resolution, while the PS3 will give you HD resolution but with lower MSAA or HDR quality...
 
It also explain that 1080p thing... If the PS3 will upscale low-rendered images, then might as well upscale all the way up to 1080p... Nice...:rolleyes:
 
Ingenu said:
Talk about the X360 there, the PS3 should handle it just fine (but w/o anti aliasing).

Different systems, different tradeoff, the XBox360 will provide "good" HDR + MSAA at not-so-HD resolution, while the PS3 will give you HD resolution but with lower MSAA or HDR quality...

We don't know that really. In the end i'm sure a developer or two will see it very comfortable to lower the res just enough to save that last bit of performance, if he knows the output will be upscaled anyway...
 
Thankfully I don't care, being on SDTV, and I think buckets of AA at SD res will suit me fine. Give me TV quality photorealism at SDTV over PC quality 2000x1500 any time :D

Also, HD resolution is precluded from this talk, only HD with AA seems unlikely. That just means the jump is either in resolution or Antialiasing, which seems fair. Expecting absolutely everything to improve in quality over last gen is asking a bit much. And they don't appear to be considering clever techniques versus standard brute-force methods. I'll cite NAO32 of course as an example of HD with AA, and I'm sure XB360 has tricks it can pull to improve things.

It'd be nice to have official confirmation on the AA mandates for XB360. Have they really dropped the mandatory AA?
 
Ingenu said:
Talk about the X360 there, the PS3 should handle it just fine (but w/o anti aliasing).

Different systems, different tradeoff, the XBox360 will provide "good" HDR + MSAA at not-so-HD resolution, while the PS3 will give you HD resolution but with lower MSAA or HDR quality...
What's the use of having HD if your AA is sub-par?
 
Alpha_Spartan said:
What's the use of having HD if your AA is sub-par?
Because it looks better on HD screens? Would you rather play a PC game at 640x480 with 4xAA or the same game at 1024x768 no AA?
 
Ingenu said:
while the PS3 will give you HD resolution but with lower MSAA or HDR quality

If referring to what the article refers to as "psuedo HDR" techniques, as far as just your dynamic range goes, that needn't be the case at all, though tradeoffs elsewhere may be introduced.

And obviously in both cases, and for 360 too, while this article may try to indicate possibly typical trends (?), it's ultimately a case-by-case thing that is dependent on the game and the developer.
 
Do you guys think there's any truth to the rumours that Japanese Developers are behind the curve with the new consoles because they are unfamiliar with the PC-based GPU's that both the new consoles use??
 
It's hard to tell. It's hard to tell how much effort they've made to date, with some notable exceptions obviously, though I hope we'll see "proper efforts" as of E3. You can be sure some of them have been R&Ding with PC GPUs in preparation for next-gen before kits for the systems became available. And it's not like every Western developer has a tonne of PC experience either. Look at someone like Factor5 - console-only, pretty much, but they seem to be doing wonderful things on next-gen hardware.

In the end, if you've a solid understanding of the fundamentals, and excellent application skill, that'll probably translate to any hardware and you'll probably flourish even if there's some "get to know you time". In terms of accessibility, also, GPUs with high level shading languages and the like are probably a lot easier to get to grips with than things like an EE was for them at the time ;) So no, I'm not sure if they're at a particular disadvantage, at least for the more talented ones.

edit - I mean, to concretise that, the first developer known to be upscaling from a non-HD resolution wasn't Japanese (!)
 
Last edited by a moderator:
scooby_dooby said:
Do you guys think there's any truth to the rumours that Japanese Developers are behind the curve with the new consoles because they are unfamiliar with the PC-based GPU's that both the new consoles use??

Boggle.

Despite the mountain of information on the PS3 rendering architecture that has been publicly available for the past year...
 
Back
Top