HD problems in Xbox 360 and PS3 (Zenji Nishikawa article @ Game Watch)

Mefisutoferesu said:
:???:

It\'s probably going to piss everyone off and you can beat me for it if you feel the need, but hearing all this stuff I can\'t help but think that the 360 really was originally intended to have game run at 480p and not 720p. I know I know, but if you\'re gonna rip me apart and tell me I\'m wrong, please, do explain why I\'m wrong for thinking this way.

360 originally only had 256MB of UMA!
 
Question

Dave Baumann said:
Giving that this was bulleted in the context of tiling, I think this was already known. I forget the tarty name MS gave to the principle of rendering vertex data from the CPU and streaming it directly to Xenos, but this is known not to work with tiling - the processing need to be recalcuated per tile.

So it is possible but is slow because vertex processing has repetition for every tile or is it not possible? If it is possible but is only too slow (because 3 tile frame has 3x CPU vertex processing) for good frame-rate then why cannot have vertex processing be much less quantity (less vertex?) so frame-rate can be fast? Thank you.
 
Xboxguy said:
Hell, quite frankly, the Xenos die alone is bigger than G71 WITHOUT EDRAM, so it's just more proof how innefecient ATI has gotten.

G71? The one built to make effective use of dynamic branching, vertex texturing, and using very flexible design and.... oh, wait, wrong one.

Or the one unnecessarily fueling your rampant pessimism?

If developers plan to program for Xenos like it's any other GPU, or like G7x, then yeah, I'll agree they wasted alot. Probably not the eDRAM, but the rest of the die. If DB, VTF, use of the chip for non-GPU and/or non-standard workload processing, etc. takes off, then I'll continue disagreeing. If developers take tiling into account and most of the problems go away because it's part of their development instead of a switchbox on the side, I'll continue disagreeing. If features make that "lesser" amount of shading power more effective than it would have been without those features but more "shading power," then I'd say it's worth it. If ports suffer some at the expense of exclusives, well, I say it's worth it.

But never let my words stop you. I think we've been over this enough, anyhow.
 
Fafalada said:
Hence the remark about cache locking being "useless".
Anyway designing a renderer around tiling does introduce extra complexities and considerations you otherwise don't need to make - which makes me wonder if that's actually the source of most complaints there (ie. - damn thing doesn't 'just work', we have to reimplement our pipelines for it).

Afaik it was never mandatory, but my info could be outdated. But from what I've read they included both 720P w/o AA, and 1066x600 (or something thereabouts) with 2xAA as "minimum allowed" resolutions.

It's easier now than it was 6+months ago. The X360 APi's are still evolving to some extent and a lot of the tools to make a good engine that exploits tiling well are around now, although my understanding is that even the current XDK doesn't expose all of the functionality.

The TRC is extremly none specific, technically the minimum res is 720P and you have to provide something to address some of the aliasing. Like all TRC's they're negotiable to some extent and MS will give exemptions for the right titles, although I suspect they will be harder to get as excuses like "launch game" start to disapear. It'd be pretty hard for me to justify anything less than 720P with 2x hardware AA to anyone I work for.

The only reasonable excuses I've heard for lack of AA other than not enough time is the one Epic uses to justify it in the Unreal3 engine games. Basically their shadowing algorythm projects screen space pixels back into light space so they would have to do 2x or 4x the work for shadowing if AA was enabled, but that's true reguardless of the AA implementation.
 
ERP said:
The only reasonable excuses I've heard for lack of AA other than not enough time is the one Epic uses to justify it in the Unreal3 engine games. Basically their shadowing algorythm projects screen space pixels back into light space so they would have to do 2x or 4x the work for shadowing if AA was enabled, but that's true reguardless of the AA implementation.

Do you have any idea whether UE3 is suporting tiling yet, or will developers have to modify the engine to do so themselfs?

Anyway, another question is how big difference is it between a true internaly rendered 720p image and one that is done in 960x540 or 880x720 which should fit the EDRAM and then upscaled? Is there a lot of detail loss. Are there any images around that one can compaire. From a lot of people it seems that even old DVD movies that are uscaled look really good, and thinking that one usually sits quite a bit away from the TV will a image be noticable if it is 960x540 and then upscaled?...
 
scooby_dooby said:
MGS4 is the lone standout, but when is it coming out? 2007?

It definately seems that developers like Ubisoft, Epic and Gearbox(and of course Ninja Theory), are setting the bar for graphics, while projects like N3, Dead Rising, & Lost Planet are certainly nice, but not in the same league.

Anyways it was just something I read, thought it was worth some discussion. I don't put alot of weight in it myself, as I think it's more a matter of X360 japanese titles being lower budget. It is interesting to consider though that japanese developers probably have a larger learning curve since they were so predominately PS2 focused.


A lone standout in a sea of US/Euro games. The reality here is that so far, we just saw a lot more from US and European developers, firstly and foremostly because not many Japanese developers are making games on the only next-gen console available (compared to the number of US and Euro ones making loads of games already of course). And secondly because we've seen very little on the platform we know will get the most Japanese games.

Once PS3 comes out and the Japanese devlopers come out for a fight, we'll see what they can really do with the hardware. Personally i'm not worried at all, seen what they came out with on consoles with very big restrictions.

God knows what they'll do with all they have in Cell, and in the 3 cores of X360! There are so many possibilities, and i think on average the Japanese developers are the ones doing things more "strangely" than US ones, who seem to be stuck on a very PC-ish look and feel most of the time (Sony's first party games like Jak, Rathet and GOW being the few exceptions here).
 
Titanio said:
The resolutions quoted as examples aren't SD. They're generally about twice that. They'll probably still look better on a HDTV than a SDTV.

Of course. I have to agree it is not what we expected, but have a look at my X360 on the 1280x720 projector in my basement and you'll see life is good. ;)

Platon said:
Anyway, another question is how big difference is it between a true internaly rendered 720p image and one that is done in 960x540 or 880x720 which should fit the EDRAM and then upscaled?

I'd like some facts on this too, because I have a feeling people are not going to notice this. Apparently PGR3 used a lower resolution in game, and true HD for photo-mode and menus, but I have yet to see the first review pointing out the differences in picture quality between these.

I for one sure don't notice it at all.

Apart from the idea MS gave us, I'd say it's nitpicking.
 
Last edited by a moderator:
ERP said:
The only reasonable excuses I've heard for lack of AA other than not enough time is the one Epic uses to justify it in the Unreal3 engine games. Basically their shadowing algorythm projects screen space pixels back into light space so they would have to do 2x or 4x the work for shadowing if AA was enabled, but that's true reguardless of the AA implementation.
It would probably be less efficient but they could compute shadow contribution in their color rendering pass, multisampling or not it would not make any difference.
 
pipo said:
I'd like some facts on this too, because I have a feeling people are not going to notice this. Apparently PGR3 used a lower resolution in game, and true HD for photo-mode and menus, but I have yet to see the first review pointing out the differences in picture quality between these.

I for one sure don't notice it at all.

I haven't played it yet, but I'd by lying if I said I never heard anyone complain about its IQ.

ERP said:
The only reasonable excuses I've heard for lack of AA other than not enough time is the one Epic uses to justify it in the Unreal3 engine games. Basically their shadowing algorythm projects screen space pixels back into light space so they would have to do 2x or 4x the work for shadowing if AA was enabled, but that's true reguardless of the AA implementation.

Thanks. I think that's the first time I've read a good explanation for the UE3 and no tiling/aa issue. Is this a situation that'll persist in the longer term, do you know?
 
Titanio said:
I haven't played it yet, but I'd by lying if I said I never heard anyone complain about its IQ.

That's besides the point. The difference of in-game versus the HD rendering modes is what we're talking about here... :)
 
Well, if PGR3 is an example of this upscaling trend, then i'm not happy at all. The game has horrible IQ compared to other Xbox360 games and i wouldn't want the majority of games to have the same IQ.
 
pipo said:
That's besides the point. The difference of in-game versus the HD rendering modes is what we're talking about here... :)

I know, but usually when complaints are made, talk of upscaling comes into the picture.

I think it's fair to say that 720p with 2xAA would be cleaner than 1024x576 with 2xAA upscaled, if you put them side by side? 2xAA doesn't remove all aliasing, and upscaling will make whatever aliasing remains more pronounced.
 
nAo said:
It would probably be less efficient but they could compute shadow contribution in their color rendering pass, multisampling or not it would not make any difference.
If they are still allocating one shadow map Per object, I doubt the little efficieny loss in shaders would make a difference. :p
 
I have really enjoyed this thread, it harks back to the B3D of yore and reminds me why I came here in the first place.

Well done guys for the informative, civil discussion.

It's great that this info wasn't used to spark a flamewar.

When I think back to how the PS2 struggled in it's first year and look at what the software can do now, I have a much greater confidence in developers being able to wring the last drop of performance out of any given system given the time. With both the Xbox360 and PS3 there is so much potential too be unlocked once the complexities of the systems have been better understood.
 
Titanio said:
I think it's fair to say that 720p with 2xAA would be cleaner than 1024x576 with 2xAA upscaled, if you put them side by side?

I'd say yes, and I'd like to see it all the way native too. If people would notice the difference is a different thing though.

london-boy said:
Well, if PGR3 is an example of this upscaling trend, then i'm not happy at all. The game has horrible IQ compared to other Xbox360 games and i wouldn't want the majority of games to have the same IQ.

The problem is we can't tell if it's to do with the scaling. For all we know every launch game could use this trick...

Edit - concerning PGR3, I suspect they were shooting for 720p and scaled back very late in the proces, so I'd guess a lot of the textures are far from optimised for the in-game resolution. Speculation of course.
 
Last edited by a moderator:
I find it rather humerous that a bunch of you guys are looking at this as a reason why PS3 will be better off than 360. In case you didn't notice, RSX has far less bandwidth than Xenos. 80% of PS2's BW was for the FB. Two thirds of the BW for the older 3DLabs Wildcat series (separate texture and FB memory) was for the FB. Framebuffer access occupies the lion's share of BW, so there's a reason ATI abandoned its previous architectures to make Xenos the way they did.

Anyway, a lot of that article sounds like they don't know what they're talking about and misinterpreted the devs' comments.
-"Lens effect, refraction, HDR effects such as bloom and glare" are all things which are unaffected by tiling. You have to resolve your scene and write it to main memory before you can do any of these effects. It sits as a whole image there.
-Xenos has 4 gigapixels per second fillrate, not texels. And if you're talking about bandwidth limitations, why bother mentioning this?
-"It leaves only 0.5 times headroom" is just silly. Say 17% more if you want to say something meaningful. Moreover, you've got all your bandwidth for your CPU and texture/vertex access. Eliminating Z and colour bandwidth, which the original XBox still needed, is enormous.
-I don't know what the heck the "Developer C" quote is talking about.

I don't see how geometry can be that big of a deal, either. We keep hearing again and again that geometry isn't the bottleneck nowadays, so even sending the geometry down twice with a simple scissor test (i.e. no predication) should be an easy alternative. A decent culling system in your engine should keep the geometry increase well under a full factor of two, and also the cache locking is unaffected this way. And why the heck would a developer not implement any LOD? That's something that helps you whether you have tiling or not.

Whatever. Just your usual stuff from a site trying to look more knowledgeable and connected than they really are.
 
Mintmaster -

BW concerns raised in the article don't wholly relate to the FB. PS3 doesn't come away unscathed, though, and in fact, one of the developers recommends sticking to 32-bit buffers to limit FB bw consumption. So don't put Xenos up on a cross just yet ;)

Asides from that point, I think you're writing off the concerns raised way to easily, particularly when we're seeing manifestations of those concerns all over the place. As far as the direct quotes go, at least, these are the actual experiences of some developers. Ultimately, theirs (developers') is the experience that matters. And there are evidently issues with tiling, one's that are having an actual impact on games, be they exactly as the author describes or not.

As for the credibility of the article, one might be able to comment on the history of the author. But again, the issues they address aren't exactly new or inconsistent with what we've seen ourselves.
 
Last edited by a moderator:
Titanio said:
Mintmaster -

BW concerns raised in the article don't relate to the FB.
Certainly their reasoning is up the swanny though. That XB360 only has 0.5 times the available BW for texture lookups over XB is silly, because the BW consumed for that 3x resolution increase doesn't come in the main from system BW. With 3.5x the system RAM BW, a lot of which was FB BW in XB, most of that isn't FB BW in XB360. For texture lookups XB360 should have well in excess of 3.5x available BW.

And Mintmaster's point on LOD, absolutely right. LOD is used already for it's other benefits. It's not something regrettably to be added to accomodate tile rendering. Likewise lens effects and bloom are applied to the whole image and don't have any concerns with straddling tiles, unless you're being pretty stupid with applying these effects per tile. HDR tonemapping per tile using average scene brightness will be something to behold when applied to a bright sky on the top tile and dark architecture in the bottom ;)

I'm seeing a lot of questionable opinions in that article. But like I say, I don't really care as long as the games look good!
 
Shifty Geezer said:
Certainly their reasoning is up the swanny though. That XB360 only has 0.5 times the available BW for texture lookups over XB is silly, because the BW consumed for that 3x resolution increase doesn't come in the main from system BW. With 3.5x the system RAM BW, a lot of which was FB BW in XB, most of that isn't FB BW in XB360. For texture lookups XB360 should have well in excess of 3.5x available BW.

I agree with this.

Shifty Geezer said:
And Mintmaster's point on LOD, absolutely right. LOD is used already for it's other benefits. It's not something regrettably to be added to accomodate tile rendering.

I agree, but the point is that with LOD or not, tiling brings issues for some developers, and in some quite high profile and far-reaching cases (like UE3). We can talk about what should or shouldn't be a problem, but there's the matter of what is the case.
 
Thread Pruned

The thread was pruned from the uneccessary posts and the replies to thoses.
 
Back
Top