HD problems in Xbox 360 and PS3 (Zenji Nishikawa article @ Game Watch)

fearsomepirate said:
I'm curious, and no one's really said...

1. Are any games other than PGR3 rendering below 720p internally?
2. How many games maintain steady framerates (ignoring gussied-up PS2 ports from launch like Gun)?
3. In games like Oblivion that don't run smoothly, does dropping them to 480p fix that?

I believe answers to those three questions will shed some light onto how real this problem is.

1. No.
2. The majority of X360 games are ports unfortunately. The only games I've experience stuttering in are: Oblivion, Quake and PDZ. The rest have run without a hitch.
3. No. The stuttering for Oblivion is mainly the loading between areas..
 
nAo said:
They asked me the same question when I was at primary school..lol :p

You're nasty... :cry:

But you like Nutella like me, so I forgive you ! :)

(Utterly forgot the front buffer was in UMA. When I'm not well enough to work, I shouldn't reply to any topic in Beyond3D's forums either ^^)
 
fearsomepirate said:
I'm curious, and no one's really said...

1. Are any games other than PGR3 rendering below 720p internally?
2. How many games maintain steady framerates (ignoring gussied-up PS2 ports from launch like Gun)?
3. In games like Oblivion that don't run smoothly, does dropping them to 480p fix that?

I believe answers to those three questions will shed some light onto how real this problem is.

1. It would take someone with a development box to go though all the games and make framebuffer grabs to be sure, but I'm fairly certain that the rest of the games do render at 1280x720, at least when the 360 is set to output a greater than 480 line resolution.

2. I haven't seen all the games in action let alone given them the full playthough, but out of what I have seen most games have some minor framerate fluctuations, but not many are to the point as to what I'd consider worth taking issue with and a some are very solid as well.

3. Not in Oblivion, 480p is just as rough as 720p or 1080i, and judging by the images from each I'm fairly certain the game renders at the same resolution regardless and uses the 360s internal scaler to resample that for output. CoD2 is an example of the other scenario where 480p (and 480i as well as 640x480 and 848x480 over VGA) does run notably smoother than the higher resolutions and those lower resolutions also get some nice AA which the higher resolutions lack.
 
kyleb said:
CoD2 is an example of the other scenario where 480p (and 480i as well as 640x480 and 848x480 over VGA) does run notably smoother than the higher resolutions and those lower resolutions also get some nice AA which the higher resolutions lack.


Interesting, is 480p widescreen (850 X 480) or 4:3 (640x480)? (i'm assuming 640...)
 
Of course, maybe the framerate question wasn't good, because IMO Xbox has made a lot of gamers immune to bad framerates. I guess I would say that if the Xbox version of Deus Ex: IW doesn't make you claw your eyes out, or if you can go back and play Conker 64 without your head asploding, you're not on the same page with me. Some say that the framerate in Oblivion is pretty bad outdoors; some say it's not that bad.
 
Hardknock said:

'We don't know', is a better answer. The only way you could know for sure if PGR3 was the only one would be to check each title on a debug kit, where you could capture shots from the framebuffer in native resolution.
 
fearsomepirate said:
Of course, maybe the framerate question wasn't good, because IMO Xbox has made a lot of gamers immune to bad framerates. I guess I would say that if the Xbox version of Deus Ex: IW doesn't make you claw your eyes out, or if you can go back and play Conker 64 without your head asploding, you're not on the same page with me. Some say that the framerate in Oblivion is pretty bad outdoors; some say it's not that bad.
This definitely is possible. While playing Shadow of the Colossus you're sort of aware that the framerate is pretty variable but it doesn't seem to be hugely noticeable. If that makes sense :p
 
So, which uses more bandwidth 720p's worth of pixels in one go or 720p's worth of pixels broken into 3 or 4 chunks?
im guessing the 3/4 chunks will be slightly more (assuming theres a slight overhead with each chunk)
 
expletive said:
Interesting, is 480p widescreen (850 X 480) or 4:3 (640x480)? (i'm assuming 640...)
Best I can tell, 480p output as well as both 640x480 and 848x480 though VGA are all the same. I'll guess that is actually rendering at 640x480 regardless of if widescreen is on or off, but obviously that is *just* my best guess from comparing those options.
 
Last edited by a moderator:
kyleb said:
Best I can tell, 480p output as well as both 640x480 and 848x480 though VGA are all the same. I'll guess that is actually rendering at 640x480 regardless of if widescreen is on or off, but obviously that is *just* my best guess from comparing those options.

There is certainly a 16x9 480 width mode in the API's what people are using is down to the application.

Although it might be easier to actually just leave it in 720P mode and let it downsample, so you just have 2 setups, widescreen and 4x3.
 
Titanio said:
'We don't know', is a better answer. The only way you could know for sure if PGR3 was the only one would be to check each title on a debug kit, where you could capture shots from the framebuffer in native resolution.

Just like PGR3, I'm pretty sure we would have heard if any other games were under720p. There's no way they could keep that a secret with all those debug units out there.
 
Hardknock said:
Just like PGR3, I'm pretty sure we would have heard if any other games were under720p. There's no way they could keep that a secret with all those debug units out there.

I think we were 'lucky' to hear about PGR3. There were only a couple of people involved in getting the word out about that - most with debugs kept entirely mum about it. Generally speaking, airing a company's dirty laundry doesn't get you in their good books. So I wouldn't bank on hearing about this in every case, at all. By the sounds of that article there will be more going forward, but I doubt we'll ever be explicitly told that that's the case.
 
I believe edram makes less sense this generation compared to the previous one.
Most of the time we need to render slow opaque pixels and fast relatively simple transparent pixels.
The first case can be managed by a standard GPU with an external memory pool, while the second case can be addressed with a small (ie 64x64) on chip tie cache, if we use our powerful CPUs to tile all our transparent geometry and to submit geometry in tile order.
It would make perfectly sense on a console..
 
Last edited:
That's an interesting idea. I've always thought that the (potential) role of framebuffer cache in discussions like these has been oft overlooked, but it's always been hard to quantify its contribution. An approach like that seems a more concrete way to leverage it effectively.
 
nAo said:
I believe edram makes less sense this generation compared to the previous one.
Most of the time we need to render slow opaque pixels and fast relatively simple transparent pixels.

It's pretty interesting that you just came right out and said that. WOW!
 
nAo said:
I believe edram makes less sense this generation compared to the previous one.

It does make 'less' sense in some ways. Another way it makes less sense is that the 3MB and 4MB on the GCN and PS2 were fairly significant portions of the total memory. e.g. the PS2 has 36MB of memory (32MB general, 4MB video), meaning ~11% is for video functions. The 360 has 522MBs of memory (512MB general, 10MB eDRAM), meaning ~2% is specifically for the framebuffer. Of course this is tradeoffs. Textures detail has increased, as well as multiple layers for normals and what not. In this sense it is not easy to compare the 360's eDRAM and the previous generations as they are used differently (the 360 eDRAM does not texturing).

On the other hand the area it makes 'more' sense is bandwidth in general.

Lets assume MS had decided to use a 256bit bus and no eDRAM. The backbuffer is a major client for bandwidth, so your large, 512MB pool of memory becomes saturated by the backbuffer.

Obviously MS averted this situation by using eDRAM. On the other hand Sony, especially if they go for some of their E3 2005 bullet points (128bit HDR, 1080p games, etc... which we all know are bullet points and the major, graphically aggressive games wont go that route) the 22.4GB/s of GDDR3 is going to be saturated. So ~50MB of framebuffer in a 256MB pool of memory is going to saturate the memory bandwidth. 4/5ths of the 256MB GDDR3 gets underutilized.

There are draw backs to every design decision. A 256bit bus still would not have met all the bandwidth needs MS was/is aiming for and runs into the above problem; it also has pad issues which could pose future price reduction issues as well. 20-30MB eDRAM would have been cost prohibative. The alternative was tiling at 720p when MSAA is enabled. I had originally planned to respond to the initial article, but it seems Mintmaster has hit on most of the errors in the article. It comes across as technical, appealing to developers and technical language, but makes basic mistakes. But the discussion spawned from it has been pretty solid.
 
Last edited by a moderator:
Acert93 said:
Lets assume MS had decided to use a 256bit bus and no eDRAM. The backbuffer is a major client for bandwidth, so your large, 512MB pool of memory becomes saturated by the backbuffer.

Obviously MS averted this situation by using eDRAM. On the other hand Sony, especially if they go for some of their E3 2005 bullet points (128bit HDR, 1080p games, etc... which we all know are bullet points and the major, graphically aggressive games wont go that route) the 22.4GB/s of GDDR3 is going to be saturated. So ~50MB of framebuffer in a 256MB pool of memory is going to saturate the memory bandwidth. 4/5ths of the 256MB GDDR3 gets underutilized.

I think the point is that finding yourself with this kind of setup, you're not going to do that - saturate bandwidth using a fraction of memory. No (sound-of-mind) PS3 dev is going to saturate GDDR3 with framebuffer bandwidth and idle most of the VRAM.

nAo's suggestion, and others, may be ways to (perhaps dramatically) increase framebuffer activity without finding yourself out of main memory bandwidth. You will find things like that done - I don't know about nAo's idea specifically, but we're seeing others already, that you wouldn't see used say, in PCs, or indeed in any other system to date AFAIK.

I suppose it is fair to say, though, that we're looking at things from the viewpoint of perhaps a more competent developer - which would affect the argument about eDram usage on 360 also, if we were to look at it from that kind of perspective.

Whether eDram is less of a good idea or not ultimately depends on whether it's (typically) used or not for what it was designed for.
 
Last edited by a moderator:
nAo said:
I believe edram makes less sense this generation compared to the previous one.
Most of the time we need to render slow opaque pixels and fast relatively simple transparent pixels.
The first case can be managed by a standard GPU with an external memory pool, while the second case can be addressed with a small (ie 64x64) on chip tie cache, if we use our powerful CPUs to tile all our transparent geometry and to submit geometry in tile order.
It would make perfectly sense on a console..

I agree in principal, although I thought Sony's and Nintendos trade offs last gen didn't make much sense either.
I think what makes it make sense is the narrow bus to memory. If they didn't have EDRAM they'd need a 256bit bus or two bus' to make any sort of high def rendering practical.
Tiling is interesting, but I've yet to see real evidence that it actually reduces transistor counts at a given performance level.

I think it's way to early to be declaring MS's or Sony's graphics chip choices bad or good, we have to wait and see.

MS is attempting to address developers biggest complaints about Xbox.
Sony's doing the same by taking a PC graphics part and putting it in PS3.
Nintendo did the same thing with the GameCubes memory performance vs N64.

In almost all cases the manufacturers seem to over react.

I personally have a soft spot for Xenos because it's architecturally interesting, and in some ways extremely clever.

But personally I'm more worried about the processors and their memory interfaces far more than either of the graphics chips.
 
The EDRAM is probably shit.

That was obvious from the get-go.

We'll see how it turns out, but certainly PS3 games so far look to be at least on par.

The design decision looks to be a bad one, and frankly, an armchair guy like me could see it immediatly.

There's simply no way to waste that may transistors on not processing and get away with it.

Interesting the cost breakdowns of Ps3 I've seen put the GPU at $70, while I typically see $100 or more for Xenos.

Considering that overall G71/RSX is smaller, and there is one die instead of two, this doesn't surprise me.

Yet it appears to outpower Xenos in shader power.

The PS3 will be expensive, but that's because of blu-ray and cell. Clearly they owned Xenos on cost/performance with RSX.

And the bus from shader core to EDRAM..I bet that adds more cost to X360. I bet the cost ends up more than for a two 128bit-bus design. There is a lot of extra compexity there.

Basically, the EDRAM is why MS will lose this generation and billions of dollars, quite probably. That's not an exaggeration. If they dedicated those transistors to shaders, they would be outp[owering PS3 and lets face it,. that's what people care about.

Think about the E3 Halo3 trailer. People are really only interested in the graphics. I mean really, thart's what we're all thinking. How will it LOOK. Can X360 hack it?

That's videogames in a nutshell. They are always about hardware, not software.
 
Back
Top