HD problems in Xbox 360 and PS3 (Zenji Nishikawa article @ Game Watch)

nAo said:
Sorry..cache for what?
Well, I probably shouldn't have said "a lot" since that's rather vague, but I was talking about read and write buffers for pixels. I guess it's sort of related to the more complex memory controller.
That's probably true. At the same time I don't think that a 32 GBytes/s bus that connects the main die with the edram die comes for free..;-)
Not free, but it's not a big deal. There's a very limited number of packet types being communicated, and the short distance allows the use of few connections with high speed with no need for error correction. I don't think it's anything like an AGP or PCI Express bus. The other stuff I mentioned is definately bigger.

I stand my opinion, you will be surprised, I have no doubt sooner or later (ok..maybe later.) you will think: god..how did they do that? ;)
I dunno. I keep up on graphics literature, and that's only happened to me a couple of times in the last decade. Heck, I've been waiting for incredible techniques that are 4 years old to get implemented.

The last time my jaw dropped was the first time I saw dependent texturing (EMBM). Seeing bumpy reflections and beautiful water really popped out in a sea of bland textures. RTHDRIBL was pretty cool too. As for reacting in a truly "OMG how'd they do that" way, it last happened playing a game called Motorhead (1998). Couldn't believe it ran so fast in software at 640x480 on a Pentium 166. They had maybe 10 CPU cycles per pixel.

Never saw anything on a console that was as mind-blowing as you're describing. Still, maybe your prediction will come true...
 
Last edited by a moderator:
Hardknock said:
But they never said HDR.
Are you saying that based on your observation of the Warhawk GDC clip that this game is not employing any kind of HDR precision?

Because if that's fake HDR, then I want more. There was plenty of smooth transition tonemapping too, but it's a little difficult showing that in a screenshot :D
 
Mintmaster said:
Well, I probably shouldn't have said "a lot" since that's rather vague, but I was talking about read and write buffers for pixels.
In current GPUs those are so small they barely register in terms of die space used. Apparently only a few odd freaks like nAo and me think larger sized pixel caches would be useful.:devilish:

Never saw anything on a console that was as mind-blowing as you're describing.
Well I do think nAo was also saying you would be surprised given your actual expectations from this hardware. Whether something is wow-factor in absolute terms it's mostly a matter of individual perspective anyways (I still remember how many people were wowed by WingCommander sprite graphics where I found it a huge step back from space shooters before it, but I disgress :p ).

Interesting. I always thought 2MB seemed like too little, but this is what I kept reading about PS2. The original 3DFX Voodoo gave you decent graphics with only this much memory so I figured it was possible.
Well textures in V1 games were also comparable to PS1 generation, not PS2. At any rate, textures are stored mostly in main mem. The way some of us use eDram, it would be rather inconvenient to statically allocate large parts of it for storage anyhow.

Smoke/fog/fire/dust/fur/grass will always look better with more pixels.
I don't necesserily agree with this - Warhawk already presents one interesting way to do quality volume smoke without going overdraw happy. Not to say everything has an alternative - but I do believe we have lots of room to rethink particle approaches with new machines.

I kept thinking they just wrote horrible code, but seeing it again and again makes me think they must be doing something useful that gobbles the power.
To be fair, writting code that uses too much resources is the easiest thing under the sun :p Doesn't mean it's horrible either, devs are only human, and with time constraints we work under, questionable design decisions are unavoidable.

Regarding bandwidth, IHV's obviously make their decisions for a reason, and stripped down value cards with half the bus width show notable performance drops too.
Well someone smart has once said here marketting makes most of hw-design decisions, and I'd say the rest are dictated by target platform - and closed boxes have quite different requirements then PC.
For one, you absolutely don't care how your product runs existing/legacy software/benchmarks. ;)
 
Last edited by a moderator:
I think the edram daughter die is a waste. MS has dropped the 4xAA requirement because tiling is a last minute broken decision to go with their HD era marketing. They know its unfair to their devs to force them to tile their next gen engine.

Motogp dev has this to say at neogaf
Its no longer required, but it is still strongly recommended. The game I was working on originally had 4xAA, then got scaled back to 2xAA and is still choking slightly up here and there.

No matter how well a game uses anti-aliasing, every one is going to have some aliasing issues no matter what, and they stick out like a sore thumb in HD (hence Microsofts requirement in the first place). Every game is still going to have AA implemented, obviously, but its not really free as advertised.

The 100mio transistors should have been shaders logic. How many of those 48 shaders are useable IRL? Why are we not seeing it IRL?
 
tema said:
The 100mio transistors should have been shaders logic. How many of those 48 shaders are useable IRL? Why are we not seeing it IRL?
Remember it's all about die space. 100 million transistors on the edram die (DRAM + some logic) does not necessarily translate into 100 million transistors pure logic. Memory cells are usually very compact.

Keep in mind the developers haven't spend a year with the 360 hw yet, it probably has some mileage left. :smile:

The learning curve may have been a bit steeper than expected though.
 
mckmas8808 said:
Can anybody explain to us how Heavenly Sword and Warhawk (2 games that could possibly be day 1 launch games at best launch window games at worst) have both HDR and AA? I read a lot of the NAO 32 material and it basically makes sense to me in a small way, but what about Warhawk?

I take it that both of these games are internally rendering the game at, at least 720p so what are the devs doing to make it happen? Is it the hardware or is it more just very talented devs that making this happen?

If you were to read forums 9 months ago you would have thought that HDR + 4xAA on the PS3 was almost an impossible to get at 720p. I'm just very curious because it wasn't long ago that the Xbox 360 was known to have "free" 2xAA and basically free 4xAA, yet it doesn't seem to be that way (obviously at 720p or higher).

I just don't get it. I had prepared my mind for non AA games on the PS3 with higher resolutions yet now everything that I learn is know false. Can somebody please explain before I ramble myself to death?

Oh and does anybody think that MGS4 will use 2xAA when released because it's obvious that they are using a nice deal of HDR?

Thanks everyone.:smile:

*Disclaimer: This is in no way a diss to the Xbox 360 or its developers. Just need more understanding on what was said months ago (pre GDC) and now (post GDC).

it could be 600p and upscaled to xxxp to have a AA effect.... who will know for now? :devilish:
 
tema said:
The 100mio transistors should have been shaders logic. How many of those 48 shaders are useable IRL? Why are we not seeing it IRL?
Dude, for the last time AA is not the only use of the eDRAM. It helps nearly all the time. Also, if you want to compare with the PS3 design, remember that there are many reasons for having a single memory pool, and hence a single bus.

The best graphics expert in the world could not "see" how much processing is used to create a game. Saying that we're not seeing it is just ignorant.
 
Crossbar said:
Keep in mind the developers haven't spend a year with the 360 hw yet, it probably has some mileage left. :smile:

Not to nitpick, but i think some developers HAVE spent a year on the platform by now, as we were seeing realtime footage this time last year... But there's obviously lots of room for improvement.
 
london-boy said:
Not to nitpick, but i think some developers HAVE spent a year on the platform by now, as we were seeing realtime footage this time last year... But there's obviously lots of room for improvement.
Sorry if I am miss-informed, I thought MS started handing out the beta kits after E3 last year, with volumes appearing first in July.
http://gamesindustry.biz/content_page.php?aid=9914

I also thought this post from yesterday kind of confirmed they were pretty late at least for the final kits.
titanio said:
-Initial tapeouts for Xenon and Xenos were Oct 04 and Sept 04 respectively. Final tapeouts were Jan 05 and July 05 respectively - ATi ran into a spot of bother getting Xenos finalised on time.
 
This being due to japanese developers usage of playstation 2 as the base console they forgot how to properly use flexible shaders which were introduced after directx8 (after PS2 came out). Aside from that there are 64 bit HDR where XBox 360 needs ALOT of bandwidth and thats not enough for the EdRam so 2xAA can be used but with 32 bit HDR there is enough bandwidth for the 360 Edram to use 4xAA. again this is all about HDR and AA at the same time. current PC GPUs find it very hard to do much AA with 32 bit HDR let alone 64 bit HDR. It all depends on what the developers want. If however the developer doesnt use HDR and uses technique like bloom or int8 ext, then however 4xAA can be a free issue and alot of bandwidth is saved. but developers these days want to tell the consumers about (HDR used in games) so they go along using it and not innovating enough to apply other forms of HDR *alternatives*
 
tema said:
I think the edram daughter die is a waste. MS has dropped the 4xAA requirement because tiling is a last minute broken decision to go with their HD era marketing. They know its unfair to their devs to force them to tile their next gen engine.

Motogp dev has this to say at neogaf


The 100mio transistors should have been shaders logic. How many of those 48 shaders are useable IRL? Why are we not seeing it IRL?

Do you have a link for this quote?
 
Fafalada said:
In current GPUs those are so small they barely register in terms of die space used. Apparently only a few odd freaks like nAo and me think larger sized pixel caches would be useful.:devilish:
Apparently you both can't see that Xenos EDRAM is one mutha of a pixel (and MSAA) cache.

Jawed
 
Jawed said:
Apparently you both can't see that Xenos EDRAM is one mutha of a pixel (and MSAA) cache.
My main point was exactly this: most of the time I don't need a so big cache, a way smaller one would be enough to cover those cases where we need a huge fill rate.
MSAA would help you on slow and opaque pixels, and we can happily live without edram in that case. With long shaders on opaque pixels the associated cost of MSAA is very small..
 
kabacha said:
This being due to japanese developers usage of playstation 2 as the base console they forgot how to properly use flexible shaders which were introduced after directx8 (after PS2 came out). Aside from that there are 64 bit HDR where XBox 360 needs ALOT of bandwidth and thats not enough for the EdRam so 2xAA can be used but with 32 bit HDR there is enough bandwidth for the 360 Edram to use 4xAA. again this is all about HDR and AA at the same time. current PC GPUs find it very hard to do much AA with 32 bit HDR let alone 64 bit HDR. It all depends on what the developers want. If however the developer doesnt use HDR and uses technique like bloom or int8 ext, then however 4xAA can be a free issue and alot of bandwidth is saved. but developers these days want to tell the consumers about (HDR used in games) so they go along using it and not innovating enough to apply other forms of HDR *alternatives*

Your part about Japans developers is "pure" BS.
 
nAo said:
My main point was exactly this: most of the time I don't need a so big cache, a way smaller one would be enough to cover those cases where we need a huge fill rate.
MSAA would help you on slow and opaque pixels, and we can happily live without edram in that case. With long shaders on opaque pixels the associated cost of MSAA is very small..

Do you have one now then, a "way smaller one" that is?
 
nAo said:
My main point was exactly this: most of the time I don't need a so big cache, a way smaller one would be enough to cover those cases where we need a huge fill rate.
But Xenos EDRAM isn't there merely to deliver twice the fillrate of RSX, it's also there so that texturing out of "VRAM" can proceed at a healthier pace and provide XB360 with a unified memory architecture.

And it also makes for an easier transition to smaller die sizes (65nm, 45nm) where a 128-bit bus shrinks more readily than a 256-bit bus would (the pad-density problem).

On top of that, if you look at the RAM in each Cell SPE, you see it's 6x as dense as the logic. It consumes ~33% of the die area of an SPE, but consists of 2x the number of transistors as the logic does (14m transistors of memory versus 7m of logic). If Xenos EDRAM has anything like the same density, then excluding the 25m transistors of logic, the 80m transistors of EDRAM would translate into about 13m transistors of logic, in terms of area.

But that excludes the cache that the ROPs require, now that EDRAM has been cut out. It also doesn't account for the fact that the ROPs would have to increase in complexity to support non-EDRAM raster operations against GDDR3. And they would also have to increase in complexity/count in order to compensate for the slowness of GDDR3 in comparison with EDRAM.

So, after all that, you might have about 5m transistors of extra shader logic in Xenos by dropping the EDRAM. Enough for about 4 shader pipes and their associated register file + scheduling/arbitration.

---

The EDRAM is intrinsic to the system design for XB360.

Jawed
 
Jawed said:
The EDRAM is intrinsic to the system design for XB360.
No one was arguing against Xenos, all the system was built around that edram and it makes perfectly sense.
The main point, that's the last time I'm going to reapeat this (I promise), this generation edram make less sense, we could happily live without it since in the vast majority of games most of the stuff on screen will be generated by 'slow' pixels.
 
Back
Top