MS big screw up: EDRAM

Status
Not open for further replies.

Bill

Banned
Why waste 105 million transistors?

With John Carmack now saying PS3 is probably more powerful, I think X360 is in trouble.

EDRAM was a feature of Gamecube. That was a failure then. Why recycle it?
 
Last edited by a moderator:
I expect a lot of "The EDRAM is hardly useless" replies.

But it is.

I would MUCH rather have 337 million logic transistors.

Or even 280 if 337 on one die was too much.
 
Last edited by a moderator:
GameCube being weaker than XBox doesn't make eDRAM a waste. On the contrary, the eDRAM in the 360 allows for incredible bandwidth alleviation. I mean I'll readily admit I've been putting MS's claims under the microscope today, but as for the eDRAM itself, I still think it will come out as 'well worth it's transistors.'
 
Last edited by a moderator:
IT'S NOT!

They could have used a 128 bus just like PS3. At 720P they'd be fine.

Gamecube was a failed console.

And in the other forum discussion with ERP, it now seems EDRAM AA is dicey itself.
 
Bill from reading your past posts, you seem quite focused in on raw power. Maybe in that sense Xenos would have been better off re-allocating transistors from eDRAM to the chip itself. But the design is more one of finesse than brute strength, and the eDRAM allows for the implementation of several effects - at a minimal performance loss - that in a traditional environment would thouroughly tax bandwidth and performance.

I think Xenos' core design philosophy is all about flexibility/elegance, and the unified shaders, MEMEXPORT functionaility, and eDRAM all combine to produce a design that is incredibly flexible.

In addition, Xenos is presumably somewhat forward compliant with DX10, which should make future titles easier to port to the 360 from the PC, and allow developers that much more ease in cross-platform development. Lower costs/effort will of course hopefully result in more titles.
 
Last edited by a moderator:
Bill said:
IT'S NOT!

They could have used a 128 bus just like PS3. At 720P they'd be fine.

Gamecube was a failed console.

And in the other forum discussion with ERP, it now seems EDRAM AA is dicey itself.

Gamecube failing had nothing to do with edram, the ps2 had it and it won the console war this gen.
 
If you could have the PS3 be 16 pipes, but EDRAM, would you?

Hell no.

I listened to Cube is efficient for years. It was underpowered is what.

I'm an MS ******. I hope I'm not banned for my opinion.
 
Bill said:
If you could have the PS3 be 16 pipes, but EDRAM, would you?

Hell no.

I listened to Cube is efficient for years. It was underpowered is what.

I'm an MS ******. I hope I'm not banned for my opinion.


If you get banned it wont be for your opinion, it will be for how you present it.
 
Bill said:
I hope I'm not banned for my opinion.
There is this misconception that if it's opinion, it's not only safe to post but its owner has the right to bring it up many times. Not all opinions are equally valid. The underpinnings of yours are quite suspect. They lack in-depth analysis and settle with cursory data. I have about as convincing opinion about liking apple pie as you do against eDRAM.

As with ANY console, you will have to wait a good long time before you truly see what can be done.
 
I agree, if it were up to me i would of used them for logic and not EDRAM.

But if microsoft and Ati decided to include it then it must contribute more to peformace than what it removes.
 
Well, most everybody here is probably going to disagree with me.

And there's almost no doubt in my mind I'm right about it anyway.

Oh well.

I ask again, who would favor a 16 pipe RSX with 10 MB EDRAM?

Few people, basically.

You want data? I dont know what you mean.

I'd rather have general purpose power than EDRAM that seems to serve almost no purpose. Normal GPU's just dont have EDRAM. They're fine. It's simply not needed.

If it ends up PS3 is crippled due to lack of bandwidth and Xenos isn't, then it was worth it. But it isn't going to happen probably.
 
Last edited by a moderator:
Bill said:
Well, most everybody here is probably going to disagree with me.

And there's almost no doubt in my mind I'm right about it anyway.

Oh well.

I ask again, who would favor a 16 pipe RSX with 10 MB EDRAM?

Nobody, basically.

I guess then it's a good thing for microsoft xenos isn't rsx.
 
I tell you another thing that convinced me.

ATI's utter lack of a coherent explanation of why it's not on PC's.

Uh, cus consoles are subsidizied?

So??!! All the more reason to get the most performance possible per dollar!
 
Bill said:
Why waste 105 million transistors?
How is it a waste of transistors? Firstly, that figure contains the ROP circuitry - go and find out how many transistors are dedicated to such tasks in a PC GPU; secondly, ask yourself this - how many transistors are in a single DDR-SDRAM memory chip? Now add those two figures together...

Bill said:
And there's almost no doubt in my mind I'm right about it anyway.
So what's the point of this thread then? This forum isn't soap-box for people to rant away - if you want to do that, then go somewhere else.
 
I figure 10 million make up 8 rops.

So yeah, they wasted 95 million.

I dont understand your second question.

"So what's the point of this thread then? This forum isn't soap-box for people to rant away - if you want to do that, then go somewhere else"

I'm just commenting on the situation.
 
Last edited by a moderator:
If I could have 337 million of logic, I'd ditch the EDRAM so fast..

What is that, 80 ALU's?

Heck, I think I'd rather have R520 even than EDRAM Xenos.

On long shaders properly programmed for, it's probably more powerful than RSX. At 321 million transistors.
 
Last edited by a moderator:
Bill said:
EDRAM was a feature of Gamecube. That was a failure then. Why recycle it?

How much RAM did the GC have? 40MB.

~63% as much ram as XBOX, and roughly the same as PS2. Gamecube had graphics better than PS2 in titles like Resident Evil, which are comparable to all but the bext XBOX games like SC: 3.

ANyways, I don't see how GC was a failiure, it had much less 3rd part support, and with 2/3rds of the ram is was able to produce GFX that could not be done on a PS2, and were extremely close to any XBOX game.

All at a lower price than either competitor. That sounds like a win to me.

I think I'll trust the ATI and MS engineers, ATI seems to know what it's doing
 
Last edited by a moderator:
But they had terrible sales.

Not a win.

I always considered PS2 and GC about equal.

At 32 megs of RAM versus 24 on GC, I might prefer PS2.
 
Last edited by a moderator:
Status
Not open for further replies.
Back
Top