MS big screw up: EDRAM

Status
Not open for further replies.
Shifty Geezer said:
Cell definitely. Xenos still uses shaders int he same way PC's have been using for a years, and there's very finite uses on very definite and well known data structures. Cell is so open-ended even the data structure access can be modelled differently. The interoperability between SPEs, the management of code chunks and buffered data, the creation of algorithms to work in a stream-friendly way...
I think Cell is at least better than Emotion Engine for that regard with better documents and samples, and C/C++-friendliness. Also in the next month (or next week?) the Cell simulator will be released in public so you may see some grass-roots research in the near future.
 
I would maintain the goal for a GPU is simple: put out the best possible graphics per transistor budget

No, the goal should be to put out the best possible graphics within a certain monetary budget. Cost does not always increase linearly with the number of transistors and that is a key point that you need to realise. Ram takes up less die area then a similar number of transistors for other logic. You may have eDRAM taking up 80 million transistors but that doesn't mean you could include 80 million transistors for vertex/pixel shaders ect in its place at the same cost.
 
Last edited by a moderator:
dukmahsik said:
i wonder which has a steeper learning curve... cell or xenos?

Just look at the results of what developers have accomplished with the two sets of hardware so far. It should be crystal clear.

Although on such an x86 orientated site like this one "steeper learning curve" is assumed to be anything not "like an x86 pc running DirectX."
 
Xbox 360 is dead.

All hail Xbox 360.

Instant Ban!

6 pages arguing edram's application to anti-aliasing, and I bet not a single poster can even come close to writing ant-aliasing algorythms. Leave the programming to those who know how (ps to bill, mentioning actual code is also an instant ban, threat, and post erasing by board ops; but, I am hoping being page six, it will not be noticed)

:rolleyes:
 
Bill said:
"Bandwidth is a problem if you do a lot of post-processing effects and more if you share it with the cpu."

True. But they could have done two 128 bit busses like PS3.

I understand, 256 bit seemed too expensive.

Not trying to be militant.

Here's the simple points an EDRAM defender needs to prove, that not having it would have otherwise crippled the system. Or show bandwidth limited benchmarks at around 720P.

Otherwise, it does nothing.

Let me put it another way, if X360 is not putting out better graphics than PS3, the EDRAM failed, plain and simple.

After all, MS paid for more transistors in the GPU than Sony did.
But MS isn't making this regulation that all next-gen games on the system must look next-gen. They can't afford to.
 
Status
Not open for further replies.
Back
Top