Strong? :? is this insider info or just speculation?Jaws said:2. The GPU will have eDRAM. STRONG.
Jaws said:Sorry but I'm confused about this threads facts/ rumours/speculation being blurred. The only facts we have are the joint Sony-nVidia press release last December. And maybe any direct quotes from Huang.
So in order to help things along how about agreeing on certain fundamental points, on a point by point basis as STRONG/ WEAK rumours/speculation and once agreing on all the STRONG speculation, then see what can be inferred from them?
I'll start with these points, Strong/Weak rumours,
1. Aggregate two-way CELL<=>GPU bus bandwidth, ~77 GB/s. This is divided into 12*8bit lanes. It's assymentric with a 7:5 ratio. STRONG.
2. The GPU will have eDRAM. STRONG.
3. Vertex work predominantly done on CELL/ SPE's. STRONG.
4. Deferred rendering. WEAK.
5. Total XDR RAM, 256MB. STRONG.
6. One CELL/8 SPEs will be used as CPU. STRONG.
Please add more key points and once everyone's in agreement then infer what we can from all the STRONG points?
nAo said:Strong? :? is this insider info or just speculation?Jaws said:2. The GPU will have eDRAM. STRONG.
Yeah...Nvidia had to little to time to add edram to their design.Jaws said:So you think it's WEAK?
nAo said:Yeah...Nvidia had to little to time to add edram to their design.Jaws said:So you think it's WEAK?
Even if I would like to be wrong on this..
Jaws said:nAo said:Yeah...Nvidia had to little to time to add edram to their design.Jaws said:So you think it's WEAK?
Even if I would like to be wrong on this..
I suspect something along the lines of NV's TurboCache tech being used instead? What of all the Sony-Tosh investment into eDRAM tech over the years?
Well, can we infer other things from this key issue... i.e. I'm under the impression (correct me if wrong) that in order to emulate PS2 successfully, you will need more than 48 GB/s bandwidth somewhere in the system because of the GS 48 GB/s? If it's not the actual EE+GS IC, then this bandwidth has to be available somewhere else in the system? If this isn't available then the EE+GS IC is a given?
nAo said:Yeah...Nvidia had to little to time to add edram to their design.Jaws said:So you think it's WEAK?
Even if I would like to be wrong on this..
Acert93 said:nAo said:Yeah...Nvidia had to little to time to add edram to their design.Jaws said:So you think it's WEAK?
Even if I would like to be wrong on this..
eDRAM also takes realestate away from other features. It seems reasonable to assume that the PS3 GPU will be 90nm. The NV40 was 222M transistors.
The problem is not space, but time. NVIDIA deal with Sony was finalized too late..one said:If it's in 65nm (I think it's more reasonable guess seeing various news articles) and it's without Vertex Shader and PC-specific features, there's more than enough space for eDRAM.
one said:If it's in 65nm (I think it's more reasonable guess seeing various news articles) and it's without Vertex Shader and PC-specific features, there's more than enough space for eDRAM.
EDRAM is rumored to be present in Xenon, not eDRAM.Jaws said:the Xenon is rumoured to have it...
No, indeed that doesn't tell much.Uttar said:My point just is we don't even have reliable R520 specs to compare it, so precise GPU performance information wouldn't do us any good. Of course it's gonna be faster than today's high-end PC GPUs, and of course it's going to be clocked at more than 500Mhz. But that doesn't tell you much, now does it?
Then why didn't they choose ATi instead for better integration of eDRAM? Why is it a licensing deal? Why do they use bulk-CMOS only for the GPU?nAo said:The problem is not space, but time. NVIDIA deal with Sony was finalized too late..one said:If it's in 65nm (I think it's more reasonable guess seeing various news articles) and it's without Vertex Shader and PC-specific features, there's more than enough space for eDRAM.
Acert93 said:one said:If it's in 65nm (I think it's more reasonable guess seeing various news articles) and it's without Vertex Shader and PC-specific features, there's more than enough space for eDRAM.
I know that 65nm is the goal for the CELL. But 65nm gives Sony less than one year to get the GPU there. I think that may be asking a lot. First is the fact nVidia is just moving to 90nm, asking them to move to 65nm in less than a year (if PS3 is released in Japan in the Spring, meaning they have to have enough suppies stocked for a couple millions units by then) is a pretty big task. Throw in eDRAM, something nVidia has not worked with to my knowledge, on top of a new process is a lot to ask.
Acert93 said:And if the last 2 GPU generations are an indication, complex chips on a new process have horrid availability.
one said:I've thought it's a licencing deal and most of the chip implementation will be done by Sony just like NEC did, with NEC's own technique for eDRAM, for ATi in the GameCube, no?
So nVIDIA changed its fab from TSMC to IBM recently.
Acert93 said:The PS3 GPU may very well be at 65nm with eDRAM, but without any official information or hints I think we should be careful getting ahead of ourselves. Setting high expectations that are not met is kinda like anti-Hype. e.g. There was a lot of speculation CELL would have eDRAM, but that also did not come to pass. If CELL is not having eDRAM, why would the GPU?
Acert93 said:And since 65nm is a new process and you are talking about needing millions of units quickly, well, that is a huge hurdle. It is not about the tech alone, but availability. They have to get this out the door at a realistic price. And the quotes from the CELL release from some analysts talking about $500-$700 PS3s is not realistic. If they come in at that price they are DOA.