RSX and PR bull?

xbdestroya said:
MS and/or ATI might own the patent to *the* eDRAM, but not eDRAM in general. And since Revolution isn't targeting 720p or anything, no need for the exotic tiling scheme.

They own the patent to the logic within the edram. The edram itself is NEC's.
 
Powderkeg said:
They own the patent to the logic within the edram. The edram itself is NEC's.

Yes yes, but what I'm telling you is that Nintendo would be free to use eDRAM sans on-die logic*; and frankly I don't think they would suffer for it as they should be able to include enough eDRAM to handle the full 480p should they wish to, and focus their 'logic' efforts on the GPU itself. It may not provide the same benefits as Microsoft's approach - but it will certainly provide benefits.

*The specific logic being implemented by ATI and MS.
 
Last edited by a moderator:
Guys.... I was kidding with the Xenos comment... My god, these days i only need one post to create discussions with worldwide repercussions!!!

I think i should run for President.
 
Powderkeg said:
Possible, but doubtful, and definitely not with the edram w/logic that the 360 has. MS owns the patent to that.
Just Googled this and all I'm finding is ATi patents for logic+eDRAM. Thus I think you're mistaken that MS hold the patent and ATi can't use the same tech in Hollywood, bar internal conflicts of legalness.
 
xbdestroya said:
Yes yes, but what I'm telling you is that Nintendo would be free to use eDRAM sans on-die logic; and frankly I don't think they would suffer for it as they should be able to include enough eDRAM to handle the full 480p should they wish to, and focus their 'logic' efforts on the GPU itself. It may not provide the same benefits as Microsoft's approach - but it will certainly provide benefits.

No argument about that. It's basically what they've already done with the GCN so I would expect a similar move from the Revolution. It works well enough and must certainly be cheaper than MS's design.
 
Shifty Geezer said:
Just Googled this and all I'm finding is ATi patents for logic+eDRAM. Thus I think you're mistaken that MS hold the patent and ATi can't use the same tech in Hollywood, bar internal conflicts of legalness.

Perhaps you are right, and I confused this with the procedural synthesis patents.
 
scooby_dooby said:
This interview??



It's a custom version of G70, they've been working on G70 for 18 months at the time of the interview, it didn't make economical sense to design a custom PS3 gpu when they were designing their new chip anyways.

He does not say they were designing RSX for 2 years concurrently to G70, he says RSX is a custom G70, and that the G70 has been in development for 18 months.



So clearly they're doing something to make work with CELL, it will be interesting to see what happens!

Ken Kutaragi said:
Those who aren't in the know seem to think it's an off-the-shelf PC GPU, but in reality, they are totally different in their architectures.
Was Ken Kutaragi lying through his teeth? If Sony and Nvidia's comments weren't so damn contradictory, we wouldn't be half as confused about the RSX than we currently are.
 
Ken and David can both be correct in their statements; one could easily argue that a chip based mostly off of the G70 architecture could still be 'totally different' from a standard video-card based GPU. Things can get 'lost in translation' and I could full well believe that when Ken was talking about 'architecture' he may not have been refering to the technological base per se; perhaps the IO interface or the turbo-cahce or the RAM setup. So Ken's statement, though opening up the possibility of something exotic, doesn't really do too much to prove it either.
 
Well, how about this... just absolutely crazy talks here, but hear me out. What if the RSX was CELL based in architectural design with the G70 shaders replacing the SPEs/PPEs with shaders. Version posted a picture of something like this way way back. Yeah, yeah, like I said crazy talk, BUT it would allow RSX to be based on G70 and have a complete different architecture at the same time.
 
Alpha_Spartan said:
Was Ken Kutaragi lying through his teeth? If Sony and Nvidia's comments weren't so damn contradictory, we wouldn't be half as confused about the RSX than we currently are.

Thats why i think its best to se through the PR(as much as possible) as that statement from Kuturagi could very well be the FlexIO he talks about.
 
Maybe we can chalk that up to translation issues...and I'm saying this because Kutaragi-san deserves the benefit of the doubt. But as translated, it doesn't sound like he's talking about one aspect of the chip (i.e. the memory controller interface), but rather the whole GPU.

In the back of my mind I have a feeling that Sony wants the public to believe that the RSX is a completely different GPU that uses some features of the G70, but other than that it's a new GPU that somehow has the same theoretical peaks of a G70 overclocked to 550 MHz. While in reality, the RSX is a slightly modified G70 shrunk to 90 nm and overclocked to 550 Mhz.
 
REYES pipeline!

We know the 136 inst/cycle matches with E3 but the DOTs/cycle don't. There are NV patents for geometry shaders/programmable primitive processors etc. The 24 PS units will largely remain unmodified but the vertex/geometry pipeline/triangle setup will get modified so that micro-polygons can get shaded like conventional fragments using the PS units!

*Warning: Extreme speculation!*
 
Jaws said:
*Warning: Extreme wishful thinking!*
fixed :p

If RSX is to have redundancy built in, won't we be seeing some extra pipes just waiting around for manufacturing defects so they don't get used?
 
That could easily be an extra quad or something else though. Going to 90nm it should be easy enough. Not to mention the 'shadow transistors' that might already be present on G70. ;)

(Not saying I immediately buy into the REYES theory - but I like the spice!)
 
Last edited by a moderator:
Shifty Geezer said:

It ain't over 'till the fat lady sings!

Shifty Geezer said:
If RSX is to have redundancy built in, won't we be seeing some extra pipes just waiting around for manufacturing defects so they don't get used?

Not sure what you mean?
 
Alpha_Spartan said:
In the back of my mind I have a feeling that Sony wants the public to believe that the RSX is a completely different GPU that uses some features of the G70, but other than that it's a new GPU that somehow has the same theoretical peaks of a G70 overclocked to 550 MHz. While in reality, the RSX is a slightly modified G70 shrunk to 90 nm and overclocked to 550 Mhz.

Well thats what i belive too.

The only thing i also know is that nVidia never give away specs about future parts.
One example is when they talked about their "refresh" part at an analyst and said we shouldnt expect any major performance increase. But G70 gave quite alot of that in my mind.

So for my own interest to keep up im more argumenting with what i belive is the RSX and try to se that if not, then what are the changes etz. Of course one thing to see is what and if nVidia releases a 90nm G70 to counter R580, then my guess would lend on that part.
 
Shifty's referring to an allusion by Kutaragi to 'redundancy' being built in to RSX similar to the way it's 'built in' to Cell in order to improve yields.

@overclocked: G72
 
Last edited by a moderator:
You are forgetting that RSX might probably lack the whole MPEG decoder logic, which is about 20 mio Transistors alone (and some other PC only features).

Despit that, it has much more transistors than the current G70.:rolleyes:
 
Back
Top