Could the RSX be more powerful than originally thought?

BOOMEXPLODE said:
The "old" news and it's thread is here. Presumably all this discussion as it relates to G70 could take place there.

Steering things back on course: how could this "rumour" relate to RSX? So what if G70 really has 32 pipelines, hypothetically the other 8 pipelines HAD to be disabled to achieve reasonable yields. Wouldn't the same be true for RSX as well?

Oh I know where the thread is BOOM, I'm all up in it. ;)

Anyway the eight pipes did have to be disabled for reasonable yields, yes. But with the caveat that a 32-pipe card was always there on die, allowing NVidia to collect 'perfect' chips in order to prepare their eventual launch of a 32-pipe part.

How does it relate to RSX? Ok for a minute let's just assume that G70=RSX and forget everything else we know. Now, at 90nm yields should get better after a point; add to that that since Sony has no intention of bringing out a 'full' RSX, they also do not need to differentiate it in the same way PC card manufacturers have to do to justify highly divergent price points. That means that RSX would just have to have the bare minimum amount of 'redundant' architecture diabled, in all likelihood, a single quad, giving 28 pixel pipes.

G70 on the other hand would need, for several possible reasons, 8 pipes disabled; from a business standpoint not the least of them being it's harder to charge hundreds more for a 32 pipe card when the in-family competition is a 28-pipe card, rather than when it's a 24-pipe card.
 
Well G70 has some performance headroom left in it's clock speed, too. Presumably when R520 finally gets here there will be a refresh of the 7800 with higher clock speeds and faster memory. And maybe, 32 pipes.

Assuming all that is true, and G70 has some pipes disabled, it does seem likely that Sony in 2006 would have less of a need (if any) to disable pipes in the RSX. And maybe that explains why any mention of fill-rate was conspiciously absent from the PS3 announcement, maybe they are not exactly sure how many pipes they would have to disable (if any) and this is something that's still up in the air, to be determined by the fabs performance. Of course, that's a whole lot of speculation!
 
Regarding the yields you have to think that this is a part Sony will use for 5-years ++, on 90nm that is matured in their fabs it will be no problem spitting out dies. Remember that 90nm are in the end of its cycle when they fab them and 65nm shrink is underway and planned BOTH fo RSX and CELL.
 
overclocked said:
Regarding the yields you have to think that this is a part Sony will use for 5-years ++, on 90nm that is matured in their fabs it will be no problem spitting out dies. Remember that 90nm are in the end of its cycle when they fab them and 65nm shrink is underway and planned BOTH fo RSX and CELL.

I agree with the 65nm logic and the idea that Sony might go for broke with that in mind, especially since Nagasaki is focused around that process and it will be the primary fab building these things, but 90nm isn't mastered yet to the degree implied by 65nm's looming onset. For example, we have yet to have a single GPU come out on 90nm (unless you count Sony's GS!). Further indication of complications in even 90nm at this point might b reflected in the troubles ATI is havign in steadying their yields pre-R520 launch.
 
Isn't it possible for ps3 chips to be the powerful chips, and the chips with disabled quads, PC parts?
 
The RSX having Flex-IO right on it, there's a pretty big difference right there compared to the G70 part. Plus binning huge 110nm parts for possible PS3 use doesn't make much sense, 90's will be cheaper, run cooler and have more leeway to choose pipeline specs based on more available cores on the wafer to pick from. TSMC makes the desktop chips, RSX is a Sony specific part that they themselves will manufacture. Getting 90nm up and ramped up to production tempo is the thing though, its certainly giving ATi grief with the R520 situation.
 
xbdestroya wrote:

I agree with the 65nm logic and the idea that Sony might go for broke with that in mind, especially since Nagasaki is focused around that process and it will be the primary fab building these things, but 90nm isn't mastered yet to the degree implied by 65nm's looming onset. For example, we have yet to have a single GPU come out on 90nm (unless you count Sony's GS!). Further indication of complications in even 90nm at this point might b reflected in the troubles ATI is havign in steadying their yields pre-R520 launch.

Well all fabs have yields issue at the beginning.
Think about this, you have the the "new" PS2 build with 90nm, you have the PSP on 90nm. ALL of these now so i say thats pretty confirming the state of the semiconductor abilites Sony has NOW on 90nm.
If you think a whole freaking year more you will see that 90nm is as matured as its gets.

For the record the transition to 65nm i belive will happen 2006 "Late" and ship the following Quarter.

Edit, i dont know what you mean by "Sony go broke"..
 
Hey guys what is the advantage for Sony to go with a 65nm tech with the RSX and CELL in the future? What would it do, make the machine smaller?
 
mckmas8808 wrote:

Hey guys what is the advantage for Sony to go with a 65nm tech with the RSX and CELL in the future? What would it do, make the machine smaller?

Cheaper, less heat etz...Think twice the chipset output for the same wafer.
 
overclocked said:
mckmas8808 wrote:

Hey guys what is the advantage for Sony to go with a 65nm tech with the RSX and CELL in the future? What would it do, make the machine smaller?

Cheaper, less heat etz...Think twice the chipset output for the same wafer.
And thus better yields as a "bad" chip @ 65nm is less of the wafer than a 90nm one.
 
overclocked said:
Well all fabs have yields issue at the beginning.
Think about this, you have the the "new" PS2 build with 90nm, you have the PSP on 90nm. ALL of these now so i say thats pretty confirming the state of the semiconductor abilites Sony has NOW on 90nm.
If you think a whole freaking year more you will see that 90nm is as matured as its gets.

For the record the transition to 65nm i belive will happen 2006 "Late" and ship the following Quarter.

Edit, i dont know what you mean by "Sony go broke"..

LOL, I think you and I agree overclocked - we're just talking about different things. The jump from 90nm to 65nm is fairly aggressive this cycle, and though a lot of firms are building on the process and have been for awhile, some others, like the graphics companies, have yet to tread down the path. That's because their product cycle is so short/tight that when something goes wrong it can derail an entire generation. That's why NVidia since NV30 has gone the route of launching top chips on mature processes. ATI seems a little more risk-oriented in that regard; it has it's benefits and drawbacks.

I agree any other - sort of long-life product - is best served by moving to smaller proccesses as soon as possible and just working out any growing (or shrinking) pains as they roll along. Intel and AMD chips are great examples, so are memory chips and of course, console chips. Server chips though tend to be another example of where they tend to wait to adopt a shrink though, for guarantee of yield reasons.

As for 'go for broke,' that's just American slang by which I meant - Sony has little to lose in pushing a more aggressive system spec for RSX for the PS3 launch; even if the initial yields might be terrible, the long term benefits for the console might justify the short wait for 65nm. 8)
 
Can nVidia have divided loyalties?

On the one hand, they have to provide a good product to showcase their technology on the PS3.

But on the other hand, not make it too good so that people bypass buying their $600 (!) cards.

Who would decide the ultimate pipeline configuration for the RSX, Sony or nVidia?
 
wco81 said:
Can nVidia have divided loyalties?

On the one hand, they have to provide a good product to showcase their technology on the PS3.

But on the other hand, not make it too good so that people bypass buying their $600 (!) cards.

Who would decide the ultimate pipeline configuration for the RSX, Sony or nVidia?

I think NVidia would decide, and Sony would approve for their PS3 or disaprove, within the range of action allowed by their agreement.

NVidia isn't at risk of losing PC gamers willing to spend the cash, because the two areas are just so divergent.

(I don't think their loyalties risk being overly divided, if at all)
 
wco81 said:
Can nVidia have divided loyalties?

On the one hand, they have to provide a good product to showcase their technology on the PS3.

But on the other hand, not make it too good so that people bypass buying their $600 (!) cards.

Who would decide the ultimate pipeline configuration for the RSX, Sony or nVidia?

Really doesn't matter. Sony's paying NVidia a royalty per chip and is fabbing the chip themselves. They can do what they want for the most part in terms of pricing and, at some level, deciding what chip yields from the pipeline standpoint are acceptable.
 
LOL, I think you and I agree overclocked - we're just talking about different things. The jump from 90nm to 65nm is fairly aggressive this cycle, and though a lot of firms are building on the process and have been for awhile, some others, like the graphics companies, have yet to tread down the path. That's because their product cycle is so short/tight that when something goes wrong it can derail an entire generation. That's why NVidia since NV30 has gone the route of launching top chips on mature processes. ATI seems a little more risk-oriented in that regard; it has it's benefits and drawbacks.

I agree any other - sort of long-life product - is best served by moving to smaller proccesses as soon as possible and just working out any growing (or shrinking) pains as they roll along. Intel and AMD chips are great examples, so are memory chips and of course, console chips. Server chips though tend to be another example of where they tend to wait to adopt a shrink though, for guarantee of yield reasons.

As for 'go for broke,' that's just American slang by which I meant - Sony has little to lose in pushing a more aggressive system spec for RSX for the PS3 launch; even if the initial yields might be terrible, the long term benefits for the console might justify the short wait for 65nm.

Ok got your POV :D

Im only saying that many belives 90nm is not mature but to me its at mature as it gets when Sony launch PS3. AND the transition to 65nm wont be as painfully cause you/or rather Sony has already covered that with the non-tight timeframe.
 
Really doesn't matter. Sony's paying NVidia a royalty per chip and is fabbing the chip themselves. They can do what they want for the most part in terms of pricing and, at some level, deciding what chip yields from the pipeline standpoint are acceptable.

Last i heard was 5$ chip..
 
xbdestroya said:
wco81 said:
Can nVidia have divided loyalties?

On the one hand, they have to provide a good product to showcase their technology on the PS3.

But on the other hand, not make it too good so that people bypass buying their $600 (!) cards.

Who would decide the ultimate pipeline configuration for the RSX, Sony or nVidia?

I think NVidia would decide, and Sony would approve for their PS3 or disaprove, within the range of action allowed by their agreement.

NVidia isn't at risk of losing PC gamers willing to spend the cash, because the two areas are just so divergent.

(I don't think their loyalties risk being overly divided, if at all)
I agree ........ the markets seem to be somewhat separate.... although, IMO, if keyboard and mouse peripherals ever gain a large user base on any console, I believe there will be a huge number of PC gamers flocking over to said console.......
 
[Brick_top said:
]
DaveBaumann said:
Ironic that someone here links to a story on another site that quite obviously originates from this site!

I had the same thought just before I saw your post
:D

This has been irking me for awhile so I feel that as involved as I am in this thread it's only right to address it, even at the risk of losing some of the anonymity we all value on the Internet. ;)

Anyway I know we all have a 'center of the universe' attitude here at B3D, myself included, but I actually got the idea to write that article (yes, I am the author!) a day or two after the chip was revealed. Sorry Dave, but I noticed a discrepency between die size and transistor count on my own. :p 110nm, 302 million trans, and 334mm^2 die area - it's just weird!

Well so there I was with this theory, when Dave posted his news brief. What this did for me was two things - first it scared me into getting my article out since I was worried it would tip others off and I would lose the story to some other PlayStation or gaming site (PSINext's competitors of course), and secondly in the thread here in the news section discussing it, a post by Weeds (viewable here) seemed to confirm through driver info something I had been wondering. I already had that the die area was off, and for a high end part it didn't make any sense for GTX not to be a dual-slot higher clocked card - especially if this was really going to be their top part. But Weeds' post of the driver info put that missing third support into place for me that I felt made the theory fully 'legitimate.'

To go further, my background is more business oriented than it is tech (though certainly I hold my own), and I really feel that some of the great business aspects I raised in the article have gone sadly unnoticed. :cry:

This in particular:

In the meantime, what launching with a crippled G70 offers NVidia is the opportunity to set a higher price point allowed to the #1 graphics card, and significantly reduce yield-issues by needing only to focus on getting chips out the door with 24 of the 32 pipelines functional. This scenario would also explain why this is one of the few chip/card combinations to be available immediately at launch in recent memory. With the R520 currently experiencing manufacturing troubles and expected to launch late this summer, this gives NVidia the chance to exploit the $600 price-point with GTX until R520 launches, and then counter with a 'full' G70 card to maintain that price point, and hopefully the performance lead.

I could have gone deeper into it I guess, but I can't emphasize enough what this will do for NVidia's GPU margins - it's crazy.

First let's just forget about extra pipes and everything and focus on the notion that there will be an 'Ultra' or 7900 card released in the future. The yields of this chip will necessarily be lower than that of the GTX variant, and it likely would have occupied the $600 price-point, with the GTX likely at $500 or thereabouts. What this means for NVidia is that they can garner those higher margins on what is in fact, a fairly high-volume, non-top-end part for them. Then when the ATI cards launch in September or whatever, they can launch the 7900 Ultra Extreme (yeah I made that up) to have a $600+ part out in the field, and drop the 7800 GTX to ~$450-500; the price it would normally have launched at anyway! ATI just won't be able to exploit the same price buffer unless the R520 launches and utterly dominates the 7900.

Three months of insane profits is what GTX is going to represent for NVidia. It cannot be overstated. Go to Newegg - people are buying these things! The reviews are pouring in, and the cards are still available. Ultra's and XTPE's never existed in these volumes, never commanded these prices (on the MSRP side); it's like the perfect storm.

Ok I'll end my babbling here - my pride just couldn't take it any longer. 8)
 
mckmas8808 said:
Hey guys what is the advantage for Sony to go with a 65nm tech with the RSX and CELL in the future? What would it do, make the machine smaller?

Yeah. PSThree 8)
 
Back
Top