Any chance the PS3 GPU is based on the gen after G70?

Almasy

Regular
I´ve been thinking, it has been rumored that the next generation card from NVidia will hit retailers in June. I don´t really follow graphic card maker´s roadmaps, but if the RSX is indeed based on it, wouldn´t that make it slightly outdated, if PS3 is indeed slated to launch in spring 2006? What are the chances the RSX incorporates technology from G70's sucessor, much like the NV2A featured in Xbox?

:?: :?:
 
Almasy said:
I´ve been thinking, it has been rumored that the next generation card from NVidia will hit retailers in June. I don´t really follow graphic card maker´s roadmaps, but if the RSX is indeed based on it, wouldn´t that make it slightly outdated, if PS3 is indeed slated to launch in spring 2006? What are the chances the RSX incorporates technology from G70's sucessor, much like the NV2A featured in Xbox?

:?: :?:

I think there's a decent chance, but if you go to the poll on the same subject you'll readily see that I'm in a minority. ;)
 
xbdestroya said:
Almasy said:
I´ve been thinking, it has been rumored that the next generation card from NVidia will hit retailers in June. I don´t really follow graphic card maker´s roadmaps, but if the RSX is indeed based on it, wouldn´t that make it slightly outdated, if PS3 is indeed slated to launch in spring 2006? What are the chances the RSX incorporates technology from G70's sucessor, much like the NV2A featured in Xbox?

:?: :?:

I think there's a decent chance, but if you go to the poll on the same subject you'll readily see that I'm in a minority. ;)

I saw the topic, but I didn´t want to focus on making a poll, but rather on what was the possibility of what I proposed being a reality and the reasons for it.

To V3:

Oh, I really, really don´t see RSX being a downgraded G70. I don´t know much about this, but looking at the Xbox relative to the graphic card state of the time, I don´t see how that can be a possibility.
 
V3 said:
Well lets just hope the RSX is not a downgraded G70.
I think there is a decent chance that it will be an 'upgraded' G70.

G70 = 110nm.
RSX = 90nm.

G70 = ~430mhz (supposedly)
RSX = 550mhz

Die shrink should allow some increase in clock speed. So even if they are the same core I imagine the clock speed increase will help a good deal.

I think the best info we can hope for in the next week is transistor count of G70 for the PC (with computex we might end up with that info). If its also ~300m (or less) than we can assume that the RSX is >= G70 clock for clock (with the ~120mhz speed increase it'll be even more). I realize transistor count doesn't exactly equate to performance, but in this situation it's an acceptable comparison.
 
Bobber the reduced micron process may be simply used to increase yields , lower power requirements and lower the heat output + an increase in mhz
 
Bobbler said:
V3 said:
Well lets just hope the RSX is not a downgraded G70.
I think there is a decent chance that it will be an 'upgraded' G70.

G70 = 110nm.
RSX = 90nm.

G70 = ~430mhz (supposedly)
RSX = 550mhz.

They should stay with 430 mhz at 90nm. The G70 is very power hungry for a console.
 
jvd said:
Bobber the reduced micron process may be simply used to increase yields , lower power requirements and lower the heat output + an increase in mhz

Isn't that what I said, essentially?

The reduction in die size usually leads to increased yields (due to more dies per wafer), lower power consumption (usually, power seapage was, hopefully, taken into account on conception of the GPU), and lower heat output is a given. Those are all pretty much the core reasons for reducing die size, that, and increasing clock speed. None of the points either of us mentioned are mutually exclusive.

So, I'll assume you agree with me and just had a funny way of saying it ;)

Apoc said:
They should stay with 430 mhz at 90nm. The G70 is very power hungry for a console.

Power consumption of the G70 is currently unknown. The 'info' thats been released has been speculative at best. It has the same power envelope as the current gen GPUs. (~75watt from bus and ~75 from plug). That info is, of course, assuming that the G70 and the RSX are identical also.

Stating they should stay at 430mhz doesn't even make sense with the information thats available at this point. Power consumption should go down a decent amount with 90nm and, let's be honest here, as long as heat isn't prohibitive, power consumption on a console is moot. Heat will be the main factor on the clock speeds of all chips going into the consoles (that, and yields of course).
 
What are the chances the RSX incorporates technology from G70's sucessor, much like the NV2A featured in Xbox?
I doubt it..RSX is a late to the party project
 
Isn't that what I said, essentially?

The reduction in die size usually leads to increased yields (due to more dies per wafer), lower power consumption (usually, power seapage was, hopefully, taken into account on conception of the GPU), and lower heat output is a given. Those are all pretty much the core reasons for reducing die size, that, and increasing clock speed. None of the points either of us mentioned are mutually exclusive.

So, I'll assume you agree with me and just had a funny way of saying it

No because i think this

I think there is a decent chance that it will be an 'upgraded' G70.

isn't true. It will just have the proper support for cell and accessing the xdr ram .

On 110nm the 6800ultra at 400mhz requires two power connectors and consumes alot of power and produces alot of heat . The g70 on 110nm will require more power and produce more heat . The g70 on 90nm should reduce this but to what extent ?

With the drop in micron process we are also increasing the clock speed of the chip .


I think what we are going to see is 90nm used for getting the gpu as cool as possible , getting it to use as little power as possible and getting its yields as high as possible

I don't see them upgrading it at all. Adding more pipelines or other features will increase the power draw and heat output .


This is imo of course
 
jvd said:
Isn't that what I said, essentially?

The reduction in die size usually leads to increased yields (due to more dies per wafer), lower power consumption (usually, power seapage was, hopefully, taken into account on conception of the GPU), and lower heat output is a given. Those are all pretty much the core reasons for reducing die size, that, and increasing clock speed. None of the points either of us mentioned are mutually exclusive.

So, I'll assume you agree with me and just had a funny way of saying it

No because i think this

I think there is a decent chance that it will be an 'upgraded' G70.

isn't true. It will just have the proper support for cell and accessing the xdr ram .

On 110nm the 6800ultra at 400mhz requires two power connectors and consumes alot of power and produces alot of heat . The g70 on 110nm will require more power and produce more heat . The g70 on 90nm should reduce this but to what extent ?

With the drop in micron process we are also increasing the clock speed of the chip .


I think what we are going to see is 90nm used for getting the gpu as cool as possible , getting it to use as little power as possible and getting its yields as high as possible

I don't see them upgrading it at all. Adding more pipelines or other features will increase the power draw and heat output .


This is imo of course

I see what you mean now.

I agree with you... When I said that the RSX would be an 'upgraded' G70 (I put upgraded in quotes for a reason), I meant it would probably be the same as the G70 but at a higher clock speed -- the quickest way to find this out will be if the G70 has the same transistor count as the RSX (it should give us a hint as to if the G70 and RSX are similar; if the G70 is like 350mil then we know the RSX won't have everything the G70 does, etc, etc -- of course this still doesn't tell us everything, only really confirms their similarity).

Also for clarification sake, the 6800 Ultra PCI-E only has one power connector (75watt, the 'normal' power plug doesn't give out that much juice). The G70 will also only have one PCI-E power connector, thus the same power envelope (it can't draw more than 150watts, and thats even pushing it -- it's not going that high). To be fair, the ATI X800 line has the same power plugs and the higher end versions draw pretty similar amounts of power (maybe slightly lower, I don't have any numbers offhand). We'll see though.
 
I agree with the whole NVidia solution is late to the party thing, but hasn't NVidia been working on G70 and G80 development concurrently? I just throw that out there because if we're looking at time constraints as our biggest limiting factor, and we're assuming a PC adaptation, I still think there's the possibility the adapted chip is a G80 precursor in the same way that R500 is an R600 precursor. Well, not the same way in terms of 'ground up,' but you catch my logic.
 
uh... didnt ken kuturagi say the RSX had a completely different architecture?? so it isnt a clocked g70 or whichever card you think it might be because its completely different, supposedly..

so end it! ;P
 
wishcraft said:
uh... didnt ken kuturagi say the RSX had a completely different architecture?? so it isnt a clocked g70 or whichever card you think it might be because its completely different, supposedly..

so end it! ;P

He did.. but nvidia said it wasn't.


Also, IIRC, they said their PC cards will be more powerful than the RSX at its launch.
 
SanGreal said:
He did.. but nvidia said it wasn't.


Also, IIRC, they said their PC cards will be more powerful than the RSX at its launch.

Well, they said the PC would have higher power options available shortly after the PS3's launch, which still gives us the possibility of the G80 as the upper range determiner.

I agree though that Kutagari's staements aren't anything to go buy though when you have NVidia sayign something contradictory.
 
I agree though that Kutagari's staements aren't anything to go buy though when you have NVidia sayign something contradictory.

it's adamantly clear his being creative with the description
 
nAo said:
What are the chances the RSX incorporates technology from G70's sucessor, much like the NV2A featured in Xbox?
I doubt it..RSX is a late to the party project

How is it you are able to get away with saying that and no one jumps on you :?:


;)
 
I think its G70 with FlexIO and the higher clockspeed.
IMO if MS had gone out with their GPU @450Mhz i think its quite possible that the RSX had been 500Mhz instead on the slides from Sony.
PS3 is quite far from launch and i think that the final speeds on the RSX and Cell is not set in stone naturally.

We are going too know more when we see what CineFX-4 and Intellisample-4 brings to the table.
 
overclocked said:
We are going too know more when we see what CineFX-4 and Intellisample-4 brings to the table.
Every new iteration of intellisample tech has brought very small improvements, so I'm not expecting too much there.
 
Back
Top