Next NV High-end

Hard to imagine Dave is referring to anything but R580. There are ongoing rumblings that TSMC is still struggling to deliver a 6-quad R520, so it has to be another ASIC and I don't think Wavey's tentacles are rooted deeply enough in NVDA for him to be confident enough for a "heh". :LOL:
 
Last edited by a moderator:
Skrying said:
I didnt say size, I said transitor budget. Two different things really. Complexity would go up another level with 8 quads, and therefore make it harder on Nvidia. I dont see a need for it for a good while, if ever though.

Yes you said transistor budget and i should been clearer with my reply.
I still think that getting hooked up on transistor-budgets when it comes to these extreme performarnce SKU´s has failed almost every generation in the speculation on forums like this before the facts have come out.

On the part i speculated they could sell the one that "fails" as GTX/GT or something.
I think nVidia will probably be more agressive now and release a "refresh" about 6 months from when they launched the GTX or a little longer if they not have gotten out all of the mid and low end SKUs.

As you can get cards from almost every vendor at higher speeds, many close to 500Mhz core and 1300Mhz memory i dont think same core but faster would be anything they would consider. But a 8quad/10vs/16ROPs at say 500MHz core and 1400-1600MHz memory would do for a great refreash part IMO.
 
Skrying said:
I didnt say size, I said transitor budget. Two different things really. Complexity would go up another level with 8 quads, and therefore make it harder on Nvidia. I dont see a need for it for a good while, if ever though.
I'm not sure why that would be such a challenge as far as complexity is concerned. I mean, the quads themselves are already designed. Very few things would have to change as far as the chip design is concerned to make the move from 24 quads to 32.

The primary difficulty would be in rerouting the chip so that everything fits with the new dimensions. But since one would have to do that to some extent anyway with a move from 110nm to 90nm, I don't see the great difficulty.

The only potential reason that I could see that nVidia would not go for 32 pipelines with the G75 would be that they deem that it would cost too much for them to manufacture (for example, due to yield issues, it may not be feasible right now to build a chip quite as large on 90nm than one can realistically build on 110nm).
 
geo said:
Well, it's really the wrong the thread for it, but just for grins and giggles, oh technacious one, I'll give you one "fact" (not so much, but play along, 'kay?), and you give me another:

"Fact" 1: R520 is 295m transistors.

"Fact" 2: R580 is ???m transistors.

Fact 1 is incorrect.
 
By more or less than 5% absolute?

Edit: And don't forget the quote marks. I'm playing our Great Game here, not setting up as Oracle. :LOL:
 
Last edited by a moderator:
Chalnoth said:
I'm not sure why that would be such a challenge as far as complexity is concerned. I mean, the quads themselves are already designed. Very few things would have to change as far as the chip design is concerned to make the move from 24 quads to 32.

The primary difficulty would be in rerouting the chip so that everything fits with the new dimensions. But since one would have to do that to some extent anyway with a move from 110nm to 90nm, I don't see the great difficulty.

The only potential reason that I could see that nVidia would not go for 32 pipelines with the G75 would be that they deem that it would cost too much for them to manufacture (for example, due to yield issues, it may not be feasible right now to build a chip quite as large on 90nm than one can realistically build on 110nm).

What if the G70 chips are "failed" G75 (32 pipe) chips? Perhaps they release the G75 when they build up enough chips with 32 working pipes. Food for thought.
 
ondaedg said:
What if the G70 chips are "failed" G75 (32 pipe) chips? Perhaps they release the G75 when they build up enough chips with 32 working pipes. Food for thought.
Well, we heard that as a possibility some time ago, but I don't buy it. I don't think that ~300 million transistors is enough for a 32-pipe chip. What's more, given the incredible overclocking potential that some are getting (I'm seeing people clocking the G70's up to 600-700MHz on the Futuremark ORB), on the surface it would seem that the G70's yields are pretty damned good right now.
 
overclocked said:
You will soon find out geo, patience my friend...

Well, see, I'm trying to keep my game going. So if it's 5% or less I just change "Guess 1" to 280-310m, and ask Technacious (I like that for your new Title, Jawed :LOL: ) to give me "Guess 2", same % spread. All in good, clean B3D fun.

Edit: Edited for the quote mark impaired.
 
Last edited by a moderator:
Frankly I was expecting way less transistors for G70 myself but then again I wasn't counting with any internal modifications of the pipeline either. From what I've been gathering lately those added 80M compared to NV40 for 2 more quads and the modifications are justified.
 
overclocked said:
Care to explain, a 90nm G70 with 8quads/10VS/16ROPs should be a fair amount smaller than the current incarnation on 110nm.

It costs lots of money which is better invested in the next gen, that's why noone will do it unless they REALLY are forced to.
 
ondaedg said:
What if the G70 chips are "failed" G75 (32 pipe) chips? Perhaps they release the G75 when they build up enough chips with 32 working pipes. Food for thought.

You'd have to starve if that was your primary food...
 
Isn't the R520 320 Million Transistors? Thought I read that somewhere. Me .. I think it's gonna be more like 270 Million.

Also the R520 had leaks .. and so has taken a long time to sort out. The R580 has no leaks .. and so ATI are stocking them up atm. Last time I read ATI's CEO said the R580 was going along nicely.

US
 
geo said:
Well, see, I'm trying to keep my game going. So if it's 5% or less I just change "Fact 1" to 280-310m, and ask Technacious (I like that for your new Title, Jawed :LOL: ) to give me "Fact 2", same % spread. All in good, clean B3D fun.

Yes, but on the downside its this kind of things that make insomnia and Bipolar disease a common for tech-nerds as us.... :)
 
Evildeus said:
Fact 1 is a fact? Since When?

I guess the use of quote marks isn't as obvious a tip as I was hoping. Edited upstream to use Jawed's formulation instead.
 
Last edited by a moderator:
Back
Top