Nvidia GT300 core: Speculation

Status
Not open for further replies.

Shtal

Veteran
Hardspell released is list of possible specifications for the GeForce GTX 350 graphics processor (GPU):

* NVIDIA GeForce GTX 350
* GT300 core
* 55nm technology
* 576 sq.mm die area
* 512bit GDDR5 memory controller
* GDDR5 2GB memory, doubled GTX280
* 480 stream processors
* Grating operation units are 64 the same with GTX280
* 216 GB/s memory bandwidth
* Default clock speeds of core: 830MHz, shader: 2075 MHz, memory: 3360MHz (effective)
* Pixel fill-rate 36.3G pixels/s
* Texture fill-rate 84.4Gpixels/s
* DirectX 10, no DX 10.1 support yet.
http://futuremark.yougamers.com/forum/showthread.php?t=85567
 
That'd be quite the trick if they could pull it off. Double the units and 50% faster shader clock in the same area with an optical shrink. I really don't know why they'd bother with 2GB of ram.
 
Without simply killing off an entire chip I doubt we'll see the transition to a smaller chip happen right away. It's likely still to soon to have decided that massive monolithic chips were a bad idea.
 
What should they have learned and when?

What should they have learned?

Don't produce >400mm2 chips unless you can sell them for >$1K.

Intel can get away with it because Xeon MP and IPF chips sell for $600 minimum, whereas an entire GT200 was $600 and included DRAM, heatsink, board, etc.

DK
 
What should they have learned?

Don't produce >400mm2 chips unless you can sell them for >$1K.

Intel can get away with it because Xeon MP and IPF chips sell for $600 minimum, whereas an entire GT200 was $600 and included DRAM, heatsink, board, etc.

DK

What is it with all the armchair experts here? Nvidia has been operating a highly profitable business for years and years. The low volume high-end segment halo effect is just one aspect of this.

Are you guys saying they should be looking to AMD for advice on how to run a business, just because they managed to pull off one promising videocard generation now?
 
What is it with all the armchair experts here? Nvidia has been operating a highly profitable business for years and years. The low volume high-end segment halo effect is just one aspect of this.

Are you guys saying they should be looking to AMD for advice on how to run a business, just because they managed to pull off one promising videocard generation now?

No, but when a company lowers the prices of it's latest highend graphicscard from $649 to $449 within 3 weeks after the launch, and partners are already offering cash-backs to early adopters of up to $120... then something didn't go as planned and there is a lesson to be learned. You're probably smart enough to figure out which lesson that is.
 
I really don't get the point of all this. The only lesson to be learned out of it is that GT200's architecture, relative to RV770's, sucks. And that it pays to have a better architecture and not underestimate the other guy.

The rest has very little to do with it, TBH (unless you think GT200 could be clocked at 800MHz if only intra-chip variability wasn't such a problem, which seems a tad extreme to me; and that's not the main problem anyway).
 
No, but when a company lowers the prices of it's latest highend graphicscard from $649 to $449 within 3 weeks after the launch, and partners are already offering cash-backs to early adopters of up to $120... then something didn't go as planned and there is a lesson to be learned. You're probably smart enough to figure out which lesson that is.

Cancel all current chip designs immediately and scramble to go the way of the Rage Fury Maxx and the Voodoo 5?
 
What is it with all the armchair experts here? Nvidia has been operating a highly profitable business for years and years. The low volume high-end segment halo effect is just one aspect of this.

NV40 (GF6800) was big, fast, expensive and technological leader. NV43 was small, very fast and NV40 created quite positive light for sale of this product.

G70 (GF7800) was big, expensive and still technological leader. G73 was very small, not as fast, as users expected, but good and very popular. Margins must have been very good.

G71 (GF7900) was very small, expensive, but not technological leader. nVidias margins must have been extreme.

G80 (GF8800) was very big, expensive and technological leader. G84 was quite big, not very fast, but it was the only mainstream part supporting DX10 and many useres bought it just because of G80's succes.

G200 (GTX200) is huge, inexpensive (for customers), it isn't technological leader. Margins must be low. G200 simply can't help with mainstream sales, because there is no profitable ~150mm2 mainstream, only 290mm2 G92 (G92b is not highly available). So the whole point of advertisement of mainstream parts by high-end products doesn't work in this case.
 
Is it just me or are the fillrates a little weird with the 830MHz core clock :???:

To produce those fill rates (or close to them anyway) would require 44 ROPs and 102 texture units at those speeds.

I find that highly unlikely. I guess there could be other clock domains but I think its more likely that these specs are simply fake.
 
What should they have learned?

Don't produce >400mm2 chips unless you can sell them for >$1K.

Intel can get away with it because Xeon MP and IPF chips sell for $600 minimum, whereas an entire GT200 was $600 and included DRAM, heatsink, board, etc.

DK

Yes, but did not people like PdM (was he really banned from RWT ?!?) and others report in the past those 400+ mm^2 IPF chips cost Intel like $150 (more or less) to actually make ?

(I remember an old RWT discussion about manufacturing costs of Itanium 2 chips)
 
Status
Not open for further replies.
Back
Top