Roy Taylor: "Only Brits care about ATI"

Well that clearly contradicts NVidia's heretofore "don't risk high-end on a new process" maxim. Coinciding nicely with 65nm :p
Yeah, it kinda does. Although as you'll see shortly, 325mm² is pretty much low-end for them nowadays! :p Ah, and saying that pre-R300, the largest NV/ATI GPU was ~150mm²! But then again, ASPs for the GF4 only went from $199 to $399... Even if you consider inflation, the industry sure has changed a lot.

Anyhow, as I said G73 was indeed back from the fab so they had the opportunity to test the process and use that to see how to optimize synthesis for the process. So from a manufacturing perspective, this isn't really any different from starting with a low-end chip even though it *might* not have gone to volume production (or perhaps it did). As for G98, expect it in June or, worst-worst-case, July.

One way or another, they both seem to have made an awful lot of rubbish in the chipset arena. SB600 still appearing on new mobos? No thanks.
Heh, I certainly agree with you there, and I'm glad to have gone for an Intel chipset this generation. SB700 is certainly nicer than SB600 at least, not perfect but heh - let's see if SB750 does more than add RAID5...
 
Yeah, it kinda does. Although as you'll see shortly, 325mm² is pretty much low-end for them nowadays! :p
Maybe they're exiting the ultra-low-end discrete part of the market? Handing it over to IGP? And just dumping end of line GPUs there too.

Ah, and saying that pre-R300, the largest NV/ATI GPU was ~150mm²! But then again, ASPs for the GF4 only went from $199 to $399... Even if you consider inflation, the industry sure has changed a lot.
The performance range across a family of GPUs also seems to have magnified somewhat.

Anyhow, as I said G73 was indeed back from the fab so they had the opportunity to test the process and use that to see how to optimize synthesis for the process. So from a manufacturing perspective, this isn't really any different from starting with a low-end chip even though it *might* not have gone to volume production (or perhaps it did).
Well even if NVidia did execute G78 properly it appears to have been for no benefit - it doesn't seem possible to disentangle the G73/G98/G92 crunch from 65nm's vandalism. Regardless, NVidia seems to have left itself wide-open to increased risk with the timing, apparently being as aggressive as ATI with 65nm but doing a worse job overall.

I can see why you thought NVidia would be simultaneous with ATI in transferring to 55nm, but it hasn't come to pass.

It seems like 40/45 will be a close race.

Jawed
 
Maybe they're exiting the ultra-low-end discrete part of the market? Handing it over to IGP? And just dumping end of line GPUs there too.
No, as I said, they just screwed up. Don't read too much into it; G98 will still be released, just way later than expected (but AFAICT still in time for the Back-to-School cycle to replace the 80nm 8300/8400GS).

Well even if NVidia did execute G78 properly it appears to have been for no benefit
Probably, yes, although who knows - for example, for all we know, it might have sold significantly in China.

I can see why you thought NVidia would be simultaneous with ATI in transferring to 55nm, but it hasn't come to pass.
Yes, it certainly hasn't. And neither have my G9x predictions. Oops!

It seems like 40/45 will be a close race.
Remember there's no 45G, only 40G. So there won't be a half-node this time around, and we'll need to wait ~18 months for 32G which is a ~40% shrink. And then presumably ~12 months later we'll have a 28nm half-node...

And yes, I'm very curious about the timeframe for 45G at NV and ATI, because as I said it's a 3x+ improvement in perf/mm² over 65nm (presumably closer to 2x once you consider that you'll likely want to optimize more for power, but still). So it doesn't matter what the wafer costs or the process yields are; whoever is there first has an obvious advantage (unlike 80/65 which wasn't as IMO only a small advantage), as long as his chip yields are good enough (i.e. if his design isn't well tuned to the process characteristics and thus yields badly or has awful awful variability, that would still be a big problem).
 
No, as I said, they just screwed up. Don't read too much into it; G98 will still be released, just way later than expected (but AFAICT still in time for the Back-to-School cycle to replace the 80nm 8300/8400GS).

Probably, yes, although who knows - for example, for all we know, it might have sold significantly in China.
Other markets are definitely the wild card in all this, e.g. G98 might be solely aimed at China with IGP for US/EU. Or the other way round. The whole global thing isn't well understood around here...

Remember there's no 45G, only 40G. So there won't be a half-node this time around, and we'll need to wait ~18 months for 32G which is a ~40% shrink. And then presumably ~12 months later we'll have a 28nm half-node...

And yes, I'm very curious about the timeframe for 45G at NV and ATI, because as I said it's a 3x+ improvement in perf/mm² over 65nm (presumably closer to 2x once you consider that you'll likely want to optimize more for power, but still). So it doesn't matter what the wafer costs or the process yields are; whoever is there first has an obvious advantage (unlike 80/65 which wasn't as IMO only a small advantage), as long as his chip yields are good enough (i.e. if his design isn't well tuned to the process characteristics and thus yields badly or has awful awful variability, that would still be a big problem).
It seems NVidia is in more need of advanced processes than ATI, with the latter only planning to build RV chips - it seems NVidia is practically forced to implement each high-end GPU on a new-ish process (only ~3 months old). Preumably they'll get better at it.

Jawed
 
Isn't the 9300 GS just a renaming of this one ?
It's mostly available in China at the moment, but it exists.
Interesting, so PV3 with full VC-1 decode - clearly an important product (EDIT, whoops, was important until HD-DVD died). Also this appears to be the "spec" of what I was calling G88, but it was "recoded as G98" as revealed in the name 8400GS. The original 8400GS was, presumably, G86 with bits turned off.

EDIT: hmm this G98 also appears to have bits turned off:

http://en.expreview.com/2007/12/04/born-for-hd-first-review-of-g98-8400gs/?page=4

:???:

So only about 6 months late, de-prioritised presumably due to the rush to deliver G92 :?:

Jawed
 
For the last time: they screwed up. Stop reading so much into it. It'll still be released worldwide, and G92 has nothing to do with it. Stop trying to find an excuse for NVIDIA, damnit! ;)
 
For the last time: they screwed up. Stop reading so much into it. It'll still be released worldwide, and G92 has nothing to do with it. Stop trying to find an excuse for NVIDIA, damnit! ;)
Well, this is very curious. G98, it turns out, is G86-shrunk (and G96 is G84-shrunk). So was G98 really supposed to replace G86 within only 3 months?

So I'm thinking G86 was meant to have been released before Christmas 2006 and so would have had an ~8 month life before being replaced by G98.

Jawed
 
Since getting a PS3 again recently after HD DVD disappeared, I've taken to asking the PS3 what video format it's playing back off the BR discs I watch. A good proportion are VC-1, so the decode assist there isn't useless now HD DVD is dead.

And a quick Google shows that the MPEG-2/H.264/VC-1 ratio on BR discs isn't highly in favour of any one of them.

Doesn't stop the products being crap of course, just pointing out that the feature still has merit.
 
Since getting a PS3 again recently after HD DVD disappeared, I've taken to asking the PS3 what video format it's playing back off the BR discs I watch. A good proportion are VC-1, so the decode assist there isn't useless now HD DVD is dead.
I did some Codec marketshare anaylsis on Blu-Ray discs recently. H.264 is the dominant force, but VC-1 is nearly coming to 1/3 the share (high twenties, IIRC). Basically adoption of MPEG-2 is obviously trailing off for BRD, where it had an initial very high adoption rate. However, now that there getting a little more maturity obviously people are jumping to the newer codecs - however, rather than everyone just moving off MPEG-2 to H.264 many are moving to VC-1 (probably because the cost of authoring under VC-1 is closer to MPEG-2).
 
Since getting a PS3 again recently after HD DVD disappeared, I've taken to asking the PS3 what video format it's playing back off the BR discs I watch. A good proportion are VC-1, so the decode assist there isn't useless now HD DVD is dead.

And a quick Google shows that the MPEG-2/H.264/VC-1 ratio on BR discs isn't highly in favour of any one of them.

Doesn't stop the products being crap of course, just pointing out that the feature still has merit.

Excellent point. Although I must admit the best encodes/transfers I've seen yet on B-r have been MPEG-2, much to my surprise.
 
Back
Top