D9P/G94: 9600 GT on 19th of February

It's born to kill the HD3850? This card was intended to go up against RV670 in Q1 2008.... There was supposed to be only a 1 month gap in release (RV670 in January, D9P on the 14th of Feb). But then something unexpected happened... A11 silicon of RV670 came back faultless so it got released much sooner than anticipated... and the more expensive G92 had to be pushed down to compete against RV670.

So when the GeForce 9600GT comes out, AMD has a 3-4 months lead in the same segment.

I know what happend, with 3850 ATi now has a 1Q lead in the 170$ segment, what is not bad at all, but than what? 1 good Q in a segment a year not enough for anything.

The cheapest HD3850 is already available for €140 over here so NV will need to match the price to remain competitive.

NV can easy made money from anything, so won't be hard if they sell the card with less margin.
 
My main focus is on D9E being late for what it's worth
Sure, D9E certainly looks late in terms of tape-out date, at the very least. FP64 support is a good indication of that. Presumably G92 is late too, although whether that's because they expected an A11 to be sellable or due to a tape-out delay is anyone's guess; presumably the former though, since the latter isn't something you should base your plans on.

Oh, and the November 2007 1 billion transistor GPU (that PC Inlife posted until that description on the graph got blacked-out) can't be G92-GX2, that's ~1.5 billion across two chips.
I wouldn't take that graph too seriously; if D9E turns out to be ~1B transistors, that could just be a coincidence and that graph might even have been made without any insider info.

Nope, chipsets, IGPs and laptop parts are the exception that proves the rule
Well, just to make sure: we're talking northbridges here. NVIDIA's single-chip southbridge+northbridge always lags behind GPUs process-wise, and there are good technical reasons for that. And laptop parts are the same chips as desktops, so I'm not sure I get your point there either. Either way, yes, small chips tend to be the first one to a process node.

which means that the budget process nodes, which are normally the first to leave prototyping, are the ideal match.
I see no evidence that the process node variety used for those is any different than that of GPUs. There aren't a trillion official process variants at TSMC, you know... And I very much doubt that northbridges are manufactured on a low-power process made with handhelds in mind!
 
Does anyone think the 9600GT will also draw its power directly from the PCI-E slot? Like the 8600GT does? If not I'll have to get a new PSU then.
 
zshadow, you could use a simple PCI-Express power adapter.
 
Does anyone think the 9600GT will also draw its power directly from the PCI-E slot? Like the 8600GT does? If not I'll have to get a new PSU then.

Considering 8800gt's 105W TDP, it's possible this part could get away being with the 75W from the motherboard via pci-e v1 spec. If not, he's right...there's always adapters...or pci-e v2 motherboards.
___________________________________________

So, going on the current hypothetical model:

9600GT: 4 clusters, 64 shaders, 32TA/32TF, 16 ROPs, 256-bit@2ghz (64gbps)


Would the lineup look something like this?

(D9E)9800GTX: 8 clusters, 192 shaders, 64/64, 32 ROPs, 512-bit
(D9E)9800GTS: 6 clusters, 160 shaders, 48/48, 24 ROPs, 384-bit
(D8P)8800GTS: 8 clusters, 128 shaders, 64/64, 16 ROPs, 256-bit
(D8P)8800GT: 8 clusters, 112 shaders, 56/56, 16 ROPs, 256-bit
(D8P)8800GS: 6 clusters, 96 shaders, 48/48, 12 ROPs, 192-bit?
(D9P)9600GTS: 4 clusters, 64 shaders, 32/32, 16 ROPs, 256-bit?
- with lower-clocked or cut-down derivatives?
(D9M)????: 2 clusters, 32 shaders, 16/16, 8 ROPs, 256-bit?
- with lower-clocked or cut down derivatives?

That would would be an interesting lineup. D9M could be a "fixed 8600GTS" with higher bandwidth on a smaller process (55/65nm) allowing for a decent mid-range card out of the bottom end, and perhaps a cheap 128-bit, 16 shader part out of the same chip...ala a 8400gs.
 
I hope it`s not true :( Expreview said D9E is a GF7950GX2 alike card :(
Well it seems that after poor year 2007 in GPU market we will see no new technologies and architectures from NVIDIA even in 2008 :(

http://en.expreview.com/?p=133

What happened that NVIDIA 14 months after launching G80 we still dont`t have a new single High-end GPU (like they did with GF6800 and GF7800)??
 
Last edited by a moderator:
I hope it`s not true :( Expreview said D9E is a GF7950GX2 alike card :(
Well it seems thah after poor year 2007 in GPU market we will see no new technologies and architectures from NVIDIA even in 2008 :(

http://en.expreview.com/?p=133

What happened that NVIDIA 14 months after launching G80 we still dont`t have a new single High-end GPU (like they did with GF6800 and GF7800)??

If D9E based on GF7950GX2 design then we are looking similar clock speeds for D9E ~500MHz :( and calculation be about 20% improvement over G80Ultra.
 
If D9E based on GF7950GX2 design then we are looking similar clock speeds for D9E ~500MHz :( and calculation be about 20% improvement over G80Ultra.

But how do you know what specs of single D9E are??

Do you think it`s 2xD9P (GF9600)?? If it is then i doubt it will beat 2xHD3870 but if it it 2xG92 (GF8800GT) then i ask is there any sense naming it GF9800??

If all of this is true something bad is going with NVIDIA :(
 
Remember on the rumor of 192(2x96?)SPs @ 2.5GHz. ;)

I hardly believe that a 2008-performance-GPU and base of a GX2-type-card, which is successor of 128SPs@1.5GHz will not only have 64(MADD+MUL)SPs.
 
What happened that NVIDIA 14 months after launching G80 we still dont`t have a new single High-end GPU (like they did with GF6800 and GF7800)??

Sorry, but I for one don't think Nvidia is sleeping on their laurels. They haven't needed to push out anything faster than the G8800 GTX/Ultra because ATI had no competition for it.

I think they went to the drawing boards and are bringing out something special in March(or when ever it gets released).

Think of this like the NV40/G7x .. where ATI was behind for a time and hand no response to Nvidia's top product.

Nvidia didn't waste time trying to tweak the tech but rather came out with newer better tech.(G80).

US
 
Because it comes with DX10.1 and with 8800GT performance .. and as Arun suggested .. because they can.

GT will be obsolete once the new cards hit .. you can't kid yourself.

US
 
Was that sarcasm or are you being serious? I fail to see the benefit of such a move, as a 9600 GT that offers the same performance as an 8800 GT yet costs less isn't going to help the bottom line at all.
I was being perfectly serious. Here's a hint: what is the 9600 GT rumoured die size again? And the bottom line is proportional to volume*gross profit/unit. Emphasis on volume... :)
 
Back
Top