Nvidia GT300 core: Speculation

Status
Not open for further replies.
Maybe it would have more function if it was placed on a board? Surely, nvidia wouldn't want you to spoon a functioning GPU off of a board.

When was the last time they put A1 silicon on a board? (***edit: at least that's what the link states I can only see a black nothing there...)
 
Yeah apparently people don't think there's enough bad press about Nvidia already. Now they're digging up old shots of GT200 to make bad Star Trek jokes :LOL:
 
I think one interpretation of Nvidia insistence that DX11 graphics will not be the prime mover of new graphics cards (this year) is that (unlike ATI) they will only have one full DX11 part this year - GT300 on the high end (not counting SP-disabled top end parts that are one rung down on the performance ladder as a separate part.)

They will want to compete against the 57xx with their new G200 respin parts, and those are not full DX11 compliant (although I bet a little bit of market haze on just how much they 'support' DX11 is to be expected.)

Hence they will insist that CUDA / DirectCompute (which will run in 4.x guise on both last gen parts and the rumored GT21x parts) are more of a motivation to buy video cards. After all, the $200 and under parts probably account for a lot more of the market than the $250+ market, so winning the top end of the market alone, while taking a real beating in the performance mid-range will not be good enough for Nvidia.

I think they may have been blind-sided by the 3 way and 6 way output cards from ATI - that's a real market differentiator that they may not have seen coming in time.
 
I don't think Nvidia has anything to worry about from Eyefinity. But I think you're spot on with the G200 respins. Those are going to be DX10.1 parts going up against AMD's DX11 lineup, and from early signs they aren't very fast either (at least the entry level stuff). So it looks like AMD will be having a very merry christmas this year.
 
I think one interpretation of Nvidia insistence that DX11 graphics will not be the prime mover of new graphics cards (this year) is that (unlike ATI) they will only have one full DX11 part this year - GT300 on the high end (not counting SP-disabled top end parts that are one rung down on the performance ladder as a separate part.)

They will want to compete against the 57xx with their new G200 respin parts, and those are not full DX11 compliant (although I bet a little bit of market haze on just how much they 'support' DX11 is to be expected.)

Hence they will insist that CUDA / DirectCompute (which will run in 4.x guise on both last gen parts and the rumored GT21x parts) are more of a motivation to buy video cards. After all, the $200 and under parts probably account for a lot more of the market than the $250+ market, so winning the top end of the market alone, while taking a real beating in the performance mid-range will not be good enough for Nvidia.

I think they may have been blind-sided by the 3 way and 6 way output cards from ATI - that's a real market differentiator that they may not have seen coming in time.
Makes sense. Very possible what they are attempting to do.
 
Sorry to disappoint you in your enthusiasm for G(T)300 being DOA. The ASIC shown is most definitely a G80 since number an placement of the SMD-parts on the package do exactly match.

Granted, the picture was chosen quite cleverly, since rumored die-size for G(T)300 is about the same as G80 and there's not exactly an abundance of naked G80-shots floating around to compare it with.

auntie edit says:
Here are some pics of a naked G80-chip; a GT200 looks like this
 
Last edited by a moderator:
They will want to compete against the 57xx with their new G200 respin parts, and those are not full DX11 compliant (although I bet a little bit of market haze on just how much they 'support' DX11 is to be expected.)

Can those G200-based GPUs really stand against 57xx?

Hence they will insist that CUDA / DirectCompute (which will run in 4.x guise on both last gen parts and the rumored GT21x parts) are more of a motivation to buy video cards.

How would CUDA help move hardware, Physx? Hopefully that's not the only reason. :?:
 
Status
Not open for further replies.
Back
Top