NVIDIA GT200 Rumours & Speculation Thread

Status
Not open for further replies.
Non-disclosure agreement for nV partners and websites that are doing reviews for launch of the cards.

New card prices are anywhere from 450-650 usd
 
what is NDA ?

cool thanks can't wait until the 17th. im wanting to use the step up feature on my 8800gt to step up without spending more than $50. Are there estimated price drops about the current cards ? and what will the prices be for the new cards?

Thanks for the quick replies

NDA is Non-Disclosure Agreement, in other words, the people who know about the cards, press etc, aren't allowed to talk about it before NDA ends
 
what is NDA ?

cool thanks can't wait until the 17th. im wanting to use the step up feature on my 8800gt to step up without spending more than $50. Are there estimated price drops about the current cards ? and what will the prices be for the new cards?

Thanks for the quick replies

NDA is something that most people seem to choose to ignore.

Chris
 
Obviously yes. And it's not the first time, that the thought crossed my mind, if one's actually better off signing an NDA or not.
 
You guys heard that?

http://www.hardware-infos.com/news.php?news=2113

30 watt idle TDP for the GTX 280

According to the slides it's true.

NV is gonna promote IDLE powerconsumption a lot... with just a small mention of LOAD powerconsumption. :p But I guess most reviewsites will see through their PR plan. ;)

gtx280powersj4.jpg
 
According to the slides it's true.

NV is gonna promote IDLE powerconsumption a lot... with just a small mention of LOAD powerconsumption. :p But I guess most reviewsites will see through their PR plan. ;)

Idle power consumption is one area where AMD was sticking it to them hard. If GTX280 idle power consumption is actually lower than G92 I would say that's a big deal. In terms of load performance/watt they actually had an advantage IIRC. Besides, the slide also shows that load power consumption is much higher so it's not like they're hiding the fact......
 
Holy crap... 147 instead of 80 for load!!!!!

It's one huge, hot chip indeed. And I don't care about the 20 watt advantage in idle if there's a 70 watt disadvantage in load.

Hmm, to think about it... If for each 9 hours my computer is on, I play 3d games for 2 hours and don't for 7 hours, it could balance it out. But someone who spends $650 on a graphics card would like to waste more of his life playing 3d games, naturally.
 
I don't think they'll just shrink it to 40nm, they'll likely want to change the ALU-TEX ratio further as I said. One big problem on 40nm is how to have enough bandwidth yet not be pad/pin limited. Assuming that's not too much of a problem, I can easily imagine a line-up along those lines:
16C/512SP | 384-bit GDDR5 | 2.5TFlops+
8C/256SP | 384-bit GDDR3 | 1.25TFlops+
4C/128SP | 256-bit GDDR3 | 600GFlops+
2C/64SP | 128-bit GDDR3 | 300GFlops+

As for 55nm, we might see a 10C shrink of GT200, a 1C low-end discrete & IGP SKU and a 4 TMUs/8 SPs ultra-low-end IGP SKU.
 
Idle power consumption is one area where AMD was sticking it to them hard. If GTX280 idle power consumption is actually lower than G92 I would say that's a big deal. In terms of load performance/watt they actually had an advantage IIRC. Besides, the slide also shows that load power consumption is much higher so it's not like they're hiding the fact......
Does "idle" mean only in a Hybrid Power mobo?, where the IGP is providing the display and GT200 is literally "turned off".

Jawed
 
I was thinking... The separate NVIO chip might help with "Idle" and "BlueRay" cases. Just power the main chip down and only use it.
 
Does "idle" mean only in a Hybrid Power mobo?, where the IGP is providing the display and GT200 is literally "turned off".

Jawed

Hybrid Power = Zero Watts because the GPU is completely turned off. Since its basically been posted I will confirm the GTX200 cards consume less power than an 8800GTX, 9800GTX and 9800GX2 at idle by a significant margin.
 
Status
Not open for further replies.
Back
Top