NVIDIA GF100 & Friends speculation

If they have one and release it too early and those that just bought a card won't be happy. :oops:
Those that waited will be smiling though. :p

well its almost 6 months since the 5870 launched. If they launch a refresh in late april or may it will be 7-8 months after.
 
If they have one and release it too early and those that just bought a card won't be happy. :oops:
Those that waited will be smiling though. :p

That's quite a bold statement I must say. I personally still expect the GTX470 to be a quite competitive part against a 5870 but nothing really groundbreaking. The GTX480 is out of that equation due to many reasons, price/performance ratio amongst them.
 
Despite the hairsplitting above, it's definitely not the day amount previously stated by NV's PR in the past :LOL:
 
Wow, Charlie's been missing it by a mere 5 Watts and Nvidia pulled several engineers just to prove him wrong?

What I don't get though, is that he says, they've been shaving off 5 Watts, upping TDP by 50 watts at the same time. Anyone care to enlighten me on this one?

Plus: I wonder why he cares so much about OEMs/ODMs: I mean, GF100 was supposed to ship in low quantities and high prices. How many OEMs apart from Alienware will initially build systems for products like this? And then notebooks: I don't think anyone expected GF100 to show up in notebooks, right?
 
Wow, Charlie's been missing it by a mere 5 Watts and Nvidia pulled several engineers just to prove him wrong?

What I don't get though, is that he says, they've been shaving off 5 Watts, upping TDP by 50 watts at the same time. Anyone care to enlighten me on this one?
This one is puzzling me, too.
Only "sensible" explanation I can come up with is that both GTX480 and 470 would have TDP of 275W, but that's just not a real possibility, is it?
 
This pic was linked already earlier, but someone pointed out on another forum something from it that I didn't notice before, which after I looked through the naked pcb shots, and they have it too (or at least I think it's the same, it's in different position on that revision but it's still 2pin like it should) - the SPDIF in for audio, so still no native audio from nV highend

One of the box shots from a few days ago shows "7.1 audio" as a feature.

14407_01.jpg
 
FRONT
BACK
Mod Edit: 3MB png (?!) pics ahoy!

Compare the GTX280 GTX285 8800GTS and the "Fermi" (GTX470) coolers.
May Fermi heatsink a little bit bigger, than 8800GTS heatsink, but the fan smaller!

Maybe the cheapest cooler in this compare the Fermi cooler.:LOL:

AND THIS SO HOT?!:LOL:
 
Kaotik,
I wouldn't think so, but you'll never know. CPU vendors do class TDPs for years now.

FenderBender,
7.1-Audio: Board-level feature or chip-level feature?
 
Plus: I wonder why he cares so much about OEMs/ODMs: I mean, GF100 was supposed to ship in low quantities and high prices. How many OEMs apart from Alienware will initially build systems for products like this? And then notebooks: I don't think anyone expected GF100 to show up in notebooks, right?

One possible explanation of that piece is that NV was trying to get the TDP down to 225W, but could only mange to reduce it by 5W. I don't know how close it is to truth.
 
One possible explanation of that piece is that NV was trying to get the TDP down to 225W, but could only mange to reduce it by 5W. I don't know how close it is to truth.
But nevertheless OEMs tend to utilize large quantities of cheap boards - especially in their line-ups with more crowded boxes.

Both do not fit what we are lead to expect from GTX4xx.
 
This one is puzzling me, too.
Only "sensible" explanation I can come up with is that both GTX480 and 470 would have TDP of 275W, but that's just not a real possibility, is it?

The only official information so far has been the 225W for the GTX470, and now this official information puts the GTX480 at 50W more. It still doesn't make too much sense, as I would have imagined that everyone knew that GTX480 would draw more power and 50W more doesn't sound out of the ordinary.
 
Guys , about testing with 2500x1600 resolutions , I just noticed that this is a trend that happened with all major card releases : 8800GTX , HD2900 and even HD5870 :
HD2900 : http://www.anandtech.com/video/showdoc.aspx?i=2988&p=19
8800GTX : http://www.anandtech.com/video/showdoc.aspx?i=2870&p=26
HD5870 : http://www.anandtech.com/video/showdoc.aspx?i=3643&p=18

Especially in the case of 8800GTX vs X1950 and 8800GTX vs HD2900 , in the reviews , the size of video RAM seemed irrelevant .

how would that translate into the upcoming situation of GTX4xx vs HD5870 ?!
 
Guys , about testing with 2500x1600 resolutions , I just noticed that this is a trend that happened with all major card releases : 8800GTX , HD2900 and even HD5870
Perhaps because it's hard to find enough interesting games that require so much graphics power?

Hell, even Crysis & Warhead are still common while they're quite bad at showing pure GPU performance, especially on AMD hardware since their driver doesn't quite handle their draw calls properly.
 
Back
Top