NVIDIA GF100 & Friends speculation

Nvidia's biggest concern must be, how on earth are they going to be selling these as HPC cards? They are clearly huge power hogs, and probably too hot for a full rack..?

What do you think..? My bet is placed in no tesla card until b1

Hence why only 448CC parts will be used. If you look at the power/noise characteristics of the GTX470, its downright similar to the GTX285/275. So I dont think it would be too bad, but the GTX480 on the other hand.. is where things start to get a little extreme for anyones liking.
 
I'm disapointed. The cards aren't much faster and use way more power and cost more. I was hoping for some price changes on the amd side (mostly cause i try not to spend more than $200 for a card and everything under that price point isn't worth upgrading from a 4850 1gig right now) since nvidia only launched higher end parts. But it looks like amd doesn't have to do anything at this point to combat these cards. The most I think i can hope for is the 5850 going down to its msrp. So looks like i have to wait till the fall before I decide to buy something.
 
I really cant fathom how these cards can be called a success. Not with all that heat,power and noise at load.

Most consumers don't really shop based on efficiency, with the exception of mobile products where it effects battery life, if they did, there wouldn't be so many gas-guzzlers on the road, so many incandescents and halogens in people's homes. The customer mostly cares about performance and cost.
 
I skimmed thru the reviews and found that only [H] had a disappointing take on the matter. Rest were saying that 480 is kinda good.
Bit-tech's report was kinda negative on it too... even called it a potential flop

Most reviews were pretty muted from what I've seen. Saying "kinda good" is a nice way of saying "meh..."
 
Most consumers don't really shop based on efficiency, with the exception of mobile products where it effects battery life, if they did, there wouldn't be so many gas-guzzlers on the road, so many incandescents and halogens in people's homes. The customer mostly cares about performance and cost.
The thing is that the cost adds up. If you're a big torrenter and keep your computer on 24/7, 25W higher idle works out to $20 per year. 120W higher peak power could mean a new power supply, so you need another $50.

I agree with you that higher clocks and 512 SPs would have lessened the perf/mm2, and consumers don't care about that when purchasing anyway, but as it is ATI feels no pressure to reduce prices. It's above launch MSRP after 6 months.
 
Most consumers don't really shop based on efficiency, with the exception of mobile products where it effects battery life, if they did, there wouldn't be so many gas-guzzlers on the road, so many incandescents and halogens in people's homes. The customer mostly cares about performance and cost.

I dunno. Most people who buy these high-end cards are gaming enthusiasts. They usually arnt too impressed with cards that over heat and make too much noise. I know Im not.
 
I've said it in the other thread; the GF100 launch is the second weakest Nvidia launch in their history.

The only good thing that they have going on is that they can improve a lot on. Sadly that only bodes well for the next generation and this one the consumers have to "beta test" it for Nvidia. :p
 
Gaming is all well and good but does anyone know if reviewers have compute benchmarks (OpenCL/CUDA) to look at? All I can find is gaming atm.
 
Gaming is all well and good but does anyone know if reviewers have compute benchmarks (OpenCL/CUDA) to look at? All I can find is gaming atm.
Where are the consumer apps to bench these things? All I know is that there is a dxcs implementation of video transcoding in w7. No benches yet. :???:
 
Power Director is one of those - it's half the way to being usable (contrary to pure transcoders like you know which) and relatively widespread.

We didn't do bars, but benched it nevertheless (with Cat 10.3a &GF 197.17 drivers - fresh as always):
http://www.pcgameshardware.de/aid,7...as-GF100-Generation/Grafikkarte/Test/?page=18

The gist:
For a 1920x1080 (NTSC) target conversion, it basically halves the time needed with GTX 285. HD 5870 is quite in the middle between the two.
 
And why is it ATI owns in DIRT2 DX11 on [H] but gets slammed on most every other site? DX9 path?
It is the application/game that decides which DX-level to use, right? If so, in what way can the driver influence the which DX rendering path is preferred? If the reviews are comparing a DX9 path against the DX11 path, is it then simply not a case of the reviewers being clueless?
 
It is the application/game that decides which DX-level to use, right? If so, in what way can the driver influence the which DX rendering path is preferred? If the reviews are comparing a DX9 path against the DX11 path, is it then simply not a case of the reviewers being clueless?

a) [H] managed to avoid that trap.

b) Drivers have been known to detect apps based upon their names and cheat occasionally, but changing from dx11->dx9 would be a massive hack.

c) The apps usually query the driver and choose which codepath to use. It is possible to build the driver to detect dirt2 and "suggest" the unavailability of dx11, turning on dx9. What exactly is going on is another matter.
 
There's a switch in Dirt 2's config.xml (it's called something complicated really), which allows you to force DX9 mode. Everyone should have known that by now.
 
I just clicked on the on the wattage/temp/sound test on Hardocp. http://www.hardocp.com/article/2010/03/26/nvidia_fermi_gtx_470_480_sli_review/7

If someone had posted that video as a "leak" before the launch they would have been flamed to death with "NO WAY!" "you'll be proven wrong!" and so on. Frankly, why anyone would want to subject themselves to this is beyond me. It's hitting those temps adn noise outside a case. In a case, where most of them will be, things will be worse. Don't even think about Sli unless you're properly deaf.
 
The thing is that the cost adds up. If you're a big torrenter and keep your computer on 24/7, 25W higher idle works out to $20 per year. 120W higher peak power could mean a new power supply, so you need another $50.

I agree with you that higher clocks and 512 SPs would have lessened the perf/mm2, and consumers don't care about that when purchasing anyway, but as it is ATI feels no pressure to reduce prices. It's above launch MSRP after 6 months.

Aye, as I posted elsewhere, this is one of those truly rare instances where early adopters got the best value. Anyone that got a 5870 for 379 USD at lauch should be laughing it up right now. I know I am. :)

It's also rather amazing that GTX 480 is actually louder than HD 2900 XT which wasn't exactly quiet at load.

Regards,
SB
 
Warning to dual monitor users looking at a GTX480

http://www.legitreviews.com/article/1258/15/


"We are currently keeping memory clock high to avoid some screen flicker when changing power states, so for now we are running higher idle power in dual-screen setups. Not sure when/if this will be changed. Also note we're trading off temps for acoustic quality at idle. We could ratchet down the temp, but need to turn up the fan to do so. Our fan control is set to not start increasing fan until we're up near the 80's, so the higher temp is actually by design to keep the acoustics lower." - NVIDIA PR

Yes, I saw that too. Ouch!! Basically saying there has to be some sort of trade-off. Well, I say thumbs DOWN to the new Fermis. ;-(

I'm surprised NV PR didn't spin it to make you believe you also got a free griddle to go with it. (see sig):p
 
Back
Top