NVIDIA Fermi: Architecture discussion

There is a difference between 300 watt and 300 watt? And why should they not use the single-pcb cooling system of the gtx295 cards?

Fermi cards are shorter. GTX295 TDP should be 289W. ;)
http://www.gpureview.com/geforce-gtx-295-card-603.html

Was this heard from anyone reputable? I think the most common expectation is that on a single-chip basis that Fermi's top grade will be faster than a single Cypress.

Is PCWorld (quoting Thermaltake and CoolerMaster) reputable? If yes, then yes, it was heard from someone reputable. ;)
 
A driver problem with their own software? At least don't show it to the world, then. I sincerely miss the point of showing an inhouse demo which crashes / is very slow under certain situations.
I mean, you are showing a Triple SLI of Fermi... It should be fast as hell, and not lagging like a 400 bucks pc...
NVidia should know that, given that they're so fond of marketing.

You really find it objectionable that a demo of new technology running on unreleased hardware and drivers can crash? I take it you have never experienced a crash in commercial software before? :LOL:

The demo itself is pretty cool. I didn't get a feel for the graphics but constructing the sled out of all physically simulated parts (including the engine!) was a nice showcase for PhysX. It was actually much more of a physics demo than DX11 but that's not surprising I guess since it's easier to show a difference with the former.
 
You really find it objectionable that a demo of new technology running on unreleased hardware and drivers can crash? I take it you have never experienced a crash in commercial software before? :LOL:

The demo itself is pretty cool. I didn't get a feel for the graphics but constructing the sled out of all physically simulated parts (including the engine!) was a nice showcase for PhysX. It was actually much more of a physics demo than DX11 but that's not surprising I guess since it's easier to show a difference with the former.

Yes I have, many times actually.
But I also find objectionable that a demo realized by Nvidia crashes on NVidia hardware during a public event. That's it. ;)
 
Was this heard from anyone reputable? I think the most common expectation is that on a single-chip basis that Fermi's top grade will be faster than a single Cypress.
The site has some reputation, but they sometimes give credit to false rumors too... as always, what's important is who said it, something we don't know as readers.

Stephane insisted on the power draw not being a guess though, so the board he's seen should have been really hot and noisy.
 
That kills the board in the market. Green IT is something you can not miss today. If ther perfromance is anything less than breathtaking the thing is NV30 all over again.
 
I call BS on the 300watt TDP for Fermi based GeForce single chip. That's what we are expecting for the X2 card.
And if the GTX 380 is really on par with a HD 5870, then it's definitely epic fail. Though it contradicts what LegitReviews and that other guy said before, so there are contradicting rumors, from entirely different sources. It's wait and see.
 
Both have the same length - 10,5" and 300 watt is only 3,8% higher than 289 watt. And we don't know if fermi will need exactly 300 watt.

Actually we don't know anything. I was just guessing. ;)

I call BS on the 300watt TDP for Fermi based GeForce single chip. That's what we are expecting for the X2 card.
And if the GTX 380 is really on par with a HD 5870, then it's definitely epic fail. Though it contradicts what LegitReviews and that other guy said before, so there are contradicting rumors, from entirely different sources. It's wait and see.

That is for sure. However, I wouldn't compare an anonymous BSter on a forum with the website of a paper magazine. ;)
 
GF100 Will be faster than Cypress... Why worry?

I guess the worry is how it is achieving that? Rumors of 300w on a single chip fermi don't really go over well for a Cypress killer. That's a horrible cost for the performance crown.
 
Yeah but question is, can it really be just on par with 5870?

For example we should make the (rather ) safe assumption that the ALUs will be at least as powerful as GTX285's ones. Also the clocks cannot be very low .. otherwise no one will ever design water cooling for Fermi ;).

Are there any places where Fermi should be, on theory, only slightly (under 30%) better than GTX285? Only ideea that comes to me is the new memory hierarchy, they could have somehow screw that ? But maybe that is just because I didn't read too much about it.
 
Is it me or is this all about some weird typo? I wouldn't expect a GF100 to surpass a 5970 (and not 5870) without any serious overclocking.
 
I guess the worry is how it is achieving that? Rumors of 300w on a single chip fermi don't really go over well for a Cypress killer. That's a horrible cost for the performance crown.
Anything slower than 30% delta LEAD over 5870 would be a fail. Either that or they need to price it at the same level as the 5870.
 
I guess the worry is how it is achieving that? Rumors of 300w on a single chip fermi don't really go over well for a Cypress killer. That's a horrible cost for the performance crown.

Nah, only the french site is saying that. At least based on the translation provided.

There's no way that a single Fermi chip will suck that much power.
 
LOL, three Fermi boards on a single loop -- the water on the drain end should be hot enough to boil a chicken. :LOL:

Hard to tell, but looks like it may have included the CPU, Northbridge, AND PowerMosfets on the same loop. Unless they were using 2 or more Quad-Fan radiators, there's no way that setup had any benefit over normal air-cooling.
 
GF100 Will be faster than Cypress... Why worry?
It'd better be 50% faster if it draws 60% more current.

Fun fact of the week is they did just what I expected them to do : show Heaven with tessellation but disabling the fps counter... is it that slow they don't want to show it? Since the launch target is somewhere in march, they could very well claim it's due to the drivers not being ready yet, it would be less of a lie than some things they've said recently...
 
Back
Top