GF100 evaluation thread

Whatddya think?

  • Yay! for both

    Votes: 13 6.5%
  • 480 roxxx, 470 is ok-ok

    Votes: 10 5.0%
  • Meh for both

    Votes: 98 49.2%
  • 480's ok, 470 suxx

    Votes: 20 10.1%
  • WTF for both

    Votes: 58 29.1%

  • Total voters
    199
  • Poll closed .
In this case, couldn't the TDP be considered a little... adventurous?

Nvidia would actually have to define what their TDP is most likely, and they've been consistently vague. Unlike say Intel or AMD, Nvidia/ATI don't really publish real technical specs for their products.
 
Nvidia would actually have to define what their TDP is most likely, and they've been consistently vague. Unlike say Intel or AMD, Nvidia/ATI don't really publish real technical specs for their products.

Ahh ok, thanks!
 
Nvidia would actually have to define what their TDP is most likely, and they've been consistently vague. Unlike say Intel or AMD, Nvidia/ATI don't really publish real technical specs for their products.

They did (thermal and power specs) up to GTX295. They don't anymore.
 
They did (thermal and power specs) up to GTX295. They don't anymore.

Not sure what you mean. NVIDIA publishes thermal and power specifications for GTX 470/480 too:

http://www.nvidia.co.uk/object/product_geforce_gtx_480_uk.html

The one line item that is no longer present from before is "Maximum Graphics Card Power". This spec is pretty meaningless to most users. The far more important spec is "Minimum System Power Requirements". Interestingly, the minimum system power requirement for GTX 295 is 680w, while the minimum system power requirement for GTX 480 is 600w. The minimum system power requirement for GTX 470 is 550w. Maximum supported GPU temp is the same for all three of these cards.

Lots of people in the past have used very power hungry setups such as 8800 GTX SLI, GTX 280 SLI, etc. I don't remember most of the reviewers making too much of a fuss about power consumption or temps back then for these high end gaming cards. But all of a sudden the reviewers go on and on about it? Go figure. It will be interesting to see what consumers say about the cards when they are available.
 
Well, no big deal was made about it when the cards were roughly equal in power/heat, but in the lead up to this launch, since AMD was known to have an advantage going in, it became a talking point, much like the R600 issues.

Basically, if two things are roughly comparable on a given measurement, then it becomes a non-issue because reviews to find differences, not equivalences.
 
There's also the matter of the extra power draw being worthwhile. Few people complained about the high TDP on high-end Core i7s because they trounce the competition. Most reviewers weren't very happy when the Phenom II 965 launched with a 140W TDP. It was the same with NV30 and R600: lots of power/heat, not much performance in return.

In a way, GF100 is the same. It's ~15% faster than Cypress (in its current form anyway) but draws 50~60% more power.
 
If I recall correctly, ATI had a prounounced advantage in terms of power consumption at load with HD 4870 vs GTX 280. We are simply seeing more of the same with HD 5870 vs GTX 480. But somehow the power consumption at load seems to be a much bigger issue now than it ever was before in the eyes of the reviewers. Of course, the 4870 cards did have some issues with idle power consumption and GPU temps which were rectified with 4890 and 5xxx cards, but we've seen a pronounced discrepancy in load power consumption for well over a year now.

I can't help but think that immature drivers are handicapping GTX470/480 to some extent, especially at 2560x1600. In some games at high settings these cards are blazingly fast (to the point where GTX 480 is nipping at the heels of HD 5970), but in other games it is completely the opposite (to the point where GTX 480 is no better or worse than HD 5870). If NVIDIA wants to save face this go around, they will have to rely on their driver team for the time being.

Maybe it was a mistake for NVIDIA to have released the GF100 graphics architecture whitepaper a couple months ago. One thing I have always enjoyed reading about when a new architecture is ready to launch (other than testing of gaming performance and image quality) is some details about the new architecture. By pre-announcing the graphics architectural details, we were in a sense robbed of what may have been the most interesting details about GF100, in favor of benchmarking in the latest round of previews. Basically after being strung along for six months with info trickling out once in a while, the only thing left to show was the performance, and that was a bit of a letdown, so the new product launch fizzled for many people.

So what happened to all the close scrutiny that most reviewers used to give towards image quality? It seems that very very few reviewers paid any attention to that. I would have loved to see more testing and image quality comparisons with the new 32x CSAA mode.
 
Last edited by a moderator:
I can't help but think that immature drivers are handicapping GTX470/480 to some extent, especially at 2560x1600. In some games at high settings these cards are blazingly fast (to the point where GTX 480 is nipping at the heels of HD 5970), but in other games it is completely the opposite (to the point where GTX 480 is no better or worse than HD 5870). If NVIDIA wants to save face this go around, they will have to rely on their driver team for the time being.

If for example game X where a 480 fairs relatively good is setup limited, how on earth is NV's driver team going to change anything for applications that aren't setup limited? While I have no doubt that there are good chances that future Forceware drivers will improve overall performance, but I'd still prefer to have each game result analyzed in order to detect what limits more in each case before I'd come to any conclusion.

So what happened to all the close scrutiny that most reviewers used to give towards image quality? It seems that very very few reviewers paid any attention to that. I would have loved to see more testing and image quality comparisons with the new 32x CSAA mode.

I don't think the expansion from CSAA to TMAA is possible yet in the 197 drivers reviews are based on so far.
 
Not sure what you mean. NVIDIA publishes thermal and power specifications for GTX 470/480 too:

http://www.nvidia.co.uk/object/product_geforce_gtx_480_uk.html

The one line item that is no longer present from before is "Maximum Graphics Card Power". This spec is pretty meaningless to most users. The far more important spec is "Minimum System Power Requirements". Interestingly, the minimum system power requirement for GTX 295 is 680w, while the minimum system power requirement for GTX 480 is 600w. The minimum system power requirement for GTX 470 is 550w. Maximum supported GPU temp is the same for all three of these cards.

Lots of people in the past have used very power hungry setups such as 8800 GTX SLI, GTX 280 SLI, etc. I don't remember most of the reviewers making too much of a fuss about power consumption or temps back then for these high end gaming cards. But all of a sudden the reviewers go on and on about it? Go figure. It will be interesting to see what consumers say about the cards when they are available.

I prefer more the Maximum Graphics Card Power than Minimum System Power Requirements, since the latter is a bit vague. I mean, what CPU/RAM/HDD/etc configuration they have in mind when giving those specs? Is CPU overclocked - if so, how much? Do they have 4GB/8GB/16GB of RAM? Do they have 1 or 4 HDDs?

To me the information on the card's power draw is far more meaningful, as I'm the one that knows how much power the rest of my system takes up and I'm the one that can tell if my PSU is enough for the entire system or not.

Anyhow, reviewers don't go about power draw of SLI setups because it's obvious that they are power hogs. On the other hand, you now have a single card that definitely stands out in terms of power draw. That's what the fuss was all about (it was the same with R600).

If I recall correctly, ATI had a prounounced advantage in terms of power consumption at load with HD 4870 vs GTX 280.

IIRC, GTX 280 had slightly better perf/W than 4870.
 
So what happened to all the close scrutiny that most reviewers used to give towards image quality? It seems that very very few reviewers paid any attention to that. I would have loved to see more testing and image quality comparisons with the new 32x CSAA mode.

I don't know if it's due to AA, but there X-Bit noticed IQ issues in Metro2033. Look here
 
I'm trying to setup 3D Vision to work with GeForce GTX 480 to watch 2010 Masters Tournament in 3D so far without success. I have Samsung 2233RZ monitor and everytime I try to enable "Stereoscopic 3D" from control panel I get blank screen saying "check signal cable". I started searching online but couldn't find any reviews / articles testing 3D Vision with GeForce GTX 480.. has anyone else seen any?

I know 3D Vision Surround won't work until with R256 drivers but how about 3D Vision on GeForce GTX 480 with current 197.17 beta drivers?

I'll switch to GeForce GTX 285 now and see if I can get it working with 197.25 drivers.


edit: nevermind, switching to DL DVI cable helped alot :oops:
 
Last edited by a moderator:
Maybe they just alocated more chips for the profesional cards in the end where they can at least sell them with profit.
But If they cant manufacture the chips in real quantity, than its the same disaster than the fx5800 in the end. :oops:

Actually.. the article is another stab at Fuad and his amazing imaginative sources at NV. The thing they do, and I figured out their algorithm:

They take Cypress' availability at some point in time and don't mention the low number they have in mind, state that availability at launch will be much better for their cards (multiples of Cypress' numbers (realize the omitted "launch" here), and then after that mention suggested Cypress launch quantities.

They let you fill out the numbers and people like Fuad will, without the slightest of a doubt, post these "tens of thousands" of cards across multiple posts during the week.

If you hear the real numbers from partners and how they will get a "projected" number of 50 cards for the month of April for one region, you keep wondering why Fuad never checks at those places. They seem to have much more realistic numbers than nVidia's Media and PR Outlet does.
 
Actually.. the article is another stab at Fuad and his amazing imaginative sources at NV. The thing they do, and I figured out their algorithm:

They take Cypress' availability at some point in time and don't mention the low number they have in mind, state that availability at launch will be much better for their cards (multiples of Cypress' numbers (realize the omitted "launch" here), and then after that mention suggested Cypress launch quantities.

They let you fill out the numbers and people like Fuad will, without the slightest of a doubt, post these "tens of thousands" of cards across multiple posts during the week.

If you hear the real numbers from partners and how they will get a "projected" number of 50 cards for the month of April for one region, you keep wondering why Fuad never checks at those places. They seem to have much more realistic numbers than nVidia's Media and PR Outlet does.
Didnt Theo also claim more than 50k for launch? We'll see if the GF100 availability is really 2.5 times that of Cypress to see who is smoking something :cool:
 
Didnt Theo also claim more than 50k for launch? We'll see if the GF100 availability is really 2.5 times that of Cypress to see who is smoking something :cool:
Yes he did.
Theo's Cypress launch numbers were pretty far off though... if he doesn't know numbers from ~6 months ago do we really think he knows the GF100 numbers before launch? GF100 launch numbers are suppose to be, at least, 3-4x less than Cypress and thats being optimistic.
 
Last edited by a moderator:
Maybe they just alocated more chips for the profesional cards in the end where they can at least sell them with profit.
But If they cant manufacture the chips in real quantity, than its the same disaster than the fx5800 in the end. :oops:


If it's true... have we ever seen something like that ? I mean, a chip that can't be really produced ?

What can nvidia do except working hard&fast on a easier to produce ship ?
 
2 neliz
Are you saying that Charlie's numb4ers are correct? :oops:
Last number I heard was 5000 cards for Europe... and it was supposed to be "official", real number. Direct claim from NV-Europe. (of course I don't mean they said this to me directly)
 
Back
Top