NVIDIA GF100 & Friends speculation

Discussion in 'Architecture and Products' started by Arty, Oct 1, 2009.

  1. ChrisRay

    ChrisRay <span style="color: rgb(124, 197, 0)">R.I.P. 1983- Veteran

    You don't expect there to be good idle performance? IE GT200 like? Sure power is going up. It has in all designs. But if you don't game 100% of your Computers usage then you're actually likely useing less power now than you were with G92 designs at idle.
     
  2. One that is fast enough to drive my screen with max settings, passively cooled, and cheap. Other than that, I don't care, to me, the use should be a black box.

    Ya think I might know a bit more about it, and may in fact be working on an article right now, just taking a break as I eat? :) You might want to stay tuned to S|A for the next few hours. That said, no, I don't agree with it. I understand why, but in this case, I think it was their only option, and a bad one at that. The price paid for it is too high, and the demos they showed off to the press were very specific for a reason.

    -Charlie
     
  3. skinnyq

    skinnyq Newcomer

    was performance per mm^2 ever that important to end-users?
    225W TDP is still somehow reasonable if the performance jump is big.
     
  4. Working on just that now.

    -Charlie
     
  5. A.L.M.

    A.L.M. Newcomer

    It was a post or an article that said something like "22k Vantage performance equal to +20% over a HD 5870 @ 1GHz". That was my point, don't know if I'm clear. :wink:

    I was just saying that trying to get a ratio between RV870 and GF100 from those numbers alone is pointless, even if it seems quite clear that GF100>RV870, it's quite difficult to say how much from what we know as of now... We just have to wait few hours more. :wink:
     
  6. Florin

    Florin Merrily dodgy Veteran Subscriber

    Because without bad-mouthing you'd suddenly have 16 awake hours a day left with nothing to do?
     
  7. Squilliam

    Squilliam Beyond3d isn't defined yet Veteran

    Funnily enough im using a G92 in this computer haha. What im saying is that the majority of the market does care that the card has great performance within a certain smaller TDP envelope and these are OEMs and your typical lower level enthusiast buyers as going by the pie graphs released by AMD which make up the bulk of the market by numbers.

    I was responding to your comment about the 'majority' of the market. If you read that at face value the 'majority' are OEMs and low level consumer designs. Sure theres a segment of people who do buy high level graphics cards and be damned about the power useage but increasingly even these people are becoming aware of the overall power draw as seen by the increased popularity of bronze, silver, and gold 80+ power supplies.
     
  8. CarstenS

    CarstenS Legend Subscriber

    I'm very concerned about my power bill - but only after I've secured "enough" power to run my games the way I want them to run. (Dragon Age: Origins @2.560x1.600 w/4x SGSSAA and 16:1 AF for example right now)

    Power saving occurs with me by using a netbook for surfing the net and leaving my game rig completey switched off when I'm not playing.


    Looking forward to the read!
     
  9. argor

    argor Newcomer

  10. trinibwoy

    trinibwoy Meh Legend

    Oh I assumed you would have suggestions on how Nvidia's engineers could have spent their transistor budget more efficiently.

    Hmmm that's not surprising is it? AMD's demos also focused on tessellation. It's the new shit right? :) I'm sure this article you're working on will be a classic, looking forward to an indepth analysis of where Fermi went wrong.....
     
  11. Squilliam

    Squilliam Beyond3d isn't defined yet Veteran

    Thats a lot of letters! Try saying the actual words behind the letters and it'd be quite a mouthful.

    Anyway you ARE in Germany really, so I wouldn't be surprised to find you pay 2* more for your electricity than we enlightened New Zealanders and our lovely renewable electricity. So what does power efficiency mean in terms of computing over there? Whats the attitude towards power hungry graphics cards and TVs?
     
  12. jimmyjames123

    jimmyjames123 Regular

    GF100 NDA breaking at ~ 9pm tonight. Chuck getting ready to post an anti-GF100 article at SA tonight. Coincidence? I think not. ;)

    I've said it before, and I'll say it again: there is a big difference between people being positive and hopeful about their most favorite/preferred vendor, as opposed to be being negative and hateful about their least favorite/preferred vendor. The latter type of attitude is unfortunate, as it creates conflict between fans of each respective vendor, and it takes focus away from what is truly important (ie. details on and celebration of a brand new radically improved product).
     
  13. CarstenS

    CarstenS Legend Subscriber

    I'm afraid I don't quite understand?

    Yes, electricity is quite expensive her in Germany i think. It's about 20 Euro-Cents per kWh (which is about 30 US-Cents roughly); we pay in our household roughly 400-500 Euros (about 650 US-Dollars i think) per year. And yes, I like power efficient pc equipment - but i have other priorities when selecting them for my gaming rig, because this is only turned on for gaming, which is vastly dwarfed time-wise by surfing the internet or watching movies (at my girlfriend's PC). So, priority number one is enough pc-power (fill, tex, bandwidth, FLOPS etc. -> Fps for my games) since the gaming rig is only turned on a few (single-digit) hours a week, so even a 100-watts-difference would equate only to 10.x Euros a year, which is only 0.25-0.2 percent of my electricity bill.

    I think I already said that (without the numbers) in my former posting.
     
  14. Andrew

    Andrew Newcomer

    Judging by the past, it's so you can continue making false statements and outlandish predictions. I just love how you gloated about being correct on the HD5xxx shader count yet your own articles during the previous year had guesses ranging from 900 to 2000+.

    To keep you honest in the future, I propose any and all 'grand claims' should have a penalty when they turn out completely false. Perhaps something along the lines of how Intel executives made you wear a bunny suit for losing a bet. ;)

    [​IMG]
     
  15. Silus

    Silus Banned

    Heh I love it. A couple of months ago, tessellation was the big thing for some people (here and in other forums). Now that leaks seem to suggest that Fermi's very good at it, the same people call it a gimmick. It never fails :)

    I'm really curious to see the alleged decrease in TMUs vs GT200, because the leaked performance numbers suggest much higher performance vs GT200. The in-depth architecture analysis will be very interesting.
     
  16. Silus

    Silus Banned

  17. Rangers

    Rangers Legend

    If it is 200% as fast as GTX285, well, that's what it should be. It remains baffling that ATI could only get 40% more performance out of basically a 2X 4890. They couldn't even manage 50, 60%, let alone say, 90. It's not all that surprising this may come back to bite ATI. And dont blame the memory bandwidth, tests show the 5870 gains more from overclocking the core than the memory.

    As for all this super duper Fermi tesselation stuff, what good is it? We mostly only get console ports on PC today. Who will design a game that uses this?
     
  18. digitalwanderer

    digitalwanderer Dangerously Mirthful Legend

    LOVE the bunny suit! You did forget to mention how he's been right about Fermi so far in spite of so many people trying to bash him along the way. ;)

    Are you the same guy who started this site? If so, please update it...I'm dying to see what happens next. :yep2:
     
  19. HKS

    HKS Newcomer

    Yes, and remember what good old Mr. nvidia-hater (Charlie) wrote about the GT300:

    http://www.theinquirer.net/inquirer/news/1137331/a-look-nvidia-gt300-architecture

    :lol:
     
  20. DavidGraham

    DavidGraham Veteran

Loading...

Share This Page

Loading...