I think the majority of the perception issues with Fermi don't start with our marketing but rather their own. Heck, even JHH admitted that.
...one video?...
Where is the video? I want that video. I will put it on my facebook wall.
Тhat's exactly what I am talking about. You are proving that GF100 is not good, and it needed that step in the right direction to make you own it...
If they make 580 using 460 as a base, then it will be a much more acceptable product for the gamers, not only for different cuda, muda and stuff that sucks...
Unfortunately thats not the case, JHH might have wished for it, but its not the case ..Well the folks who actually own one seem to be happy. When you have people screaming from the rooftops about power consumption differences that are less than a single lightbulb then you know you're in loony territory.
Unfortunately thats not the case, JHH might have wished for it, but its not the case ..
That is a pretty significant stat .. are you suggesting that any one who considers 125W as significant in loony territory? Please respond so I can classify you quickly.Based on what (watt) ? Running Furmark the 480 pulls about 125w more than a HD5870. Now think about how long you leave a light bulb (or fridge or AC or TV) on vs how much you time spend loading your graphics card in a given month/year and then get back to me on how Fermi breaks the bank.
That is a pretty significant stat .. are you suggesting that any one who considers 125W as significant in loony territory? Please respond so I can classify you quickly.
I've got my answer and I think others including the mods did too.Yes, anyone who considers an extra 125w of load power draw from a graphics card as having a significant impact to their electric bill is either crazy or has never actually seen an electric bill. Have you?
So you're trolling for the mods?
It's simple math.
125w/1000 * # hours playing games * $/kWh. What would your gaming habit cost you currently if you had a 480 in your rig?
If I played 2 hours a day every single day it would cost me $1.35 a month. And I live in NY. Oh, lordie!
Im sorry your electric bill feels that way. Anyway it is likely that people who run air-con in the summer would also feel differently due to the extra cost of heat removal.
Anyway, onto other matters entirely. The probable reason why GF110 or 100b or whatever your desire to name it is was based on GF100 architecture and not GF104 scaled up is probably that they will use the same cards for their Quadro lines for capabilities which the GF104 which is a more focused on gaming chip lacks.
Yes, anyone who considers an extra 125w of load power draw from a graphics card as having a significant impact to their electric bill is either crazy or has never actually seen an electric bill. Have you?
Im sorry your electric bill feels that way. Anyway it is likely that people who run air-con in the summer would also feel differently due to the extra cost of heat removal.
I think your point would be more effective by comparing it to the opportunity cost of wasting so much of your life gaming.If I played 2 hours a day every single day it would cost me $1.35 a month. And I live in NY. Oh, lordie!
Oh please, drop your pitchfork and look in the mirror. I'm not the one calling people loonies for having a certain preference, perf/watt is a valid metric no matter which way you slice it. Difference of >125W under load and 30W in idle is pretty significant. Just because you dont mind it does not make others loonies. Next time think twice before you start calling people names just cause they dont share the same opinion as yours.So you're trolling for the mods?