NVIDIA GF100 & Friends speculation

I think the majority of the perception issues with Fermi don't start with our marketing but rather their own. Heck, even JHH admitted that.
 
Тhat's exactly what I am talking about. You are proving that GF100 is not good, and it needed that step in the right direction to make you own it...
If they make 580 using 460 as a base, then it will be a much more acceptable product for the gamers, not only for different cuda, muda and stuff that sucks... :LOL:

I proved GF100 is not good because I bought a 460? :) Wow, you're completely insane and seem to be enjoying it wholeheartedly too.
 
Well the folks who actually own one seem to be happy. When you have people screaming from the rooftops about power consumption differences that are less than a single lightbulb then you know you're in loony territory.
Unfortunately thats not the case, JHH might have wished for it, but its not the case .. :LOL:

Can we move on from the Fermi is a success/failure redundant talk and back to Fermi's friends talk please?
 
Unfortunately thats not the case, JHH might have wished for it, but its not the case .. :LOL:

Based on what (watt) ? Running Furmark the 480 pulls about 125w more than a HD5870. Now think about how long you leave a light bulb (or fridge or AC or TV) on vs how much time you spend loading your graphics card in a given month/year and then get back to me on how Fermi breaks the bank.
 
Based on what (watt) ? Running Furmark the 480 pulls about 125w more than a HD5870. Now think about how long you leave a light bulb (or fridge or AC or TV) on vs how much you time spend loading your graphics card in a given month/year and then get back to me on how Fermi breaks the bank.
That is a pretty significant stat .. are you suggesting that any one who considers 125W as significant in loony territory? Please respond so I can classify you quickly. :rolleyes:
 
That is a pretty significant stat .. are you suggesting that any one who considers 125W as significant in loony territory? Please respond so I can classify you quickly. :rolleyes:

Yes, anyone who considers an extra 125w of load power draw from a graphics card as having a significant impact to their electric bill is either crazy or has never actually seen an electric bill. Have you?
 
Yes, anyone who considers an extra 125w of load power draw from a graphics card as having a significant impact to their electric bill is either crazy or has never actually seen an electric bill. Have you?
I've got my answer and I think others including the mods did too. :cool:
 
So you're trolling for the mods? :rolleyes:

It's simple math.

125w/1000 * # hours playing games * $/kWh. What would your gaming habit cost you currently if you had a 480 in your rig?

If I played 2 hours a day every single day it would cost me $1.35 a month. And I live in NY. Oh, lordie!
 
So you're trolling for the mods? :rolleyes:

It's simple math.

125w/1000 * # hours playing games * $/kWh. What would your gaming habit cost you currently if you had a 480 in your rig?

If I played 2 hours a day every single day it would cost me $1.35 a month. And I live in NY. Oh, lordie!

Im sorry your electric bill feels that way. :( Anyway it is likely that people who run air-con in the summer would also feel differently due to the extra cost of heat removal.

Anyway, onto other matters entirely. The probable reason why GF110 or 100b or whatever your desire to name it is was based on GF100 architecture and not GF104 scaled up is probably that they will use the same cards for their Quadro lines for capabilities which the GF104 which is a more focused on gaming chip lacks.

P.S. If anyone is wondering, we're (ATI fanatics) are all going to strip ourselves naked and paint ourselves red whilst dancing round the campfire. Afterwards we're all going to talk about the R300 chip and then fall asleep.
 
Everyone put their troll masks away and leave Halloween for actual children, mkay? The signal-to-bait ratio of the last two pages is depressing.
 
Im sorry your electric bill feels that way. :( Anyway it is likely that people who run air-con in the summer would also feel differently due to the extra cost of heat removal.

Perhaps but the same analysis would apply with respect to absolute costs.

Anyway, onto other matters entirely. The probable reason why GF110 or 100b or whatever your desire to name it is was based on GF100 architecture and not GF104 scaled up is probably that they will use the same cards for their Quadro lines for capabilities which the GF104 which is a more focused on gaming chip lacks.

Depends on which rumours are true. I've seen everything from a simple respin all the way to a smaller die with 300m fewer transistors. If it's the former then yeah it will most likely find a home in Quadro/Tesla too. But if they did silly things like reduce the amount of L2, number of ROPs or number of schedulers per SM then it's less likely. They could also reduce the GPC count from 4 to 2, remove the artificial limits on rasterization throughput and end up in about the same ballpark. Would they bother fiddling with all that? IMO they would if they cared at all about making a play for the dual-GPU crown.
 
Yes, anyone who considers an extra 125w of load power draw from a graphics card as having a significant impact to their electric bill is either crazy or has never actually seen an electric bill. Have you?

It's not just about the electric bill. I for one would shy away from anything that draws over 200W, because I don't want to reluctantly gaze at my computer during hot summer days, wondering if I should really launch a video game considering the temperature.

In fact, I already sort of do that, and I only have a C2D E8400 with an HD 4850. I don't want to make it any worse.
 
Im sorry your electric bill feels that way. :( Anyway it is likely that people who run air-con in the summer would also feel differently due to the extra cost of heat removal.

I normally don't take part in these theads, but I don't think that a graphics card will have much, if any effect on the electric bill due to heat generation. Usually a thermostat is placed somewhat centrally in a house, mounted say about 5 feet up. I have never seen a computer set up anywhere near a thermostat. They are usually in the bedroom, office, or the basement, not in the hallway. For example in my house, which is a two story, the thermostat is located centrally on the first floor between the dining area and the livingroom. The computers now are setup in the basement, but at one time they were in setup on the second floor in one of thebedrooms. I do not think that either location would have any effect on the ac. The inlaws house down in 'Bama is a ranch with the thermostat located in the hallway on the way to the bedrooms. I do not think the pc setup in one of the bedrooms would have any effect either.

Disclaimer: My desktop has a GTS250, my sons desktop has a 4350. The old Gateway has the original Radeon Mobility(I believe), and the e-machines laptop has Intel integrated graphics, 4500M.

Jim
 
If I played 2 hours a day every single day it would cost me $1.35 a month. And I live in NY. Oh, lordie!
I think your point would be more effective by comparing it to the opportunity cost of wasting so much of your life gaming.

(Mint hides from the B3D mob out to lynch him for such outrageous blasphemy...)
 
Heh, I was tempted to play that card too but figured I could make my point with just dollars and cents. But yes, if graphics card power consumption is an issue on the electric bill you're probably spending too much time inside :)
 
So you're trolling for the mods? :rolleyes:
Oh please, drop your pitchfork and look in the mirror. I'm not the one calling people loonies for having a certain preference, perf/watt is a valid metric no matter which way you slice it. Difference of >125W under load and 30W in idle is pretty significant. Just because you dont mind it does not make others loonies. Next time think twice before you start calling people names just cause they dont share the same opinion as yours.

Sorry Pete, but that needed to be said.
 
Conspiracy theory: The GTX580 uses the same chip as the GTX480. No respin.
Perhaps NVidia collected all the best chips from the past 6 months of GF100 production, the ones where all 16 SMs worked without voltage boosts. There were too few of these to released at launch, but now they have a 6 month inventory and can have a non-paper launch as a new model number. Hopefully the 6 months of chip production improved their defect rate enough that they can now keep the 16 SM GTX580 in stock now.

Just a wild theory. But it'd explain why it's pin-compatible.
A more impressive GF110 design would be based on the 48 SP per SM design seen in GF104 which gives better performance for the area. Why not scale that up to say 576 SPs in 12 SMs, which would use comparable die area as the 512 SPs of GF100. (hard to judge, since die area isn't 100% SPs, but it's something in the GF100 530mm^2 range..)

I don't think the above theory is true, but it's almost a little plausible. One fact that argues against it: if 512 SP chips were rare to start, why not use those chips for Tesla and make the Tesla cards more powerful than the GTX480? That'd make the compute guys happy, and you'd still skim off a couple of the rare crazy rich gamers buying 4x priced Teslas just to play Battlefield BC2 at one extra FPS.
 
Last edited by a moderator:
Back
Top