Nvidia BigK GK110 Kepler Speculation Thread

Both. Well, mostly long idle, but looking at The Tech Report and HardWare.fr (my most common references) the 7970 GE has a ~2W advantage in idle. Not huge but I think the typical user will spend a lot more time in 2D mode than 3D.

As for long idle, I care about power, but for various reasons my computer needs to remain on for long periods of time when I'm not actively using it, so it matters quite a bit to me. I'm sure I'm not the only one.

Seriously those rough 2W difference should make what and by how much in $ or Euros a difference in an electricity bill? If you use common light bulbs for example exchanging them with economy bulbs will save you a lot more electricity and they last longer too.
 
Seriously those rough 2W difference should make what and by how much in $ or Euros a difference in an electricity bill? If you use common light bulbs for example exchanging them with economy bulbs will save you a lot more electricity and they last longer too.
It's great that GPUs have come a long way in this field, but it amuses me how review website suddenly start measuring long idle too. More 'content' is more clicks, I guess.

If this GPU generation is an indication, the laptop makers don't seem to be too concerned about long idle power either...
 
I had a NAS PC expressly for letting it on, but now I'm back to main PC only, maybe buying parts for the NAS, putting it together etc. is more wasteful than just leaving the main PC on if the PC was on for quite some lengthes already.

as I have nuclear electricity and winter will be coming I've even begun to run BOINC (CPU only) and I feel this makes great sense in this case but else, low idling watts are good I think.
 
Seriously those rough 2W difference should make what and by how much in $ or Euros a difference in an electricity bill? If you use common light bulbs for example exchanging them with economy bulbs will save you a lot more electricity and they last longer too.

Same argument can be used against the bulb, if you take a typical western household with heating, cooking, television, cars and even home A/C in the case of US households then changing light bulbs accounts for a 1% reduction in energy use.
 
Implosion won't get any better if prices stay as high as they are, rather the contrary. Simple market rules: the higher the price the lower the demand.

Don't know where to put it but here the discussion can go on...

DRAM Contract Price Falls Below US$16 Due to Weak Demand

Don't know what to think either. Despite these so low memory prices, demand is not as expected. Probably other components' prices have influence, memory alone is useless.

However, found pretty interesting comment below the article.

2012 was a good year for consumers buying DDR3 memory, I expect them to clean up next year. 2013 will be the year of VERY CHEAP pc parts if you have the patience to wait. Video cards, followed by motherboards are heading for the largest tumble. Best, Liquid Cool
 
Seriously those rough 2W difference should make what and by how much in $ or Euros a difference in an electricity bill?

Wrong focus. The differences between boards are marginal. A few performance percentage points, a few watts, etc. Different people will focus on different marginal benefits. Arguing about the objective importance of those differences is pointless.

All things discussed so far, I'd buy a Radeon. But my next purchase will be an nvidia card. I don't know that it'll be an improvement, but my current card flashes across monitors when starting/stopping movies. That's a problem because I use computer mainly for video editing (if I were using Adobe products, CUDA would be a selling point as well, but, I don't). This bug has been present in every driver drop I've ever gotten, though it used to be much worse. At some point, scrolling content in one monitor caused the content on my other monitor to tear and otherwise freak out. Well, it's mostly better now, so it's just another marginal benefit -- but it's driving me absolutely bat$%^* crazy. Objectively silly. Subjectively very important.

Of course, I don't know when my next purchase will be, because the current crop of cards is overpriced from my perspective (I don't play games, I "need" the card for my titling software and transition effects).

It's just as likely that the card after that will be a Radeon. I can see myself getting upset about the lack of DirectCompute performance, the poor quality of nvenc, the lack of extreme low power idles (though not likely the 1-2 watt difference in 2D mode :shrug:). But right now, those aren't the marginal features that make the difference.

YMWV
-Dave
 
Seems we can't keep this thread on topic no matter what :LOL: Typically an economy light bulb uses less than 1/3rd the wattage of a conventional one, where one of the downside is that it takes a bit of time until they're properly heated up and reach their maximum output. It depends how many lightbulbs someone is using at a time and how many are going to get exchanged really. For lighting exclusively if you'd exchange them all in a household the difference isn't just 1% at all and there's no difference in location since light bulbs are light bulbs. Compared to the total electricity consumption the difference might be small, but it's not much difference for any other appliance or device if you don't isolate the consumption just for that particular case.

Besides heating and air conditioning is highly relative to how "power conscious" the devices/installation really are, how well maintained and how good isolated the apartment or building really is etc.
 
Yes, Nvidia's brand is stronger but WHY?

Sorry that I have to quote myself but perhaps you missed it:

Because those rich milksops (or cissies) don't have a damn clue what they buy. Perhaps geforce sounds better to them than Radeon, which is actually kind of true.
Truth is that people who are deep enough in hardware field are willing to buy Radeons better.

Apple or nvidia addiction , it is the same. People in the know would prefer to go for HTC or Samsung. :mrgreen:

So, why do people prefer iphone, istuff in general.
It is a simple matter of what people like more. It's more like a question of why person A has a lot of fans despite being what-ever-what, while person B doesn't look worse but people don't like him so much.

Simple psychology, very weird.
 
And much like a sporting event, the speculation leading up to the event is often more entertaining than the event itself! :D

There's always some sort of frustration when you find out that you've been dead wrong with it though, irrelevant if we'd like to admit it or not.

That said either I'm missing something or the difference in power consumption between a K10@225W and a GTX690@300W seems quite big. Of course do I realize that there are sizeable frequency differences and the K10 probably lacks the turbo mode, but if those early K20 specs are for real (13 SMX, 705MHz, 320bit bus, 5GB, 225W TDP), then I'm rather safe than sorry. Going up to say 850MHz+turbo for example even with 14 SMXs, 3GB 1250MHz GDDR5@384bit, doesn't sound at this point "that" encouraging for a ~250W TDP under today's conditions.

We'll find out eventually next year I guess.
 
Seriously those rough 2W difference should make what and by how much in $ or Euros a difference in an electricity bill? If you use common light bulbs for example exchanging them with economy bulbs will save you a lot more electricity and they last longer too.

Not much, however it's about as significant as the 20~60W difference under full load, but for much shorter periods—unless you really game a lot.

And that was my point: for usage patterns I believe to be typical, the energy consumption contest is roughly a wash.
 
@UniversalTruth I'm sure that is what AMD has been telling themselves for years now. How is that working out?

IOW, you are wrong.
 
There's always some sort of frustration when you find out that you've been dead wrong with it though, irrelevant if we'd like to admit it or not.

That said either I'm missing something or the difference in power consumption between a K10@225W and a GTX690@300W seems quite big.

Nvidia's funny TDP rating system. http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_690/26.html GTX690 draws less average power, the same peak power, and less maximum power than the gtx480. Yet the gtx480 was listed as a 250 watt card, while the gtx690 is 300 watts.
 
It's great that GPUs have come a long way in this field, but it amuses me how review website suddenly start measuring long idle too. More 'content' is more clicks, I guess.

If this GPU generation is an indication, the laptop makers don't seem to be too concerned about long idle power either...
I don't see the humor. It's a useful data point and previously there was no difference between monitor on and off idle power.
 
Sorry that I have to quote myself but perhaps you missed it:



So, why do people prefer iphone, istuff in general.
It is a simple matter of what people like more. It's more like a question of why person A has a lot of fans despite being what-ever-what, while person B doesn't look worse but people don't like him so much.

Simple psychology, very weird.
nVidia had much better parts than ATI by most measures for many years back in the early days of 3D. At the time, ATI parts had very widespread adoption, but very poor 3D capabilities. People would swap out an ATI card to put in an nVidia card. I don't think this perception of ATI's poor quality ever quite left them.

Now, nVidia has had its setbacks. The GeForce FX was a huge one, with ATI's parts of the time dramatically outperforming them (at least in most games...I still remember being supremely annoyed when City of Heroes played better on my GeForce Ti4200 than a Radeon 9700 Pro...given the 9700's capabilities that should never have happened). But they bounced back relatively quickly with the GeForce 6 series, and have maintained parts that were highly competitive with ATI/AMD's parts in one way or another ever since. Even when AMD/ATI has been ahead in one way or another since the FX debacle, there has never been a time when AMD/ATI was a knock-down clear win for consumers, but a few times when nVidia's parts have been. For example, both the GeForce 8 series and the GeForce 6xx series in particular have been quite dramatic wins for nVidia, though the 6xx series less so.

This isn't to say that they always won all of the benchmarks, but the GeForce 8 series combined dramatic performance with incredible computing power and tremendous improvements in visual quality, while the GeForce 6xx series pushed a dramatic increase in performance per watt, with the possibility of some truly incredible performance in the possible upcoming "Big Kepler" part.

nVidia's consistent delivery and attention to consumer interests has earned them a big following. It really shouldn't be much of a surprise.
 
I would say that Cypress and Juniper were as clear wins for AMD as anything Nvidia had done since the 8 series. The problem is they didn't build on it and they've just slowly slipped back into Nvidia's grasp.
 
nVidia had much better parts than ATI by most measures for many years back in the early days of 3D. At the time, ATI parts had very widespread adoption, but very poor 3D capabilities. People would swap out an ATI card to put in an nVidia card. I don't think this perception of ATI's poor quality ever quite left them

I don't think it's fair. You are talking about pure framerates. But as far as I know in those years it had been exactly ATi and Matrox delivering 'much' superior image quality. And I don't like nvidia because once I owned Matrox G200, changed it to geforce 2 and image quality was dramataically worse. From that moment on no respect whatsoever to that company.

the GeForce 8 series combined dramatic performance with incredible computing power and tremendous improvements in visual quality

I had the chance to compare X1600 Pro with 8500GT and ATi card offered superior 3D image quality. ;)
 
It's great that GPUs have come a long way in this field, but it amuses me how review website suddenly start measuring long idle too. More 'content' is more clicks, I guess.
For anyone that uses their PC for games a few hours a day but leaves it on for the rest of the time its a very important factor - likewise for businesses, icafes etc. It is also a key metric in some regulatory spec.
 
. And I don't like nvidia because once I owned Matrox G200, changed it to geforce 2 and image quality was dramataically worse. From that moment on no respect whatsoever to that company.
That's some very sensible and rational thinking when making 2012 purchasing decision. Let's also not forget the uselessness of NV1 and 3DMark-early-this-century.
 
Back
Top