NVIDIA Kepler speculation thread

As far as I know (and maybe was previously discussed here) Bitcoin mining is not currently profitable, meaning the cost of electricity is usually higher than the actual amount of coins mined.

Weather 7970 changed that, I don't know. But it's not all that likely it did.
 
Is this meant to be 690 (as some gaffers say) or just 670Ti etc?? I assume the latter, am I wrong?

Funny, they may get the salvage parts out while the 680 itself still isn't available :LOL: That's gotta be a first.

Impossible to know with this screen.. but at least, i will bet on the 660-670. It will be really more profitable in term of sales to release it as fast as they can.
 
Really? It helps to measure integer performance, does it not?

No, not really. The perf you get on Bitcoin is not relevant to any loads other than crypto. There are a lot of other kind integer loads (dna sequencing, for one) whose performance bitcoin is absolutely awful predictor of.

To have good "Integer performance" for generic loads, you need to be good at a lot of things, from the arithmetic to the cache system. BTC mining really only cares about how many bitshifts and bitwise logic operations you can do per second. Nothing else matters. You could remove the memory interface and caches from an AMD GPU and it would still rock at BTC, while that would make it completely unusable for anything else.

So it's a bad benchmark.
 
Communication will slow down the computation, no doubt. But that does not change the fact that GK104 obviously has a quite low integer raw performance (only integer adds are reasonably fast). In typical gaming and a lot of GPGPU workloads, this may not matter much, but cryptography or also some other mathematical problems are valid application fields of GPGPU.

But does Bitcoin mining benchmark a good indicator of general integer performance? Probably not.

That is a misconception in my opinion. Newer hardware is going to provide higher performance/Watt. What is cost effective today, won't be cost effective tomorrow as the needed effort for mining new bitcoins rises. Therefore at some point you need to replace the hardware with something else to keep the mining profitable. That also means, there is a limited life span a piece of hardware can be used. For that simple reason, the initial cost of buying the hardware of course plays a role. If a certain piece of hardware costs less to buy and achieves a higher absolute performance, you can tolerate a worse performance/W and still reach a higher profit.

No, my point is GPU is not and will not be the best choice of hardware for Bitcoin mining simply because FPGA/ASIC is and will be much better. And since Bitcoin mining is all about money, performance per watt counts, because if everyone else are using doing better performance per watt than you, you are going to lose money. Furthermore, from what I've seen a standalone FPGA miner is already cheaper than a high end GPU right now (with better performance and performance per watt). It's just going to be even cheaper later if Bitcoin proved to be useful, not to mention if someone actually decide to make an ASIC to do that.
 
The market dictates that the long-term price of bitcoins will depend on the supply of bitcoin (fixed) versus demand. Demand is driven by its usefulness as a currency. That's what makes the whole thing a joke. The speculators should ride the hype train while it's rolling but there's a cliff waiting at the end of the tracks.

Of course, my point is "IF" Bitcoin ever gets really useful. Personally I don't see that because there are just too many problems. But since this thread is really not about Bitcoin I think I'll stop here.
 
Maybe they mean the GTX 680 is finally coming to Newegg. :p
The GTX 680 has been coming to Newegg many times over the last two weeks.

I have gotten many Auto-Notifies on various vendor's GTX 680's over the last two weeks. That said when I did check (usually an hour or more later) they were again Sold Out. So demand is still high and supply still low.
 
I had one in stock on Newegg this AM.
Tigerdirect has two models in stock right now.

Not in the market though. My 580s will hold me over a while longer.
 
Really? It helps to measure integer performance, does it not? What about hashCat? Does that app not count either just because AMD's GPUs are so much better than nvidia's?

Other test may stress different parts of the chip such as caches or float/double performance, but that doesn't make Bitcoin or hastCat any less useful in providing information about the chip being tested.

-FUDie

They are synthetics and they basically don't touch mem at all. Not very useful if you want a holistic picture.
 
But does Bitcoin mining benchmark a good indicator of general integer performance? Probably not.
What is "general integer performance" on a GPU anyway?
No, my point is GPU ...
I got your point the first time already. My point is that you can't neglect the initial investment cost. You can run an FPGA for quite some years before the integrated power costs starts to approach the initial investment (for a high end GPU it is also more than a year). In that timeframe, newer hardware will made the particular FPGA in question obsolete. The goal has to be a reasonably high ROI, i.e. a high return in relation to the initial investment. It has to be profitable within one or two years at the latest because after that a new hardware generation will probably crush the old one. A low initial investment helps with this.
Only the latest FPGA miners are really faster than GPUs for a somewhat similar price as a high end GPU (but the current ones are quite bad deals in this respect so with FPGAs you still pay more money per hashing speed upfront if you don't want to go big which neccesitates investing north of 10,000$).
And developing and manufacturing a custom ASIC for this purpose is quite unfeasible in the moment. You don't have a high volume market (as for GPUs and even FPGAs) where the huge costs can be distributed.

But I guess we should leave this stupid BTC profitability discussion to the guys doing the mining. :LOL:
 
What is "general integer performance" on a GPU anyway?
I got your point the first time already. My point is that you can't neglect the initial investment cost. You can run an FPGA for quite some years before the integrated power costs starts to approach the initial investment (for a high end GPU it is also more than a year). In that timeframe, newer hardware will made the particular FPGA in question obsolete. The goal has to be a reasonably high ROI, i.e. a high return in relation to the initial investment. It has to be profitable within one or two years at the latest because after that a new hardware generation will probably crush the old one. A low initial investment helps with this.
Only the latest FPGA miners are really faster than GPUs for a somewhat similar price as a high end GPU (but the current ones are quite bad deals in this respect so with FPGAs you still pay more money per hashing speed upfront if you don't want to go big which neccesitates investing north of 10,000$).
And developing and manufacturing a custom ASIC for this purpose is quite unfeasible in the moment. You don't have a high volume market (as for GPUs and even FPGAs) where the huge costs can be distributed.

But I guess we should leave this stupid BTC profitability discussion to the guys doing the mining. :LOL:
Also, once you use btc mining to pay off your GPU, you can use it to game and what not, or even just resell it on ebay. If you use an FPGA board and pay it off, you get....
 
It's about as useful a benchmark as say Furmark or OCCT. No real world counterpart, but it's still seen time after time in reviews.

Bitcoin mining would be just as useful for judging a video card as those. :) Well, except people actually do Bitcoin mining. So I guess it's slightly more useful as a GPU metric.

Regards,
SB
 
If bitcoin is really what all that fuss is about, maybe AMD should keep all their GPU chips and build their own mining farm instead.

They might finally... turn profitable.
 
Back
Top