NVIDIA Maxwell Speculation Thread

Litecoin is the more relevant benchmark, as Bitcoin is now dominated by ASICs (from what I've heard). But AMD apparently retains a significant advantage in Litecoin.

Where exactly do you see that "significant advantage"?

LTC Mining

Litecoin.png



But the Maxwell architecture's improvements allow the 60 W GeForce GTX 750 Ti to outperform the 140 W GeForce GTX 660 and approach AMD's 150 W Radeon R7 265, which just launched, still isn't available yet, but is expected to sell for the same $150. On a scale of performance (in kH/s) per watt, that puts Nvidia way out ahead of AMD. Today, four GM107-based cards in a mining rig should be able to outperform a Radeon R9 290X for less money, using less power.
 
Why is the marketing material mainly comparing the card to the GTX 550 Ti, which is a 40nm chip with half the transistors?
Sounds like they're avoiding the comparisons to the GTX 650 Ti...
Anandtech's take on the matter: "The downside to this is that the long upgrade cycle for video cards – many users are averaging 4 years these days…" and "NVIDIA is making a major effort to target GTX 550 Ti and GTS 450 owners as those cards turn 3-4 years old, with the GTX 750 series able to easily double their performance while reducing power consumption."

I suspect that, in GPU land, there are 2 major groups: the ultra enthusiasts who upgrade whenever something major comes up (but those aren't the target of a 750) and those who, indeed, skip generations.

It makes sense then for marketing to focus on the performance increase within that segment.
 
cudaMiner is still not optimized for this new CUDA compute device model. There should be even more performance squeezed out in the future.
After clicking the links that is what happened to me as well. However, on the new tab/page, select the address and press enter again. For me it worked...
Links are fixed now.
 
Is there a large discrete notebook market below that? My laptop world is skewed by Apple, where you have a GK107 or nothing. ;)
Yes, very much so. Note the cost on any notebooks using Iris Pro...

Where exactly do you see that "significant advantage"?

While GPU perf/w is accurate, from a serious mining perspective these are not sensible solutions because you still have to deal with a ~60-100W baseline platform power; platform hash/w is what matters.
 
Do I dare to dream GTX x60 class Maxwell card at 100W that outperforms the GTX 760(170W)?

Do I dare to dream of a GTX x70 class Maxwell card at 150W that outperforms GTX 770(230W)?

Do I dare to dream of a GTX x80 Ti class Maxwell card at 200W that outperforms TITAN BLACK(250W) & writes new world records of performance per watt, making that graph even more hilariously lopsided?
 
While GPU perf/w is accurate, from a serious mining perspective these are not sensible solutions because you still have to deal with a ~60-100W baseline platform power; platform hash/w is what matters.

You do realize that this is the First Maxwell released and that others will follow. The first generation Maxwell we have the GM107 today and soon the GM106 will follow. The 20nm GM2xx come in 2H2014.
 
VERY impressive gains on the power envelope, looks like a good candidate for my new laptop ;)
But for the actual product in question, it seems quite overpriced, as I think it is largely irrelevant to most if such a retail desktop card is using 60 or 100w... It's not enough to be a problem in any way (neither noise or power) for the usual case.
Don't get me wrong, for laptop performance and some SFFs it matters, and for a card with twice the performance and power consumption it would also matter a lot more, but not for this particular product segment.

...maybe in optimizing cgminer to be more GCN friendly with profiler, where you can easily get +10-20%? :p

Yep, the cgminer numbers are very low (ie badly tuned) on that graph, just tested this stock 7850 to ~290, and 265 should be clearly above that with it's 1400mhz memory.
The litecoin mining is also very bandwith limited (that's the whole point of litecoin compared to bitcoin - to avoid the asic miners), which is likely why it's doing so much better relative to gcn compared to the pure on-chip number crunching in the bitcoin numbers - especially with it's great bandwidth efficiency.

And ofcourse people are already paying the premium for the 290s (instead of going with eg twice as many 270x) because you can only have so many graphics cards (pcie slots) in a system requiring mb,ram,cpu,hd,psu..
 
Last edited by a moderator:
You do realize that this is the First Maxwell released and that others will follow. The first generation Maxwell we have the GM107 today and soon the GM106 will follow. The 20nm GM2xx come in 2H2014.
I'm sure he realizes that. But miners need the performance now, while the gold rush is still profitable. It doesn't matter what happens 6 months from now.
 
I'm sure he realizes that. But miners need the performance now, while the gold rush is still profitable. It doesn't matter what happens 6 months from now.

It will if the GM106/GM204 & GM200 turn out to be better than any other solution available in the future including ASICs.

The use of scrypt (a password-based key derivation function) in their proof-of-work algorithm, rather than Bitcoin's SHA-256, makes dedicated hardware more difficult to develop. So, GPUs still rule, even if increasing difficulties make the investment in equipment and power greater than current returns.

http://www.tomshardware.com/reviews/geforce-gtx-750-ti-review,3750-17.html
 
Back
Top