Nvidia BigK GK110 Kepler Speculation Thread

Didn't someone recently publish a study showing that tri-SLI had the least microstutter? Even less than quad?

You mean the botched THG review 1-2 years ago? Not really. They ran into a CPU bottleneck with 3 GPUs, thus microstutter obviously decreased. They only provided one single diagram with frametimes, so there is basically no data that would support their claims.
 
mmm...i ask myself, better to sell two $ 500 vgas or one at $ 1000?

1 at $1000 obviously as the costs for producing and distributing 2 are double that of 1. i.e. more profit.

2 at $600 or 1 at $1000 is another question though. But the crazy thing is even $600 is ridiculously expensive. $1000 is totally insane.
 
These distributing costs... given that you can buy from china and ship it to Europe for free, then I would assume that these distributing costs are measured in pennies.
 

If I am not mistaken, these 1k prices include taxes (VAT).

If the Geforce Titan 6GB version is released with an MSRP anywhere close to $849 USD, then that will be a fair and reasonable price given that the Asus Ares 2 HD 7970 GHz Ed. 3GBx2 version sells for $1499 USD (http://www.newegg.com/Product/Product.aspx?Item=N82E16814121717&Tpk=Ares%202), the GTX 690 2GBx2 version sells for $999 USD (http://www.newegg.com/Product/Product.aspx?Item=N82E16814130781), the HD 7970 GHz. Ed. 6GB version sells for $599 USD (http://www.newegg.com/Product/Product.aspx?Item=N82E16814202005), and the GTX 680 4GB version sells for $529 USD (http://www.newegg.com/Product/Product.aspx?Item=N82E16814500288). Geforce Titan should have far higher performance per watt compared to all 7970 GHz Ed. and 7970 GHz Ed. x2 cards, should have somewhat higher performance per watt compared to all GTX 680 cards, and should have similar or better performance per dollar compared to the 6GB 7970 GHz Ed. and 4GB GTX 680 cards.
 
Last edited by a moderator:
You mean the botched THG review 1-2 years ago? Not really. They ran into a CPU bottleneck with 3 GPUs, thus microstutter obviously decreased. They only provided one single diagram with frametimes, so there is basically no data that would support their claims.

Seemed far more recent. Think it had 79xx and Kepler in it... I'll dig for it in a bit.
 
In that case, Nvidia and AMD still have the card built for them. Which is a big difference to 'built it themselves'.

I wouldn't be surprised at all if, say, eVGA shares the same production line with other vendors to save on costs.

Of course, neither company mass produces cards in house in any appreciable quantity. I can't even remember which was the last product generation where AMD manufactured a significant number of cards themselves. It has been a long long time since they did that. The HD 2xxx generation? I can't remember if they were still making cards themselves then or if all the ATI branded cards were being made by Sapphire at that point.

Regards,
SB
 
IIRC at least ATI had a (small) facility in Canada, where they put together some prototype boards and whatnot. Not something you would buy in a shop, of course, and not sure if that hasn't been optimized away...
 
IIRC at least ATI had a (small) facility in Canada, where they put together some prototype boards and whatnot. Not something you would buy in a shop, of course, and not sure if that hasn't been optimized away...

I believe ATI's manufacturing of retail products ended with the All in Wonder.
 
They ran into a CPU bottleneck with 3 GPUs.

Speaking about bottlenecks... just look how poor the Intel Core i7-3960X Extreme Edition (3.30GHz) is. A single game but... nevertheless.

28w0w0o.jpg


http://www.techspot.com/review/603-best-graphics-cards/page4.html
 
^ that is a matter of perspective. If your system is bogged down due to the CPU, no matter the exact reason, it's a bottleneck. If one were to follow your logic, you could blame almost every game out there for piss-poor coding because it doesn't run 6x as fast on y hexacore than on a singlecore.
 
Speaking about bottlenecks... just look how poor the Intel Core i7-3960X Extreme Edition (3.30GHz) is. A single game but... nevertheless.

28w0w0o.jpg


http://www.techspot.com/review/603-best-graphics-cards/page4.html

I don't get what point you're trying to make? Clearly the Core i7-3960X isn't a poor CPU and that game is simply CPU heavy + likely doesn't make full use of all 12(!) threads. And it's still able to hit 75fps which should be enough by anyone's standards.
 
Oh I agree, I'm just pointing out that it's not actually the fault of the cpu.

I think that is just semantics. A bottleneck is a bottleneck, the actual cause of this bottleneck doesn't change this. Btw I don't find it smart to judge games this way. We don't know how (in)efficient the code really is, we know nothing about the background. Maybe it just cannot work any better, who knows?

It would be equally silly to say that games are badly coded because there is a GPU bottleneck at 1600p with 4xAA. Bottlenecks are completely normal.
 
Back
Top