Nvidia BigK GK110 Kepler Speculation Thread

Oh, I didn't know MSI Afterburner was basically made by Rivatuner's author.
I remember when just fours years ago I still had XP, Rivatuner and nHancer. It was tweaking and gaming heaven (and now all of them are deprecated)

The second post is interesting, it hints to TDP target switching.
End user control of the TDP is something that probably should be generalized. Imagine I have a 150W TDP GPU, it would be worth it to drop the TDP at something like 75W to play Counterstrike then run it at full power (or even have a small overclocking margin) to play Far Cry 3. Game profile could trigger this automatically (else you will need a launch script)
 
Oh, I didn't know MSI Afterburner was basically made by Rivatuner's author.
I remember when just fours years ago I still had XP, Rivatuner and nHancer. It was tweaking and gaming heaven (and now all of them are deprecated)

The second post is interesting, it hints to TDP target switching.
End user control of the TDP is something that probably should be generalized. Imagine I have a 150W TDP GPU, it would be worth it to drop the TDP at something like 75W to play Counterstrike then run it at full power (or even have a small overclocking margin) to play Far Cry 3. Game profile could trigger this automatically (else you will need a launch script)

This is exactly what you can do with AMD's PowerTune, although I don't think you can select a profile per game (yet).
 
This is exactly what you can do with AMD's PowerTune, although I don't think you can select a profile per game (yet).

Or when you OC with Kepler gpu, as basically you increase the target TDP, by +115-120% ..

Effectively it is not per game.
 
This is exactly what you can do with AMD's PowerTune, although I don't think you can select a profile per game (yet).
You can't select it per profile, period... It's one global setting, for any/all profiles. So disappointing, AMDs profile system really sucks.
 
On a standard 680 you can change the TDP power range from 80-132%, at least on Evga precision X tool. Different models have different ranges.
 
You can't select it per profile, period... It's one global setting, for any/all profiles. So disappointing, AMDs profile system really sucks.

Can you change it through command line..
Rivatuner had a really nice feature : it gave you a shortcut to change one setting, if you asked it. You get a .lnk file (standard shortcut) on your desktop and can move it somewhere else.
I had a separate program that did hotkeys and ran the .lnk files. So I had win+F1..win+F4 to toggle between gamma 1.0, 1.1, 1.2, 1.3 and this worked all the time except perhaps in games that did especially raw keyboard reading (id software games I think though ctrl+alt+F1..F4 could work, then).
For playing the odd game that absolutely needs driver gamma (glquake and counterstrike 1.x) I had a .bat script doing "set gamma at 1.2 or 1.3 ; run game ; set gamma at 1.0".
Later I think I found out gamma integrated into nvidia game profiles but that feature was pretty late.
 
On a standard 680 you can change the TDP power range from 80-132%, at least on Evga precision X tool. Different models have different ranges.

Radeon 6990 had that feature especially advertised, using a switch on the card itself I think.
It was just a dual BIOS thing (useful when your users will flash stupid modded BIOS), I suppose it's intended to be flipped when the PC is turned off.
 
If PC Perspective and TR already have review samples, the actual launch can't be very far now. Is there any rumored release date? I haven't heard anything yet.
 
If PC Perspective and TR already have review samples, the actual launch can't be very far now. Is there any rumored release date? I haven't heard anything yet.

It's been rumoured to launch at the end of this month
 
Thanks. It seems like quite a lot of time for reviews then, if sites already have samples. Perhaps something will leak out.
 
Can't wait as I'm in dire need of an upgrade.
I am probably not in dire need per se, but I certainly feel the "upgrade itch". My base system; mobo, CPU, RAM will turn four years old very soon, like in a few weeks at most. What worries me however, is price... The rumored $900 RRP is, shall we say, ludicrous really. Even $600 is way pushing it for a video card. I don't know if I can honestly justify paying this much for just one card, let alone two for my preferred SLI fix.

If titan's really costing $900, it better be a 300W monster from the outset, anything half-assed for that amount of money is just going to turn into a bad joke - and by half-assed I mean anything that isn't full-bore, balls-out super awesome. For nine hundred fucking dollars, this card better rock. It better rule. Dominate. Stomp. Crush!

Anything, even slightly less than all of the above, and it is instant fail.
 
I am probably not in dire need per se, but I certainly feel the "upgrade itch". My base system; mobo, CPU, RAM will turn four years old very soon, like in a few weeks at most. What worries me however, is price... The rumored $900 RRP is, shall we say, ludicrous really. Even $600 is way pushing it for a video card. I don't know if I can honestly justify paying this much for just one card, let alone two for my preferred SLI fix.

If titan's really costing $900, it better be a 300W monster from the outset, anything half-assed for that amount of money is just going to turn into a bad joke - and by half-assed I mean anything that isn't full-bore, balls-out super awesome. For nine hundred fucking dollars, this card better rock. It better rule. Dominate. Stomp. Crush!

Anything, even slightly less than all of the above, and it is instant fail.

You are not going to hear me argue against that. The bottom line for me is this if 2 x SLI with 680s > 1 x titan and cheaper then 680 wins. But if the PC mark screenshots are anything to go by seems like that's probably not even true...which then in my mind I can handle that price. So basically I am agreeing with you.I am running 285s in SLI at the moment. I think I have milked as much as I can out of those cards. Those cards are not even DirectX 11 capable! The wait is killing me I know it's supposed to be released at the end of this month or read reviews at least. So many awesome games are waiting for me to be enjoyed after the upgrade
 
Don't forget that the gameplay experience with AFR is a little worse than with single GPU, even with SLI. So if there is a 20% performance difference, I would still pick Titan over two 680s, especially since it has more memory.
 
Don't forget that the gameplay experience with AFR is a little worse than with single GPU, even with SLI. So if there is a 20% performance difference, I would still pick Titan over two 680s, especially since it has more memory.

I guess it depends on the individual need. I have a 2560 by 1600 30 inch display and playing games in that resolution as I'm sure you know is quite taxing on any hardware. Memory does not seem to be the problem. Outright horsepower of the chip matters. I highly doubt the Titan will be able to run Crysis 3 at those resolutions without SLI.
 
I doubt that Titan is going to look particularly impressive in price/performance ratio to say 2 x 670 setup. It's a super enthusiast product, where value is not that great. Unlike the Extreme CPUs from Intel, it should provide great single GPU power upgrade though.
 
I doubt that Titan is going to look particularly impressive in price/performance ratio to say 2 x 670 setup. It's a super enthusiast product, where value is not that great. Unlike the Extreme CPUs from Intel, it should provide great single GPU power upgrade though.

I agree with you for the most part. The Intel CPU is only worth it if you know someone who works at Intel so you can get the employee discount.;)
 
Back
Top