NVIDIA Kepler speculation thread

Why is that important for you now, high power consumption, low performance per watt never bothered you before.

Personally, I'll be getting 2 HD 7970. I waited to see what Nvidia had, and I am not impressed.

I'm still trying to figure out who pays $500 dollars for a card and plays at 1080P or less!

I have been much more impressed with SLI drivers than with CFX, but I'd rather wait for a single card that can match 2x 580 or 2x 6970.
 
I have been using crossfire since the 48xx series, have not had any problems with any of the games I play. I will admit that I am probably not the typical gamer, I do not play FPS games, I like my RPG's, and sandbox games.

Edit: I am on the 69xx series now, and will soon upgrade with a new motherboard, and CPU.
 
Finally figured out how the "average" boost clock works.

Base: 1006
Boost: 1058
Difference: 52 MHz
Max: 1006 + (52*2) = 1100

The "average" number is just the halfway point in the default power profile. They apparently step the clock at 8 13 MHz intervals, so 1006, 1019, 1032, 1045, 1058, 1071, 1084, 1097, 1110 by default (depending on power consumption). Thus, every 680 will be allowed to clock up to 1100 Mhz by default, it is just how often it does this will depend on the unique power consumption of each board/chip. The clock offset just changes the minimum from which the 8 boost intervals start (which are still power limited).

From what I have gathered, they have been fairly conservative in the profiling, especially regarding power consumption. I'd guess virtually every 680 will easily hit the same clocks as the review samples, though in some cases the user might have to increase the power consumption limit by a small amount. Hopefully, that will rarely be the case though. It will certainly be interesting to see the results from actual users.

Sorry if this has been posted before... haven't been keeping up.
 
Hitting the same clocks != hitting those clocks all of the time. More stressful scenes may invoke limitations sooner. So while every card might be able to hit 1100Mhz under certain conditions, it certainly will not do so under every condition (and I believe there is variation in those clocks in reviews that monitored them).

I look forward to a review which monitors the clocks of a number of GTX 680's in the same tests.
 
It is kind of annoying not a single reviewer appeared to even acknowledge the seeming possibility of a "golden sample" skewing their results under this new boost tech. Dereliction of duty, frankly.
 
That's a possibility with any review sample of any tech product, not sure why you expect Kepler to get special treatment about that.
 
GPU Boost Hypocrisy

It sure is funny to see all the red-headed stepchildren crying and howling about Nvidia's GPU Boost technology when the same technology is built into the latest CPU's from Intel and AMD. I don't see any calls for websites that are benchmarking those CPU's that have a Turbo Mode to disable Turbo Mode when benchmarking.


Yet you see posts here on B3D and saturated on S|A forums is how despicable Nvidia is and that all reviewers should disable GPU boost when benchmarking.


Hypocrisy at its finest.


Nvidia has instituted for Kepler-based hardware a new technology called GPU Boost. Similar to Intel's Turbo Boost and AMD's Turbo Core
http://www.pcmag.com/article2/0,2817,2402021,00.asp


New AMD Turbo CORE Technology
Both the AMD Phenom™ II X6 1055T and 1090T come equipped with AMD's new Turbo CORE technology. AMD Turbo CORE technology is a performance boosting technology that automatically switches from six cores to three turbocharged core for applications that just need raw speed over multiple cores. While in Turbo CORE mode, the AMD Phenom™ II X6 1090T shifts frequency speed from 3.2GHz on six cores, to 3.6GHz on three cores, making it the fastest processor AMD has ever created.
http://www.amd.com/us/products/desktop/processors/phenom-ii/Pages/phenom-ii-key-architectural features.aspx



Get an extra burst of raw speed when you need it most with AMD Turbo CORE Technology.
http://www.amd.com/us/products/desktop/processors/amdfx/Pages/amdfx.aspx



The embedded Radeon GPU also features GPU TurboCore, which boosts engine speed from 276 MHz to 400 Mhz.
http://www.techpowerup.com/144260/AMD-C-60-Gets-TurboCore-to-CPU-and-GPU.html



Intel Turbo Boost is a technology implemented by Intel in certain versions of their Nehalem- and Sandy Bridge-based CPUs, including Core i5 and Core i7 that enables the processor to run above its base operating frequency via dynamic control of the CPU's "clock rate".
http://en.wikipedia.org/wiki/Intel_Turbo_Boost
 
Well installed my GK104 yesterday afternoon. In Battlefield 3, Alan Wake and Skyrim the card was 'Boosting' up to 1123 Mhz (up from 1006 Mhz stock).
 
Yet you see posts here on B3D and saturated on S|A forums is how despicable Nvidia is and that all reviewers should disable GPU boost when benchmarking.

Hypocrisy at its finest.
Hi pot meet kettle :rolleyes:

calling people Hypocrites without actually addressing there point.

:rolleyes::rolleyes::rolleyes:

come on people, this forum is better then that lowest common denominator crap! :devilish:
 
It sure is funny to see all the red-headed stepchildren crying and howling about Nvidia's GPU Boost technology when the same technology is built into the latest CPU's from Intel and AMD. I don't see any calls for websites that are benchmarking those CPU's that have a Turbo Mode to disable Turbo Mode when benchmarking.
How about thinking twice, before you elevate the discussion with such elaborate phrases?

1. CPU Boost has guaranteed values for all chips and doesn't boost further, there is no lottery here.
2. CPU Boost can be turned off.

CPU Boost has no relevance to what nVidia is doing, other than the general idea, which is why the comparision was made.
 
Why is that important for you now, high power consumption, low performance per watt never bothered you before.
I think you've made up an imaginary person and substituted him in my place. Either that, or you're thinking of what was going on years ago. Because I've easily cared a fair amount about power consumption for the last five years or so.

Personally, I'll be getting 2 HD 7970. I waited to see what Nvidia had, and I am not impressed.
It's lower-priced, performs better, uses less power. What more could you possibly want?

But personally I still lean a bit towards nVidia parts for the little things. I do have an ATI GPU on my laptop, so I'm not talking completely out of my backside when I say that I generally prefer nVidia's drivers. They have more options and better Linux support for the most part (ATI does have better open-source drivers available, which allows KMS, which in turn allows for faster wake times from sleep mode). So even given very similar parts, I would still prefer nVidia. Just makes me happy that it's a better part all-around.
 
It's lower-priced, performs better, uses less power. What more could you possibly want?

It uses less power under full load, but more power when idle (displays off), due to the lack of anything like ZeroCore.

So if your primary concern is maximum power draw, GK104 wins. If it's total energy consumption per day, Tahiti wins.
 
For me it's not so much power draw or heat that's important, as long as they are reasonable then I don;t really care which GPU wins. However what those two metrics enable with regards to noise is very relevant to me. The quieter the GPU the better. I learned this the painful way with my current 4890. Whatever I get next I want it to be a lot quieter than this!
 
For me it's not so much power draw or heat that's important, as long as they are reasonable then I don;t really care which GPU wins. However what those two metrics enable with regards to noise is very relevant to me. The quieter the GPU the better. I learned this the painful way with my current 4890. Whatever I get next I want it to be a lot quieter than this!

Stock cooling is rarely the best for noise. I was able to quiet my 6970s down to a whisper with a couple aftermarket Scythe heat sinks and PWM fans and my current 580s are extremely quiet on water.
 
But personally I still lean a bit towards nVidia parts for the little things. I do have an ATI GPU on my laptop, so I'm not talking completely out of my backside when I say that I generally prefer nVidia's drivers. They have more options and better Linux support for the most part (ATI does have better open-source drivers available, which allows KMS, which in turn allows for faster wake times from sleep mode). So even given very similar parts, I would still prefer nVidia. Just makes me happy that it's a better part all-around.

I have an sli laptop here, its dead from bumpgate, so I hope you can understand why I prefer to go with AMD products.
 
Well installed my GK104 yesterday afternoon. In Battlefield 3, Alan Wake and Skyrim the card was 'Boosting' up to 1123 Mhz (up from 1006 Mhz stock).

So, either you have been sold a sample meant for reviewers by accident or maybe 1100-ish MHz is not that much out of the ordinary as some people were afraid of.
 
It uses less power under full load, but more power when idle (displays off), due to the lack of anything like ZeroCore.

So if your primary concern is maximum power draw, GK104 wins. If it's total energy consumption per day, Tahiti wins.
I'm not sure that really helps for a single video card, provided I turn my computer off or put it to sleep when not in use (which I do). Tomshardware has some power consumption numbers that seem to back this up:
http://www.tomshardware.com/reviews/geforce-gtx-680-sli-overclock-surround,3162-15.html

Of course, being able to power down completely when the display is off is nice, but otherwise even the idle consumption is on par or higher than what nVidia offers. And just putting the computer to sleep is even better.
 
Back
Top