Man from Atlantis
Veteran
How much does extra 1GB of GDDR5 costs when it comes to power usage?
i assume 13.3W@1600MHz(6400MHz effective), the bottom vrm readings belong to fbvdd, 1.6v is what 6Gbps memory chips use..
How much does extra 1GB of GDDR5 costs when it comes to power usage?
How much does extra 1GB of GDDR5 costs when it comes to power usage?
For 2D I'm not sure how that fits. But for TDP that's another story. It's still unclear how TDP is determined now that the card can auto overclock itself (game profiling??). The question would be if the 195W is based on just stock clock or does it include auto overclocks. However, the peeks and valleys do suggest that it's ramping up and down the clock rate. But I guess we will know more once the reviews are out.maybe the card have OC and then go back to normal and so lowering consumption, i pain to see 75-100W difference can occurs in GT1 and GT2. Specially looking at the curve.. ( both test start high and then go down.. ) in GT3, you have directly a pic a peak down.
I can only think of some sort of game profiling for 3Dmark11. That may not be indicative to how the card will work in most other games. I guess we will have to wait and see.This dont explain when in 3Dmark11 the gpu ask ( look fps ) is going harder in each GPU test, and the TDP decrease. if you look the AMD red line you see clearly the gpu TDP is increasing gradually from the start of the test to the end.. when the GTX is droping severely ( ofc if their curve is accurate ).
i assume 13.3W@1600MHz(6400MHz effective), the bottom vrm readings belong to fbvdd, 1.6v is what 6Gbps memory chips use..
TBH I think there's a lot simpler explanation - AMD drops to "low-power 3D clocks" or some such, as does 680, but 680 has simply lower clocks / voltages at that stateAMD's heuristics may be more conservative about ramping down, but we see that once a low-power period is long enough, it can drop below the 680.
This drops make no sense to me at all. Isn't dynamic overclocking suppose to fill the TDP budget?
Why the drops in power usage then? If anything it should always stay at set TDP limit.
Anything is possible until the official benchmark release date.The folks at Anandtech's forum seem to believe that the review is fake because it's not possible to run 8x MSAA in BF3.
Which is negated when loaded.Pic regarding the link posted earlier regarding the leaked benchmark results...
I'm not sure how accurate this is but what got my attention was the consumptions rates when their is no load.
We are of different opinions then :smile:. I don't know of many who game that often in a 24 hour period where the difference in load would matter that much. However, in 2D mode when some can go days without playing a game is a different situation.Which is negated when loaded.
AMD will have lower consumption when idle but is saving 20 watts when idle and burning 50 watts more when active really a plus.
I gave a possible explanation.Well, since the actual layout of the RF apparently doesn't match the supposed number of SIMDs, it may very well be a plausible assumption. And more over, if Kepler treats the RF as a single address space, like in Fermi...
To get maximum utilization, the processing of the pixels in a quad has to get out of sync. I really doubt this happening with dynamic scheduling right now.Not sure if it's clever, but... here's a wild guess: 6 ALUs, one LD/ST and one SFU are right next to each other. This mini-cluster is then replicated 8 times vertically and another 4 times horizontally. I'd guess 1 in 6 (maybe 1 in 3) ALUs are DP capable, and that groups of 4 threads (pixel quads) are handled by a single mini-cluster (to get full ALU utilization, you need SP IPC of 1.5). This way, each set of 8 mini-clusters handles a 32 wide warp per cycle and is associated with a single dual issue scheduler.
We are of different opinions then :smile:. I don't know of many who game that often in a 24 hour period where the difference in load would matter that much. However, in 2D mode when some can go days without playing a game is a different situation.
Then there's the new 3D Vision Surround, bolstered by a redesigned display logic, which addresses the two-display limitation of NVIDIA GPUs. You can now connect as many as four monitors to a GeForce Kepler GPU, enabling 3-monitor HD 3D Vision Surround setups. You no longer need more than one GeForce GPU to connect more than two monitors. The new 3D Vision Surround is said to work in conjunction with Adaptive V-Sync to ensure the center display has higher frame-rate (since it's at the focus of your central vision), at the expense of the frame-rates of the two side displays (since they're mostly at your peripheral vision). This ensures there's a balanced, high-performance experience with multi-monitor gaming setups.
TPU
I never indicated any exact wattage numbers. However, it is what it is based on that chart if it's true.do you honestly think it will draw more than 20W @idle?? GTX560ti draws less than 20W and GK104 is even smaller.. even Tahiti draws 15W< if it's not in zero power state..
Are you talking about this:This drops make no sense to me at all. Isn't dynamic overclocking suppose to fill the TDP budget?
Why the drops in power usage then? If anything it should always stay at set TDP limit.
I never indicated any exact wattage numbers. However, it is what it is based on that chart if it's true.