I think NV has to use ultra high leakage chips to reach GTX480 clock.That cannot be the only reason, because GTX470 is only consuming ~60% of GTX480's idle power, while the die itself is obviously the same.
I think NV has to use ultra high leakage chips to reach GTX480 clock.That cannot be the only reason, because GTX470 is only consuming ~60% of GTX480's idle power, while the die itself is obviously the same.
Why would there be a correlation b/w high leakage and high clocks?I think NV has to use ultra high leakage chips to reach GTX480 clock.
High leakage means chip can achieve certain clock at lower voltage.You cant bump up voltage without limit,so high leakage is good for clock.Why would there be a correlation b/w high leakage and high clocks?
I was blasting you about using metal spins to reduce leakage. Linking I/O performance and interference to leakage isn't much better...I/O could affect it too. basically, what I have heard (and this is from months back and everyone blasted me for it) that the controller is basically the same as in the GT215 and at high frequencies it causes way to much interference, presumably because of leakage.
High leakage => high temp => >hysteresis than expected?Linking I/O performance and interference to leakage isn't much better...
I was blasting you about using metal spins to reduce leakage. Linking I/O performance and interference to leakage isn't much better...
You're good about knowing dates and early specs, much better than most. Don't spoil it by overreaching.
5970 dies are low-leakage to keep power consumption in check.5970 Cypress dies are also high-leakage parts.
But max clock is determined by the max clock achievable in the "worst" part of the chip. Whereas the high leakage would be a result of the leakage of all transistors. Is there really any correlation between the average and the extremes?High leakage means chip can achieve certain clock at lower voltage.You cant bump up voltage without limit,so high leakage is good for clock.
That cannot be the only reason, because GTX470 is only consuming ~60% of GTX480's idle power, while the die itself is obviously the same.
Why would there be a correlation b/w high leakage and high clocks?
What about Dave's theory of needing to keep it cool to keep power consumption in check? Sounds like a risky proposition to me but it's possible.
Does anyone know where the post in question is located? I have been unable to find it and am currently involved in a discussion about it elsewhere, I'd like to cite it if at all possible.
Not quite what I was thinking of. ISTR discussion in this thread about GF100 consuming less power when running at lower temperatures. This stems from power consumption results varying between review sites, with some sites using higher fan speeds and achieving seemingly lower power consumption.
That's inherently physics, cool conducts better than warm, right?
Semiconductor physics failResistance to electron flow most certainly increases with heat.
Er, well, while it is true that increases in temperature increase the mobile electrons/holes in semiconductors, processors make use of doping for the majority of their mobile electrons/holes, so that effect is small, and is generally offset by other deleterious effects, as far as I am aware.Semiconductor physics fail