However, I don't know if overdrive can change the voltage?
SW-wise it would be easy, but controllable voltage regulators are way more expensive than the fixed ones. I say not, for cost reasons.
However, I don't know if overdrive can change the voltage?
SW-wise it would be easy, but controllable voltage regulators are way more expensive than the fixed ones. I say not, for cost reasons.
If Powerplay is that was AMD promises, it would be necessary.
btw.
Performance slides of HD3800-Series:
http://we.pcinlife.com/thread-844605-1-1.html
Yep, should be fully controllable.Ooops wait, they do have a digital power supply, right? That should be able to do it, correcting myself here.
The kookiest slides I've seen in a while...
Fully Dynamic (Now its clear why it is restricted to 825MHz in Overdrive)
I'm curious why AMD didn't replace Serious Sam with Crysis in their benchmark set.Interesting results those at http://we.pcinlife.com/thread-844605-1-1.html
As you can see, this longer board is on the trail of the 8800GTX/Ultra PCB. Unusually, there is no significant increase in the height of the board, meaning this dual-GPU card will fit it a more compact chassis.
The price of 3870 X2 will be between 399 and 499 USD/EUR, 249 and 299 quids (yes, life sucks in Bligthy).
RV670 chips are working at 750+ MHz each. 775 MHz should be reachable by launch time. Who knows, maybe even 800 MHz for the GPU. When it comes to overclocking, we have no idea at the moment, but the memory should be able to work at 2.25 GHz, yielding in combined bandwidth of 144 GB/s. Not too shabby.
[...]
Bear in mind that the G92_300 series, or D8E, the GeForce 8850GX2 or 8950GX2, whatever Nvidia marketeers figure it out to be - is also a dual-chip board. It is widely expected that Nvidia's card will be the same as 7900GX2 and 7950GX2, or a dual-PCB connected with 20-or-so connecting pins, all working in harmony with Nvidia's BR04 chip.
The Inquirer has a picture of an R680 a.k.a. Radeon HD 3870 X2 card:
We have HD3850 and HD3870's already here ready.
Pretty sweet OC. And looks and the temps.
To 860 mhz! Not too shabby on the stock cooler.
More than likely, GPU-Z thought for few versions that my X1800XL had only 6 VS units, too416 shaders? Bug in gpu-z?