AMD: Volcanic Islands R1100/1200 (8***/9*** series) Speculation/ Rumour Thread

Yeah, 800W at 12 volts is nearly 67 ampres, which is an insane amount of power draw. How was that measured, exactly?
 
800? :oops:

Gah, I assume liquid nitrogen cooling then or surely your voltage regs would blow to hell from such a load... Out of curiosity, what vcore was that at?

No, just a custom water cooling loop. It's EVGA 780Ti Classified Hydro Copper. It has quite a beefy PCB. I had it set a bit over 1.5v for the core on a software tool, but the actual voltage in the Ti-models has been reported to be higher than what the software says, so probably around 1.6v. I got two of these puppies.

Yeah, 800W at 12 volts is nearly 67 ampres, which is an insane amount of power draw. How was that measured, exactly?

Precision X power draw readings have been reported to be quite accurate. That's what I used. It's very easy to make my AX860 shutdown with stock CPU and only GPU OC when you crank up the voltages, with both OC'd I managed to shutdown an AX1200. Overclock.net has plenty of similar stories. These cards can zip some serious power when the voltages are cranked up.
 
Last edited by a moderator:
I'm amazed the voltage regulator hardware on your vidcard could cope with that kind of ampearage (500A@1.6V!!!) You'd think those tiny little ICs would be reduced to smoking ruins in a flash and a bang, like that old dual-GPU board NV launched a bunch of years ago now. :LOL:

Still, I wouldn't expect your GPU to last very long at 1.6V, that can't be healthy at today's geometries... For curiosity's sake, how high did it clock at that voltage level?
 
I'm amazed the voltage regulator hardware on your vidcard could cope with that kind of ampearage (500A@1.6V!!!) You'd think those tiny little ICs would be reduced to smoking ruins in a flash and a bang, like that old dual-GPU board NV launched a bunch of years ago now. :LOL:

Still, I wouldn't expect your GPU to last very long at 1.6V, that can't be healthy at today's geometries... For curiosity's sake, how high did it clock at that voltage level?

That particular card was only able to hit around 1390Mhz on the core. My second card passed Firestrike with 1441Mhz on the core at 1.35v in the software, but that's also probably closer to 1.45 or at least 1.4v in reality. I'm not done with that card yet as I've only run it with the AX860 so far and it begins to shut down at around 1.37v in the software if the CPU is also OC'd. I'll probably do some runs during the weekend.

I'm currently 38th in 1 card Firestrike hall of fame with 13451 points.
http://www.3dmark.com/hall-of-fame-2/fire+strike+3dmark+score+performance+preset/version+1.1/1+gpu

Perhaps these should be in some other thread though :)
 
I'm amazed the voltage regulator hardware on your vidcard could cope with that kind of ampearage (500A@1.6V!!!) You'd think those tiny little ICs would be reduced to smoking ruins in a flash and a bang, like that old dual-GPU board NV launched a bunch of years ago now. :LOL:

Still, I wouldn't expect your GPU to last very long at 1.6V, that can't be healthy at today's geometries... For curiosity's sake, how high did it clock at that voltage level?


You should look more what overclockers done to them GPU's or CPU..... dont forget we are speaking about high rush run, not stability of the transistors ..

38th with just 1441mhz is a bit surprising grall... what tell Hwbot on this ? ( i have not look, and dont want loose time for this anyway ) ( i have check your score on3dmark, but dont do research )
 
Last edited by a moderator:
I'd put my money on basic aseteks for the GPUs and basic heatsink for the rest

Money well spent ;) http://videocardz.com/50151/closer-look-radeon-r9-295x2
AMD-Radeon-R9-295X2-Liquid-Cooler-850x548.jpg
 
Breaking the 375 Watt PCIe Limit

Talk here is sure different now that AMD is breaking the 375 Watt Limit than when Nvidia was rumored to be releasing a card that was speculated to need more than 375 watts.

What changed, has there been some hardware change that now allows greater than 375 watts?
 
Where in that preview are those cards that went "over the limit of officially allowed power through PCI Express powerplugs" because I don't see any mentioned in the article?

I didn't say they were mentioned in the article, but 6990 or 7990, can't remember which, had 375W and 450W BIOSes with only 375W powerplugs
 
I didn't say they were mentioned in the article, but 6990 or 7990, can't remember which, had 375W and 450W BIOSes with only 375W powerplugs

Anandtech said:
AMD refers to the 6990 as a 450W card. At default clocks it has a rated TDP of 375W but the cooler itself is designed to take 450W, which is why AMD went with so many design changes such as the dual-exhaust system and the exotic thermal compound.
http://www.anandtech.com/show/4209/amds-radeon-hd-6990-the-new-single-card-king/4
 
Don't get me wrong, I'm a current AMD owner and have been since ... what, probably 2006? I moved away from NV's 7800GT and into a pair of crossfired 3870's, then a pair of 4850's, a single 5850, and currently a single 7970 OC edition.

But nothing in these benches is impressing me any more. I'm handwaving-off the general challenges of AFR methods, and just looking at the power, price and performance here. The 295x's frame pacing is near-always better than it has been with the GCN 1.0 cards, but it's continually worse than NV's offering. Absolute performance is certainly even with NV, but at a higher price and a higher power consumption rate. The only thing really going for AMD here is the watercooling, which keeps the noise quite further down than the competitors.

Nothing here strikes me as truly impressive. Not that it should, I suppose, it's CF-on-a-stick and I wouldn't buy an equivalent SLI-on-a-stick from NV either. Still, it's a LOT of power, a LOT of price, and the performance just isn't the same to me.

Guess it's time to go back to NV again after the long hiatus.
 
What about the Titan-Z? Only in very low resolution cases will the "Z" be faster than the 295x, unless you're talking compute performance with double precision math.
 
You mean Titan Z.

2 x 290X = $1100(ignoring mark ups)
1 x CLLC + extra Waterblock = ~$150+
Then mix in the miners and Titan Z ridiculous $2999

=

$1499
 
What about the Titan-Z?

It is missing from the review which is astonishing given the part of the market these products target.
At least it could give a better idea on the distribution of forces between both camps, now it just gives the wrong impression that there is no competition whatsoever.
 
It is missing from the review which is astonishing given the part of the market these products target.
At least it could give a better idea on the distribution of forces between both camps, now it just gives the wrong impression that there is no competition whatsoever.

Every article i have read do mention Titan Z in the conclusion,... but anyway we have no benchmark for compare it and this card will not be released untill the end of the month.

Anandtech... resume it enough well i think..
NVIDIA has announced their own dual-GPU card for later this month, the GeForce GTX Titan Z, but priced at $3000 and targeted more heavily at compute users than it is gamers, the GTX Titan Z is going to reside in its own little niche, leaving the 295X2 alone in the market at half the price.

We’ll see what GTX Titan Z brings to the table later this month, but no matter what AMD is going to have an incredible edge on price that we expect will make most potential buyers think twice, despite the 295X2’s own $1500 price tag.

Ultimately while this outcome does put the 295X2 in something of a “winner by default” position, it does not change the fact that AMD has put together a very solid card, and what’s by far their best dual-GPU card yet. Between the price tag and the unconventional cooler it’s certainly a departure from the norm, but for those buyers who can afford and fit this beastly card, it sets a new and very high standard for just what a dual-GPU should do.
This said, maybe Nvidia will release really soon a 790 version in the same price range ..
 
It looks like the water pump and fan run at a raised level at idle, so the noise floor is higher at low power.
Coil noise is being commented on. Perhaps AMD's engineers and QA testers all have high-frequency hearing loss?
 
Back
Top