AMD: R9xx Speculation

I don't see anything there explicitly precluding using two 8-pin sockets. It's just populating a card with two 8-pin plugs that's definitely out of spec. And since a 6-pin plug fits fine in an 8-pin socket, couldn't it then be seen as working normally and get certified for operating parameters that are within spec?

Any further OC-voodoo enabled from power-sensing any additional out of spec power would thus not be neither tested nor certified and remain entirely at user risk?

They include 6pin+8pin for 225W cards as an option when you draw 75W from each - if 8+8pin would be option when drawing only 75W from one and 150W from the other, it would surely be listed too.
 
They include 6pin+8pin for 225W cards as an option when you draw 75W from each - if 8+8pin would be option when drawing only 75W from one and 150W from the other, it would surely be listed too.
Huh? I mean if it were to draw 150 from a 8-pin and 75 from a 6-pin connector placed in any of the two (8-pin) sockets.
A 300W add-in card can receive power by the following methods:
75W from x16 PCIe® connector plus 150W from a 2x4 connector plus 75W from a 2x3 connector
This could still apply whether or not the second physical socket is actually 6-pin or not.

Edit: Bad car analogy time. Similar to making a track car designed for peak performance using slicks, but to be road legal too. No RSA would test the car using slicks, that's out of spec, but that doesn't stop the manufacturer from suggesting that the car will only perform to its fullest potential once you take it to the track and put slicks on it.
 
Last edited by a moderator:
Brazos/Zacate/Ontario ASP is $34. Calculated from AMD financials, income from those APU chips divided by number of APU's shipped
Are you sure the Brazos revenue was broken out? I didn't try to look it up, but it's unusual for a company to give that level of specificity.
 
If those picture are correct, and if those are DP 1.2 connectors, that beastie could theoretically support 12x 1920x1200 monitor's or 8x 2560x1600 monitors. :oops:

Regards,
SB
amdradeonhd6990_offdetails_8a_dh_fx57.jpg

http://www.donanimhaber.com/ekran-k...ran-karti-Tum-detaylariyla-Radeon-HD-6990.htm
 
I understand that hasn't been an issue with all the AIB cards, but if AMD wouldn't care if their official flagship is approved by PCI SIG, why do they bother getting any of their products approved by PCI SIG in the first place?
I'm willing to bet that you worry about this way more than anyone at the PCI SIG.

How many people, do you think, who were planning to buy this card, are not going to do so if it turns out that there's no official PCIe logo on the box?

The PCI SIG doesn't approve anything. They publish specs. They organize plugfests for compatibility testing for those who wish to participate. They have a logo, the usage of which is probably restricted to some terms (don't know if power consumption is part of those.) That's about it.
 
I also expected it to keep the 300W due to powertune throttling it most of the time (maybe only hitting max frequencies with CF inactive) - and then maybe have something like a +50% powertune setting to set it free.
 
It has 2x8pin power, I think breaking the 300W limit is a given.

I would expect PowerTune to be set to 300W as a default value, but with a +33.3% setting available; hence the second 8-pin connector. If I were AMD, I would actually test and validate the card for high clocks at 400W, encouraging users to "overclock*" it to those levels, with a simple warning that it takes the card out of PCI-E specs.

*With the card specified for such clocks, it technically wouldn't be overclocking.
 
It has 2x8pin power, I think breaking the 300W limit is a given.
Agreed.

Basically AMD is just releasing another 4870 X2, only this time they officially acknowledge its true power consumption. That's all there is to it, in my opinion.
 
But what's the reasoning behind such move?
Do they know nVidia is doing the same? I mean, why else would the break it when they could make a card which would wipe the floor with 5970 within 300W barrier anyway?
 
450W max powertune TDP for the OC version (which is a switch on top of the card) and 375W for the "normal" version.

Who needs to adhere to PCI-E SIG certification anyway? Certainly not AMD. :p
 
But what's the reasoning behind such move?
Do they know nVidia is doing the same?

Yes they probably know that, or at least are prepared for it. This is as far as they can go and they went there. Now let's see what nVidia brings and whether they have improved their SLI scaling, otherwise 6990 is looking pretty good.
 
Back
Top