AMD: Southern Islands (7*** series) Speculation/ Rumour Thread

about 40 more watts in full and an acquisition cost greater than 7970, for performances that often do not exceed the gtx680 ... definitely fail.

http://www.hardocp.com/article/2012/06/21/amd_radeon_hd_7970_ghz_edition_video_card_review/1
Well, there are also reviews, which show quite the opposite:
TechReport said:
The Radeon HD 7970 GHz Edition has indeed recaptured the single-GPU performance title for AMD; it's even faster than Zotac's GTX 680 AMP! Edition. And at $499.99, the 7970 GHz Edition is unambiguously a better value than the stock-clocked GeForce GTX 680. Everything seems to be going AMD's way—even our power consumption results turned out to be closer than expected.
According to ComputerBase.de, GE (compared to GTX 680) drains 14 % more power while offering 15-20 % more performance with MSAA 8× enabled. Hardware.fr results show 10-12 % better performance than GTX 680.

Performance difference in many tests is very similar to HD 6970 - GTX 580 performance delta. The later one consumed ~50 watts more and nobody called it a fail.
 
Well, there are also reviews, which show quite the opposite:

According to ComputerBase.de, GE (compared to GTX 680) drains 14 % more power while offering 15-20 % more performance with MSAA 8× enabled. Hardware.fr results show 10-12 % better performance than GTX 680.

Performance difference in many tests is very similar to HD 6970 - GTX 580 performance delta. The later one consumed ~50 watts more and nobody called it a fail.

TechReport tested power consumption with Batman, which is much faster on Kepler. I suspect the latter reaches higher utilization in this game, thereby closing the power gap.

Still, the new 7970 is at the same 250W TDP as the old one, so it's probable that there were equally power hungry 7970s floating around already, just at 925MHz and with accordingly lower performance.
 
Not sure why anyone cares about the reference cooler since it won't be used.

In that case, instead of dropping the ball on the cooler you dropped the ball on communication with reviewers.

If people would have understood that the reference cooler wasn't going to be used, nobody would care about the noise measurements. Because this wasn't communicated, people made reasonable assumptions. You have shipped similarly loud products in the past after all.

Hate to say it, but Nvidia wouldn't have made this mistake.
 
TechReport tested power consumption with Batman, which is much faster on Kepler. I suspect the latter reaches higher utilization in this game, thereby closing the power gap.

Still, the new 7970 is at the same 250W TDP as the old one, so it's probable that there were equally power hungry 7970s floating around already, just at 925MHz and with accordingly lower performance.
Don't forget, TDP isn't the same as power consumption. They're related, but a 250W TDP doesn't mean the card draws 250W.
 
In that case, instead of dropping the ball on the cooler you dropped the ball on communication with reviewers.

If people would have understood that the reference cooler wasn't going to be used, nobody would care about the noise measurements. Because this wasn't communicated, people made reasonable assumptions. You have shipped similarly loud products in the past after all.

Hate to say it, but Nvidia wouldn't have made this mistake.
It was communicated, hence why you saw many note it.
 
The GHz edition ASIC is explicitly designed to fit within the same thermal / electrical / physical footprint as the standard 7970 and as such, given that every one of the partners have transistioned to their own designs they will be using their own fansink designs. The "reference" design is not much more than our qualification mule now and is sampled for the purposes of performance testing.

Why then did many reviews show increased power consumption of 10-20% (the variation between reviews is quite large)?
Also one would think that is is a bad idea to let a card be reviewed that is as loud as this - no matter if it will be sold as such or not. A bad impression can stick and people not always differentiate.

Don't forget, TDP isn't the same as power consumption. They're related, but a 250W TDP doesn't mean the card draws 250W.

Afaik the 250W are maximum board power - that is exceeded occasionally, for example with the 254W (just for the card itself) in the ht4u review. They tested HawX. In Furmark, the 250W are exceeded in almost any review on the web.
 
It was communicated, hence why you saw many note it.

I only saw Techreport note it, but they didn't sound completely certain, with language like "you won't likely see"

On the other hand, Hardware Canucks seemed to think that no partners were going to come out with custom GHz editions at all, and would instead rely on their current crop of non-reference cards.

Anandtech complained about the noise of the card in their conclusions, and other reviewers followed suit.

Obviously, simply mentioning the fact wasn't enough. When you send a review sample with such unappealing reference cooling, you need to make sure to drive the point home.

Heck, you could have even given review sites some photos of upcoming GHz edition cards from partners. That way it's impossible to miss the message, you help build PR for partners, and you create hype over the upcoming designs. Much better than the majority of readers thinking "Good performance but too noisy" because they don't know better.
 
Why then did many reviews show increased power consumption of 10-20% (the variation between reviews is quite large)?
All the boards are designed to run at a set, max power level. With a boost state you are bringing more apps (and for more of the time in that app) closer to the Board limits. PowerTune clamped high power apps down to the peak, but there were a lot of cases that apps were not at that peak; boost will take those cases and bring them closer to the peak.

Even a test like Furmark has power variation and may not be at the PowerTune limits 100% of the time, boost will enable it to use the max power budget for more of the time it is being rendered.
 
I only saw Techreport note it, but they didn't sound completely certain, with language like "you won't likely see"

On the other hand, Hardware Canucks seemed to think that no partners were going to come out with custom GHz editions at all, and would instead rely on their current crop of non-reference cards.

Precisely. As mentioned ease of deployment was the key here it is fully expected that partners will quickly deploy using their current designs. Over the coming months you may see specific new designs, but initially it will ramp on the current design.
 
All the boards are designed to run at a set, max power level. With a boost state you are bringing more apps (and for more of the time in that app) closer to the Board limits. PowerTune clamped high power apps down to the peak, but there were a lot of cases that apps were not at that peak; boost will take those cases and bring them closer to the peak.

Even a test like Furmark has power variation and may not be at the PowerTune limits 100% of the time, boost will enable it to use the max power budget for more or the rendering time.

I think it is well known how the boost works from Kepler ;)
The problem I see is the lower power-efficiency:

All measurements below only for the card, not the whole system
HT4U: 21% more power (HawX)
PCGH: 33% more power (BFBC2)
techpowerup: 28% more power (Crysis 2)
Hardware.fr: 20% more power (Anno 2070)

All for a maximum of 13% better performance. It is especially worrying that ht4u measures 254W for the card itself during gaming while the TDP is only 250W.

Edit:
Interesting what no-X says. To clarify: What is the maximum board power for the 7970 GHE?
Btw would AMD have enabled this boost if Nvidia had not implemented it first?
 
Last edited by a moderator:
Still, the new 7970 is at the same 250W TDP as the old one, so it's probable that there were equally power hungry 7970s floating around already, just at 925MHz and with accordingly lower performance.
Well, the numbers aren't identical.

For HD 7970, AMD stated: "typical board power ~210 W" and "maximum board power 250 W" for 7970.

The HD 7970 GE guide says: "typical power draw ~250 W" / "typical board power ~250 W"

So, PowerTune limit ("maximum power") wasn't probably changed, but typical power draw was raised by 40 watts.
 
Even a test like Furmark has power variation and may not be at the PowerTune limits 100% of the time, boost will enable it to use the max power budget for more of the time it is being rendered.

So you're saying that Hardware Canucks misinterpreted what the board partners were saying? That they will come out with GHz editions, but they may just be minimally altered versions of existing OC cards with, say, more stringent binning and boost functionality added? HC makes it sound like board partners aren't even going to bother releasing new cards.

One other thing mentioned, I think at HC also, was that some GHz editions might use 5.5GHz memory instead of 6GHz memory in the review unit. Is this true? If so, it seems to be bordering on false advertising.
 
Well, there are also reviews, which show quite the opposite:

According to ComputerBase.de, GE (compared to GTX 680) drains 14 % more power while offering 15-20 % more performance with MSAA 8× enabled. Hardware.fr results show 10-12 % better performance than GTX 680.

Performance difference in many tests is very similar to HD 6970 - GTX 580 performance delta. The later one consumed ~50 watts more and nobody called it a fail.

so i'd pay 1$ for each watt more than 7970 to have 10% average more performance....this ghz edition it's only marketing, it has no convincing argument not only than gtx680, but the "old" 7970 (and its overclocked versions) too.
 
Don't forget, TDP isn't the same as power consumption. They're related, but a 250W TDP doesn't mean the card draws 250W.

No, but it means that some cards come close, or it would be set lower. So the GHz Edition may draw more on average, but the most power-hungry GE card probably doesn't draw more than the most power-hungry standard 7970.


Well, the numbers aren't identical.

For HD 7970, AMD stated: "typical board power ~210 W" and "maximum board power 250 W" for 7970.

The HD 7970 GE guide says: "typical power draw ~250 W" / "typical board power ~250 W"

So, PowerTune limit ("maximum power") wasn't probably changed, but typical power draw was raised by 40 watts.

That's to be expected with Boost.
 
Last edited by a moderator:
So the GHz Edition may draw more on average, but the most power-hungry GE card probably doesn't draw more than the most power-hungry standard 7970.
Indeed.

Btw would AMD have enabled this boost if Nvidia had not implemented it first?
PowerTune is a very different thing in terms of implementation to NVIDIA's solution, and it has its own development path. There was a discussion on enabling it on SI at launch but the schedule wasn't favourable. Development continued (witness Trinity) post SI launch and XT2 picked it up as it went through the qual path.
 
As techreport said it really is a mojo thing. Nvidia have this image, especially with the execution of the 680 launch that makes it a more desirable product. I myself will be looking for one of these GHz editions with a nice custom cooler mainly because of the extra memory, better compute potential and lower price (680's are gouged through the roof over here, currently $100-200 more than equivalent 7970's). That said I don't think the general gaming and hardware enthusiast community will be at all swayed by this re-launch - and a lot of that will have to do with the noise, power and heat increases reported in most of the reviews. It really hasn't been made clear that this is (apparently) not an issue and I think it has left AMD in the exact same position (image wise) they were before launching this product - possibly even worse.
 
Why does 7970 looks like the best gpu.....and worst gpu at the same time...? The paper specs look good...i think we are past the point of gpu design and AMD should start looking at games design support and...their driver level support....a lot of today games seem to be built for nvidia...so IWATP about nvidia mojo....

imho i would rather pick up the 7970 over 680 as it stands...of course if AMD would sell a ref 7970 at $399...it would be back to good ol' days of AMD love..
 
Why does 7970 looks like the best gpu.....and worst gpu at the same time...? The paper specs look good...i think we are past the point of gpu design and AMD should start looking at games design support and...their driver level support....a lot of today games seem to be built for nvidia...so IWATP about nvidia mojo....

imho i would rather pick up the 7970 over 680 as it stands...of course if AMD would sell a ref 7970 at $399...it would be back to good ol' days of AMD love..

Pure power and computing force dont forcibly transmit to game engine and evnironnement, and you have mention one ... the drivers, or engine developpement .. the choice made during the developpement of the game, and specially the optimisation of it .

the 7970 have someting like 1tflops more in SP and 5x more tflops in DP ( not important for games anyway ) .. The GK20 with a dual Kepler is just at 4.5tflops.... really close of the 7970 .. i have no doubt why Nvidia have some problem with their GK110 against GCN ... they will not be able to match 2x Kepler, and the GCN have a good advance in computing part now . ( hence why maybe Nvidia have not report the GK100 and pass to GK110 directly )

Nvidia have annoce 1Tflops for the GK110 in DP and the 7970 ( game part ) is allready here at 1tflops DP ...
 
Last edited by a moderator:
It looks to me that using Power Control Setting for +20 has more a of an effect on the GHz edition then the previous 7970.
 
TechReport tested power consumption with Batman, which is much faster on Kepler. I suspect the latter reaches higher utilization in this game, thereby closing the power gap.

Still, the new 7970 is at the same 250W TDP as the old one, so it's probable that there were equally power hungry 7970s floating around already, just at 925MHz and with accordingly lower performance.

Speaking about old 7970, 250W is the max board power ... not the tdp ..this is the max is designed the card for be used .. tdp was vary between 160 to 190W .. yet it is a bit higher for the 7970ghz, but still it is vary a lot from a game to another ..
 
Back
Top