AMD: Sea Islands R1100 (8*** series) Speculation/ Rumour Thread

Jaredpace
I'm with your reasoning there. It makes sense.
But...
I guess the marketing department feels different about you suggestion.
It will be somewhat confusing with that many models in each series.
But at least two models should be doable, aka 8970 and 8980.
 
Why? Nothing I have seen changes the decision to be deterministic or not on the current products.
Actively using PowerTune like you seemingly did on that "no-additional-power-plug-so-let's-to-use-every-watt-we-have" HD7750 would be a very interesting option for a dual Tahiti card, though. Just like the "hard" 75W wall for HD7750, there is a "hard" 300W limit for HD7990 - so the situation seems comparable (though on diametrically opposed ends of the spectrum, of course).

I won't argue against being deterministic @stock settings. I really see your point there.

But given that you said PowerTune is highly programmable, I think a lot of people would really appreciate a kind of software switch in CCC that allows them to easily put their card into a "PowerBoost" mode that dynamically raises clocks/voltages to make full usage of the given power budget at all times.

That way, you could basically keep the deterministic appraoch @stock settings - but provide your customers with all the advantages of easily and automatically puhing their specific card to its individual limits within a hard 300W power envelope.
 
Actively using PowerTune like you seemingly did on that "no-additional-power-plug-so-let's-to-use-every-watt-we-have" HD7750 would be a very interesting option for a dual Tahiti card, though. Just like the "hard" 75W wall for HD7750, there is a "hard" 300W limit for HD7990 - so the situation seems comparable (though on diametrically opposed ends of the spectrum, of course).

I won't argue against being deterministic @stock settings. I really see your point there.

But given that you said PowerTune is highly programmable, I think a lot of people would really appreciate a kind of software switch in CCC that allows them to easily put their card into a "PowerBoost" mode that dynamically raises clocks/voltages to make full usage of the given power budget at all times.

That way, you could basically keep the deterministic appraoch @stock settings - but provide your customers with all the advantages of easily and automatically puhing their specific card to its individual limits within a hard 300W power envelope.


And average and max framerates go up in multi-game reviews resulting in the card earning higher "performance percentage" in the rankings. Kudos to NV for flipping powertune over and renaming it.
 
Dave Baumann said:
Why? Nothing I have seen changes the decision to be deterministic or not on the current products. On the contrary in fact.
Overclocking has always been a gamble. I like the fact that overclocking is automatic and hassle free. I understand that it complicates things for reviewers, but that's a fair trade-off in my very personal opinion.
 
I wonder how people will feel about it being a fair trade-off when the reviewed products are performing better than what they get in their own off the shelf part?
 
I wonder how people will feel about it being a fair trade-off when the reviewed products are performing better than what they get in their own off the shelf part?

At the risk of sounding cynical, they will probably never know about it.
 
I see what you did there :D

Most probable candidate for performance schemes based on more aggessive power thresholds would be HD7750 - and look what I found (should read way more midrange reviews):

A stock PowerTune setting that's actually limiting gaming performance:

6fwdic.gif


And what's the result of PowerTune actually kicking in at stock settings?

Nearly 50% better perf/watt than GTX680
:oops:

As I said earlier: A detailed perf/power review of an OCed HD7970 @stock PowerTune limits would be really interesting. Should behave rather similar to what Nvidia did on the GTX680 - just "self made".

For be honest, with " light" OC on the 7970's ( 1125mhz), i have never set Powertune to +20%. I have allways left it on default ( and i can say there's never any throttle down ). When i OC them to 1150-1200mhz+, i need increase the Vcore with MSI AB, so i dont know how work the Powertune in this case. ( i allways double check gpu core speed at same time, and i have never see any throttle ).

Overclocking has always been a gamble. I like the fact that overclocking is automatic and hassle free. I understand that it complicates things for reviewers, but that's a fair trade-off in my very personal opinion.

Yes and no, we are not speaking about a extreme OC.. It seems most games are running at 1100-1110 mhz max and some are moving from 1058 to 1100mhz. this is just 100mhz more. i have yet to see a 7970 who cant be set at 1100mhz ( max CCC limits is 1125mhz ) without any hassle. ( outside fan speed ofc )

Dont forget, the GPUBoost on 680 is too linked with temps ( i think SKYMTL have show it in his review ). The cooler is really good, but im not sure you will have the same fps after 1 hour on a multiplayer party of BF3 you have with the 110sec fixed bench in single player. Ofc this will not change your experience dramatically, as the game will still run good, and you will surely not see anything.
 
Last edited by a moderator:
Overclocking has always been a gamble. I like the fact that overclocking is automatic and hassle free. I understand that it complicates things for reviewers, but that's a fair trade-off in my very personal opinion.
I think the problem with Nvidia's approach really isn't the approach itself, but the fact that it's a stock setting and can't be easily turned on/off.

As I said earlier, I really think it would be a great idea to actually give customers the option to enable that kind of "automatic and hassle free" power defined dynamic overclocking on their card if they chose to do so. It would basically just be a way to automatically unleash the "bonus" reserves of your specific chip without having to do a manual and "brute", i.e. frequency defined overclock.

Problems start when you make that kind of chip-quality dependend auto-overclocking a default setting.

Had Nvidia marketed their GTX680 as a stock 1Ghz card (the clock speed they actually guarantee) with a simple switch to turn on (chip-quality depended) GPU Boost - it would have been a great and honest way to implement that kind of feature.

That being said, typical Boost rates of Nvidia's GTX680 seem small enough to not alarm any customer who actually draws a dud - very much like Alexko said (that's not to say I'd prefer their way of implementig it, though).


In the light of this very interesting discussion, I'd really dig a HD7990 that comes @ "hard" 850Mhz 3D stock clocks - but has a kind of CCC switch that allows customers (and OC interested reviewers, for that matter) to turn on an option to dynamically boost clock speeds up to a (previously validated) limit (say 1100Mhz) in situations where the power budget isn't fully exhausted by the default clocks. Actual performance benefits of such a feature will be chip-specific - but as long as you market it as a free "bonus" on top of the guaranteed stock performance, I don't see a problem.
 
Last edited by a moderator:
Yes, but it is also the "bonus" for those that be interested in it. Why turn the default product performance into a "gamble"?

Because it is already a "gamble"? Performance may vary with OS version, 3rd party software, montly drivers releases, etc.

For the market it is targeted this variance is acceptable.
 
Nice mockup Mianca :)
I would like to add a level to your sketch.
In my opinion as an end user I would prefer an smartboost "automatic power deterministic" checkbox, and that's it. I like it to be super easy to use for everyone. The smartboost should then overclock the card based on the specific chip at hand.

Or.
You could choose the checkbox "Manual overclock" with all the setting a geek need, just like your mockup.
 
Because it is already a "gamble"? Performance may vary with OS version, 3rd party software, montly drivers releases, etc.

For the market it is targeted this variance is acceptable.

Is it?
Think about it this way, reviews generally at least at first get cherry picked "press samples", which consist of pretty much best chips out there.
They get huge boost, you get the image that card x is this much faster than card z, so you go and buy the card x. Once you get your card, you'll soon find out that despite identical machine to that of the benchmark you viewed, your performance is percents off the pace in the review.

Wouldn't you feel cheated at that point?

The big issue is that reviews might show more than just percent or two faster performance just because they happen to have better chip than your retail sample
 
Nice mockup Mianca :)
I would like to add a level to your sketch.
In my opinion as an end user I would prefer an smartboost "automatic power deterministic" checkbox, and that's it. I like it to be super easy to use for everyone. The smartboost should then overclock the card based on the specific chip at hand.
Yeah, you're probably right. Better keep it simple.

I'd still like an "advanced" tab hidden somewhere that allows me to fine-tune power target and base clocks, though. Max Boost clock is probably not needed - why would you put that on anything but the max value anyway? Don't remember why I even put it in there - possibly just because I had a forth slider to use for something :D

So, yeah. Maybe just a big red "Enable SmartBoost" button plus an "advanced settings" button that opens up some fine-tuning sliders for those who care.

2hd41z5.gif


Now, let's hope that function follows form - and they actually implement something like this in their upcoming cards. :smile:
 
Since Dave is actively participating in this thread I'd like to take the chance and ask about how GPU-Boost is different from what Intel and AMD are employing in their CPUs. Based on TDP headroom, clocks are increased - shouldn't this also be subject to individual ASIC variation?
 
Since Dave is actively participating in this thread I'd like to take the chance and ask about how GPU-Boost is different from what Intel and AMD are employing in their CPUs. Based on TDP headroom, clocks are increased - shouldn't this also be subject to individual ASIC variation?

Llano and Bulldozer use activity counters, like Cayman & S.I., and therefore produce deterministic results. Intel relies on analog measurements detailed by David Kanter here: http://www.realworldtech.com/page.cfm?ArticleID=RWT042011004622&p=2

So Intel and NVIDIA use a similar system, although Intel's is much more sophisticated.
 
Llano and Bulldozer use activity counters, like Cayman & S.I., and therefore produce deterministic results. Intel relies on analog measurements detailed by David Kanter here: http://www.realworldtech.com/page.cfm?ArticleID=RWT042011004622&p=2

So Intel and NVIDIA use a similar system, although Intel's is much more sophisticated.

Thx! So Turbo won't be dependant on the cooling solution employed, right?
What catched my attention from Anand's piece about Bulldozer was this part:
"The APM modules samples a number of performance counter signals and these samples are used to estimate dynamic power with 98% accuracy."

So, static power cannot be a factor in Power Management then? As must be the case with GCN-chips, since they're also using digitaly obtained estimates from activity counters. I am not really sure if I might be missing something, but wouldn't have the set values then contain worst-case guardsbands for the most leaky chip that'll ever get past their screening process?
 
Back
Top